Slack is scraping your messages and files to develop AI by default
"To develop AI/ML models, our systems analyse Customer Data (e.g. messages, content and files) submitted to Slack..."
See our updated story of Monday May 20 here for clarification from Slack on its approach to generative and non-generative AI.
Slack is scraping customers’ messages and file content to "develop" AI/ML models – and has been doing it by default since at least October 2023.
Customers need to opt out by emailing the company. Many just noticed.
(The UK Information Commissioner’s Officer (ICO) warns companies in its online guidance on consent: “Consent requires a positive opt-in... Don’t use pre-ticked boxes or any other method of default consent.")
A furore on Hacker News and social channels suggests the realisation has landed badly with most customers, despite Slack’s claimed safeguards.
These include the claim that "When developing AI/ML models or otherwise analysing Customer Data, Slack can’t access the underlying content.
See also: Dropbox’s AI integration with OpenAI turns into a messaging mess – as Amazon’s CTO apologises over data protection post
Slack reveals the behaviour on its privacy principles page.
The Stack reviewed archived versions of the page on the Wayback Machine.
Identical language could be found as early as October 2023.
(The Stack has attempted to reach Slack to ask when it started using customer data to develop AI/ML models. As Slack customers ourselves, we do NOT recall being clearly notified and may end our use as a result.)
Slack: "To develop AI/ML models, our systems analyse..."
“To develop AI/ML models, our systems analyse Customer Data (e.g. messages, content and files) submitted to Slack” the page says.
"Contact us to opt out. If you want to exclude your Customer Data from Slack global models, you can opt out. To opt out, please have your org, workspace owners or primary owner contact our Customer Experience team at [email protected] with your workspace/org URL and the subject line ‘Slack global model opt-out request’. We will process your request and respond once the opt-out has been completed" says Slack in its Privacy Principles.
Slack says: “For any model that will be used broadly across all of our customers, we do not build or train these models in such a way that they could learn, memorise, or be able to reproduce some part of Customer Data... Slack aggregates and disassociates Customer Data such that Slack’s use of Customer Data to update the Services will never identify any of our customers or individuals as the source of any of these improvements to any third party, other than to Slack’s affiliates or sub-processors.”
(Its sub-processors include Amazon, Google, and Zendesk.)
The company also collects user data to “identify organisational trends and insights,” a privacy policy from the Salesforce-owned firm adds.
In an April blog however a trio of Slack engineers said explicitly that "we do not train large language models (LLMs) on customer data"; that it is able to "host and deploy closed-source large language models (LLMs) in an escrow VPC, allowing us to control the lifecycle of our customers’ data and ensure the model provider has no access to Slack’s customers’ data" and that for its Slack AI product "conversation summaries and search answers all generate point-in-time responses that are not stored on disk."
The Stack has requested clarification on a number of points. What appears to be poor communication and perhaps a lack of joined-up thinking or understanding across engineering, legal, and communications has previously also caused problems for organisations like Dropbox looking to add AI-augmentations to their product for customer convenience.
“What Slack's actually created is a jurisdictional headache for any team who uses it. Say you're a global remote distributed agency with staff in 30 countries working for clients in almost as many countries. How is Slack's AI going to know what data lives where?” noted one privacy expert.
“How are they going to segment any opted-out data from the data they hoovered up American-style? What also concerns me is whether or not Slack is asking permission after the fact that they've already fed everyone's client data into an AI. [For customers just noticing Slack’s behaviour and opting out] how are they going to take that out?”
Slack says in its privacy principles: "Our guiding principle as we build this product is that the privacy and security of Customer Data is sacrosanct, as detailed in our privacy policy, security documentation and SPARC and the Slack Terms." A review by The Stack shows that not a single one of those four documents makes any mention of AI or machine learning.
More to follow.
Views? Get in touch.