Slack under attack over sneaky AI training policy
On the heels of ongoing issues around how big tech is appropriating data from individuals and businesses in the training of AI services, a storm is brewing among Slack users upset over how the chat platform is charging ahead with its AI vision. The company, like many others in this space, is tapping its own user data to train some new AI services, but it turns out that if you don’t want Slack to use your data, you have to email the company to opt out.
The terms of that engagement are tucked away in what appears to be an out-of-date, confusing privacy policy that no one was paying attention to until a miffed person posted about them on a community site hugely popular with developers, and that post went viral.
It kicked off last night with a note on Hacker News raising the issue and sparked a conversation showing that Slack opts users in by default to its AI training, requiring users to email a specific address to opt out. The Hacker News thread spurred multiple conversations and questions on other platforms, revealing confusion and surprise among current Slack users.
According to Slack’s privacy policy, the company is using customer data specifically to train global models to power channel and emoji recommendations and search results. Slack said it does not use customer data to train Slack AI.
Despite the company’s assurances, users are questioning the transparency and clarity of Slack’s policies, prompting a call for better communication and understanding of user privacy in AI development. As the industry navigates evolving AI technologies, the focus on user data privacy remains critical for maintaining trust and accountability in tech platforms.