By Bobby Jefferson on Monday, 30 September 2024
Category: Tech News

Slack (And Others) is Training Its AI On Your Messages: Here's How To Opt Out

Quick Links

AI can't think for itself; it needs huge amounts of data to train on. When you ask generative AI a question, it creates output that resembles training data scraped from the internet. This is why every company now wants to record all of your interactions and scan your documents: they need more data!

Companies Need Your Permission to Use Your Data

Services that integrate AI based on their user data include Slack, Grammarly, and Stack Overflow, and recently, LinkedIn. Many more are following suit, so don't be surprised when your favorite app or website offers you AI functionality (at the cost of them being able to use your usage data to feed their AI models).

Fortunately, companies looking to use your conversations, documents, and other data need your permission to use it for anything other than its original purpose.

Companies are Completely Ignoring This Requirement

Unfortunately, many companies are completely ignoring this when it comes to public data. Multiple AI companies have been caught scraping public social media posts to train AI models, without the permission of the authors.

Other companies comply with data privacy regulations by quietly updating their terms of service and privacy policies, but unless you regularly review them, and delete your account when AI training stipulations are added, you're likely to be unaware that your data is now being used in this way.

Slack is nice enough to let you opt out of having your potentially sensitive conversations used to train its AI, but they don't make it easy.

Rather than having a checkbox or button to tell them they can't use your data to train their AI models, you need to have your Slack workspace owner email their support team (yes, really). You need to drop them an email at This email address is being protected from spambots. You need JavaScript enabled to view it. with the subject "Slack global model opt-out request" to opt out.

As with most cybersecurity and privacy concerns, your best protection is vigilance. If a product you use has AI features, there's a good chance it uses your data to power them, so hunt around in the apps settings and see what privacy invading features you can switch off (or, check their documentation to see what inconvenient way they've devised to try and discourage you from denying them your data).

Above and beyond AI, there are some quick ways you can improve your cybersecurity if you're looking to keep your data more private and secure.

Original link
(Originally posted by Brad Morton)
Leave Comments