Slack trains machine-learning models on user messages, files and other content without explicit permission. The training is opt-out, meaning your private data will be leeched by default.
Sounds like a lot of this is for non-generative AI. It’s for dumb things like that frequently used emoji feature.
Knowing how my legal teams have worked in my tech companies, I’m a bet that a lawyer updated the terms language to be in compliance with privacy legislation, but they did a shit job, and didn’t clarify what specifically was being covered in the TOS. They were lazy, and crafted something broad, so they wouldn’t have to actually talk to product or marketing people in their org.
Sounds like a lot of this is for non-generative AI. It’s for dumb things like that frequently used emoji feature.
Knowing how my legal teams have worked in my tech companies, I’m a bet that a lawyer updated the terms language to be in compliance with privacy legislation, but they did a shit job, and didn’t clarify what specifically was being covered in the TOS. They were lazy, and crafted something broad, so they wouldn’t have to actually talk to product or marketing people in their org.
What is it like to live in a place with privacy legislation? Here we must sell our healthcare data for food, and sell our food for healthcare.
Where do you live?