I also had a brief read on the bill you linked and some relavent articles. The bill only cite “national security” yet doesn’t explain what “national security” it causes.
The Bloomberg article states a few reasons, but none satisfied me to justify a ban. For example, reason 1 points out that the algoritm of generating feed is advanced and intoxicating. So they should be punished for a well written and effective algorithms?
Yes, there are and were dumb to harmful contents found on TikTok. However, I think it should be a content moderation issue, not a national security issue. I heard people can find CSAM on Twitter and Discord, harmful and damaging it’s, should it get banned too due to “national security” concerns? It just have a smell of unfair.
Just my two cents.
Disclosure: I don’t use Facebook, Intagram, Twitter, nor TikTok. I do have a Discord account.
They’re not worried about CSAM. They worried about TikTok users being influenced during an election campaign.
And yes, it is a moderation issue. Specifically, the US doesn’t want the current moderation team to be in charge of moderation.
Disclosure: I don’t use Facebook, Intagram, Twitter, nor TikTok
To put it in perspective, about a quarter of the US population uses TikTok. And politics are a major discussion point with the political content you’re exposed to selected by an algorithm that is opaque and constantly changing.
It absolutely can be used to change the result of an election. And China has meddled in elections in the past (not least of all their own elections… but also foreign ones:
“China has been interfering with every single presidential election in Taiwan since 1996, either through military exercises, economic coercion, or cognitive warfare, including disinformation or the spread of conspiracies”
It not uncommon to see misinformatuon to fabricated information appears on many SNS platforms including Facebook and Twitter. It is not unheard of Russia use social media to influence election too via popular platform that is US based. All SNS are subject to the same problem, but only TikTok have more active users thus more far reaching, but again this is a content moderation problem, not the inherent fault of TikTok itself. Whom should perform content moderation is a business decision. It should not be dictated by law, though they can make moderation standards that companies needs to comply. I think this is a bit unfair to just targeting TikTok only, and should be universal.
EDIT:
political content you’re exposed to selected by an algorithm that is opaque and constantly changing
Isn’t TikTok opened access to its algorithm for reviewing?
Actually it is not solely a content moderation problem. While some dumb and physically harmful content should be subject to moderation, speeches should be protected. Isn’t American all about the word “Freedom”? It should be free to speak what they believe, right?
However, the recommendation algorithms might need some regulations that categorize content and have relevant display policies. For example, political content, user generated and advertisement, should be distributed equally for all views (i.e. a user will see content for all candidates for roughly same amount of time). The “addictive” thing shouldn’t be regulated as that the point of the algorithm: maximize user engagement. However, there could be a rating system similar to game ratings that affect who at what age can use which platform. Otherwise, it should be free for one to addict to something, as long as it doesn’t cause a physical harm to himself and others.
I have a question. How would it be moderated and by whom? In an age where the warthunder forums literally have a leak of classified info like monthly, and the US is increasingly losing the cyber security war because people can’t do simple things like not plug random usb’s they found on the side of the road into their work computers, I don’t really understand why it’s hard to believe tik tok is a threat to national security.
The permissions it asks for on your phone are kind of a red flag. Specifically access to the camera and microphone. Mostly because with it being controlled by the CCP (as most successful Chinese Businesses are), it is absolutely trivial for them to gather information “anonymously” about their users, de-anonymize it, and then target those users with anything and everything including pro CCP propaganda. That alone is reason enough for me to understand why federal employees aren’t allowed to use tik tok on any federal device (work phones and computers for instance).
I don’t necessarily think forcing them to sell to another entity will fix the problems with tik tok. I think this bill is intended to be a “solution” to placate people. Mostly because it doesn’t seem like it’s been written by people who understand the technology. But I also wouldn’t say that tik tok is harmless or blameless.
Why does tik tok need to gather information about what banking apps I use? What healthcare apps I use? Why does it need my GPS location? Why can it collect this data without my consent? Why and how does it collect information on people even if they don’t use tik tok? Have never used tik tok?
On top of that Tik Tok got caught spying on reporters with the intent to track down their sources. That’s terrifying.
the US is increasingly losing the cyber security war because people can’t do simple things like not plug random usb’s they found on the side of the road into their work computers
I’m not surprised at this when Americans refuse to ware a simple medical mask during COVID.
That would be easier to answer if we had a list of companies that can afford to buy it (that’s a short list) and also willing to buy it (an even shorter list).
I don’t necessarily think forcing them to sell to another entity will fix the problems
Sure - it obviously depends who buys it. Elon Musk, for example, would probably be a bad steward.
But what about Alphabet? That might not be so bad. As a fan of YouTube, I’d love to see the “shorts” feature killed off and all that content moved to a separate service where I can go the rest of my life without ever seeing a short repeating video.
Whoever buys it, it the US can force TikTok to be sold once, they can do it again if the buyer proves to also be problematic.
The invasion of privacy is bad regardless of who does it though. This bill isn’t about protecting consumer privacy. It’s about sticking it to the CCP. Alphabet should also be considered a company that needlessly invades the privacy of it’s users and laws should be made to protect those users. Just because tik tok is worse doesn’t mean any company doing this isn’t bad.
I’ll also say that YouTube shorts views pay more than tik tok views for established creators, by a significant margin. I would rather creators I enjoy get paid decently. Not that YouTube doesn’t have a lot of problems and anti-creator policies of its own. But $.04 per 1K views is a lot worse than $18.00 per 1K views.
Thanks.
I also had a brief read on the bill you linked and some relavent articles. The bill only cite “national security” yet doesn’t explain what “national security” it causes.
The Bloomberg article states a few reasons, but none satisfied me to justify a ban. For example, reason 1 points out that the algoritm of generating feed is advanced and intoxicating. So they should be punished for a well written and effective algorithms?
Yes, there are and were dumb to harmful contents found on TikTok. However, I think it should be a content moderation issue, not a national security issue. I heard people can find CSAM on Twitter and Discord, harmful and damaging it’s, should it get banned too due to “national security” concerns? It just have a smell of unfair.
Just my two cents.
Disclosure: I don’t use Facebook, Intagram, Twitter, nor TikTok. I do have a Discord account.
They’re not worried about CSAM. They worried about TikTok users being influenced during an election campaign.
And yes, it is a moderation issue. Specifically, the US doesn’t want the current moderation team to be in charge of moderation.
To put it in perspective, about a quarter of the US population uses TikTok. And politics are a major discussion point with the political content you’re exposed to selected by an algorithm that is opaque and constantly changing.
It absolutely can be used to change the result of an election. And China has meddled in elections in the past (not least of all their own elections… but also foreign ones:
It not uncommon to see misinformatuon to fabricated information appears on many SNS platforms including Facebook and Twitter. It is not unheard of Russia use social media to influence election too via popular platform that is US based. All SNS are subject to the same problem, but only TikTok have more active users thus more far reaching,
but again this is a content moderation problem, not the inherent fault of TikTok itself.Whom should perform content moderation is a business decision. It should not be dictated by law, though they can make moderation standards that companies needs to comply. I think this is a bit unfair to just targeting TikTok only, and should be universal.EDIT:
Isn’t TikTok opened access to its algorithm for reviewing?
Actually it is not solely a content moderation problem. While some dumb and physically harmful content should be subject to moderation, speeches should be protected. Isn’t American all about the word “Freedom”? It should be free to speak what they believe, right?
However, the recommendation algorithms might need some regulations that categorize content and have relevant display policies. For example, political content, user generated and advertisement, should be distributed equally for all views (i.e. a user will see content for all candidates for roughly same amount of time). The “addictive” thing shouldn’t be regulated as that the point of the algorithm: maximize user engagement. However, there could be a rating system similar to game ratings that affect who at what age can use which platform. Otherwise, it should be free for one to addict to something, as long as it doesn’t cause a physical harm to himself and others.
I have a question. How would it be moderated and by whom? In an age where the warthunder forums literally have a leak of classified info like monthly, and the US is increasingly losing the cyber security war because people can’t do simple things like not plug random usb’s they found on the side of the road into their work computers, I don’t really understand why it’s hard to believe tik tok is a threat to national security.
The permissions it asks for on your phone are kind of a red flag. Specifically access to the camera and microphone. Mostly because with it being controlled by the CCP (as most successful Chinese Businesses are), it is absolutely trivial for them to gather information “anonymously” about their users, de-anonymize it, and then target those users with anything and everything including pro CCP propaganda. That alone is reason enough for me to understand why federal employees aren’t allowed to use tik tok on any federal device (work phones and computers for instance).
I don’t necessarily think forcing them to sell to another entity will fix the problems with tik tok. I think this bill is intended to be a “solution” to placate people. Mostly because it doesn’t seem like it’s been written by people who understand the technology. But I also wouldn’t say that tik tok is harmless or blameless.
Why does tik tok need to gather information about what banking apps I use? What healthcare apps I use? Why does it need my GPS location? Why can it collect this data without my consent? Why and how does it collect information on people even if they don’t use tik tok? Have never used tik tok?
On top of that Tik Tok got caught spying on reporters with the intent to track down their sources. That’s terrifying.
https://www.welivesecurity.com/2023/03/24/what-tiktok-knows-you-should-know-tiktok/
I’m not surprised at this when Americans refuse to ware a simple medical mask during COVID.
That would be easier to answer if we had a list of companies that can afford to buy it (that’s a short list) and also willing to buy it (an even shorter list).
Sure - it obviously depends who buys it. Elon Musk, for example, would probably be a bad steward.
But what about Alphabet? That might not be so bad. As a fan of YouTube, I’d love to see the “shorts” feature killed off and all that content moved to a separate service where I can go the rest of my life without ever seeing a short repeating video.
Whoever buys it, it the US can force TikTok to be sold once, they can do it again if the buyer proves to also be problematic.
The invasion of privacy is bad regardless of who does it though. This bill isn’t about protecting consumer privacy. It’s about sticking it to the CCP. Alphabet should also be considered a company that needlessly invades the privacy of it’s users and laws should be made to protect those users. Just because tik tok is worse doesn’t mean any company doing this isn’t bad.
I’ll also say that YouTube shorts views pay more than tik tok views for established creators, by a significant margin. I would rather creators I enjoy get paid decently. Not that YouTube doesn’t have a lot of problems and anti-creator policies of its own. But $.04 per 1K views is a lot worse than $18.00 per 1K views.