The verdicts against social media companies in California and New Mexico over the past two days reveal a disturbing trend: Americans are increasingly willing to view speech as a “product,” subject to regulation in the same way physical substances like alcohol or tobacco are.
Whenever these tech companies enter the debate for privacy abuses, dominant market share, mental health decline, or algorithmic polarization (and let's be clear: those things are definitely happening) the debate is only focused on how we can regulate them, sue them, have the government break them up into smaller companies, etc.
It's super weird that nobody ever talks about how we can just choose not to use their services.
I get it, some of those services are pretty great and you'll definitely miss them after you've moved on or migrated to services that aren't as good, but this "learned helplessness" phenomenon has got to stop. It's our responsibility to solve our own problems, not to wait for someone to solve it for us.
Sorry guys, there’s a lot of nuance you’re left out here. Can you tell your readers what Operation MetaPhile was and what it showed in the New Mexico case?
But aren’t the social media companies trying to have their cake and eat it? If the algorithm is speech (I agree, it is) it shouldn’t get sec. 230 protection. We need some way to deal with the obvious harm social media is causing our society. Products liability litigation isn’t the best way to do that for the reasons you state, but it’s what we get when the richest companies in the world are given liability shields for unprotected speech amplified by algorithms.
Whenever these tech companies enter the debate for privacy abuses, dominant market share, mental health decline, or algorithmic polarization (and let's be clear: those things are definitely happening) the debate is only focused on how we can regulate them, sue them, have the government break them up into smaller companies, etc.
It's super weird that nobody ever talks about how we can just choose not to use their services.
I get it, some of those services are pretty great and you'll definitely miss them after you've moved on or migrated to services that aren't as good, but this "learned helplessness" phenomenon has got to stop. It's our responsibility to solve our own problems, not to wait for someone to solve it for us.
Instagram is not an idea. It is not speech. It is a product designed to addict and manipulate. I hope they get sued into oblivion.
Sorry guys, there’s a lot of nuance you’re left out here. Can you tell your readers what Operation MetaPhile was and what it showed in the New Mexico case?
But aren’t the social media companies trying to have their cake and eat it? If the algorithm is speech (I agree, it is) it shouldn’t get sec. 230 protection. We need some way to deal with the obvious harm social media is causing our society. Products liability litigation isn’t the best way to do that for the reasons you state, but it’s what we get when the richest companies in the world are given liability shields for unprotected speech amplified by algorithms.