• 3 Posts
  • 11 Comments
Joined 1 year ago
cake
Cake day: June 15th, 2023

help-circle

  • Lemmy has pretty much all the same problems as reddit does but at a much smaller scale because it’s just not as big. Would you suggest Google use Lemmy?

    I agree, and I covered that in my blog. Lemmy is astroturfed and may even be easier to astroturf than reddit. I would like to see a more diversified “discussions and forums”, that’s not just reddit links.

    In general, privately-owned forums (running Xenforo, etc.) seem much better run than most reddit subs. I have never experienced the plethora of problems with reddit, on forums. I think it’s harder to spam and astroturf forums, and the owners & moderators have different incentives than reddit mods.

    The bar to entry as a new person on smaller forums was often high.

    I don’t remember experiencing that, but it makes me think of the bar to entry for running a reddit sub. Anyone can instantly create one for free and do whatever they want with it and get on the top of search results pretty quickly. Setting up your own forum is a lot more difficult and more of a commitment. I think there are benefits to that.

    I agree with your last paragraph. I think the type of warnings Twitter implemented are a decent idea. I think in general people need more warnings that what they see on reddit and other social media is not policed for legal content – people can and do say whatever they like, and much of what people say is misinformation and disinformation.

    I don’t think most people realize that reddit and other social media platforms have no obligation to take down illegal content. People seem WAY too trusting of things they read on reddit. If Google is going to be highlighting reddit results and putting them at the top, then they bear some responsibility for this.

    Since the CDA’s passage in 1996, § 230© has been consistently interpreted by U.S. courts to provide broad immunity to platforms for hosting and facilitating a wide range of illegal content—from defamatory speech to hate speech to terrorist and extremist content.12 Notice of illegal content is irrelevant to such immunity.13 Thus, even if a platform like YouTube is repeatedly and clearly notified that it is hosting harmful content (such as ISIS propaganda videos), the platform remains immune from liability for hosting such harmful content.