• Chozo@fedia.io
    link
    fedilink
    arrow-up
    53
    arrow-down
    1
    ·
    3 months ago

    An argument being made in another social media case (involving TikTok) is that algorithmic feeds of other users’ content are effectively new content, created by the platform. So if Twitter does anything other than a chronological sorting, it could be considered to be making its own, deliberately-produced content, since they’re now in control of what you see and when you see it. Depending on how the TikTok argument gets interpreted in the courts, it could possibly affect how Twitter can operate in the future.

    • wewbull@feddit.uk
      link
      fedilink
      English
      arrow-up
      16
      ·
      3 months ago

      It’s certainly arguable that the algorithm constitutes an editorial process and so that opens them up to libel laws and to liability.

      Fair point.

    • jaybone@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 months ago

      Let’s say this goes through, how is a company going to prove it is not using an “algorithmic feed” unless they open source their code and/or provide some public interface to test and validate feed content?

      Plus, even without an “algorithmic feed”, couldn’t some third party using bots control a simple chronological or upvote/like-based feed? And then those third parties, via contracts and agreements, would manipulate the content rather than the social media owner itself.

      • Toribor@corndog.social
        link
        fedilink
        English
        arrow-up
        5
        ·
        3 months ago

        unless they open source their code and/or provide some public interface to test and validate feed content

        This honestly seems like a good idea. I think one of the ways to mitigate the harm of algorithmically driven content feeds is openness and transparency.

        • jaybone@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          3 months ago

          Well for the end users and any regulators it’s a great idea. But the companies aren’t going to go along with this.

          • tarsisurdi
            link
            fedilink
            English
            arrow-up
            2
            ·
            3 months ago

            Then they must be held liable for what they allow to spread on their platforms