• wewbull@feddit.uk
      link
      fedilink
      English
      arrow-up
      43
      arrow-down
      6
      ·
      3 months ago

      Under what law?

      UK currently holds the people that post things liable for their own words. X, the platform, just relays what is said. Same as Lemmy. Same as Mastodon.

      If you ban X I don’t see why those other platforms wouldn’t be next.

      Now should people/organisations/companies leave X? Absolutely! Evacuate like it’s a house of fire. Should it be shut down by legal means? No.

      • Chozo@fedia.io
        link
        fedilink
        arrow-up
        53
        arrow-down
        1
        ·
        3 months ago

        An argument being made in another social media case (involving TikTok) is that algorithmic feeds of other users’ content are effectively new content, created by the platform. So if Twitter does anything other than a chronological sorting, it could be considered to be making its own, deliberately-produced content, since they’re now in control of what you see and when you see it. Depending on how the TikTok argument gets interpreted in the courts, it could possibly affect how Twitter can operate in the future.

        • wewbull@feddit.uk
          link
          fedilink
          English
          arrow-up
          16
          ·
          3 months ago

          It’s certainly arguable that the algorithm constitutes an editorial process and so that opens them up to libel laws and to liability.

          Fair point.

        • jaybone@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          3 months ago

          Let’s say this goes through, how is a company going to prove it is not using an “algorithmic feed” unless they open source their code and/or provide some public interface to test and validate feed content?

          Plus, even without an “algorithmic feed”, couldn’t some third party using bots control a simple chronological or upvote/like-based feed? And then those third parties, via contracts and agreements, would manipulate the content rather than the social media owner itself.

          • Toribor@corndog.social
            link
            fedilink
            English
            arrow-up
            5
            ·
            3 months ago

            unless they open source their code and/or provide some public interface to test and validate feed content

            This honestly seems like a good idea. I think one of the ways to mitigate the harm of algorithmically driven content feeds is openness and transparency.

            • jaybone@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              3 months ago

              Well for the end users and any regulators it’s a great idea. But the companies aren’t going to go along with this.

              • tarsisurdi
                link
                fedilink
                English
                arrow-up
                2
                ·
                3 months ago

                Then they must be held liable for what they allow to spread on their platforms

      • daddy32@lemmy.world
        link
        fedilink
        English
        arrow-up
        20
        ·
        3 months ago

        Twitter (or rather musk) chooses what it “relays” or boosts. Unlike lemmy, unlike Mastodon.

      • Hanrahan@slrpnk.net
        link
        fedilink
        English
        arrow-up
        8
        ·
        3 months ago

        The Australian Government issued a bunch of take down notices to Twitter and Musk said no

        https://www.abc.net.au/news/2024-04-23/what-can-the-government-do-about-x/103752600

        Musk decided to block them in Australian only which didn’t satisfy the Australian Government

        He took them to court and the court sided with Twitter, (x)

        https://variety.com/2024/digital/news/australian-court-elon-musk-x-freedom-of-speech-row-1236000561/

        The complexity and contradictions were illustrated by Tim Begbie, the lawyer representing the eSafety Commissioner in court. He said that in other cases X had chosen of its own accord to remove content, but that it resisted the order from the Australian government.

        “X says […] global removal is reasonable when X does it because X wants to do it, but it becomes unreasonable when it is told to do it by the laws of Australia,” Begbie told the court.

      • OldWoodFrame@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        3 months ago

        Here’s the thing about nation state governments. They can pass laws. It’s kind of the main thing they do.

        • Echo Dot@feddit.uk
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          3 months ago

          They retain authority by having some air of legitimacy. They can’t just change laws, there has to be a due process just changing laws without a process is literally a dictatorship.

      • Skvlp@lemm.ee
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        3
        ·
        3 months ago

        I agree. It would set a terrible precedent, even if it’s terribly tempting. I’d say it’s better to ask people to leave instead.