• WashedAnus [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    63
    ·
    9 months ago

    reddit-logo-famous sex worker, took some goofy gnome pics like a decade ago, posted about doing acid, is part of the Effective Altruist cult

    • Rom [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      38
      ·
      edit-2
      9 months ago

      Effective Altruism

      It’s like they correctly realized capitalism is a death cult and propagating global suffering, but instead of seeing if anyone else has ever worked out alternatives before them (Karl Marx) they came up with the brilliant idea to…pick the right careers and make donations? Without any actual material analysis of why things around the world are the way they are. Also apparently they have an algorithm that they say can predict world events or something.

      What an amazing prediction, no one else could have possibly seen that one coming, I’m sure.

      Apparently a bunch of techbros like Sam Bankman-Fried were heavily involved so that’s how you know it’s a joke ideology.

      • SSJ2Marx@hexbear.net
        link
        fedilink
        English
        arrow-up
        33
        ·
        edit-2
        9 months ago

        pick the right careers and make donations

        On first blush, there’s a certain amount of logic to this, but then the EA people all “realized that AI is the most dangerous thing ever” and started giving all of their money to each other’s AI startups, so whatever it started as it ended up being a techbro grift.

      • disco [any]@hexbear.net
        link
        fedilink
        English
        arrow-up
        20
        ·
        9 months ago

        There is a certain logic to the original ideas of effective altruism. If you want to make positive change in the world, you will be more able to do so if you are a powerful person.

        The problem is you can essentially use it as a justification for endlessly pursuing your own personal power. After all the more powerful you become, the more good you can do! In fact it makes it easy to justify committing immoral acts in the pursuit of power-- think of it from a utilitarian point of view! I am going to do so much good once I’m rich, that it will far out weight the harm I’m doing now. In other words, an okay idea that is used and abused by the greedy and power hungry to convince themselves they aren’t bad people.

        • JohnBrownNote [comrade/them, des/pair]@hexbear.net
          link
          fedilink
          English
          arrow-up
          7
          ·
          9 months ago

          yeah like some charities are more effective than others, and if you’re already an engles maybe it’s some kind of better to find some billable hours and pay people to “volunteer” at the soup kitchen rather than do it directly yourself but those guys all went fucking nuts about pascal’s wager instead.

      • Philosoraptor [he/him, comrade/them]@hexbear.net
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        9 months ago

        Yeah, there’s the seed of a sensible idea in there, which is just “if you’re going to try to improve the world, you should think carefully about where and how to spend your limited resources to have the most impact.” That’s fine–good, even–but these ghouls have used that idea and their own ideological preconceptions to bootstrap themselves into some absolutely insane positions.

        A big part of the problem is that they extend their utility calculations indefinitely into the future with no temporal discounting whatsoever. That leads to a hyperfixation on “existential risks” and trying to optimize for lives 100,000 years in the future at the expense of lives today. Their “leading philosophers” say unhinged things about this: one of them claims that delaying the technological singularity is “costing” us something like 250 billion lives per second, since in the future when we’re an intergalactic civilization with 500 trillion people, that’s how many people will die every second before we have post-Star Trek levels of technology. Somehow this is taken by actual thinking human beings to be a compelling argument instead of a reductio ad absurdum of the position.

        The lack of any temporal discounting means that they see things that actually matter–like climate change, a lack of healthcare, or the ravages of global capitalism–as fundamentally unimportant. After all, if you can save 30 trillion lives in the year 45,000, who cares if a few billion die in the year 2060? The eventual impact of this is that most of them have talked themselves into thinking that AI “alignment” is the place they should all be focusing all their resources, since Robot God will either save us all from death forever or usher in human extinction, and thus its future utility numbers are either infinitely positive or infinitely negative. Therefore, they all insist that anyone not giving all their money and time to science fiction AI grifts like MIRI is fundamentally irrational and that trying to help actual people who are actually suffering right now is dumb, short sighted, and based in emotion (pejorative) instead of reason.

        They’re fucking infuriating grifters.