• Uriel238 [all pronouns]@lemmy.blahaj.zone
    cake
    link
    fedilink
    arrow-up
    18
    ·
    2 days ago

    Whatever made that crater was an ELE. Bigger than Chicxulub.

    We have plenty of great filters to navigate:

    We end war, or we die.

    We restore the atmosphere and rebuild global ecology, or we die.

    We end stratified society and power disparity, or we die.

    Where are all the aliens? Fermi asked. The first question is, how do we navigate our way to becoming a space-faring, world colonizing species, ourselves? It’s turning out to be pretty difficult for the common hominid.

    • jsomae@lemmy.ml
      link
      fedilink
      arrow-up
      7
      arrow-down
      2
      ·
      2 days ago

      Not to mention skynet. It always bothers me when people leave AI out of lists of x-risks. I guess it’s because a popular sci-fi movie predicted it would happen, so nobody takes it seriously. Or perhaps it’s just because AI is so unpopular now, nobody wants to devote any time to thinking about the ramifications of it becoming smarter.

      • Uriel238 [all pronouns]@lemmy.blahaj.zone
        cake
        link
        fedilink
        English
        arrow-up
        14
        ·
        2 days ago

        Courtesy of XKCD, long before we have to contend with unfriendly AI (we have committees of AI-techs working on this problem already) we’ll have to contend with someone like Musk or Bezos determined to own everything and capable of creating an AI-controlled army of killer robots.

        We’re not sure how rogue AI is going to manifest. We are sure rogue power-seeking humans exist all the time, and positions of power are commonly filled by them. (That’s the primary argument for election by sortition, or by lottery.)

        • jsomae@lemmy.ml
          link
          fedilink
          arrow-up
          3
          ·
          1 day ago

          Ok, so to be clear, you’re saying that AI x-risk is already partially or even mostly bundled under “We end stratified society and power disparity, or we die.”?

          • Uriel238 [all pronouns]@lemmy.blahaj.zone
            cake
            link
            fedilink
            arrow-up
            3
            ·
            edit-2
            21 hours ago

            That is a good assessment. Yes.

            In fact, the race between capitalist interests to bypass safety and get operational AGI soonest is entirely about getting that power to be able to use it to hold everyone else hostage.

            • jsomae@lemmy.ml
              link
              fedilink
              arrow-up
              1
              ·
              17 hours ago

              Yeah, fair enough, I do agree that this is largely driven by capitalism, and if we didn’t have a capitalist society we would hopefully be going about this more cautiously. Still, I feel like it’s a unique enough situation that I would consider it its own x-risk.