• VirgilMastercard@reddthat.com
    link
    fedilink
    arrow-up
    17
    arrow-down
    2
    ·
    21 hours ago

    I don’t mind AI now that I’ve vastly lowered my expectations of what it can do, and am aware that “AI” isn’t actually real. An LLM might be useful in some situations where you can reasonably expect it to give a decent answer and where your task isn’t particularly important.

    Problem is, most of it is forced on you and is not privacy friendly.

      • snooggums@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        19 hours ago

        There are uses for AI in scientific and other things where using it for pattern matching to help find areas to focus on and then thoroughly doing the real work where accuracy is important are where the current AI really excels.

        https://news.berkeley.edu/2022/05/24/ai-reveals-unsuspected-math-underlying-search-for-exoplanets/

        Artificial intelligence (AI) algorithms trained on real astronomical observations now outperform astronomers in sifting through massive amounts of data to find new exploding stars, identify new types of galaxies and detect the mergers of massive stars, accelerating the rate of new discovery in the world’s oldest science.

        It didn’t just spit out answers they treated as correct, it did stuff they looked into and found ways to improve their methods. That is the real benefit of AI.

    • Smee@poeng.link
      link
      fedilink
      arrow-up
      3
      ·
      16 hours ago

      After dabbling with AI for years, I think it should be called out for what it is; machine learning. We’re not in the ballpark of intelligence and barely close to mimicking (artificial) intelligence.

      Though ML has a lot of things going for it, my most successful results are with voice synthesizing. Almost flawless and pretty amazing considered the yield.

    • daniskarma@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      2
      ·
      15 hours ago

      I use as an advanced rubber duck for coding.

      I know the answer is wrong but it gets my brain going into finding the right answer.

      Like: “This is a ridiculous approach to make this. It would be much easier to just…”

      Getting the wrong answer sometimes speed up the process, like some kind of dialectics.