• Balder@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    6 days ago

    This kind of logic never made sense to me, like: if an AI could build something like Netflix (even if it needed the assistance of a mid software engineer), then it means every indie dev will be able to build a Netflix competitor, bringing the value of Netflix down. Open source tools would quickly reach a level where they’d surpass any closed source software, and would be very user-friendly without much effort.

    We’d see LLMs being used to create and improve rapidly infrastructure like compilers, IDEs and build systems that are currently complex and slow, rewrite any slow software into faster languages etc. So many projects that are stalled today for lack of manpower would be flourishing and flooding us with new apps and features in an incredible pace.

    I’m yet to see it happen. And that’s because for LLMs to produce anything with enough quality, they need someone who understands what they’re outputting, someone who can add the necessary context in each prompt, who can test it, integrate it into the bigger scheme without causing regressions etc. It’s no simple work and it requires even understanding LLMs’ processing limitations.

    • Tar_Alcaran@sh.itjust.works
      link
      fedilink
      arrow-up
      2
      ·
      5 days ago

      LLMs, by definition, can’t push the limit. LLM’s can only ever produce things that look like what they were trained on. That makes them great for rapidly producing mediocre material, but it also means they will never produce anything complex.