• billwashere@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    2 days ago

    I’m pretty sure the simplest way to look at is an LLM can only respond, not generate anything on its own without prompting. I wish humans were like that sometimes, especially a few in particular. I would think an AGI would be capable of independent thought, not requiring the prompt.