Lugh@futurology.todayM to Futurology@futurology.todayEnglish · 7 months agoEvidence is growing that LLMs will never be the route to AGI. They are consuming exponentially increasing energy, to deliver only linear improvements in performance.arxiv.orgexternal-linkmessage-square67fedilinkarrow-up1299arrow-down127
arrow-up1272arrow-down1external-linkEvidence is growing that LLMs will never be the route to AGI. They are consuming exponentially increasing energy, to deliver only linear improvements in performance.arxiv.orgLugh@futurology.todayM to Futurology@futurology.todayEnglish · 7 months agomessage-square67fedilink
minus-squareUmbrias@beehaw.orglinkfedilinkEnglisharrow-up1·7 months agoHallucinations are not qualia. Please go talk to an llm for hallucinations, you can use duck duck gos implementation of chatgpt, and see why it’s being used to mean a fairly different thing from human hallucinations.
Hallucinations are not qualia.
Please go talk to an llm for hallucinations, you can use duck duck gos implementation of chatgpt, and see why it’s being used to mean a fairly different thing from human hallucinations.