- cross-posted to:
- environnement@jlai.lu
- climate@slrpnk.net
- cross-posted to:
- environnement@jlai.lu
- climate@slrpnk.net
one assessment suggests that ChatGPT, the chatbot created by OpenAI in San Francisco, California, is already consuming the energy of 33,000 homes. It’s estimated that a search driven by generative AI uses four to five times the energy of a conventional web search. Within years, large AI systems are likely to need as much energy as entire nations.
I guess it depends on how you use chatbots. If you’re just too lazy to click on the first google result you get, it’s wasteful to bother ChatGPT with your question. On the other hand, for complex topics, a single answer may save you quite a lot of googling and following links.
Oh, well as long as it save you from Googling it’s okay that it’s a massive ecological disaster. My mistake.
That’s the opposite of what he said. That sort of usage isn’t what ChatGPT is good for, it’s best to use it for other kinds of things.
It’s best to not use it. At all.
Feel free not to, I guess. But again, that wasn’t the point of my comment. You mistook bleistift2’s statement in the opposite way it was intended. ChatGPT’s not intended as a replacement for a search engine so evaluating it on that basis is misleading.
That’s just like… your opinion, man.
AI is going to be an important tool in the future. Decrying it as bad is similar to folks saying investing in green energy was stupid because without economies of scale they were expensive and inefficient.
Computers are using more energy. Instead of turning them off, let’s find ways to produce energy less destructively, such as nuclear which would benefit EVs and all energy usage.
The future for the people who aren’t dying of thirst due to the lack of water?
Did you even read the rest of my post?
The part where you suggested using nuclear energy? Which also uses a huge amount of fresh water?
Yes, I read it. I chose not to mention it since I didn’t want to show that you were making my point stronger for me, but you forced my hand.
https://www.ucsusa.org/sites/default/files/attach/2014/08/ew3-freshwater-use-by-us-power-plants-exec-sum.pdf
Then solar. Wind. Geothermal. Whatever. Energy usage is never, ever going down unless population does and probably not even then. If that silicon isn’t used for AI it’ll be something else. Then what?
I mean an argument could be made here, right? Just thinking theoretically.
Maxim: we want to be as eco-friendly as possible.
Per a given task, understand the least environmentally-taxing way to accomplish the goal.
Task requires one, two, or three/four DuckDuckGo searches? DDG away.
Task requires five DDG searches, OR one LLM query? Language model it is.
(LLM may well rarely be the answer there, of course, just laying out the theory!)