I’ve recently been told that it’s only the training that demands these massive datacenters, so it is stupid to be upset about treat printing when it’s just training that requires these massive data centers. Surely these data centers don’t stick around and their owners don’t demand more and more of them for even more training, you silly emotional Luddites.
Techbros are irredeemable shitbags. I saw one such shitbag on lobste.rs downplaying the energy cost by saying that one inference ONLY takes as much energy as a lightbulb does in one hour.
Right before mother gaia swallows us whole for being bad children to her I would like to drive stake through the hearts of a bunch of such monsters.
Microshit wants a nuclear power plant to power its upcoming datacenter. I think this is the endgame of the ridiculous things capital does in its lust for money.
Meanwhile many people who are actually involved in the programming shudder in horror of what lies ahead of us.
I took one year of programming in college and that’s all I need to know that I will resist as much IoT or bazinga slop as possible. But FWIW it was kind of cool to fool around with data.
Just running some data through the resulting model is still somewhat expensive since they have so many parameters. And of course for a lot of things, you want to train the model on new data you’re putting through it anyways.
In their defense, I’m sure there are tons of actually useful machine learning models that don’t use that much power once trained.
I have an iPhone with Face ID and I think the way they did that was to train a model on lots of people’s faces, and they just ship that expensive-to-train model with the operating system and then it trains a little bit more when you use face ID. I can’t imagine it uses that much power since you’re running the algorithm every time you open the phone.
I’m sure any model worth anything probably does require a lot of training and energy usage. I guess it really depends on the eventual utility whether it’s worth it.
I’ve recently been told that it’s only the training that demands these massive datacenters, so it is stupid to be upset about treat printing when it’s just training that requires these massive data centers. Surely these data centers don’t stick around and their owners don’t demand more and more of them for even more training, you silly emotional Luddites.
Techbros are irredeemable shitbags. I saw one such shitbag on lobste.rs downplaying the energy cost by saying that one inference ONLY takes as much energy as a lightbulb does in one hour.
Right before mother gaia swallows us whole for being bad children to her I would like to drive stake through the hearts of a bunch of such monsters.
Microshit wants a nuclear power plant to power its upcoming datacenter. I think this is the endgame of the ridiculous things capital does in its lust for money.
Part of that contract involves a vague promise from a tech startup that totally promises to prove le epic fusion power in two years.
Wow, that’s definitely not embezzling funds
Meanwhile many people who are actually involved in the programming shudder in horror of what lies ahead of us.
I took one year of programming in college and that’s all I need to know that I will resist as much IoT or bazinga slop as possible. But FWIW it was kind of cool to fool around with data.
deleted by creator
That’s what the smug bazinga fuck claimed, and even added a pithy “no investigation no right to speak” cherry picked site appropriation.
deleted by creator
Just running some data through the resulting model is still somewhat expensive since they have so many parameters. And of course for a lot of things, you want to train the model on new data you’re putting through it anyways.
The proselytizer treated it as a gotcha, so I appreciate the additional information.
In their defense, I’m sure there are tons of actually useful machine learning models that don’t use that much power once trained.
I have an iPhone with Face ID and I think the way they did that was to train a model on lots of people’s faces, and they just ship that expensive-to-train model with the operating system and then it trains a little bit more when you use face ID. I can’t imagine it uses that much power since you’re running the algorithm every time you open the phone.
I’m sure any model worth anything probably does require a lot of training and energy usage. I guess it really depends on the eventual utility whether it’s worth it.