The actually scary thing is when the AI suggestions become less ridiculous. It seems very unlikely for anyone to try to sauté garlic in gasoline, but at a certain point these suggestions are going to sound more reasonable but be equally as dangerous. Say, what medication is safe. Or what to do in an emergency. Those are the things that are going to get people killed.
These have to be fake right… please tell me they are fake… please
They’re very real. There’s also one telling people to eat glue, which got a lot of attention recently. Lol
The actually scary thing is when the AI suggestions become less ridiculous. It seems very unlikely for anyone to try to sauté garlic in gasoline, but at a certain point these suggestions are going to sound more reasonable but be equally as dangerous. Say, what medication is safe. Or what to do in an emergency. Those are the things that are going to get people killed.
And Google will be held accountable right… Right?
I’m extremely surprised we haven’t seen the Ai telling people that mixing vinegar and bleach makes a killer cleaner
See, you say that, but…
I thought they must be photoshopped…but then I messed around with it just now for, like, a minute:
To be fair, anything is edible at least once.