you know how some smartphone keyboards predict the next word that you’re going to use, and you can form a comprehensible sentence that sometimes even makes sense by simply tapping the next word on the prediction bar over and over? that’s what those language models do. they don’t actually search for anything, they just create sequences of words that sound probable.
It seems that Bing chat bot searches then reads the results and gives you the answer.
I know it’s basically predictive text but if the prompt contains a relevant info then the predictive text is likely to be the answer you’re looking for so it works well.
Yeah but you can tell from the context that search results are just a list of random web pages that maybe what Google says is bollocks.
Google gives you a bunch of results and says “here, look at these”. LLMs confidently tell you things that they may have simply made up and present them as if they’re real.
I don’t get what generative ai could add to my browsing experience. How ever I do think it makes a good search engine.
a “search engine” that hallucinates results, including but not limited to non-existent court cases.
And what? Half the shit on Google is completely wrong as well.
Google actually pulls results from web pages.
you know how some smartphone keyboards predict the next word that you’re going to use, and you can form a comprehensible sentence that sometimes even makes sense by simply tapping the next word on the prediction bar over and over? that’s what those language models do. they don’t actually search for anything, they just create sequences of words that sound probable.
It seems that Bing chat bot searches then reads the results and gives you the answer.
I know it’s basically predictive text but if the prompt contains a relevant info then the predictive text is likely to be the answer you’re looking for so it works well.
Yeah but you can tell from the context that search results are just a list of random web pages that maybe what Google says is bollocks.
Google gives you a bunch of results and says “here, look at these”. LLMs confidently tell you things that they may have simply made up and present them as if they’re real.