I think most people would consider the function of a "search engine" to return links to sources. A tool that *creates content* is not a "search engine."
Wait, do you have examples of search engines that don’t use AI content generation yet create false websites and include those in the results?
Or are you arguing that if a search engine returns a website that itself contains untrue info, therefore the search engine itself was not 100 percent true?
Google may not be 'making up fake websites,' but it is absolutely indexing slop sites and serving them in search results. to say nothing of it also hallucinating in its AI overviews (see the recent 'you can't lick a badger twice' discourse)
It’s not about whether results are true, it’s about where those results come from. A search engine is for finding relevant, already published content (true or not). AI “search” generates derived content. Whether this is derived from a static dataset or the current internet is irrelevant.
If it hallucinates sources and/or misrepresents actual source content, I do not think it should be called a search engine. Being transformational (and often bs) makes it something else. Sources might not be 100 true, but at least they're complete representations of themselves.
It can only be a search engine if it indexes and ranks links. With no ranking, it’s like the Yahoo Search directory with a corpus of links thrown at you at random.
You probably are too young to remember those days.
I expect much more curiosity from a journalist than this. Your audience is telling you something and you are refusing to take the time to understand where they are coming from.
Comments
Or are you arguing that if a search engine returns a website that itself contains untrue info, therefore the search engine itself was not 100 percent true?
If something fabricates a false search result, it is not a search engine
And yes, the Google AI overview hallucinates results
The Google AI overview is not a search engine
You probably are too young to remember those days.