Synthetic media spills are a public health threat, and also a threat to human connection.

It’s bad enough that people find pseudoscience and garbage marketing of healthcare information, and much of it is pushed by influencers on social media and dubious actors who join social support groups with the intention of sales. The effect that a chatbot may make search engines into stochastic pseudoscience influencers is egregious. It’s now “searchers beware” when it comes to AI assisted search engine results.

Mystery AI Hype Theater 3000: The Newsletter May 28, 2024 Information is Relational Google’s AI Overviews Fails Helpfully Highlight a Source of Danger By Emily I’ve been writing for a while, in academic papers and also blog posts and op-eds, about why LLMs are a bad replacement for search engines, and how synthetic media spills are polluting our information ecosystem. One of the key points is that, even if the answers provided could be magically made to be always “correct” (an impossible goal, for many reasons, but bear with me), chatbot-mediated information access systems interrupt a key sense-making process. An example I like to use is as follows: Say you put in a medical query into a traditional search engine (think one that would return “10 blue links”), and the links you get point to a variety of sites. Perhaps you’re offered links to the Mayo Clinic, WebMD, Dr. Oz’s site and a forum where people navigating similar medical questions are discussing their own experiences. As a denizen of the Internet in 2024, you have had the opportunity to form opinions about these different sites and how to situate the information provided by each. You might know the Mayo Clinic as a renowned cancer treatment center, WebMD as commercial web property but one that does work with MDs to vet information, and Dr. Oz as a charlatan. The forum is particularly interesting, because any given answer lifted from such a site might be the kind of thing you’d want to confirm before acting on, but the potential to connect with other people living through similar medical journeys, share stories and pool information, can be invaluable. If instead of the 10 blue links, you get an answer from a chatbot that reproduces information from some or all of these four options (let’s assume it even does so reliably), you’ve lost the ability to situate the information in its context. And, in this example, you’ve lost the opportunity to serendipitously discover the existence of the community in the forum.