In a recent CBC interview, an anti-vaccination supporter declared that “I’ve done my research, and come to my conclusions.” The interview revealed that this “research” was incorrect, fraught with in accuracies, and ignorant of basic sources of information, such as publicly-accessible databases and inventories of information on vaccinations. In our age of ever more accessible information, how could such information be missed? Modern algorithms trick our old-fashioned brains into thinking there is more support for a minority argument than what really exists.
Google launched its personalized search system in 2004, and made is part of all Google searches in 2005. All searches on Google are associated with a browser cookie record, which informs the results you see on future searches. When you type in your search words, the list of results is not only based on up the words entered, but the previous browsing history. Someone else could enter the same key words and see different results, given their different browsing history.
When a climate denier or an anti-vaxer enters the terms “climate change” or “vaccinations” they will see information posted by skeptics of the science of climate change or vaccinations. This leads to the assumption that this information is more popular and the views more widely held than they actually are.
This is where our old-fashioned brain comes in. We use heuristics all the time. Heuristics are shortcuts. They can be helpful in many situations – instead of rationally thinking through every situation, we can make quick decisions based on past experiences and impressions (which is useful when crossing the street and a bus is barreling toward you). Think of the hundreds of decisions you make in a day, large and small, heuristics keep us moving forward.
A well-studied heuristic is the availability heuristic. We use readily available information to make a decision. For example, if you see several news reports of airplane crashes, you may question if air travel is unsafe. In reality, the news reports may be highlighting the very few cases of airline crashes because they are dramatic, and, frankly make for good TV. The availability heuristic can fool as much as help.
When someone searches for climate change or vaccinations who has a past browser history of climate denial or anti-vaccination sites, their results will fool rather than help. They easy availability of information – ranked highly by Google, no less! – will trick their brains into concluding that this information is correct, because it is so available.
When, in reality, climate denier claims are thoroughly debunked (even by former climate skeptic scientists) and the only study linking vaccinations to disease was retracted after the results could not be replicated. The results these people receive are the result of their past searching history, not the facts of the issues at hand, or the considerable weight of the scientific evidence.
In part because of such personalized searches, those who are already skeptical become more entrenched in their positions. Societal debate and dialogue is reduced to the vast majority trying to convince the minority to accept reality. It’s a base, and black and white debate that no one can win.
There are of course other factors, such as the heavily-resourced mobilization campaigns to create biased views, and a growing culture of skepticism. Yet, they are based on similar ideas – that there is a community out there with readily available information that can prove that something is dangerous. The problem is that the fact checking is now also biased, leading individuals to dangerous conclusions.