The Myth of Unbiased Search Results

The idea that there is a single truth, with no interpretations or shades of grey, seems to be at the heart of the digital promise. Wikipedia is one example, with it there is no need for multiple encyclopaedias with different perspectives or focuses: the truth is, so to speak, out there. The hive mind, or algorithms, or combination of both, will rise above the biased perspectives of the individual mind. Nowhere is this promise more prominent than in search. The ever-improving algorithms have only one objective, to provide you the user with the best, most relevant results. Or at least this is how it is often perceived according to Swedish author and journalist Andreas Ekström, in a recent TEDx talk in Oslo. Ekström makes a convincing case that the algorithms we trust are not as objective as we might like to think and that there is significant human influence and bias on particular search queries. The keywords neutrality, objectivity and independence we have learned to associate with the internet seems to never reach further than the conscience of the designers, executives and owners of the tech companies will allow. I can only agree that there is often need for such intervention and that “neutrality” is at best a dead end and at worst the recipe for disaster. The problem is that the principles of interference are not transparent or predictable, nor can decisions be appealed. With all that power over our facts and knowledge about the world, search engines can do a much better job on this. First order of business: acknowledge this power. Second: develop transparent systems and process of accountability. Look at how the traditional press has developed principles of press ethics, much of that can also be applied to online editors and curators, not necessarily limited to search: social media feeds are ruled by presumably unbiased algorithms too. Let’s wake up from this dogmatic slumber.

FULL DISCLOSURE: Ekström was a contributor to the Swedish Netopia-project and also contributed a chapter to an anthology I edited in 2009.

1

Leave Comment

  1. […] we are really outsourcing thinking to organizations that run the machines”, writes Foer. Algorithms are not neutral: they follow specific interests. Book and movie recommendations are one small example: Netflix […]

Comment on this article