It’s not all about the user experience, also not for those who say so.
You know there is something missing when you read the line ‘We collect personal data to give you the best possible user experience’. A more truthful account would be something like ‘We collect personal data generated not only from your use of our service but pretty much everything we can harvest from your device, to some extent to improve your experience but really mainly because the more data we have about you the more we can monetize, primarily by selling adverts that tend to chase you from one website to the next for days’. It’s like in hotel rooms – who actually believes the management cares about the environment and doesn’t just want to save on laundry?
But what if it were all about the user experience? What if the user experience is the only thing that matters? Would you prefer a single company controlling all the information in the world, giving you the perfect, just-in-time, personalised user experience every time? Or would you rather take less-perfected services from various companies, none of which have all your information? Let’s just say there are other values in life and online than the user experience.
A close relative of the user experience myth is the algorithm myth, as in ‘we don’t have any responsibility for the result; it’s all in the algorithm’
A close relative of the user experience myth is the algorithm myth, as in ‘we don’t have any responsibility for the result; it’s all in the algorithm’. Yes, except you wrote it, you fed it with data and trained it, you tweaked it and keep updating it to deliver … ahem … ‘a better user experience’. It’s like in the comedy show Little Britain where a hospital receptionist takes the most obnoxious stances possible with patients – like signing up a five-year-old for double hip replacement surgery – because the ‘computer says no’. If anyone blames the algorithm, they’re playing dumb in the hope that you won’t call their bluff. Don’t buy that!
A variation of this myth is ‘The Almighty Algorithm’, as in we can’t be responsible for the output of the algorithm. Except you can. While it may be complicated, an algorithm is a set of instructions for how a computer should handle particular situations. My kids have a Lego robot called Bullen. It has a simple graphic programming interface. It’s easy to tell Bullen to, for example, first move forward 30 centimetres then stop and turn 180 degrees when I press the Start button. Every time my eight-year-old presses the Start button Bullen will carry out these exact instructions. It’s predictable and we know what’s going to happen, because we told Bullen what to do – or wrote the algorithm. Algorithms for news ranking, search, dating or financial services are obviously much more complicated than Bullen’s, but basically the same. The owners of those algorithms constantly tweak and alter them for various reasons – improved profits, better function, better security, sometimes even better ‘user experience’. If you don’t like the output of the algorithm, you change it and try again. Then repeat, until it produces the results you want. When someone says the algorithm is too complicated, they may want you to think something like ‘the Lord moves in mysterious ways’, but really they’re just saying they don’t want or can’t be bothered to do what you ask them to.
One great example of how algorithms can be biased was provided by journalist and author Andreas Ekström in a TED talk called ‘The Myth of Unbiased Search Results’. He takes two examples of internet hacktivism in Google image searches: first, one in which racists tagged photos of monkeys with Michelle Obama in order to make the search engine return monkey pictures along with real images of the first lady. To its credit, Google intervened and tweaked the algorithm and image tags so Michelle Obama picture searches would be accurate. But similarly, in the second example, hacktivists tagged images of dog droppings with the name of Norwegian mass murderer Anders Behring Breivik (who, in 2011, killed 77 people, many of them teenagers, in a terrorist attack on a government building and a left-wing youth summer camp). In this case, Google did not intervene. It is easy to agree with Google’s judgement, but don’t tell us the search results are unbiased or that you don’t know what the algorithm does.