Author Archive

Can We Make the Digital World Ethical?

Saturday, February 15th, 2014

The digital revolution connects people. Or at least, so we’re told. Our assumption about the internet and digital technology is that it is about people communicating with people. Great benefits, anyone with a laptop and broadband can make services for a potential global audience. Or so the story goes. The conflicts online involve real people: trolls on forums, privacy issues, grooming, hackers and man-made viruses. Or that’s what we like to think.

But what if the digital revolution is about machines communicating with other machines? What if algorithms, software bots and smart devices make the most traffic? The stock market report in the newspaper you read with your morning coffee is not made by an editor; or it is but, that editor is a bot, compiling data from stock market servers. Most of the trades on that stock market are done by machines. The ads on that news paper’s web page (and all other web pages) are published by algorithms, and bought by bots on micro-second ad exchanges. When you tweet your thoughts on the latest market trends, that tweet is read, analysed, retweeted and stored by bots (often with human looking account names). And you haven’t even finished that morning coffee yet.

Within five years, a majority of online traffic will be machine-generated. Humans will be in the minority in terms of connectivity. That is the complete opposite of the way we think about the internet today and it raises many questions: can machines be accountable for mistakes? Small mistakes, sure, but what about medical treatment bots, self-driving trucks, or automated weapon systems? All of those technologies exist today. The trends of cloud computing, big data and smart devices accelerate this development in the direction of machines increasingly making decisions without human involvement.

Done right, this could be a blessing – scores of bots making your life easier. But the past six months’ reveals about privacy abuse by both government and private organisations suggest that it’s much more complicated. The legal consequences of this technology must be addressed, yesterday. Most importantly, what sort of ethics go into these systems?

To address these questions, Netopia publishes the report Can We Make the Digital World Ethical? to be presented at a seminar in Brussels on Tuesday. Hope to see you there!

https://www.netopia.eu/2014/01/31/event-ethics-in-the-digital-world-feb-18/

It’s progress, deal with it

Wednesday, February 5th, 2014

Financial Times columnist Martin Wolf comments a new book by MIT researchers Erik Brynjolfsson and Andrew McAfee: The Second Machine Age (yes, Netopia will review it shortly). The question is whether intelligent technologies can be a threat to our lifestyle and society. Which coincidentially is the topic of Netopia’s next event, Ethics in the Digital World, Brussels, February 18th.

Many Netopia-topics run together in Wolf’s column: the power concentration of big data. The threat to knowledge jobs. The decision-making machines. The winner takes all markets of global niche monopolies. The value of intangibles is not properly reflecting in the economy. The digital revolution’s part in the economic stagnation. Nice to see, but what is the answer? Or better: what are the answers?

So far, the answer has been “it’s progress; deal with it.”. But more and more voices question this definition of progress: why does it only benefit a few corporate entities, while the rest of the world pays the price of free search and networking with falling productivity, unemployment, and increased social inequalities? Technology is not some independent force of nature; it is the result of investment, conflicting interests, competing standards, government intervention, consumer demand, and many other factors. The future is not predetermined; we can make change. If we don’t like to see one economic ecosystem after another being harvested by the “creative disruption” of tech companies, we can put a stop to it.

Read this Wired story about online limousine service Uber. The company has an app that connects passengers with drivers, bypassing the rest of the ecosystem. Great stuff, cuts out intermediaries, brings down prices, and brings better business to limousine companies. Except when faced with the hassle of local taxi regulation (such as pricing and meter rules), they pulled the usual: “those rules don’t apply to us; we are only the connector” (think telcos and pirates, Google search, and defamation). How is that not saying, “We’d like to be above the law; thank you very much”? Or as the joke goes, “Gravity doesn’t work for Google.” Google Ventures is an Uber investor, by the way. Surprise.)

There is a simple rule: you make money, you play by the rules. Why should some be held to different standards just because they call themselves “progress”? That joke got old really fast. So let’s stop the charade. Use the legal system to make global tech companies follow the same rules as everybody else. If that doesn’t help, make better laws. And remember, we make the future. Code is written by people, at least for a little while longer. Tomorrow, most of it will be written by machines. That will be a completely different story.

The Curse of Unlimited Information

Tuesday, January 28th, 2014

In today’s news, Swedish mash-up service Lexbase combines public data on home addresses with court data on convicted criminals, creating a map where you can see if your neighbour was ever in trouble with the law. This is possible due to Sweden’s generous policies on public data. It is also a very good illustration on why more information is not necessarily better. For all the nice buzz words about transparency and access to data, this is nothing but a violation of privacy on a massive scale. The legal system deals with crime and punishment, once a convicted criminal has served their time, the debt to society has been paid. Sure, you can request rulings from the courts’ archives, but that access to information is far different from broadcasting it to anyone 24-7.

Digital evangelist Clay Shirky talks about how technology can create emergent behaviours by lowering transaction costs, in “Here Comes Everybody” he has a beautiful example of the Long Island “Mermaid Parade” which has inspired a spontaneous collective photo database online – something which would have been impossible with the higher transaction costs of analog technology and hierarchic distribution. But if that is the case, the opposite is also true: in some cases it is important to introduce speed bumps to avoid harmful use. This was always the case, systems of press ethics have been in place for many years and for good reason in all forms of mass media prior to digital self-publication. It’s just that we thought the digital would somehow be different. Well, it wasn’t and now we have to reverse engineer the whole darn thing to make it comply with ethics and law. It has been apparent for quite some time now that information does not want to be free (it wants to be controlled by cloud giants and security services).

Schmidt Shares Netopia’s Concerns over Job Loss

Friday, January 24th, 2014

The Davos-connection. This week, I wrote about technology leading to job loss, the so-called Luddite fallacy. Not sure Google Chairman Eric Schmidt reads Netopia regularly (probably not, right?), but his speech in Davos shows he shares my concern. At the World Economic Forum, Schmidt said there is a race between humans and computers “And the humans need to win”. Hear, hear, says Netopia. Except Schmidt forgets to remember his own role in the process, it is the cloud giants and mainly Google that is the driver of the technological developments that threatens to have this impact. It is Google’s self-driving cars that will put taxi drivers out of a job. It is Google’s advertising business that pulls the rug out from under the feet of the news media. It is the obsession with cutting out intermediaries that built Google’s empire that is also driving the job loss. That Schmidt expresses this concern without considering his own responsibility is nothing short of preposterous. The excuse “if we don’t do it, somebody else will” has never convinced anyone. But instead, Schmidt points to government to fix this.

Looking then at the European Commission, it seems that Schmidt’s calls for intervention are futile. According to today’s press release, the Commission expects the digital revolution to provide the solution, stating that since last March over 2200 digital jobs have been created in Europe (and over 5000 internships!). Great for the people who got the jobs, but that does not help in the grand scheme of things. For the first and very likely last time, Netopia says: European Commission – Listen to Google!

The increased productivity of the digital era takes away far more jobs than it creates. The growth in jobs will never come from the niche mono-cultures of the cloud giants, but much more from the SME’s that Europe has such great love for. The government should be thinking more about how to sustain a working eco-system of SME’s than how to support the locust of cloud giants sacrificing one sector after another on the altar of efficiency. Oh, we love free search and e-mail, but if our best career prospect is The Internship, we’d much rather just watch the Movie.

UPDATED: In November 2013 26,5 Million were unemployed in the European Union. Your math homework: How many Digital Agendas are needed to get them jobs? Send answers to stromback@netopia.eu. The first correct answer wins a handshake meeting with Eric Schmidt. Or the Netopia editor. Whichever we can organise first.

Apple Faces Lawsuit for Selling Personal Info to Third Parties

Wednesday, January 22nd, 2014

There has been a lot of criticism against how cloud companies abuse their customers privacy in doing their business. The phrase “If you’re not paying for a service, you’re not the consumer – you’re the product being sold” is quoted more and more often. The main targets for this criticism have been Facebook and Google, while Apple have managed to mostly stay clear. Many, including this writer, probably thought that as Apple’s main business is selling hardware, it has a different model that is less vulnerable to privacy concerns. We were wrong. In a lawsuit in Massachusetts (which may be the most difficult state to spell!), Apple is accused of unlawfully collecting customers zip codes with credit card purchases and selling this information to third parties. Too bad – the fruit was rotten.

Anticipatory Shipping – Amazon Looks Deep into the Crystal Ball

Sunday, January 19th, 2014

Many of us shook our heads thinking they had lost the marbles upon reading Viktor Mayer-Schönberger’s and Kenneth Cukier’s theories on how prediction and probability will rule everything from insurance premiums to crime protection in a near future, as presented in their 2013 book Big Data. For sure, we knew already that Google can predict election results and flu spread. And we were familiar with semantic analysis of social media for things like stock market recommendations. But predicting the specific behaviour of individuals ought to be something different altogether, right? Free will, and all that. Well, turns out this future is a lot closer than most of us would like to believe. This week, Amazon received a US patent on “anticipatory shipping”. That’s right, the shipping starts before you have placed the order. By analysing things like previous purchases and browsing behaviour, Amazon seems to believe it can predict your buying decisions with enough precision to commercially justify expediting your orders before they have been placed. If this flies, a paradigm shift is on its way for sure. Will my mates know when and which bar to turn-up at for after work drinks without making any arrangements? Will I not even have to dial a taxi, but only step outside to find it conveniently waiting in the street? (That, by the way, happens to be the system in Beijing, but it has nothing to do with big data, just loads of taxis.) Will this blog write itself, because WordPress knows what I think before I write it? Or maybe that already happened, because how would you – dear Reader – ever know?

POTUS Puts Leash on Surveillance Agency – Short Enough?

Saturday, January 18th, 2014

It seems to never end – only this week we learned that the NSA’s Dishfire-system picks up close to 200 million text messages every day. Americans are not the only culprits, UK’s GCHQ also use the system. And it also turns out that spy programs were inserted in a 100,000 computers before they were shipped to customers, so the NSA and its associates could monitor them even if they’re not connected to the internet. We can only expect more reveals will follow, the age of innocence is over for sure. It used to be that we could feel secure by hiding in plain sight, just because the numbers were so big: billions of users, calls, messages, e-mails, status updates, GPS records. But with big data analysis, the shoal no longer obscures the fish. Everything digital should be regarded as immediately visible to authorities and private companies, even if you’re not online. When privacy is of the essence, use non-digital communication (and I don’t mean postcards).

Oxford professor of internet governance, Viktor Mayer-Schönberger, should be familiar to most Netopia-readers by now. His thinking, and previous book Delete, is an obvious influence on EU data protection policies such as the “right to be forgotten”. In his more recent work, Mayer-Schönberger moves the focus from collection of data, to its use and suggests accountability to be the most effective regulatory tool. The only problem with this approach is that once the data is collected, the risk is that it will leak one way or another, never mind the best intentions of the collector.

Today, President Obama announced restrictions on the NSA’s mandate. For sure they could be more ambitious, but the basic idea is sound. Anonymity and invisibility are not the answers to privacy. In fact, they are an illusion with today’s technology. Democracy is much better; government institutions should decide what information can be collected, for what purpose and how it can be used. Not only for government functions but also for private data collectors. The questions will then be “What should be collected?”, “Why?”, “How can it be used?” and “When should it be deleted?”  Those are great questions for public debate. Much better than the current “if” there should be any restrictions whatsoever on what happens online. Great to see Obama in this camp. The next order of business should be to drop the charges against Edward Snowden. Why not bring him on board as an expert advisor to the government on these matters? You know, just like cybersecurity companies make a habit of hiring the hackers that beat their systems.

Open Tech Does Not Equal Open Society

Wednesday, January 15th, 2014

This week, Netopia contributor Ralf Grötker reviews Evgeny Morozov’s To Save Everything, Click Here. I strongly recommend it to anyone interested in the digital society, in fact anyone interested in society – because that is Morozov’s main point, we look at the technology and not at the context. Internet-centrism is everywhere, we think the internet will change society and our lives in some predictable way, but technology is only one of many factors that influences the development. This point is fundamental, we should think about problems first, solutions second and technology only when relevant, but today the sequence is in many cases the opposite. So while Morozov may spend a little too much ink on criticising his opponents, the main point can hardly be overstated

One aspect of internet-centrism is the fixation with the word “open”. Who can be against open? Open is good – in many cases in the digital context it also means free of charge (like “open software”). It is very close to transparency, which Ralf Grötker discusses in detail in his review. Like transparency is not always benign (obvious conflict with privacy), open is devious. What do we even mean when we say open? Is open the same for all, or can open in one context mean more closed in a different? More importantly, open technology is not the same as open society. The open society relies on human rights, freedom of speech, democratic institutions, legal predictability, rule of law, independent court and many other functions that have evolved over centuries. Open tech means that there is no built-in barriers to access data, or to add features, or to build on what others have coded, or to see what others have coded etc. It can be great, Netopia runs on open source software. Just remember it has nothing to do with an open society.

Selling Shovels in a Gold Rush

Monday, January 13th, 2014

To make money in a gold rush, you want to be selling the shovels. In the current gold rush, those are digital however. I’m talking about Bitcoin of course, which is close to $1000 again after the big drop before the holidays. Every hour, 25 new Bitcoins can be mined, except there is no real excavation taking place. Instead, the Bitcoins are awarded to whoever first solves a complex math problem, puzzles that become increasingly difficult over time. The mining/puzzle solving is done with computers, early days a good graphics chip was enough, but now it is more common to pool computing power with others or buy or build a dedicated system. Bitcoin mining hardware sets you back anywhere from a few hundred dollars all the way up to tens of thousands. But at the current exchange rate, even the most expensive rig needs to deliver once or twice to give a return on investment. Of course, as more miners join and compete for new Bitcoin, increasingly the profits will be made by those who sell the rigs, rather than the miners themselves. Same as every gold rush in history, why would digital be any different?

Computer Says No

Sunday, January 12th, 2014

In a classic Little Britain-episode, a hospital receptionist suggests absurd, even horrible, treatments to patients. According to the receptionist, a five year-old girl has an appointment for a hip replacement, but her mom says she is supposed to have her tonsils removed – to which the receptionist (played by David Walliams) only replies “Computer says no”. Between the real-world people and their various ailments, and the rigidity of the computer system that always says no, the receptionist invariably sides with the latter.

The same phenomenon appears in 2013 blockbuster Elysium, but as dystopia rather than comedy. Set in the year 2154, Matt Damon plays an ex-con who tries to walk the straight and narrow, but constantly gets in trouble with the legal system run by a computer main frame and its robot agents. (More than anything, its aesthetics reminds me of Schwarzenegger’s epic Total Recall.) The robots don’t understand humour, nuance or human misery, their only response is violence and imprisonment. It’s like a sadistic version of “Computer says no”.

During the holidays, I was on a break in the Baleares with my family. Since we travelled inside Schengen, we did not think twice about passports. Too bad the connecting flight going home was in London, which – obviously – is outside Schengen. It was easy enough to explain the situation to the passport police, they said no problem as we were going back to our home country. The issue was with the airline, whose computer systems were designed to comply with international travel regulation. It was impossible to check-in to the flights without passport numbers! To make things more complicated, it was Sunday morning so obviously no one who could by-pass the system was in the office. For a while, I feared we would end up in a limbo between departure terminals (like Tom Hanks in The Terminal, to drop another movie reference). It worked out in the end, thanks only to some very creative work by airline ground staff who probably violated more than one paragraph in the process. But for a while, it was our own very real version of “Computer says no”.

It is very likely you’ve had a similar experience yourself. No matter how sophisticated the system, no software developer can ever foresee all the potential situations that may happen in real life. Let’s build in a human over-ride function in all systems from now on, okay?