Over the past five weeks, I've followed DelftX University's excellent Economics of Cybersecurity course, which is concluded with an essay on my opinion of a topic in this field. I've picked the incentives that relate to privacy, and how regulation is the only feasible way to bend the outcome to one in line with the common good.
Coming from a technical background, my main experience in security has been around security measures (known as “controls” in this course). I have noticed that technical measures rarely tell the whole story: there is no black and white in this matter, and decision makers don't have a use for “this is just safer”-type arguments. This applies not in the least to privacy-related concerns: while it intuitively it seems obvious that privacy should be a concern, it can be complicated to construct it into a business decision. The technology field has some uncommon incentives, most having to do with restricted friction that technology brings, that make it impossible for a market-driven situation to shake out such that the privacy concerns of the public are served best. I therefore believe that regulation is needed that pushes the liability to the party best suited to control privacy.
As a software professional, I am well aware of the non-intuitive relations within the world of technology. The lack of natural friction on transactions opens the door for extremes in either direction of market share: either a huge number of small players in a race to the bottom (e.g., the current state of health- and fitness-related products and software), which later progresses into a practical monopoly for the player who understands this game the best. In both situations, either pre- or post-monopoly, market parties tend to make the choices that make best sense for their business goals. The winner-takes-all dynamics of this market means that the effects are amplified once one party reaches a practical monopoly situation. From my personal experience, I’ve seen that the direct business-incentives for user privacy are never those of the commons. This is exacerbated by (a) either unknowing or intentional company policy to make use of the wealth of information that many market parties are currently laying their hands on, and (b) the fact that most consumers currently willingly divulge information for marginal benefits, putting no pressure on market parties to change this situation.
The current situation—-customers don’t care, market parties are driven by business incentives—-is an unstable balance, waiting to break. On the one hand we see data abuses, breaches and security flaws running wild in various industries, such as insurance (Anthem), automotive (BMW) and appliances (Netatmo) which don’t get the backlash one might rationally expect. On the other, we see an unwitting movement towards trust no one (TNO) systems, such as Whatsapp’s inclusion of TextSecure’s end to end encryption, producing marginal security awareness in the public.
Some time in the near future, I expect these two trends to meet, likely in a large-scale breach whose effects will reverberate in the minds of the public. This can undermine public trust in all market parties that process private information, not just the bad apples of the industry, potentially halting technological progress as the public cannot distinguish good actors from bad ones, and turn their backs on both. The situation outlined above shows that market pressures will not produce any of the desired effects. Privacy, in this sense, becomes a communal asset, which needs (minimal) legislation to shepherd it. In situations of security, mainly banking, it has been shown that pushing liability to the party best suited to affect it, provides the proper results. The European Union’s General Data Protection Regulation (GDPR), which is expected to be adopted by most EU member states in 2015, and will become enforceable starting 2017, provides just this framework for putting liability with the correct party.