The Case for a European Data Policy
By: Guest Author
Subjects: Digital Economy EU Single Market
By Richard Robert, Executive Director, ParisTech Review
The digital single market is a fact. Neither a project, nor an ambition, just a fact. Only thing is, it is in no way limited to European borders. Actually the very notion of border has been challenged by the development of digital platforms that have made investing, buying and selling a truly global thing.
Though in this open, digital world Europe’s vision of trade as a major way of regional integration is not as convincing as it used to be, it is still relevant. The grammar and vocabulary of trade, that form our institutions’ lingua franca since 1957, can help us describe, understand and shape the EU’s place and power in a digitized planet. But they need to be inserted in a different narrative.
Take standards, for instance. For decades, they have been the main material for technical barriers to trade, and as such they are precious assets when negotiating trade agreements. Used as it is to discussions on technical standards among its own member states, the EU has specialized in juggling with standards and using them, alternatively, as a substitute for tariffs (think of GMOs), a little help to its industrial champions (think of the GSM standards), or a valuable card for its negotiators (think of CO2 emissions). Europe’s power is, as discussed by many authors after Ian Manners’ seminal article in 2002, a normative power.
But with the digital economy, Europe has lost its grip on standardization. Even the celebrated Net Neutrality is fundamentally an American regulation, made by a Federal Commission whose decisions balance the interests of the American public and several large US broadband users such as Google or Netflix.
One could thus consider the Commission’s rhetoric of creating “digital independence” (to quote Günther Oettinger) not very realistic. In the current state of affairs, spurring a greater expansion of the digital economy in Europe would just pave the way to US giants. The platform economy is a winner-takes-it-all market. One hardly sees which kind of European policy could create a challenger to Amazon or Google. The remaining cultural segmentation of European markets (be it only due to languages) is in itself a barrier – not so much to enter digital markets, which is easy, as to grow at a rapid pace, and ultimately to win.
Hence the EC’s temptation to use standards in the old, protectionist way. Commentators such as Philippe Legrain have highlighted this trend, pointing out the disastrous outcomes of such a short-sighted policy.
Yet, there is one thing that is a matter of sovereignty, that can be regulated in an ambitious, though non-protectionist way, and where the European political culture could be an asset for future champions: data.
Notwithstanding all the buzz about Big Data, datamining is still in its infancy and technical standards are still to be decided. The definition and implementation of high level standards is not to be taken as a way to protect markets, but as a way to create markets. And to shape them.
Let us not be blind: when it comes to exploiting large sets of data, European companies are trapped in a patchwork of national laws that were designed according to a traditional vision of collection, storage and use, impossible to apply to massive data. That’s why European players have a disadvantage. But this disadvantage may turn into an advantage thanks to a smart data policy, that would preserve the best of European data culture – protection, attention to privacy – while unifying the national regulations in order to define a new, consistent framework.
Health data provide a very good example of what should be done. Medical data lies at a crossroads between two worlds: the patient’s privacy and statistical epidemiology, which carries a positive use for the population, as well as multiple opportunities for smart companies – from insurance services to personal service providers. An important question is, how should we balance these two dimensions. This question happens to be a key technical stake, as well, for any big data processing needs to simplify and unify the data (that have to to be included in a minimum storage unit, then to be managed and shifted through the processing system): too large a quantity of data will complicate matters and slow down any selective hierarchy-ranking process. This is where an ethical dimension can be introduced and this is where the law plays a defining role.
The quality of the legal framework, its capacity to impose its standards and shape the technical field, is essential. The efficiency and speed at which Europeans will be able to define this framework will determine their ability to develop champions, to attract research centers of worldwide groups, or – talking of digital independence – to have any control over the standards of data protection that could otherwise be imposed from outside.
Three directions should be pointed.
The Internet of things. Behind the current discussions on standards lies a crucial question: who will control the machine-to-machine data – the manufacturer or the information system provider? There is a competition going on, between the Industrial Internet Consortium (a public-private organization led by US companies, recently joined by India’s Infosys or Germany’s Bosch) and Germany’s Industrie 4.0 project. Power and speed will make the difference. European players should play as a team, and the Commission’s role should be to help them unite and speed up. There is no protectionist option here: once the standards are set, the game is global. A smart data policy is the basis of any industrial policy in this field.
Open data and smart cities. Public data have great potential value for almost no cost. Governments can use this public good and offer it for free, thus pouring an inflow of value into their economy, and spurring the growth of the many startups and the few giants (Veolia, Siemens) trying to develop and export a know-how. Standards won’t be of a legal nature here: it’s a certain way of building and representing a city, an insistance on public transportation, for instance, on pedestrians, on bikes, on what is common rather then what is private. Europe has certainly a competitive advantage here, and the global smart cities market is a booming one. Largely opening our public data – in our languages – may give a kick to our players.
Personal data and privacy. The gathering of personal data has become a central issue. The question is not to protect us from Google or Facebook. It is to think further: with datamining to become more and more crucial in shaping our digital environment, the game is already changing. Till now, US platforms have benefitted from three things: (1) the size and consistency of their native market, (2) the service quality and the attention they pay to their customers, a distinctive feature of the American culture, (3) their unrivaled capacity to enroll companies and individual in the corporate value chain, almost for free. But they have somehow failed in providing their users with a high level of confidence in the protection of their personal data. With datamining giving may to a more and more precise targeting of who we are, this failure may prove fatal. In any case, European players, may take advantage of a different culture, paying more attention to privacy. An insurance company such as AXA already redefines itself as a trusted-third party, placing trust and confidence at the very center of its value proposal.
In this context, a consistent set of European standards could prove to be a lever for framing a digital industrial policy. In a legal framework where individuals come under high protection, the winners will be the enterprises who inspire enough confidence to obtain the users’ consent to access and use their personal data.
Trust, says Rachel Botsman, is the currency of the 21st Century. Maybe. But data is the commodity of the future, and those who can combine trust and valuable data can lead the race. It has just started.