Published
Rediscovering Europe’s Enlightenment Spirit in the Age of Quantum Technology
Subjects: Digital Economy

The Nobel Prize in Economics was awarded to Philippe Aghion, Peter Howitt and Joel Mokyr for their contribution to a simple but hard to answer question: why did economic growth take off? For centuries, people lived much the same way. There were innovations in agriculture (the heavy plough), transport (the caravel), and communication (the printing press) but these advances did not transform living standards. Long-term economic growth remained elusive.
That changed with the Industrial Revolution and the Enlightenment. Mokyr’s crucial insight is that innovation alone cannot sustain economic growth. To achieve the rates of growth to which we have become accustomed, societies must foster the acceleration and accumulation of ‘useful knowledge’, what we now call research and development (R&D). Yet for knowledge to be most useful, it must be accessible within nations and across borders. Technical standards contribute to this shared “knowledge stock,” serving as one of its most practical components.
Standards transform knowledge into a common language for innovation, codifying how technologies interact and evolve. This embodies Mokyr’s idea of cumulative, organised knowledge: each new technical specification builds on the last. Technical standards are also public documents whose success depends on their adoption, the more widely a standard is used, the greater the benefits for all. But not any technical standards is published by standardisation bodies. The process of developing a technical standard inside standardisation bodies is infused with the ‘competition for ideas’ that ensure only the best technologies are selected.[1]
That is why ETSI’s decision to establish a new Technical Committee on Quantum Technologies developing technical specifications for quantum communications and networks is timely. ETSI is not alone in its efforts to develop technical standards for quantum technologies. Other standardisation bodies, including CEN and CENELEC, have already set up Joint Technical Committee 22 (JTC 22 QT) on Quantum Technologies. The IEC and ISO have also launched a Joint Technical Committee 3 (JTC 3) on quantum information technologies. Incidentally, this year’s Nobel Prize in Physics was awarded to John Clarke, Michel H. Devoret, and John M. Martinis for demonstrating how quantum mechanical phenomena could underpin the next generation of digital technologies, including quantum cryptography and sensors.
Standards are not the only means of generating ‘useful knowledge’. Firms carry out vast amounts of R&D independently, without coordinating with others. Some of this knowledge is patented and published; other knowledge becomes trade secrets or is embedded in products and services, making them hard to decipher or copy.
Take the market for large language models as an example. It can be argued that to succeed, developers of such AI applications need vast amounts of data and computing power, and these conditions favour a market dominated by large firms with deep pockets. While it is true that the best-known AI firms have grown rapidly, the market could have evolved differently. Had market-driven technical standards taken hold, they might have fostered a more open ecosystem. One in which specialised firms supplied datasets and training services for large language models, while consumer-facing companies focused on building interfaces for AI-powered chatbots. For now, though, this is not the case: OpenAI, Anthropic, Google and Meta carry out most of these tasks in-house.
In light of these developments, the European Union passed the AI Act. The Act is not, strictly speaking, a government-set standard, but it comes really close. Many of its provisions dictate how AI products must be developed. For instance, companies have to maintain a continuous and iterative risk-management system; ensure that training, validation and testing datasets are “relevant, sufficiently representative, and, to the best extent possible, free of errors and complete”; and implement a quality management system (QMS) to ensure compliance.
The Act suffers from the same weaknesses as government-set standards. Debates leading up to the approval of the law were political rather than technical. The resulting rules lack technical expertise and commercial considerations, producing vague requirements that are difficult for firms to interpret and apply. The ensuing uncertainty encouraged European firms to develop their own AI solutions in-house, hindering the emergence of a broader EU market for AI technologies.
It is unsurprising that other countries did not follow in the EU’s footsteps. Indeed, major industry groups within Europe have called for a moratorium on the implementation of the Act, while one of our colleagues argued that the law should be scrapped and rewritten from scratch. The discussion has once again fed the perception that “the US innovates, China replicates, and the EU regulates.” Painful as it may be for Europeans, there is truth in that line. By most accounts, the EU trails the US and China in AI development, and the uncertainty created by the AI Act has only deepened that gap.
Backing AI technical standards would have been a far wiser policy for Europe. The good news is that market-driven AI standards can still succeed and achieve wide adoption. Firms themselves are the first to ask for unbiased, representative datasets and transparent algorithms for their products and services.
Market-driven standards also suit the European economy far better than other solution such as proprietary standards or regulation. Their strength lies in enabling European firms to specialise in what they do best. This specialisation fosters the rise of firms that concentrate on research and development, knowing that their innovations can be licensed and adopted by downstream players. The experience of Nokia, Siemens, Ericsson and many other EU companies in the cellular technology industry demonstrates how this system plays to Europe’s strengths.
The development of AI offers valuable lessons for EU policymakers turning their attention to quantum technologies. This technology could shape Europe’s future competitiveness across a range of industries. Sectors such as aerospace and defence, automotive manufacturing, electronics, chemicals, biotechnology and software are all likely to integrate quantum components. At the same time, the applications of quantum technology vary widely. While research often focusses on quantum computing, other fields, such as quantum sensing, have already reached commercial maturity. A similar gap exists across EU member states. ECIPE analysis shows that some EU countries are moving quickly in quantum research and market deployment, while others remain at earlier stages.
Market-driven technical standards are far better suited to this reality than any regulation attempting to steer Europe’s quantum technologies in a particular direction. The interoperability enabled by standards fits the European industrial ecosystem, where innovation is spread across sectors. They are also better equipped to support the development of a technology whose progress remains uneven among EU member states. True, standards take time to develop. Yet because they are industry-led and consensus-driven, the resulting frameworks are more likely to gain broad adoption and evolve in step with innovation.
Moreover, technical standards can help develop the QMS that the AI Act sought to impose through regulation. Just as EU legislation often references standards developed by CEN, CENELEC or ETSI, QMS in market-driven technical standards for quantum technologies could serve as a bridge to future regulatory compliance. These technical standards could also align with existing international standards further enhancing interoperability.
Against this backdrop, references to ‘technological leadership’ and ‘sovereignty’ in the EU Quantum Strategy and potentially in the upcoming Quantum Act are not only unrealistic but also unhelpful. As we have argued before, Europe’s interdependence with non-EU partners in quantum technology is a strength, not a weakness. Technical standards reinforce this interdependence and help European firms secure a central role in the development of quantum technology.
More could be done to encourage participation in the standardisation process by European firms of all sizes, as well as by research institutions and universities. EU policymakers must remember that technical standards do more than ensure compatibility: they enable companies to create solutions that push the technological frontier forward. In other words, they foster the acceleration and accumulation of ‘useful knowledge’.
Standardisation bodies and market-driven technical standards are the scientific societies of the digital ‘Republic of Letters’ and have helped Europe punch above its weight. Their role in shaping cellular technology and European companies’ position in that development is proof of that strength. If the EU is to reclaim its central position in technology development, it must rediscover the institutional qualities Mokyr identified: a culture open to new knowledge and competition among ideas. In the next wave of technological progress, from AI to quantum, market-driven standards are the vehicle through which Europe can regain its standing.
[1] In Mokyr’s words, technical standards are systematic and cumulative (e.g., telecommunications standards such as 2G, 3G, 4G, 5G and 6G build on one another); their principles extend beyond a single tool (e.g., Wi-Fi connects everything from mobile phones to fridges); they are empirical and testable (i.e., firms propose solutions within standardisation bodies that others scrutinise and validate); and they are instrumental, solving practical problems or enabling new techniques and processes.