Participants explored what form the possible standards would take and how we could work together to avoid some of the usual pitfalls when different SDOs diverge, for example by harmonizing quantum terminology. The meeting was extremely interesting and general audience feedback was excellent. We will therefore be organizing a follow-up joint symposium later in the year to continue the discussion. One of our hopes is that we can pave the way for IEC, ISO, ITU and other SDOs to come together to collaborate on a roadmap for the standardization of quantum technologies.
Quantum technologies have already had a significant impact on the work of IEC technical committees in areas such as lasers and semiconductors. More recently, quantum computing has emerged as a new and very exciting frontier for standardization. I am a member of a working group on quantum computing set up by IEC and ISO in their joint technical committee on information technology (JTC 1). At the IEC, I chair Subcommittee 86B on fibre optic interconnecting devices and passive components.
Quantum computing taps into quantum mechanical effects, such as the behaviour of materials at extremely low temperatures or "entangled" particles, such as photons, to essentially arrange it that nature itself works out specific problems, which would take "classical" digital computers prohibitive amounts of time to calculate. By prohibitive, I mean, in some cases, the age of the universe.
The benefits to society are potentially huge, allowing impossible world-scale simulations to be carried out in reasonable times. As always, though, the dangers are proportional to the benefits when a powerful technology is abused by "bad actors". Quantum computers could and will inevitably be used also for nefarious purposes, such as cracking powerful ciphers and hacking highly secure installations. Therefore, quantum technologies need to be part of the encryption process. That is why, for example, ISO/IEC JTC 1/ Subcommittee 27, which is best known for the ISO/IEC 27000 series of IT cyber security standards, is already looking at ways to develop quantum resilient cryptography.
Quantum computing still faces a number of hurdles before it can become a mainstream technology. The exact challenges depend on what type of quantum computing is involved. There are many fundamentally different ways of exploiting quantum superposition or entanglement, whether it is in photons, trapped cooled ions, nitrogen vacancies in diamond (carbon) lattices and so forth, to essentially implement parallel computing on a massive scale.
Each type comes with unique practical challenges. I would say one of the many commercial technical challenges would be to make cryogenic chambers more commercially viable (cheaper and more portable) since many desired high-fidelity effects (such as photon counting) are best achieved under near absolute zero temperatures (100s of milli-Kelvins up to 10 Kelvins).
The photonics eco-system has built up rapidly over the past two decades and many enabling technologies for quantum computation, communication and measurement can be addressed by drawing on advances in photonics. For example, photonic integrated circuits are set to become a key enabling technology for quantum devices. In the IEC fibre optics technical committee (TC 86), we address mainstream optics and photonics technologies such as fibres, connectors, passive devices (WDM) and active devices (lasers, detectors) as well as emergent technologies such as photonic integrated circuits and optical circuit boards.
Quantum computing is still very much in its infancy, with new methods of quantum computation emerging now on a frequent basis. That is why it needs total freedom to innovate, to breathe and to proliferate.This cannot be impeded, hindered or constrained in any way. There are, however, some areas in which standards would be helpful, including raising the performance benchmarks for the equipment and infrastructure required to support quantum computing, quantum measurement and quantum communication.
One prime example would be to develop standards for lower loss fibres and connectors, in order to better allow delicate quantum states, qubits, in the form of single or entangled photons, to be conveyed over longer distances with a lower chance of decoherence and disruption.
Another example would be quantum random number generators, a very simple precursor to quantum computers. The purpose of these devices is to generate purely random numbers, and there are levels of "purity" of randomness. Even a mostly random number will have a pattern that a sufficiently powerful computer could, given enough time, predict. A purely random number has no pattern, meaning no computer could crack it. It is surprisingly difficult to generate a totally true random number, but many organisations are coming up with more and more sophisticated ways of harnessing nature to produce increasingly random numbers. Such purely random numbers can be used to create completely secure keys to encode confidential information.
Standardized benchmarks on new properties such as “purity of randomness” would therefore be a useful way of assessing the suitability of a technology to an application. For example, while total randomness could be overkilled for non-critical, cost-sensitive applications, other applications such as highly secure military, defence intelligence data would require the highest levels of randomness to encrypt their data, most likely at a cost premium.
Richard Pitwon is an entrepreneur, engineer and scientist. He holds more than 50 patents and is a frequent contributor to peer-reviewed publications. He chairs IEC TC 86/SC 86B and is a member of ISO/IEC JTC 1 WG 14.