Press the start button, switch on the monitor, grab a cup of coffee and off you go. That is pretty much how most us experience booting up a computer. But with a quantum computer the situation is very different. So far, researchers have had to spend hours making dozens of adjustments and fine calibrations in order to set up a chip with just five quantum bits so that it can be used for experimental work. (One quantum bit or 'qubit' is the quantum physical equivalent of a single bit in a conventional computer). Any small errors in the adjustment and calibration procedure and the chip would not work.
The problem is that, not unlike musical instruments, quantum computers react to small changes in the local environment. If, for example, it is a little warmer or a little colder or if the ambient air pressure is a little higher or a little lower than the day before then the complex network of qubits will no longer function – the computer is detuned and has to be readjusted before it can be used. 'Up until now, experimental quantum physicists have had to sit down each day and see how conditions have changed compared to the day before. They then had to remeasure each parameter and carefully recalibrate the chip,' explains Professor Wilhelm-Mauch, Professor for Theoretical Quantum and Solid-State Physics at Saarland University. Only a very small error rate of less than 0.1 percent is permissible when measuring ambient conditions. Frank Wilhelm-Mauch explains this sensitivity thus: 'That means that an error can occur in only one in a thousand measurements. If just two in a thousand measurements are in error, the software will be unable to correct for the errors and the quantum computer will not operate correctly.' With around 50 different parameters involved in the calibration process, one begins to get an idea of the sheer effort involved in calibrating a quantum computer.
Working together with his doctoral student, Wilhelm-Mauch began to consider a fundamentally new approach to the problem. 'We asked ourselves the question: Why is it necessary each and every day to understand how conditions differ from those of the day before?' The answer we eventually came up with was that it isn't necessary. What's important is that the setup procedure produces the right results. Why it produces the right results is not so relevant.' It was this pragmatic approach that underlay the work carried out by Wilhelm-Mauch and Egger. 'For the calibration procedure we used an algorithm from engineering mathematics, strictly speaking from the field of civil and structural engineering, as that's another area in which experiments are costly,' explains Professor Wilhelm-Mauch.
Using this technique, the two theoreticians were able to reduce the calibration error rate to below the required 0.1 percent threshold, while at the same time speeding up the calibration process from six hours to five minutes. The Saarbrücken methodology, which goes under the name Ad-HOC (Adaptive Hybrid Optimal Control), has now been subjected to rigorous testing by a group of experimental physicists from the University of California in Santa Barbara. Their experimental work is published in the issue of Physical Review Letters that also contains the Saarbrücken paper.
link.
No comments:
Post a Comment