Quantum Computing Era Approaches as Moore’s Law Ends
April 12, 2019
Quantum computing is coming and it’s safe to say that only a handful of people know what it is. At NAB 2019, USC Viterbi School of Engineering Ph.D. candidate Bibek Pokharel did an excellent job of breaking down the basics. First, according to quantum computer scientists, all the computers we have used thus far are “classical computers.” Although IBM, Intel, Google, Microsoft, Rigetti and D-Wave have built quantum computers, the task is so incredibly complex that you won’t be able to purchase one at Best Buy.
A quantum computer is so difficult and expensive to build that it doesn’t seem logical to expend the effort. But Moore’s Law — which dictates that the number of transistors in a dense integrated circuit will double every two years — has reached the end of its usefulness, since transistors have become so minute. Something different is needed to go to the next level — that’s where quantum computing comes in.
Although it presents hardware design restrictions and is error-prone, quantum computing also offers a level of computing power far beyond anything we’ve seen. As such, it can benefit everything from cryptography to artificial intelligence, and be used to create new medicines or simulate materials.
“Studying these systems is not just hard but also counter-intuitive,” said Pokharel. “Atoms do things that don’t make sense to us. The idea is to use this weirdness to build a better computer.” USC’s Entertainment Technology Center lead of adaptive technology Seth Levenson moderated a panel that also included IBM Research lab director and vice president Jeffrey J. Welser and Corto chief executive Yves Bergquist, who heads ETC’s AI and neuroscience project.
Pokharel described that qubits (quantum bits) are the fundamental building blocks of quantum computers. “A qubit can be both 0 and 1 at the same time — it’s a superposition of both,” he said. “With quantum particles, two or more contradicting properties can simultaneously co-exist — but only until you measure them.”
The other important property is entanglement, which occurs “when two or more qubits have their fates linked … they behave in a correlated way even when separated by large distances,” a feature that gives more power. The collective sum of qubits, said Pokharel, is “much bigger than the whole.” “Even a single qubit will make the computer more powerful than adding a bit to your phone,” he said. “You always get an exponential increase in power because of entanglement.”
It gets even more complicated, which is why quantum studies are the purview of physicists, but the potential for uses in the M&E industry is clear. IBM’s Wesler reported that, with the IBM Q quantum computing system, his company is involved with broadcast encryption. “Quantum computing is especially good at factoring integers,” he noted. “We’re about five years away from using quantum computing for machine learning.”
Currently there is a race to achieve the quantum advantage. “Billions of dollars are being invested in this,” said Wesler.
Bergquist posed the question of whether quantum computing can accelerate the development of artificial intelligence. “It’s a probabilistic machine for a probabilistic world,” he said. “It’s a statistician’s dream. It can do the kind of work DeepMind was doing, but more efficiently, so quantum computing could help us a lot.”
He pointed to AIXI, a universal reinforcement-learning agent created by theoretical mathematician Marcus Hutter. “AIXI’s deep reinforcement learning is behind Go Chess, Starcraft, all of which were considered impossible AI problems to solve.” It took “a gargantuan amount of computing” to create, said Bergquist, implying that, with quantum computing, more impossible problems will be solved.
No Comments Yet
You can be the first to comment!
Sorry, comments for this entry are closed at this time.