29,99 €
Quantum computing is making us change the way we think about computers. Quantum bits, a.k.a. qubits, can make it possible to solve problems that would otherwise be intractable with current computing technology.
Dancing with Qubits is a quantum computing textbook that starts with an overview of why quantum computing is so different from classical computing and describes several industry use cases where it can have a major impact. From there it moves on to a fuller description of classical computing and the mathematical underpinnings necessary to understand such concepts as superposition, entanglement, and interference. Next up is circuits and algorithms, both basic and more sophisticated. It then nicely moves on to provide a survey of the physics and engineering ideas behind how quantum computing hardware is built. Finally, the book looks to the future and gives you guidance on understanding how further developments will affect you.
Really understanding quantum computing requires a lot of math, and this book doesn't shy away from the necessary math concepts you'll need. Each topic is introduced and explained thoroughly, in clear English with helpful examples.
Das E-Book können Sie in Legimi-Apps oder einer beliebigen App lesen, die das folgende Format unterstützen:
Seitenzahl: 639
Veröffentlichungsjahr: 2019
BIRMINGHAM – MUMBAI
Copyright © 2019 Packt Publishing
All rights reserved. No part of this book may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, without the prior written permission of the publisher, except in the case of brief quotations embedded in critical articles or reviews.
Every effort has been made in the preparation of this book to ensure the accuracy of the information presented. However, the information contained in this book is sold without warranty, either express or implied. Neither the author, nor Packt Publishing or its dealers and distributors, will be held liable for any damages caused or alleged to have been caused directly or indirectly by this book.
Packt Publishing has endeavored to provide trademark information about all of the companies and products mentioned in this book by the appropriate use of capitals. However, Packt Publishing cannot guarantee the accuracy of this information.
Producer: Andrew WaldronDevelopment Editor: Ian HoughPeer Review Acquisitions: Suresh JainProject Editor: Tom JacobProof Reading: Safis EditingCover Designer: Sandip Tadge
First published: November 2019
Production reference: 1251119
Published by Packt Publishing Ltd. Livery Place 35 Livery Street Birmingham B3 2PB, UK.
ISBN 978-1-83882-736-6 (Paperback edition) ISBN 978-1-83882-525-6 (eBook edition)
www.packt.com
www.packt.com
Subscribe to our online digital library for full access to over 7,000 books and videos, as well as industry leading tools to help you plan your personal development and advance your career. For more information, please visit our website.
Why subscribe?
Spend less time learning and more time coding with practical eBooks and Videos from over 4,000 industry professionals
Learn better with Skill Plans built especially for you
Get a free eBook or video every month
Fully searchable for easy access to vital information
Copy and paste, print, and bookmark content
Did you know that Packt offers eBook versions of every book published, with PDF and ePub files available? You can upgrade to the eBook version at www.Packt.com and as a print book customer, you are entitled to a discount on the eBook copy. Get in touch with us at [email protected] for more details.
At www.Packt.com, you can also read a collection of free technical articles, sign up for a range of free newsletters, and receive exclusive discounts and offers on Packt books and eBooks.
To Judith, Katie, and William,to whom my debt is beyond computation.
Robert S. Sutor has been a technical leader and executive in the IT industry for over 30 years. More than two decades of that have been spent in IBM Research in New York. During his time there, he worked on or led efforts in symbolic mathematical computation, optimization, AI, blockchain, and quantum computing. He is the co-author of several research papers and the book Axiom: The Scientific Computation System with the late Richard D. Jenks.
He also was an executive on the software side of the business in areas including emerging industry standards, software on Linux, mobile, and open source. He’s a theoretical mathematician by training, has a Ph.D. from Princeton University, and an undergraduate degree from Harvard College. He started coding when he was 15 and has used most of the programming languages that have come along.
I want to thank:
My wife, Judith Hunter, and children, Katie and William, for their love and humor while this book was being written,
John Kelly, Arvind Krishna, Dario Gil, Jay Gambetta, Jamie Thomas, Tom Rosamilia, and Ken Keverian for their leadership of the IBM Q program and their personal support,
the following for their conversations, insight, and inspiration regarding the breadth of quantum computing science, technology, business, and ecosystem:
Abe Asfaw, Alexis Harrison, Ali Javadi, Amanda Carl, Andrew Cross, Anthony Annunziata, Antonio Corcoles-Gonzalez, Antonio Mezzacapo, Aparna Prabhakar, Bill Minor, Brian Eccles, Carmen Recio Valcarce, Chris Lirakis, Chris Nay, Christine Ouyang, Christine Vu, Christopher Schnabel, Denise Ruffner, Doug McClure, Edwin Pednault, Elena Yndurain, Eric Winston, Frederik Flöther, Hanhee Paik, Heather Higgins, Heike Riel, Ingolf Wittmann, Ismael Faro, James Wootten, Jeanette Garcia, Jenn Glick, Jerry Chow, Joanna Brewer, John Gunnels, Jules Murphy, Katie Pizzolato, Lev Bishop, Liz Durst, Luuk Ament, Maika Takita, Marco Pistoia, Mark Ritter, Markus Brink, Matthias Steffen, Melissa Turesky, Michael Gordon, Michael Osborne, Mike Houston, Pat Gumann, Paul Kassebaum, Paul Nation, Rajeev Malik, Robert Loredo, Robert Wisnieff, Sarah Sheldon, Scott Crowder, Stefan Woerner, Steven Tomasco, Suzie Kirschner, Talia Gershon, Vanessa Johnson, Vineeta Durani, Wendy Allan, Wendy Cornell, and Zaira Nazario
the many authors whose works I reference throughout the book, and
the Packt production and editorial team including Andrew Waldron, Tom Jacob, and Ian Hough.
Any errors or misunderstandings that appear in this book are mine alone.
Jhonathan Romero is a quantum computing scientist and entrepreneur. Born in Barranquilla, Colombia, he received a Ph.D. in Chemical Physics from Harvard University, after earning B.S. and M.S. degrees in Chemistry from the National University of Colombia. His research has focused on the development of algorithms for quantum simulation and artificial intelligence on near-term quantum devices. Jhonathan is one of the co-founders and research scientists at Zapata Computing, a company pioneering the development of quantum algorithms and software for commercial applications. He has authored several publications in computational chemistry and quantum computing.
Everything we call real is made of things that cannot be regarded as real.
Niels Bohr [1]
When most people think about computers, they think about laptops or maybe even the bigger machines like the servers that power the web, the Internet, and the cloud. If you look around, though, you may start seeing computers in other places. Modern cars, for example, have anywhere from around 20 computers to more than 100 to control all the systems that allow you to move, brake, monitor the air conditioning, and control the entertainment system.
The smartphone is the computer many people use more than anything else in a typical day. A modern phone has a 64-bit processor in it, whatever a ‘‘64-bit processor’’ is. The amount of memory used for running all those apps might be 3Gb, which means 3 gigabytes. What’s a ‘‘giga’’ and what is a byte?
All these computers are called classical computers and the original ideas for them go back to the 1940s. Sounding more scientific, we say these computers have a von Neumann architecture, named after the mathematician and physicist John von Neumann.
It’s not the 1940s anymore, obviously, but more than seventy years later we still have the modern versions of these machines in so many parts of our lives. Through the years, the ‘‘thinking’’ components, the processors, have gotten faster and faster. The amount of memory has also gotten larger so we can run more—and bigger—apps that do some pretty sophisticated things. The improvements in graphics processors have given us better and better games. The amount of storage has skyrocketed in the last couple of decades, so we can have more and more apps and games and photos and videos on devices we carry around with us. When it comes to these classical computers and the way they have developed, ‘‘more is better.’’
We can say similar things about the computer servers that run businesses and the Internet around the world. Do you store your photos in the cloud? Where is that exactly? How many photos can you keep there and how much does it cost? How quickly can your photos and all the other data you need move back and forth to that nebulous place?
It’s remarkable, all this computer power. It seems like every generation of computers will continue to get faster and faster and be able to do more and more for us. There’s no end in sight for how powerful these small and large machines will get to entertain us, connect us to our friends and family, and solve the important problems in the world.
Except … that’s false.
While there will continue to be some improvements, we will not see anything like the doubling in processor power every two years that happened starting in the mid-1960s. This doubling went by the name of Moore’s Law and went something like ‘‘every two years processors will get twice as fast, half as large, and use half as much energy.’’
These proportions like ‘‘double’’ and ‘‘half’’ are approximate, but physicists and engineers really did make extraordinary progress for many years. That’s why you can have a computer in a watch on your wrist that is more powerful than a system that took up an entire room forty years ago.
A key problem is the part where I said processors will get half as large. We can’t keep making transistors and circuits smaller and smaller indefinitely. We’ll start to get so small that we approach the atomic level. The electronics will get so crowded that when we try to tell part of a processor to do something a nearby component will also get affected.
There’s another deeper and more fundamental question. Just because we created an architecture over seventy years ago and have vastly improved it, does that mean all kinds of problems can eventually be successfully tackled by computers using that design? Put another way, why do we think the kinds of computers we have now might eventually be suitable for solving every possible problem? Will ‘‘more is better’’ run out of steam if we keep to the same kind of computer technology? Is there something wrong or limiting about our way of computing that will prevent our making the progress we need or desire?
Depending on the kind of problem you are considering, it’s reasonable to think the answer to the last question if somewhere between ‘‘probably’’ and ‘‘yes.’’
That’s depressing. Well, it’s only depressing if we can’t come up with one or more new types of computers that have a chance of breaking through the limitations.
That’s what this book is about. Quantum computing as an idea goes back to at least the early 1980s. It uses the principles of quantum mechanics to provide an entirely new kind of computer architecture. Quantum mechanics in turn goes back to around 1900 but especially to the 1920s when physicists started noticing that experimental results were not matching what theories predicted.
However, this is not a book about quantum mechanics. Since 2016, tens of thousands of users have been able to use quantum computing hardware via the cloud, what we call quantum cloud services. People have started programming these new computers even though the way you do it is unlike anything done on a classical computer.
Why have so many people been drawn to quantum computing? I’m sure part of it is curiosity. There’s also the science fiction angle: the word ‘‘quantum’’ gets tossed around enough in sci-fi movies that viewers wonder if there is any substance to the idea.
Once we get past the idea that quantum computing is new and intriguing, it’s good to ask ‘‘ok, but what is it really good for?’’ and ‘‘when and how will it make a difference in my life?’’ I discuss the use cases experts think are most tractable over the next few years and decades.
It’s time to learn about quantum computing. It’s time to stop thinking classically and to start thinking quantumly, though I’m pretty sure that’s not really a word!
This book is for anyone who has a very healthy interest in mathematics and wants to start learning about the physics, computer science, and engineering of quantum computing. I review the basic math, but things move quickly so we can dive deeply into an exposition of how to work with qubits and quantum algorithms.
While this book contains a lot of math, it is not of the definition-theorem-proof variety. I’m more interested in presenting the topics to give you insight on the relationships between the ideas than I am in giving you a strictly formal development of all results.
Another goal of mine is to prepare you to read much more advanced texts and articles on the subject, perhaps returning here to understand some core topic. You do not need to be a physicist to read this book, nor do you need to understand quantum mechanics beforehand.
At several places in the book I give some code examples using Python 3. Consider these to be extra and not required, but if you do know Python they may help in your understanding.
Many of the examples in this book come from the IBM Q quantum computing system. I was an IBM Q executive team member during the time I developed this content.
Before we jump into understanding how quantum computing works from the ground up, we need to take a little time to see how things are done classically. In fact, this is not only for the sake of comparison. The future, I believe, will be a hybrid of classical and quantum computers.
The best way to learn about something is start with basic principles and then work your way up. That way you know how to reason about it and don’t rely on rote memorization or faulty analogies.
1 – Why Quantum Computing?
In the first chapter we ask the most basic question that applies to this book: why quantum computing? Why do we care? In what ways will our lives change? What are the use cases to which we hope to apply quantum computing and see a significant improvement? What do we even mean by ‘‘significant improvement’’?
The first full part covers the mathematics you need to understand the concepts of quantum computing. While we will ultimately be operating in very large dimensions and using complex numbers, there’s a lot of insight you can gain from what happens in traditional 2D and 3D.
2 – They’re Not Old, They’re Classics
Classical computers are pervasive but relatively few people know what’s inside them and how they work. To contrast them later with quantum computers, we look at the basics along with the reasons why they have problems doing some kinds of calculations. I introduce the simple notion of a bit, a single 0 or 1, but show that working with many bits can eventually give you all the software you use today.
3 – More Numbers than You Can Imagine
The numbers people use every day are called real numbers. Included in these are integers, rational numbers, and irrational numbers. There are other kinds of numbers, though, and structures that have many of the same algebraic properties. We look at these to lay the groundwork to understand the ‘‘compute’’ part of what a quantum computer does.
4 – Planes and Circles and Spheres, Oh My
From algebra we move to geometry and relate the two. What is a circle, really, and what does it have in common with a sphere when we move from two to three dimensions? Trigonometry becomes more obvious, though that is not a legally binding statement. What you thought of as a plane becomes the basis for understanding complex numbers, which are key to the definition of quantum bits, usually known as qubits.
5 – Dimensions
After laying the algebraic and geometric groundwork, we move beyond the familiar two- and three-dimensional world. Vector spaces generalize to many dimensions and are essential for understanding the exponential power that quantum computers can harness. What can you do when you are working in many dimensions and how should you think about such operations? This extra elbow room comes into play when we consider how quantum computing might augment AI.
6 – What Do You Mean “Probably”?
‘‘God does not play dice with the universe,’’ said Albert Einstein.
This was not a religious statement but rather an expression of his lack of comfort with the idea that randomness and probability play a role in how nature operates. Well, he didn’t get that quite right. Quantum mechanics, the deep and often mysterious part of physics on which quantum computing is based, very much has probability at its core. Therefore, we cover the fundamentals of probability to aid your understanding of quantum processes and behavior.
The next part is the core of how quantum computing really works. We look at quantum bits—qubits—singly and together, and then create circuits that implement algorithms. Much of this is the ideal case when we have perfect fault-tolerant qubits. When we really create quantum computers, we must deal with the physical realities of noise and the need to reduce errors.
7 – One Qubit
At this point we are finally able to talk about qubits in a nontrivial manner. We look at both the vector and Bloch sphere representations of the quantum states of qubits. We define superposition, which explains the common cliché about a qubit being ‘‘zero and one at the same time.’’
8 – Two Qubits, Three
With two qubits we need more math, and so we introduce the notion of the tensor product, which allows us to explain entanglement. Entanglement, which Einstein called ‘‘spooky action at a distance,’’ tightly correlates two qubits so that they no longer act independently. With superposition, entanglement gives rise to the very large spaces in which quantum computations can operate.
9 – Wiring Up the Circuits
Given a set of qubits, how do you manipulate them to solve problems or perform calculations? The answer is you build circuits for them out of gates that correspond to reversible operations. For now, think about the classical term ‘‘circuit board.’’ I use the quantum analog of circuits to implement algorithms, the recipes computers use for accomplishing tasks.
10 – From Circuits to Algorithms
With several simple algorithms discussed and understood, we next turn to more complicated ones that fit together to give us Peter Shor’s 1995 fast integer factoring algorithm. The math is more extensive in this chapter, but we have everything we need from previous discussions.
11 – Getting Physical
When you build a physical qubit, it doesn’t behave exactly like the math and textbooks say it should. There are errors, and they may come from noise in the environment of the quantum system. I don’t mean someone yelling or playing loud music, I mean fluctuating temperatures, radiation, vibration, and so on. We look at several factors you must consider when you build a quantum computer, introduce Quantum Volume as a whole-system metric of the performance of your system, and conclude with a discussion of the most famous quantum feline.
This book concludes with a chapter that moves beyond today.
12 – Questions about the Future
If I were to say, ‘‘in ten years I think quantum computing will be able to do …,’’ I would also need to describe the three or four major scientific breakthroughs that need to happen before then. I break down the different areas in which we’re trying to innovate in the science and engineering of quantum computing and explain why. I also give you some guiding principles to distinguish hype from reality. All this is expressed in terms of motivating questions.
Karen Barad.Meeting the Universe Halfway. Quantum Physics and the Entanglement of Matter and Meaning. 2nd ed. Duke University Press Books, 2007.
When I want to highlight something important that you should especially remember, I use this kind of box:
This is very important.
This book does not have exercises but it does have questions. Some are answered in the text and others are left for you as thought experiments. Try to work them out as you go along. They are numbered within chapters.
Question 0.0.1
Why do you ask so many questions?
Code samples and output are presented to give you an idea about how to use a modern programming language, Python 3, to experiment with basic ideas in quantum computing.
defobligatoryFunction():print("Helloquantumworld!")obligatoryFunction()Helloquantumworld!Numbers in brackets (for example, [1]) are references to additional reading materials. They are listed at the end of each chapter in which the bracketed number appears.
To learn more
Here is a place where you might see a reference to learn more about some topic. [1]
Feedback from our readers is always welcome.
General feedback: If you have questions about any aspect of this book, mention the book title in the subject of your message and email us at [email protected].
Errata: Although we have taken every care to ensure the accuracy of our content, mistakes do happen. If you have found a mistake in this book we would be grateful if you would report this to us. Please visit, http://www.packt.com/submit-errata, selecting your book, clicking on the Errata Submission Form link, and entering the details.
Piracy: If you come across any illegal copies of our works in any form on the Internet, we would be grateful if you would provide us with the location address or website name. Please contact us at [email protected] with a link to the material.
If you are interested in becoming an author: If there is a topic that you have expertise in and you are interested in either writing or contributing to a book, please visit http://authors.packtpub.com.
Now let’s get started by seeing why we should look at quantum computing systems to try to solve problems that are intractable with classical systems.