36,59 €
Fintech veteran and venture capitalist, Arunkumar Krishnakumar, cuts through the hype to bring us a first-hand look into how quantum computing and Blockchain together could redefine industries and life as we know it.
Key Features
Book Description
Are quantum computing and Blockchain on a collision course or will they be the most important trends of this decade to disrupt industries and life as we know it?
Fintech veteran and venture capitalist Arunkumar Krishnakumar cuts through the hype to bring us a first-hand look into how quantum computing and Blockchain together are redefining industries, including fintech, healthcare, and research. Through a series of interviews with domain experts, he also explores these technologies' potential to transform national and global governance and policies – from how elections are conducted and how smart cities can be designed and optimized for the environment, to what cyberwarfare enabled by quantum cryptography might look like. In doing so, he also highlights challenges that these technologies have to overcome to go mainstream.
Quantum Computing and Blockchain in Business explores the potential changes that quantum computing and Blockchain might bring about in the real world. After expanding on the key concepts and techniques, such as applied cryptography, qubits, and digital annealing, that underpin quantum computing and Blockchain, the book dives into how major industries will be impacted by these technologies. Lastly, we consider how the two technologies may come together in a complimentary way.
What you will learn
Who this book is for
This book is for tech enthusiasts – developers, architects, managers, consultants, and venture capitalists – working in or interested in the latest developments in quantum computing and blockchain. While the book introduces key ideas, terms, and techniques used in these technologies, the main goal of this book is to prime readers for the practical adoption and applications of these technologies across varies industries and walks of life.
Das E-Book können Sie in Legimi-Apps oder einer beliebigen App lesen, die das folgende Format unterstützen:
Seitenzahl: 557
Veröffentlichungsjahr: 2020
Quantum Computing and Blockchain in Business
Exploring the applications, challenges, and collision of quantum computing and blockchain
Arunkumar Krishnakumar
BIRMINGHAM - MUMBAI
Quantum Computing and Blockchain in Business
Copyright © 2020 Packt Publishing
All rights reserved. No part of this book may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, without the prior written permission of the publisher, except in the case of brief quotations embedded in critical articles or reviews.
Every effort has been made in the preparation of this book to ensure the accuracy of the information presented. However, the information contained in this book is sold without warranty, either express or implied. Neither the author, nor Packt Publishing or its dealers and distributors, will be held liable for any damages caused or alleged to have been caused directly or indirectly by this book.
Packt Publishing has endeavored to provide trademark information about all of the companies and products mentioned in this book by the appropriate use of capitals. However, Packt Publishing cannot guarantee the accuracy of this information.
Producers: Andrew Waldron, Jonathan Malysiak
Acquisition Editor – Peer Reviews: Suresh Jain
Project Editor: Tom Jacob
Content Development Editor: Dr. Ian Hough
Technical Editor: Karan Sonawane
Copy Editor: Safis Editing
Proofreader: Safis Editing
Indexer: Tejal Soni
Presentation Designer: Pranit Padwal
First published: March 2020
Production reference: 1270320
Published by Packt Publishing Ltd.
Livery Place
35 Livery Street
Birmingham B3 2PB, UK.
ISBN: 978-1-83864-776-6
www.packt.com
I would like to dedicate this book to the four wonderful ladies that rule my life.
My mother who inspires and supports me unconditionally. My two little princesses, Dhwani and Diya. They have been really patient, every time I got in to my little den to write this book.
Most of all, my dear wife Sumi, who has been the bedrock of my life and career!
packt.com
Subscribe to our online digital library for full access to over 7,000 books and videos, as well as industry leading tools to help you plan your personal development and advance your career. For more information, please visit our website.
Did you know that Packt offers eBook versions of every book published, with PDF and ePub files available? You can upgrade to the eBook version at www.Packt.com and as a print book customer, you are entitled to a discount on the eBook copy. Get in touch with us at [email protected] for more details.
At www.Packt.com, you can also read a collection of free technical articles, sign up for a range of free newsletters, and receive exclusive discounts and offers on Packt books and eBooks.
Arunkumar Krishnakumar is an Investor at Green Shores Capital, a Venture Capital firm focusing on Deep Tech and Diversity. Arun and his team have made over 18 investments over the last four years. Arun sits on the board of four of his portfolio firms; the firms are either into AI, Data Science, or Blockchain. Arun is also on the board of trustees of Aram Foundation, an NGO in India focused on water conservation and environment initiatives.
Apart from his venture career, Arun is one of the top 100 Onalytica Fintech influencers. He is a podcast host and was a blogger on DailyFintech, which is the second most read Fintech blog in the world. Arun has contributed circa 150 blog posts over a period of 3 years at DailyFintech.
Prior to his career as a venture capital investor, Arun was working within capital markets data and technology at Barclays. He was also part of the leadership team within PwC's data analytics practice and was focused on banking and capital market clients.
I would like to begin my acknowledgement with my beginning, my parents. I wouldn't be where I am in life and career if not for them.
I would like to thank Packt and their amazing team for giving me this opportunity. It was a great experience working on this book. I would like to thank every single expert whom I have interviewed for this book. Their insights have been invaluable in getting the book to what it is.
Most of all, I would like to thank all innovators. You keep me excited, yet grounded. You help me see the future, yet keep me mindful of present challenges. Most of all, you give me hope!
David F. Beach is an American Technology Evangelist with over 15 years of experience in large-scale technology transformations for global enterprises and governments. With an undergraduate degree in Engineering, he finished his PhD in Physics with focus in Nano-Optics at the University of Basel, followed by Postdoctoral appointment at University of California, Los Angeles (UCLA). He has since been dedicated to helping global enterprises and governments through their technology transformation journey including IT, cloud, quantum computing, Data Science, security, and most recently artificial intelligence. He is also an advocate of other topics such as history and psychology.
In 2012 he launched QuantumBits.IT as a consulting platform on quantum computing, and in 2015 CentCom.Cloud delivering a single platform for a variety of tools used in global technology infrastructure. He is currently with Accenture Technology and previously worked at Oracle.
I would like to send my sincere gratitude to my mother as my true supporter in various stages of my life. Further thanks go to former teachers and colleagues whose ideas and feedback have helped shaping my ideals and goals.
Preface
Who this book is for
What this book covers
Download the color images
Conventions used
Get in touch
Reviews
Introduction to Quantum Computing and Blockchain
What this book does
An introduction to quantum computing
The history of quantum mechanics
Einstein's quantum troubles
Bell's inequality
Quantum computers – a fancy idea
Déjà vu
The weirdness of quantum
A scary experiment
Einstein's photons – weirder now
Inside a quantum computer
Qubit types and properties
Blockchain and cryptography
Hashing
The bitcoin hash
Mining a bitcoin
A block
Proof of work
Transactions
Utility versus security token
Conclusion
References
Quantum Computing – Key Discussion Points
Superposition
An exponential challenge
The five coins puzzle
Entanglement – spooky action at a distance
Bloch sphere
Shor's algorithm
Grover's algorithm
Quantum annealing
Quantum tunneling
The traveling salesman
Decoherence
Quantum Error Correction
Conclusion
References
The Data Economy
The internet
The ARPANET
TCP/IP
The boom, the bust, and the boom
Social media
Big data
Structured data processing
Unstructured data processing
Big data architecture
The cloud
Artificial intelligence
Origins of AI
The imitation game
Avatars of AI
Blockchain
Decentralization
Immutability
Traceability
Quantum computing
Conclusion
References
The Impact on Financial Services
Quantum computing applications
Market risk
Credit risk
Technology limitations
Quantum-Inspired Digital Annealing
Quantum annealing
Dynamic portfolio selection
ATM replenishment
Blockchain applications
Anti-money-laundering (AML) and Know Your Customer (KYC)
Trade finance
Remittance
A SWIFT experiment
Retail remittance
Central Bank-backed digital currency
Security tokens
Conclusion
Interview with Dr. Dave Snelling, Fujitsu Fellow
Conclusion
References
The Impact on Healthcare and Pharma
The rally for innovation
The AI doctor
Last mile healthcare
Cancer diagnosis and treatment
Drug discovery
Blockchain in healthcare
Insurance
Pharmaceutical supply chains
Healthcare data exchanges
Research governance
Conclusion
Interview with Dr. B. Rajathilagam, Head of AI Research, Amrita Vishwa Vidyapeetham
Conclusion
The Impact on Governance
Social media in politics
Election modeling
Quantum machine learning
Boltzmann machine
QxBranch election model
Primary experiment
Blockchain, governance, and elections
Governance models
On-chain governance
Benevolent dictator for life
Core development team
Open governance
Smart Dubai
e-Estonia
Estonia and e-Residency
The Vienna token
United Nations and refugees
Conclusion
Interview with Max Henderson, Senior Data Scientist, Rigetti and QxBranch
Conclusion
The Impact on Smart Cities and Environment
So why do I care?
Smart cities
Smart parking
Traffic management
City planning
Waste collection
Climate modeling
Biogeography
Atmospheric circulation
Ocean currents
Earth's tilt
Quantum computing solutions
Conclusion
Interview with Sam McArdle, Quantum Computing Researcher at the University of Oxford
Conclusion
The Impact on Chemistry
Nitrogen fixation
Carbon capture
NISQ and chemistry
Blockchain in chemistry
Fostering an industry ecosystem
Disaster recovery
Conclusion
The Impact on Logistics
Traffic management systems
The Airbus quantum computing challenge
Quantum networks
Data security
Blockchain in logistics
Tracking goods
The food supply chain
Sustainable supply chains
Transport
Vehicle manufacturing
Conclusion
Interview with Dinesh Nagarajan, Partner, IBM
Conclusion
Quantum-Safe Blockchain
A race against time
The RSA algorithm
ECC
Does quantum computing mean chaos?
Quantum-Safe Cryptography
Lattice-Based Cryptography
Code-Based Cryptography
Blockchain Cryptography
Security in a post-quantum world
eXtended Merkle Signature Scheme (XMSS)
Blockchain Post Quantum Signatures (BPQS)
Winternitz One-Time Signatures (W-OTS)
Conclusion
Nation States and Cyberwars
When growth can be dangerous
Fertile grounds
Age of the machine
The cyber arms race
United States
China leading the world
Europe and the United Kingdom
The hype surrounding quantum networks
Conclusion
Conclusion – Blue Skies
Is there hype?
Is there merit?
Healthcare
Financial services
Logistics
Climate
Elections
A bumpy road ahead
The collision course
Nation states and ecosystems
The sky is blue
Other Books You May Enjoy
Index
Cover
Index
The book offers a glimpse of the future in the sense of how different industries could benefit from the new wave of computing. The book takes readers on a journey, starting by describing the limitations of today's computers, and laying out the possible solutions presented by quantum computing to go beyond those limits.
This book is for professionals who would like to understand the potential applications of quantum computing in industrial scenarios. Readers who are looking for the "so what" of the technology, rather than get too deep into the technology itself, will find it an interesting read.
The book is also meant to address innovators across the world who are worried or curious about the potential collision course between today's cryptography algorithms and quantum computing. The book brings views from various industry experts in the fields of quantum computing, cryptography, and machine learning.
Readers will get a practical, realistic, and on-the-ground view of the potential of quantum computing. The book is not meant for readers who are looking to for a primer or textbook in the Math or Physics behind quantum computers, but rather those who wish to take a more business-focused perspective on the topic.
Chapter 1: Introduction to Quantum Computing and Blockchain
This is the first chapter of this book, where I touch upon the technological concepts that quantum computing and Blockchain are based on. This chapter also provides a brief history of the field of quantum physics over the last century, and how this led to the evolution of quantum computing.
Chapter 2: Quantum Computing – Key Discussion Points
This chapter discusses quantum computing terminology and key concepts in greater detail. The purpose of this chapter is to provide the readers an overview of the terms they will see through the rest of this book.
Chapter 3: The Data Economy
We live in a data era. Over the last 30 years the internet and the social media boom has helped create a lot of data. This has in turn helped fuel technologies like AI that are reliant on such data. We look at the role that data technologies like AI, Blockchain, IoT, and quantum computing could play in our everyday lives.
Chapter 4: The Impact on Financial Services
Financial services is a data-intensive industry. Most financial decisions around portfolio management and risk management are heavily reliant on simulations used by financial institutions. This chapter covers the potential applications of quantum computing on financial services.
Chapter 5: Interview with Dr. Dave Snelling, Fujitsu Fellow
An interview with Dave Snelling, the Program Director of AI at Fujitsu and one of the key stakeholders behind their Digital Annealer. Dave brings to life the applications of digital annealers that he is already working on, and the potential future he sees for quantum computers.
Chapter 6: The Impact on Healthcare and Pharma
This chapter brings to life applications of quantum computing in the field of healthcare and pharmaceuticals. Drug discovery is one of the key areas where the quantum computing could make a big difference. There are applications of Blockchain in the pharma industry that I discuss in this chapter too.
Chapter 7: Interview with Dr. B. Rajathilagam, Head of AI Research, Amrita Vishwa Vidyapeetham
The interview with Dr. B. Rajathilagam (BRT) gives us a view of the use cases for quantum computing and Blockchain in the emerging markets. BRT brings to fore her firsthand knowledge of the on-the-ground challenges for villages in India.
Chapter 8: The Impact on Governance
Elections are largely becoming testing grounds for AI and data science technologists. It could be a new future where election results could be accurately modeled using technologies such as quantum computing. This chapter explores the possibilities of the technology in future elections.
Chapter 9: Interview with Max Henderson, Senior Data Scientist, Rigetti and QxBranch
Max and his team at QxBranch modeled the 2016 American elections using quantum computing, and achieved a high degree of precision with their predictions. This interview covers Max's experience in doing so, his observations, and lessons learned from the process.
Chapter 10: The Impact on Smart Cities and Environment
We live in a world that is being affected by climate change and is in a climate emergency. This chapter looks at ways we could create smart cities using technology that can not only make our lives better, but can also help us to live more sustainably.
Chapter 11: Interview with Sam McArdle, Quantum Computing Researcher at the University of Oxford
One of the biggest challenges with quantum computers is the limited ability we have in correcting errors. In this chapter, I discuss these challenges with Sam McArdle, who is researching NISQ at Oxford University. Sam explains why error correction in quantum computers is such a hard task. He also touches upon the possibilities of using NISQ in fields like Chemistry.
Chapter 12: The Impact on Chemistry
Quantum computing is based on the principles of quantum mechanics that describes the behavior of sub-atomic particles. This is precisely why Chemistry is a very fitting application for quantum computers. In this chapter I discuss how quantum computers can be used to model the interactions between molecules during a chemical reaction.
Chapter 13: The Impact on Logistics
Logistics problems have been challenging for classical computers to solve. In this chapter I discuss how quantum computing can help solve some of the real-world problems in logistics. I also discuss the work firms like Airbus is doing to improve aerodynamics of their flights. On the Blockchain side, I explain the use of the technology within the supply chain and the efficiencies it can add to supply chains.
Chapter 14: Interview with Dinesh Nagarajan, Partner, IBM
Dinesh is a cyber security expert and a partner at IBM. In this chapter we discuss how technology innovation has helped enrich people's lives across the world and the cyber risks that it has brought to the table. We also discuss how firms and nation states can be more agile from a cryptographic perspective to face the threat posed by the new era in computing.
Chapter 15: Quantum-Safe Blockchain
In this chapter we talk about the elephant in the room. Is Blockchain under threat in a post quantum era? We take a step further to touch upon how data transmission on the internet could be at risk because of quantum computing. We also look at various cryptographic techniques that are quantum ready and how they can help us protect our data.
Chapter 16: Nation States and Cyberwars
What would a country do if it was equipped with a technology that can be used to take control of most encrypted data in the world? How close are we to getting to that reality? Is that even a possibility we should be prepared for? These are the questions I address in the chapter about nation states and their efforts to dominate the data technology world. Several billion dollars have been allocated for technologies such as AI, Blockchain, and quantum computing by countries that want to dominate this space.
Chapter 17: Conclusion – Blue Skies
A closure chapter summarizing the key takeaways from the book. I discuss the learning from the interviews and my thoughts on the comments from the experts I interviewed. I conclude with my views on the hype around quantum computing and how that could affect the direction of travel of the ecosystem. I note the efforts from top nations across the world to gain dominance in this space. I conclude by providing my views on the possibilities of this amazing technology – quantum computing.
We also provide a PDF file that has color images of the screenshots/diagrams used in this book. You can download it here: https://static.packt-cdn.com/downloads/9781838647766_ColorImages.pdf
Bold: Indicates a new term, an important word, or words that you see on the screen, for example, in menus or dialog boxes, also appear in the text like this. For example: "My professor Dr. B. Rajathilagam, who taught me about Database Management Systems (DBMSes) and Object Oriented Programming 20 years ago, is now leading AI and quantum machine learning research."
Warnings or important notes appear like this.
Feedback from our readers is always welcome.
General feedback: If you have questions about any aspect of this book, mention the book title in the subject of your message and email us at [email protected].
Errata: Although we have taken every care to ensure the accuracy of our content, mistakes do happen. If you have found a mistake in this book we would be grateful if you would report this to us. Please visit, http://www.packt.com/submit-errata, selecting your book, clicking on the Errata Submission Form link, and entering the details.
Piracy: If you come across any illegal copies of our works in any form on the Internet, we would be grateful if you would provide us with the location address or website name. Please contact us at [email protected] with a link to the material.
If you are interested in becoming an author: If there is a topic that you have expertise in and you are interested in either writing or contributing to a book, please visit http://authors.packtpub.com.
Please leave a review. Once you have read and used this book, why not leave a review on the site that you purchased it from? Potential readers can then see and use your unbiased opinion to make purchase decisions, we at Packt can understand what you think about our products, and our authors can see your feedback on their book. Thank you!
For more information about Packt, please visit packt.com.
It was the best of times, it was the worst of times, it was the age of wisdom, it was the age of foolishness, it was the epoch of belief, it was the epoch of incredulity, it was the season of Light, it was the season of Darkness, it was the spring of hope, it was the winter of despair.
I am sure Charles Dickens did not foresee quantum computing or Blockchain. His words from 160 years ago, however, still apply to the ebbs and flows we have seen with these two technologies. Quantum computing has been around for a good part of a century. In contrast, Blockchain was first introduced to the world in 2008.
Unlike the Blockchain wave that has hit us in recent years, quantum principles have been around for several decades. Quantum physics has been a very debated field and is fundamental to quantum computing. However, the field of quantum computing has gained momentum in recent times.
Despite the differences in the age of the two technologies, they have had interesting histories. For instance, most people who understand Blockchain agree that the framework is robust. However, the technology is still far from perfect, and that is true for quantum computing too.
The momentum behind quantum computing in the past decade has been largely due to advancements in algorithms and infrastructure. However, in my opinion, it is also because of the data age we live in, and some of the use cases for quantum computers are becoming clearer and relevant. In this chapter, I will cover the history of both these technologies that have had controversial pasts. Their place in modern society as transformational technologies is hard to dispute.
The purpose of this book is to explore the overlaps between quantum computing and Blockchain. The two technologies are fundamentally based on cryptography. As a result, there is a possibility that they are on a collision course. However, when we look at the real-world applications of these technologies, they are quite complimentary to one another.
In this chapter, we will discuss technical concepts that are fundamental to quantum computing and Blockchain. We will delve into quantum computing and its history, and then touch upon some of the key concepts of Blockchain that are relevant to the thesis of the book.
One of the key themes that I would like to establish in this book is that Technology is just a means to an end. While it is important to understand it, and feel excited about the possibilities, a technology can only be special if it can make a difference to people's lives.
There is a lot of hype on social media that quantum computing would kill Blockchain. In a data age, both these technologies have a place. Quantum computing can vastly improve our problem-solving abilities. In a social media age, we will need our technologies to cope with big data volumes and understand the interdependencies between variables that we analyze. Quantum computing, when it goes mainstream, should address those areas.
On the other hand, a simple way to describe Blockchain's application is Decentralized Data Integrity. An immutable record of every transaction gets maintained and managed by the network. That is the fundamental advantage of Blockchain over data storage mechanisms we have used in the past.
Through industry-specific chapters and interviews with thought leaders in quantum computing, AI and machine learning, I will try to establish the business relevance of these two technologies. In doing so, I will establish that these two technologies have vertical synergies in a data centric world we live in.
In the next section, I will go through the history of quantum computing. In the process of doing that, I will also touch upon several key concepts of the technology.
We are living through a data era, with several technologies sharing symbiotic relationships with each other. Of all the exciting technology paradigms, quantum computing has the potential to create disruption at scale. The principles of quantum physics, which are the bedrock of quantum computing, have been around for over a century.
An understanding of the evolution of quantum Physics is interesting because of the personalities involved and their contradicting philosophical views. However, the history of this field also gives us an insight into the counter intuitive nature of these concepts that challenged even the brightest minds. This chapter focuses on the story of quantum computing, and touches upon some of the basic principles of this technology.
In a conversation between an investor and a professor in academia, the investor is often left thinking, "Wow, that is great, but so what?", and the academic is wondering, "Does the investor get it?". The exploration of quantum computing has been one such experience for me, where the nerd in me wanted to delve deep into the physics, math, and the technical aspects of the discipline. However, the investor in me kept on asking, "So what's of value? What's in it for the world? What's in it for businesses?".
As a result of this tug of war, I have come up with a simplified explanation of quantum principles that lays the foundations of quantum mechanics. For a better understanding of quantum computing, we need to first study the basics of quantum information processing with respect to the flow of (quantum) bits, and how they process data and interact with each other. Therefore, let us begin with the tenets of quantum physics as the basis of quantum information processing.
Quantum physics provides the foundational principles that explains the behavior of particles such as atoms, electrons, photons, and positrons. A microscopic particle is defined as a small piece of matter invisible to the naked human eye.
In the process of describing the history of quantum mechanics, I will touch upon several of its fundamental concepts. The discovery and the evolution in scientists' understanding of these concepts has helped shape more modern thinking around quantum computing. The relevance of these concepts to quantum computing will become clear as this chapter unravels. However, at this stage the focus is on how this complex field has continued to perplex great minds for almost 100 years.
Quantum mechanics deals with nature at the smallest scales; exploring interactions between atoms and subatomic particles. Throughout a good part of the 19th century and the early part of the 20th century, scientists were trying to solve the puzzling behavior of particles, matter, light, and color. An electron revolves around the nucleus of an atom, and when it absorbs a photon (a particle of light), it jumps into a different energy level. Ultraviolet rays could provide enough energy to knock out electrons from an atom, producing positive electrical charge due to the removal of the negatively charged electron. Source: https://www.nobelprize.org/prizes/physics/1905/lenard/facts/
Scientists observed that an electron absorbing a photon was often limited to specific frequencies. An electron absorbing a specific type of photon resulted in colors associated with heated gases. This behavior was explained in 1913 by Danish scientist Niels Bohr. Further research in this field led to the emergence of the basic principles of quantum mechanics. Source: https://www.nobelprize.org/prizes/physics/1922/bohr/biographical/
Bohr postulated that electrons were only allowed to revolve in certain orbits, and the colors that they absorbed depended on the difference between the orbits they revolved in. For this discovery, he was awarded the Nobel prize in 1922. More importantly, this helped to cement the idea that the behavior of electrons and atoms was different from that of objects that are visible to the human eye (macroscopic objects). Unlike classical physics, which defined the behavior of macroscopic objects, quantum mechanics involved instantaneous transitions based on probabilistic rules rather than exact mechanistic laws.
This formed the basis of further studies focused on the behavior and interaction of subatomic particles such as electrons. As research identified more differences between classical physics and quantum physics, it was broadly accepted that quantum principles could be used to define the idiosyncrasies of nature (for example: black holes). Two great minds, Albert Einstein and Stephen Hawkins, have contributed to this field through their work on relativity and quantum gravity. Let us now look into how Albert Einstein viewed quantum physics and its concepts. Source: https://www.nobelprize.org/prizes/physics/1921/einstein/facts/
We may have to go back some years in history to understand how Einstein got entangled (pun intended) in the world of quantum mechanics. For a layman, space is just vast emptiness, yet when combined with time, space becomes a four-dimensional puzzle that has proven to be a tremendous challenge to the greatest minds of the 19th and 20th centuries. There were principles of quantum mechanics that Einstein did not agree with, and he was vocal about it.
One of the key principles of quantum mechanics was Copenhagen Interpretation. This explains how the state of a particle is influenced by the fact that the state was observed; the observer thus influenced the state of the particle. Einstein did not agree with this indeterminate aspect of quantum mechanics that Niels Bohr postulated.
In 1927, Einstein began his debates with Bohr at the Solvay Conference in Brussels. He believed in objective reality that existed independent of observation. As per the principles of quantum theory, the experimenters' choice of methods affected whether certain parameters had definitive values or were fuzzy. Einstein couldn't accept that the moon was not there when no one looked at it and felt that the principles of quantum theory were incomplete. Source: https://cp3.irmp.ucl.ac.be/~maltoni/PHY1222/mermin_moon.pdf
One interesting aspect of this indeterministic nature of objects is that as babies, we tend to appreciate these principles better. This is illustrated in the peek-a-boo game that babies often love. They believe that the observer exists only when they observe them, and do not demonstrate the cognitive ability called object permanence. However, as we grow older, we base our actions on the assumption of object permanence.
Niels Bohr believed that it was meaningless to assign reality to the universe in the absence of observation. In the intervals between measurements, quantum systems existed as a fuzzy mixture of all possible properties – commonly known as superposition states. The mathematical function that described the states that particles took is called the wave function, which collapses to one state at the point of observation.
This philosophical battle between the two scientists (Einstein and Bohr) intensified in 1935 with the emergence of the property of Entanglement. It meant that the state of two entangled particles was dependent on each other (or had a correlation) irrespective of how far they were from each other. Einstein (mockingly) called it the Spooky action at a distance.
As a response to Bohr's findings, the infamous EPR (Einstein, Podolsky, Rosen) paper was written in 1935/36 by Albert Einstein, Boris Podolsky, and Nathan Rosen. The purpose of the paper was to argue that quantum mechanics fails to provide a complete description of physical reality. Podolsky was tasked with translating it to English, and Einstein was not happy with the translation. Apart from that, Podolsky also leaked an advance report of the EPR paper to the New York Times, and Einstein was so upset that he never spoke to Podolsky again. Source: https://www.aps.org/publications/apsnews/200511/history.cfm
The EPR paradox identified two possible explanations for the entanglement property. The state of one particle affecting another could potentially be due to shared, embedded properties within both particles, like a gene. Alternatively, the two particles could be making instantaneous communication with each other about their states. The second explanation was thought to be impossible, as this violated the theory of special relativity (if the particles were making instantaneous communication at faster than the speed of light) and the principle of locality.
The principle of locality states that an object is influenced by only its immediate surroundings.
The theory of special relativity states that the laws of physics are the same for all non-accelerating observers, and Einstein showed that the speed of light within a vacuum is the same no matter the speed at which an observer travels.
If entanglement existed, and if particles could influence the state of each other at a great distance, then the theory of locality was also considered to be breached. Hence, the EPR paper challenged the assumption that particles could communicate their states instantaneously and from a good distance.
Hence, the EPR concluded that the two entangled particles had hidden variables embedded in them, which gave them the information to choose correlated states when being observed. Albert Einstein continued to challenge the principles of quantum mechanics.
"Quantum mechanics is certainly imposing. But an inner voice tells me that it is not yet the real thing. The theory says a lot but does not really bring us any closer to the secret of the 'old one.' I, at any rate, am convinced that He does not throw dice."
–Albert Einstein
Einstein and Bohr could not come to an agreement, even in the presence of an arbitrator. This arbitrator came in the form of John Wheeler. In 1939, Bohr and Wheeler started working at Princeton University and shared a good working relationship. Wheeler was a pleasant persona and could speak German. Einstein – who was the professor in Exile at Princeton – became Wheeler's neighbor and there arose a possibility for these great minds to come together. Wheeler saw merits in Bohr's view on complementarity – where two particles could be entangled. He also agreed with Einstein's challenge to the theory that, when we view particles, we unavoidably alter them. Despite several attempts, John Wheeler did not manage to come up with a theory that convinced both Bohr and Einstein.
Following on from the likes of Einstein and Bohr, John Bell entered the arena of quantum in the latter half of the 20th century. He was born in Belfast in 1928, and after several years of flirting with theories of quantum mechanics, he finally chose to take the plunge in 1963 when he took a leave at Stanford University. He explained entanglement as the behavior of identical twins who were separated at the time of birth. If, after a lifetime, they were brought together, they would have surprising things in common. He had come across this in a study by the Institute for the Study of Twins. This led to the thought that perhaps electrons behaved like they had genes. At the minimum, it helped a layman understand what entanglement of quantum particles meant.
However, in 1964, Bell subsequently came up with Bell's inequality. Through a set of experiments on electrons and positron pairs, and probability theory, Bell proved that the conclusion of EPR was wrong. The assumption that particles had to have properties embedded in them to explain entanglement did not seem the right way forward after all. Bell's inequality was supported through several subsequent experiments. The probability explanation through Venn diagrams of Bell's inequality is simple. There is a simpler possible home experiment that can explain the spooky nature of quantum mechanics using a polarizing lens used on photons.
You can check out the YouTube video of the experiment here, https://www.youtube.com/watch?v=zcqZHYo7ONs&t=887s, and it does get quite counter-intuitive.
The video shows the following:
Look at a white background through a polarized lens. It looks gray, indicating that a lot of light is being blocked from going through the lens.Add another polarized lens B, and you will observe less light coming through it – indicated by the background getting even darker.Now, by adding another polarized lens C on top of A and B, you would expect the white background to look even darker. But surprisingly, it looks brighter than with just A and B.The results of the experiment can perhaps be explained by one possibility. What if the nature of the photon changes when it goes through one filter? This could mean the way the changed photon interacts with subsequent filters is different too.
I will explain another weird behavior of light particles (photons) using the Quantum Slit experiment later in this chapter. Currently, the behavior of subatomic particles is most clearly explained through the principles of quantum mechanics. If any new alternative is to be offered, it must be more convincing than the existing principles.
Whilst the theories underlying the behavior of particles in nature were being postulated, there were a few individuals who were starting to think about the implications of simulating these behaviors using classical computers. In 1965, the Nobel Prize in Physics was awarded jointly to Sin-Itiro Tomonaga, Julian Schwinger, and Richard P. Feynman for their fundamental work in quantum electrodynamics, with deep-ploughing consequences for the physics of elementary particles. It was in the 1980s that Richard Feynman first discussed the idea "Can a classical computer simulate any physical system?". He is considered to have laid the foundations of quantum computing through his lecture titled "Simulating Physics with Computers."
In 1985, the British physicist David Deutsche highlighted the fact that Alan Turing's theoretical version of a universal computer could not be extended to quantum mechanics. You may ask what Turing's computer was.
In 1936, Alan Turing came up with a simple version of a computer called the Turing machine. It had a tape with several boxes, and bits coded into each one of them as "0"s and "1"s. His idea was that the machine would run above the tape, looking at one square at a time. The machine had a code book that had a set of rules, and, based on the rules, the states ("0"s and "1"s) of each of these boxes would be set. At the end of the process, the states of each of the boxes would provide the answer to the problem that the machine has solved. Many consider this to have laid the foundation for the computers we use today.
However, David Deutsche highlighted that Turing's theories were based on classical physics (0s and 1s), and a computer based on quantum physics would be more powerful than a classical computer.
Richard Feynman's idea started to see traction when Peter Shor of Bell Laboratories invented the algorithm to factor large numbers on the quantum computer. Using this algorithm, a quantum computer would be able to crack even recent cryptography techniques.
In 1996, this was followed by Grover's search algorithm. In a classical computer, when an item has to be searched in a list of N items, it needs, on average, N/2 checks to recover the item. However, with Grover's algorithm, the number of checks could be brought down to √N. In a database search, this offered a quadratic improvement to the search performance. This is considered a key milestone in the field of quantum computing.
Grover's algorithm and subsequent work in this space have since accelerated the excitement and hype around quantum computing. More recently, tech giants IBM, Google, Intel, Microsoft, and a few others have ramped up their work in quantum computing. At CES 2019, IBM showed off their prowess through the launch of an integrated system for quantum computing for scientists and businesses. IBM also has a cloud-based quantum computing infrastructure that programmers could use. More on what the tech giants are up to will be revealed in Chapter 16, Nation States and Cyberwars.
When I first looked at the picture of IBM's quantum computer replica as revealed at CES 2019, my immediate thought was Déjà vu. The previous generation witnessed the rise of the classical computing revolution, with its far-reaching impacts upon all aspects of society. We stand on the brink of another revolution; we will be fortunate enough to see the evolution of quantum computing first-hand.
Before we explore quantum computing, it would be good to understand the behavior of particles as described by quantum mechanics. Below, I describe an experiment that helps us to understand the counter-intuitive nature of quantum theory.
The famous Quantum Slit experiment describes the behavior of photons/particles and how they interact with each other and themselves. As we will see, this posed a challenge to physicists attempting to describe their behavior.
In the 19th century, a British scientist, Thomas Young, postulated that light particles traveled in waves, rather than as particles. He set up a simple experiment where he cut two slits on a piece of metal and placed it as a blocker between a light source and a screen. He knew that if light traveled in the same manner as particles, then the particles that passed through the slits would hit the screen. Those that were blocked by the metal would bounce off the surface and would not reach the screen. Effectively, if the light was made of particles, then the screen should look like a spray of paint on a stencil. Figure 1 shows the experiment and the slit formation.
However, he assumed (before the experiment) that light was formed of waves, and the waves, when they passed through the slit, would interfere with one another and form patterns on the screen. The pattern would be defined based on how the waves passing through the slits interacted.
Where the waves interfered with each other (called constructive interference), the screen would display bright spots, and where peaks interfered with troughs (called destructive interference), they would form dark spots. Hence, the pattern would be slit shapes at the center followed by progressively darker slit shapes to the left and the right. Young successfully proved that light traveled in waves.
Figure 1: Young's double slit experiment
Albert Einstein once more proved to be of great influence in the field of quantum mechanics. He proposed that light was made of photons – a discrete quantum of light that behaved like a particle. As a result, the experiment was repeated and this time, photons were passed through the slit one by one and the patterns still appeared. This could only happen if:
Photons travelled in waveforms.All possible paths of these waveforms interfered with each other, even though only one of these paths could happen.This supports the theory that all realities exist until the result is observed, and that subatomic particles can exist in superposition. As detectors were placed to observe photons passing through the slits, the patterns disappeared. This act of observation of particles collapses the realities into one.
We have discussed the three principles of quantum mechanics: superposition, entanglement, and interference. These principles are fundamental to the way in which particles are managed within a quantum computer.
Figure 2: A quantum computing timeline
The history of quantum computing and the key milestones are captured in Figure 2. The key takeaway is the contributions made to the field that have brought this technology to the brink of achieving impact at scale.
Quantum computing has quantum bits called qubits (pronounced cue-bit) as their fundamental unit. In the classical computing world, bits take 0 and 1 states. Qubits exist in these two states, but also in a linear combination of both these states called superpositions.
Superpositions can solve some problems faster than the deterministic and probabilistic algorithms that we commonly use today. A key technical difference is that while probabilities must be positive (or zero), the weights in a superposition can be positive, negative, or even complex numbers.
The other important quantum mechanics principle that is fundamental to understanding quantum computers is Entanglement. Two particles are said to display entanglement if one of the two entangled particles behaves randomly and informs the observer how the other particle would act if a similar observation were made on it.
This property can be detected only when the two observers compare notes. The property of entanglement gives quantum computers extra processing powers and allows them to perform much faster than classical computers.
Quantum computers have similarities and differences compared to traditional transistors that classical computers use. Research in quantum computers is moving forward to find new forms of qubits and algorithms. For example, optical quantum computers using photons have seen significant progress in the research world since 2017. Optical quantum computers using photonic qubits work at room temperatures.
A quantum computer should satisfy the following requirements:
Qubits need to be put into a superpositionQubits should be able to interact with each otherQubits should be able to store data and allow readout of the dataQuantum computers also demonstrate some features (typically):
Tend to operate at low temperatures, and are very sensitive to environment/noiseTend to have short lifetimes – the reasons are explained belowWe encode qubit states into subatomic particles; electrons in the case of semiconductor quantum computers. There are several methods to create qubits and each method has advantages and disadvantages. The most common and stable type of qubits is created using a superconducting loop. A superconductor is different from a normal conductor because there is no energy dissipation (no resistance) as the current passes through the conductor. Superconductor circuits operate at close to absolute zero temperatures (that is, 0 Kelvin, or -273 degree Celsius) in order to maintain the states of their electrons.
Another qubit architecture where transistor-based classical circuits are used is called SQUIDs. SQUID stands for Superconducting Quantum Interference Device. They are used to track and measure weak signals. These signals need to only create changes in energy levels as much as 100 billion times weaker than the energy needed to move a compass needle. They are made of Josephson junctions. One of the key application areas for SQUIDs is in measuring magnetic fields for human brain imaging. Source: https://whatis.techtarget.com/definition/superconducting-quantum-interference-device
Superconducting qubits (in the form of SQUIDs) have pairs of electrons called Cooper pairs as their charge carriers. In this architecture, transistor-based classical circuits use voltage to manage electron behavior. In addition, a quantum electrical circuit is defined by a wave function. SQUIDs are termed artificial atoms, and in order to change the state of these atoms, lasers are used. As described earlier in this chapter, based on the principles of quantum mechanics, only light with specific frequency can change the state of subatomic particles. Therefore, lasers used to change the state of qubits will have to be tuned to the transition frequency of the qubits.
A superconducting qubit can be constructed from a simple circuit consisting of a capacitor, an inductor, and a microwave source to set the qubit in superposition. However, there are several improvements of this simple design, and the addition of a Josephson junction in the place of a common inductor is a major upgrade. Josephson junctions are non-linear inductors allowing the selection of the two lowest-energy levels from the non-equally spaced energy spectrum. These two levels form a qubit for quantum-information processing. This is an important criterion in the design of qubit circuits – a selection of the two lowest energy levels. Without the Josephson junction, the energy levels are equally spaced, and that is not practical for qubits. Source: https://web.physics.ucsb.edu/~martinisgroup/classnotes/finland/LesHouchesJunctionPhysics.pdf
Like the gate concept in classical computers, quantum computers also have gates. However, a quantum gate is reversible. A common quantum gate is the Hadamard (H) gate that acts on a single qubit and triggers the transition from its base state to a superposition.
There are several variations of qubit circuits based on the properties here. The key properties that need consideration in the design of these circuits are:
Pulse time: This is the time taken to put a qubit into superposition. The lower the pulse time, the better.Dephasing time: This is the time taken to decouple qubits from unwanted noise. The lower the dephasing time, the better. Higher dephasing times lead to a higher dissipation of information.Error per gate: As gates are used to create a transition in states of qubits when there is a faulty gate, the error can propagate onto qubits that were originally correct. Hence, error per gate needs to be measured regularly.Decoherence time: This is the time duration for which the state of the qubit can be maintained. Ionic qubits are the best for coherence times as they are known to hold state for several minutes.Sensitivity to environment: While semiconductor qubits operate in very low temperatures, the sensitivity of the particles involved in the construction of the circuit to the environment is important. If the circuit is sensitive to the environment, the information stored in the qubit is corrupted easily.Figure 3: Qubit circuits
IBM recently launched the 50-qubit machine, and also provides a cloud-hosted quantum infrastructure that programmers can go and code in. There are also several advances in quantum assembly language that will act as the interface between these machines and the code that developers write. Figure 3 shows different qubit circuit types.
We've now covered the fundamentals of quantum computing, so let's move on to look at the other technology in focus for this book: Blockchain.
Unlike quantum computing, Blockchain has had a relatively short history. If quantum computing is the Mo Farah of emerging technologies, Blockchain is the Usain Bolt. Several Blockchain properties have their roots in cryptography, and it is essential to understand some of the terminologies in order to be able to enjoy the rest of the chapter.
It is important to understand how Blockchain depends on cryptography. This would help us in subsequent chapters to understand how Blockchain and quantum computing could potentially collide in future. A detailed, yet simplified, description of some key terms of Blockchain and cryptography are as follows:
Hashing is a process where a collection of data is input into a function to get a fixed length string as an output – called a hash value. We use them every day. When you create an email ID with a password, the password goes through a hash function, a unique string is created, and this is stored in the database of the email provider. When you try to log in again, the password entered is put through the hashing algorithm, and the resulting string is matched with the string stored in the data base of the email provider. If they match, you get to access your email.
Figure 4: An illustration of the transaction process for Bitcoin. Source: https://bitcoin.org/bitcoin.pdf
The bitcoin system uses a function called Hashcash. The Hashcash proof of work algorithm was invented in 1997 by Adam Back. The bitcoin hash uses two additional parameters – a nonce, and a counter. The nonce is just a random number that is added to the collection of data before it gets fed into the hashing function. So, the hash created is a combination of the previous hash, the new transaction, and a nonce. The bitcoin system requires the hash value to start with a certain number of zeros; the challenge of identifying the right hash value increases exponentially as the number of zeros increases. The counter parameter of the Hashcash function is used to record increments until the right hash value is arrived at.
The nodes in a bitcoin network work hard to find the hash value that has the correct number of zeros. They use different nonces to generate hashes, until the right hash is generated. This exercise takes a lot of computing power, and when the right hash value is found, the node that has achieved that will be rewarded bitcoins for identifying the right nonce.
Determining the nonce that, when put through the hash function, results in a specific hash value within a difficulty level is called mining. The difficulty level increases as the number of zeros increases. Mining bitcoins has become harder over the years as more computing power is required to determine the nonce. There are only 21 million bitcoins to ever be produced, and at the time of writing this book, about 17.5 million bitcoins have been mined. The reward to mine a block is at 12.50 bitcoins, and there are about 144 blocks mined per day. There are 65,000 more blocks to be mined before the mining reward halves again to 6.25 bitcoins.
A block is just a group of transactions validated together. If a bunch of transactions are not able to make it into a block in time, they get moved into the next block. The number of bitcoins that are rewarded for mining a block started at 50 and is halved with every 210,000 blocks mined.
The term proof of work was coined by Markus Jakobsson and Ari Juels in a document published in 1999. Proof of work was used in the bitcoin system to ensure that transactions are validated through sheer computing power. After a chain of blocks has been established through this method, to hack through the block would require an immense amount of computing power too.
Also, in a proof of work system, the processing power that a node has decides the control the node has over the network. For example, in the bitcoin network, one CPU is equivalent to a vote, which can be exercised at the time of decision making.
New transactions are broadcast to all nodes for validation. Transactions are collected into blocks and nodes are busy finding a proof of work for their blocks. When a node finds the proof of work, it broadcasts the block to all nodes that accept the block only if all transaction in it are valid. The acceptance of the block results in the network starting to work on new blocks.
Hacking a block means a new nonce needed to be identified that solved the work of not just one miner, but of all subsequent miners too. Also, when there are multiple chains of blocks, the longest chain of blocks, in terms of the amount of computing power required to create them, is accepted by the network.
Several of these concepts are quite fundamental to understanding how Blockchain networks work, and you should now be able to approach the topic of Blockchain with greater confidence. With that said, we'll now discuss another key concept of Blockchain: utility and security tokens. Understanding the differences between a security and a utility token has recently proven to be a conundrum for the global Blockchain community.
As solutions based on Blockchain started raising capital, they were broadly classified into two buckets – utility tokens or security tokens. A utility token is like loyalty points or digital coupons needed to use an application. Loosely, they are used for distributing profits (or dividends) when a firm makes money.
On the other hand, a security token derives its value from an underlying asset. For example, a real estate fund could be tokenized, and the token can be traded. The value of the token is derived from the value of the real estate fund. In the same way, firms raising capital can issue tokens and investors would get a share of the company. This is effectively owning equity in the company and is classified as a security token.
While I have made it sound like utility and security tokens are mutually exclusive concepts, they are often not. For instance, in the case of Ether (Ethereum's token), it is more of a utility than a security as the token is used across the ecosystem in applications and largely derives its value from Ether's demand. The SEC has developed a simple methodology to identify a token as a security, as security tokens fall under their regulatory umbrella. It's called the Howey test.
The Howey test gets its name from a Supreme court decision in 1946: SEC v W.J. Howey Co. Howey Co was offering service contracts for producing, harvesting, and marketing orange crops in Lake County, Florida. These contracts were sold to tourists who stayed at a hotel that was owned by Howey Co. The company sold land and service contracts to these visitors. The court was asked whether the land purchase plus the service contract created an investment contract. The court agreed, and the Howey test was born.
As per the Howey test, a transaction would be an investment contract (and therefore a security) if:
It is an investment of moneyThere is an expectation of profits from the investmentThe investment of money is in a common enterpriseAny profit comes from the efforts of a promoter or third partyLet's take the Ethereum crowdsale in 2014 as an example. Money was invested (albeit in bitcoins) – and the investment was made by at least a few with a view that the tokens would increase in value over a period, and they could cash out at a profit. With the Ethereum crowdsale, the capital was pooled by investors in a scheme, and that is viewed as common enterprise by the SEC. And the value increase in Ether was expected to happen through the work of Vitalik and company. Therefore, Ether should be a security, as per the Howey test.
The way the Ethereum crowdsale happened in 2014, it is easy to categorize it as a security token. However, Ethereum is now the oxygen of a big community of applications. As a result, we can say that Ether is an example of a token, which initially raised capital like a security, but due to the way the firm and the technology have evolved, it is more of a utility today. Ethereum is decentralized due to the community it has and no longer just thrives on the initial founders of the firm.
Recently, I was part of a round table discussing the challenge in categorizing tokens as utility or security. I would describe it as a progress bar; at one end of it is the security token, and at the other end is the utility token. Depending on how the token derives it value and how it is used by the community, it would move closer to one end of the progress bar or another. Security vs utility shouldn't be seen as binary states of tokens.
We have discussed the background of quantum computers and touched upon some interesting Blockchain concepts too. The idea is to use these basic ideas as the building blocks before moving onto real-world applications across industries in future chapters. The cryptographic element is fundamental to these two technologies. Does that mean quantum computing makes Blockchain obsolete? We'll touch upon that question in future chapters.
The journey of Bohr, Einstein, Alan Turing, and several others almost a century back has now led to the invention of quantum computers. The hype and the headlines in this field are getting bigger every day. However, mass industry adoption of this technology is several years (if not decades) away. In this chapter, I wanted to take the reader through a journey and introduce key people, principles, events and technology components within quantum computing.
It is important to understand why qubits are different from bits that today's computing world largely relies on. This chapter provides quantum methods and real-world applications that we will touch upon in future chapters. Applications of optical quantum computers that use photons will also be discussed in a subsequent chapter.
We briefly touched upon Blockchain and the use of cryptography. This is also critical, so that we can see the technological overlap between the two technologies. It is essential that the Blockchain community views this overlap as an opportunity rather than a major roadblock. I firmly believe that both these technologies are here to stay, and definitely here to enrich our lives by complementing each other across industries.
There are several practical applications of quantum computers across industries, including healthcare, logistics, finance, and cybersecurity in general. We will cover these in detail in this book.
