Advances in the Quantum Theoretical Approach.pdf
N. Abura’ed et al.
This assumption led to the conclusion that the average energy released from the object
would decrease with the frequency, so as frequency goes to infinity, the average energy
goes to zero, preserving the conservation of energy law and reconciling experimental
observations. A thorough review of the history of the derivation of Planck’s formula
that gave birth to quantum physics can be found in Boya .
Toward the 20th century, quantum physics further developed and cemented its position as the “new” physics governing the dynamics of the atomic and subatomic world.
In 1905, Einstein  successfully explained the photoelectric effect, where light
incident on an object leads to the emission of electrons from the object’s atoms. The
success of Einstein’s explanation was based on an assumption that light was made up
of quanta, or photons, rather than waves, an idea motivated by Planck’s seminal work
mentioned earlier. This work earned Einstein the Nobel Prize in physics in 1921. Furthermore, in 1913, Bohr proposed a model of the hydrogen atom by assuming discrete
or quantum values for angular momentum of the electron [Bohr 1913]. This model
explained experimentally observed properties of the hydrogen atom that other prevailing models of the time could not. Further developments in formalizing the physics of
the quanta by Born , Dirac , Heisenberg , and Schr¨odinger 
culminated in a formal theory of quantum physics. This theory, sometimes referred
to as the old quantum theory, has been responsible for the ensuing decades of major
technological innovations such as the transistor (and hence computers) and the laser
(and hence CD-ROMs, Blu-ray, etc). An excellent treatment of the ubiquity of the old
quantum theory can be found in Kakalios .
Since the late 1980s, quantum physics has once again come to the forefront of technological innovation in a dramatic way. In fact, it would be appropriate to say that a
“new quantum theory” has come to light in the past four decades, which sees the old
quantum theory fuse together with the theory of information processing. At least one
each of scientific philosophies and technological advances have led to this fusion of
information and quantum physics, with the former being Richard Feynman’s idea that
a computational perspective on quantum physics might produce an efficient method
to simulate complex quantum physical systems like molecules. This idea was further
developed by Deutsch . The fact that the number of processors on a computer
chip has been roughly doubling every year since the 1980s had a major role in the
development of what is now known as quantum information processing. This trend in
computer engineering, referred to as Moore’s law [Moore 1965], means that the physical size of these processors has been decreasing rapidly and by 2020 or so would reach
the atomic scale. In this scenario, quantum effects, which were regarded as negligible
noise in the technological innovations based on the old quantum theory would now
have to be dealt with.
Motivated by developments such as Feynman’s idea and Moore’s law, scientists have
studied the theoretical underpinnings of quantum computers, which are essentially
Turing machines that utilize the principles of quantum mechanics, and which are expected to give dramatic computational speedups over the classical Turing machines.
For example, the algorithms of Shor and Grover [Shor 1997; Grover 1996] are designed
to run on a quantum computer in durations that are polynomial and quadratic in
input size, respectively, for factoring integers and searching unstructured databases.
Because the problems of integer factorization is known to be an NP problem on classical
computers, it is extensively used in electronic security protocols such as the RSA and
elliptic curve–based cryptography. The fact that a quantum computer can run Shor’s
algorithm in time that is polynomial in the size of the input means that these security protocols will become obsolete once a fully functioning quantum computer becomes
available. In addition, since the complexity of the brute-force search of an unstructured
database of k elements is O(k), Grover’s search algorithm offers a substantial speedup
in this case. This observation got the attention of security experts and intelligence
ACM Computing Surveys, Vol. 49, No. 4, Article 75, Publication date: February 2017.