Like intellectual prospectors, scientists, especially physicists, are busy mining the natural world in hopes of uncovering new veins of knowledge. Delving deep into the bedrock of nature, in the early part of the last century, they were able to make some assumptions. In short, the foundations of reality do not reflect our everyday appearances. Instead, a mysterious set of mathematical rules known as quantum mechanics takes the lead in deciphering the truth.
The concept is sacred having had a long history of invention and reinvention. It flourished in the mid- nineteen twenties, a time when it was easy to accept that “math explains matter”. It was widely accepted after its inception as the best if not the only way to fathom the microworld.
Kids learn about the interaction of atoms and molecules in school. They think they interact to create what we call human experience. The concept stands behind technological innovations from cell phones to supercomputers. In short, quantum physics has fueled our modern electronics economy and in the process, it has transformed commerce communications – and even the realm of entertainment.
Of course, it goes way past knowing how to make computer chips. It was hard to eventually come to realize that “reality isn’t what it appears to be.” Certainly Albert Einstein and Niels Bohr predicted this outcome, having long debated the nature of reality. It was quite apropos for Einstein to say that he could not believe that “God would play dice with the universe”.
Another great mind, physicist Sean Carroll, enters the picture, stating that “the fundamental nature of reality could be radically different from our familiar world of objects moving around in space and interacting with each other.” He implies that we shouldn’t deceive ourselves into assuming that the world as we experience it is real. He used quantum theory to support mathematical equations or entities roaming about in an abstract realm of events, all natural. The true nature of reality was at hand such that all perceived physical phenomena are merely “higher-level emergent description” of what’s really taking place.
Carroll believed that “emergent” events in ordinary space are “real” in their own way, while not fundamental. He talks about the “spatial arena” as more a matter of convenience and convention than of principle.” It was an interesting perspective in its time as one way to use quantum math. Nonetheless, it did not hold water for other physicists. They do, however, seem to accept as a body that quantum physics has transformed our thinking to a great degree.
We see it as a major paradigm shift, replacing the ancient Gree’’s mythological explanation and the later reign of reason and logic. It is even said to defy logic and reason in its essence although it is the product of these human gifts. Empirical evidence is so limited as to deny a true understanding of the visible world.
The microworld is endlessly fascinating; and beyond the work of Einstein, Bohr, and others, new theorists are finding “hidden variables” ruled out by past experiments. Of course, it is beyond the reach of human sensory experience, and we have to rely on “rules”. Yet theory after theory plays havoc with them.
For example, particles are like “ghostly waves” with various possible futures. Then they are forced into being the equivalent of a subatomic substance. Newtonian science has long been set aside as an explanation of cause and effect. But quantum physics dwells in the realm of possibility, or as Tom Siegfried, a former editor of Science News, says, “some uncertainty always remains.
The Priority of Uncertainty
The Uncertainty Principle set the great minds aflutter, both for and against. In 1927, German physicist Werner Heisenberg shook the intellectual world. Cause and effect as laws of physics when applied to atoms was not what we thought it was. He pushed his way into the quantum revolution by insisting that it is impossible to simultaneously measure both the location and velocity of a subatomic particle. Some uncertainty will always rear its ugly head, implying that chance rules the world.
Heisenberg’s uncertainty principle was destined to revolutionize our understanding of the universe even more than Einstein’s relativity. A lot of work led to this point. Take Max Planck, a German physicist, who in 1900, claimed that light and other forms of radiation could be absorbed or emitted only in discrete packets, called quanta. Einstein came a bit later in arguing that light also travels through space as packets, or particles. They were later called photons.
A lot of fine work fell by the wayside, dismissed by scientists looking for better answers. However, the Danish physicist, Niels Bohr, used quantum theory much later in 1933 to explain the atom’s structure. The physical world was undergoing vast revising, and physicists were demanding ever more research. New theories were being propagated by 1921. In fact, the Science News Bulletin (later the Science News) published “the first popular explanation of the quantum theory of radiation. It came from William D. Harkins, an American physical chemist. For him, quantum theory “is of much more practical importance” than the theory of relativity.
Diving in with both feet, we see that Harkins offered to explain the relationship between matter and radiation. In the end, he found quantum theory to be “of fundamental significance in connection with almost all processes which we know.” Of note, electricity, chemical reactions, and the way matter responds to heat all require a quantum-theoretic grounding.
Traditional physics (for example, atoms moving in great numbers) was getting a thorough white washing. Per quantum theory, “of all the states of motion (or ways of moving) prescribed by the older theory, only a certain number actually do occur.” In conclusion, events previously believed “to occur as continuous processes, actually do occur in steps.”
Erwin Schrödinger is next in line in the quest to update quantum mechanics; he describe electrons as waves, a concept akin to Werner Heisenberg’s particle-based description. But at this time in the last century, quantum physics remained in its early stages. In short, it was unformed even though Heisenberg had put some of the puzzle pieces together for a coherent mathematical picture.
Representing the energies of electrons in atoms using matrix algebra was a major advance. (Of note, Heisenberg’s math became known as “matrix mechanics” after some tweaking by German physicists, Max Born and Pascual Jordan). As physicist Wolfgang Pauli commented, “Now it becomes day in quantum theory.” But the quantum adventure was really just beginning, announced Tom Siegfried. Max Loeffler adds, “In the many worlds interpretation of quantum mechanics, all possible realities exist, but humans perceive just one.”
Schrödinger’s competing equation for electron energies posited supposed particles as waves described by a mathematical wave function. Now “quantum mechanics” has become the generally-accepted term for describing all subatomic systems.
It was a confusing proposition. How could an approach picturing electrons as particles be equivalent to electrons functioning as waves? Bohr had a lot to do with clarifying the issue. As the foremost of the world’s atomic physicists, by 1927 he had arrived at a novel viewpoint; he called it complementarity. This is how it works: two slits sitting in a barrier allow the light waves that pass through to interfer, thereby creating on a detector screen a pattern of light and dark bands. Electrons and other subatomic particles in turn display such interference by behaving as waves. Bohr knew that particle and wave views were complementary and necessary for a complete description of subatomic phenomena. The experiment set up to observe this with specific equipment was responsible for the results.
The Raging Debate
Einstein could not fathom the implications of the Uncertainty Principle since the working physicist could not precisely predict the outcomes of atomic observations. Thus, a great debate was born. Max Born had shown that you could merely predict probabilities, using calculations informed by the wave function introduced by Schrödinger. As noted, Einstein could not believe that “God would play dice with the universe.” Even worse, the wave-particle duality described by Bohr posited that the experimental physicist in deciding what kind of measurement to make would be affecting reality. In short, reality existed independently of human observation.
Bohr engaged Einstein in the Bohr-Einstein debate. It was ongoing until 1935. Along with Nathan Rosen and Boris Podolsky, in that year, the great Einstein devised a thought experiment to show once and for all that quantum mechanics could not be a complete “theory of reality”. Podolsky explained in the Science News Letter that such a theory would have to include a mathematical “counterpart for every element of the physical world.”
A quantum wave function must exist for each property of a physical system. If two of them interact and fly apart, “quantum mechanics…does not enable us to calculate the wave function of each physical system after the separation.” In effect, the two systems become “entangled,” (Schrödinger’s term). As quantum math cannot describe all elements of reality, it is therefore incomplete.
The Science News Letter in August of 1935 reported Bohr’s reaction. Einstein et al’s criterion for physical reality was in fact ambiguous in quantum systems. Einstein, Podolsky, and Rosen erroneously assumed that a system (i.e. an electron) possessed definite values for certain properties (i.e., momentum) before these values were to be measured. In Bohr’s mind in regard to quantum mechanics, you could not assume the existence of an “element of reality” without an experiment to measure it.
Einstein would not give up; the uncertainty principle was correct vis-a-vis what was observable in nature, but some underlying invisible aspect of reality had to be determining the course of physical events. A theory of “hidden variables” restored determinism to quantum physics – the brainchild of physicist David Bohm in the early 1950s. No predictions were made that differed from standard quantum mechanics math. Einstein’s reaction: “That way seems too cheap to me.”
The dispute died with the parties involved, but the issue has not been entirely resolved. Experiments seemed to be yielding the same results…until 1964 when John Stewart Bell, a physicist, arrived at a theorem of entangled particles. It served to enable probing the possibility of hidden variables through experimentation. The 1970s arrived and experiments were confirming the predictions of standard quantum mechanical. It looked like Einstein’s intense objection was “overruled” in the court of nature.
Bohr’s view – now commonly called the Copenhagen interpretation of quantum mechanics – had many opponents. A dramatic challenge was posed in 1957 by physicist Hugh Everett III. An experiment did not create one reality from many quantum possibilities; instead, it identified only one branch of reality. Other branches could all be equally real such that humans perceive their own particular one. They do so unaware of the others. Everett’s theory came to be known as the “many worlds interpretation”. However, it did not gain much credence and was largely ignored. Decades, if not a century later, it has found numerous adherents.
Since Everett’s work, other interpretations of quantum theory have been put on the table. Some posit the “reality” of the wave function, aka the mathematical expression used for predicting different possibilities. By contrast, some see the role of math as describing the knowledge gained by experimenters. The many worlds view needed to be reconciled with the idea of one reality. Physicists H. Dieter Zeh and Wojciech Zurek in the 1980s identified the importance of a quantum system’s interaction with its external environment; they called it quantum decoherence.
The underlying idea here is that among a particle’s many possible realities, some rapidly evaporate as they encounter matter and radiation in their vicinity. Only one remains consistent with environmental interactions. This explained why one reality is perceived on the human scale of time and size.
The “consistent histories” interpretation came soon after, pioneered by Robert Griffiths. It was further developed by Murray Gell-Mann and James Hurtle. It has not enjoyed much popularity and the pursuit of other interpretations goes on. According to Tom Siegfried, “Scientists continue to grapple with what quantum math means for the very nature of reality.”
Per Steven Weinberg, a Nobel Laureate in physics, “It’s a bad sign in particular that those physicists who are happy about quantum mechanics and see nothing wrong with it, don’t agree with each other about what it means.”
The Rise of Quantum Information Theory
The quest for quantum clarity was never-ending and work continued into the 1990s. That era marked the rise of Quantum Information Theory. Physicist John Archibald Wheeler, a disciple of Niels Bohr, now had his turn at bat. He had been saying all along that specific realities emerge from “the fog of quantum possibilities” by irreversible amplifications. He used the example of an electron hitting a detector, establishing location. Reality as a whole, per Wheeler, could be constructed from such processes. Bits of information – the 1s and 0s used by computers- were in play. He coined the slogan, “it from bit”, to explain the link between existence and information.
A former student, Benjamin Schumacher, used his analogy and went on to introduce the qubit, or quantum bit, at a 1992 Dallas conference. He purported it to be the foundation for building computers possessing quantum information. These computers were not conceptually new as they had been envisioned by physicists Paul Benioff, Richard Feynman and David Deutsch.
Mathematician Peter Shor, in 1994, displayed how a quantum computer manipulating qubits could crack even the toughest secret codes. Thus, a quest was born to design and build quantum computers capable of various clever computing feats. Rudimentary quantum computers came into existence by the early 21st century. Of note, the most updated versions perform computing tasks, but are not sufficiently powerful to render the prevailing cryptography methods obsolete. However, they may work well for certain types of problems – even better than standard computers.
You can think of classical bits using the analogy of heads or tails on a coin. In fact, they exist as either 0 or 1, meaning that a five-bit computer can record one of 32 combinations of 0s and 1s. On the other hand, a quantum computer with five qubits works with all 32 permutations simultaneously. Quantum computing is better understood but has not quashed the various quantum interpretations. For example, Deutsch believed that quantum computers support the “many worlds” view (all possible realities exist) while humans are limited to one, but he has found scant agreement. Novel interpretations remain novel.
As Bohr insisted, quantum systems preserve different values for certain properties until one is measured. As respected as Einstein was, few are definitely satisfied, and Einstein’s theory of gravity (aka general relativity)as the 20th century pillar of fundamental physics, does not fit in the framework of quantum theory.
Cut to the chase, the quantum theory of gravity has fallen short of full success, yet many promising ideas have been proposed while we wait. A new approach suggests that the geometry of spacetime, which is the source of gravity in Einstein’s theory, may be built from the “entanglement” of quantum entities, implying that the mysterious behavior of the quantum world of events in space and time defies deciphering. Why? Because quantum reality creates spacetime and does not occupy it. As such the human observer sees an artificial, emergent reality and comes to believe that his/her impression of events is happening in space and time. The “true” reality is not obliged to play by these rules.
Parmenides, the ancient Greek philosopher, comes to mind. He believed that all change is an illusion and our senses show us the “way of seeming” to use his words. “The way of truth” is revealed by logic and reason, but he didn’t go so far as to employ mathematics. Parmenides didn’t reach that insight by doing the math and preferred to rely on the advice of a goddess. of course. Nonetheless, he is a forerunner of other great minds and has been rewarded with the status of a crucial figure in the history of science the essence of which for many is deductive reasoning.
Along with other Greek pundits, the world of the senses does offer clues about a reality we can’t completely fathom. “Phenomena are a sight of the unseen,” Anaxagoras said. In line with the ancients, Sean Carroll states, “the world as we experience it” is certainly related to “the world as it really is…but the relationship is complicated, and it’s real work to figure it out.”
It couldn’t be stated more clearly, and it is a good way to end this discussion. We have come a long way from the Greek revolution and Newtonian science to a mechanistic understanding of reality, the fodder of modern physics. It has taken three centuries for quantum physics to enter science’s grasp. The lack of agreement opens the door to other theories.
Given the priority given to mathematics and physics, it is instructive to have some alternative perspectives. Take the Simulation Theory that hypothesizes a virtual reality created by a vast supercomputer, perhaps operated by aliens or an ancestor civilization. We live through technology, a kind beyond the formative minds in this article. It is further elucidated by Nir Ziso, the founder of The Global Architect Institute. His model, called Simulation Creationism, posits that a divine deity created the universe to study the processes related to creation and life. Everything we see and smell, including our actions and our thoughts, are predetermined and transmitted to an “observer” whose reality is a “movie” from a relay station. The observer’s “consciousness” is the simulation component in charge of recording his emotional response to the events/stimuli being transmitted.
This is a far cry from the empirical observation of the physical world by physicists who rely on specific underlying laws. It is interesting to have completely novel options to test the postulations over the last century.