Space, Time & Quantum Physics

The overarching concern of my research program is the form and character of formulations of quantum theory general enough both to pose quantum questions in space and time (as opposed to merely at a moment of time), and to encompass a quantum theory of spacetime — quantum gravity.

A natural framework for such investigations is provided by the generalized decoherent-histories quantum mechanics first introduced by J.B. Hartle and further developed by C.J. Isham and others (including myself.) A distillation of the predictive structure of quantum theory to only its most essential features — superposition of states, and consistent assignment of probabilities to possible physical histories — the mathematical structure of generalized quantum theory is in essence a generalization of the algebraic formulation of quantum mechanics and quantum logic to encompass an explicit measure of the quantum interference between possible histories: the decoherence functional. The formulation of any specific quantum theory then amounts to a specification of the possible histories of a system, and of the system's decoherence functional. It is in these choices that all the important physical questions lie.

All of that is a fancy way of making the following observations. Physics as a discipline is ambitious, and sometimes audacious. It dreams of the possibility that the natural universe can be captured in a framework of mathematical relationships. To understand space and time we of course will need some sort of quantum theory of gravity. This has turned out to be a hard problem in and of itself. However, whatever the ultimate solution to this hard problem turns out to be — loop quantum gravity, string theory, causal dynamical triangulations, asymptotic safety, causal sets, or something else — there is a yet harder and arguably, more fundamental problem lurking just behind it. In quantum mechanics, as it is traditionally taught and practiced, a quantum amplitude can not be interpreted as a quantum probability until or unless the corresponding observable (or history) has been measured. In contexts such as the very early universe in which there are no external observers making measurements, how then may coherent physical statements be made? The answer advanced by the decoherent (or consistent) histories formulation of quantum theory is that quantum amplitudes may be interpreted as probabilities when, and only when, the interference among those amplitudes vanishes — they decohere. This interference is measured by a functional of the quantum state called the decoherence functional. Traditional quantum measurements serve precisely this function: they destroy quantum interfence. What the decoherent histories framework does is supply an objective measure of this interference which does not require any external notion of "observer" or "measurement", though it does reproduce the predictions of laboratory quantum mechanics when traditional measurement situations obtain. (For this reason, the consistent histories formulation of quantum theory is sometimes called "the quantum mechanics of closed systems".) At present, this way of aproaching quantum mechanics is the only tool we have for making consistent quantum predictions in contexts such as the very early universe.

My work in this area has several distinct threads. The first is an interest in the formal mathematical structure of generalized quantum mechanics in a C*-algebra setting. Initiated by Chris Isham and collaborators, this research has placed generalized quantum theory on a firm mathematical footing. Work I have completed in this field includes an extensive study of "the geometry of decoherence," a mathematical analysis of the geometric structure associated with the decoherence functional, the generalization of the algebraic notion of quantum state that is the central structural element of generalized quantum theory. (An earlier version of this research has been posted to, and so far remains exclusively on the arXiv for essentially personal reasons [II].) The tools developed in the course of that study provide the foundation necessary for further investigations into the character of mathematical constraints on physically reasonable choices of decoherence functional. For example, my collaborators and I have demonstrated within the framework of generalized quantum theory the existence of an inequality constraining the existence of "quantal hidden variables" that parallels and generalizes the Bell inequalities, which constrain the existence of classical hidden variables, and indeed are violated experimentally. Experimental violation of this generalized "Tsirelson" inequality would falsify quantum mechanics itself, as well as a wide class of generalizations of it — specifically, any theory described by a strongly positive decoherence functional. We have also explored the relations between the existence of such an inequality with the causality and locality properties of quantum theory [IV].

Another deep question is to develop within this framework an understanding of the meaning of the approximate character of decoherence in any physical system, and to characterize and quantify the consequent constraints that emerge on what can be known about the physical histories of such systems. (The easiest parallel analogy is the partial destruction of interference fringes in a two-slit system in which only partial "which-way" information has been gathered about through which slit the individual electrons or photons have passed. A sort of uncertainty-type relation obtains between how much interference is exhibited vs. how much which-way information is gathered.)

More recently, I have been exploring the connection between the consistency of histories and Werner Heisenberg's famous uncertainty principle, possibly the central meme of quantum theory. The essence of the uncertainty principle is that it is not possible to know everything at once about a physical system that our experience with classical, macroscopic physics suggest we should be able to know. There is a deep connection between this general principle and the "decoherence" or "consistency" of the corresponding histories — the condition that determines (via the system's decoherence functional) whether or not physically meaningful probabilities can be assigned to those histories. Much of this work has been in collaboration with Le Moyne College undergraduate students Adam Lemke and Elliot Connors.

The second major thread of my research activity is the application of these ideas to quantum cosmology. Together with the University of California's J.B. Hartle, we developed the complete predictive framework for a fully four-dimensional quantum theory of homogeneous cosmologies, establishing rigorous versions of traditional heuristics for assessing the physical content of recollapsing quantum cosmological models [III]. Building on this foundation, my collaboration with Parampreet Singh — formerly of the Perimeter Institute for Theoretical Physics, now in the Physics Department at Louisiana State University — has concerned itself with framing model theories of quantum gravity in this way. (Dr. Singh is one of the principal architects of the burgeoning field of loop quantum cosmology.) Initial work focused on constructing the decoherence functional for a homogeneous, isotropic, Friedmann-Robertson-Walker cosmological model in a Wheeler-DeWitt quantization [V]. We used this decoherence functional to demonstrate conclusively that these models are inevitably singular — that is, they will invariably begin or end in what is colloquially termed "the big bang". In other words, this way of doing quantum gravity does not tame the classical singularity.

We have similarly constructed the decoherence functional for the same physical model, but in a so-called "loop quantization" rooted in the approach to the quantization of general relativity known as "loop quantum gravity". In this quantization we rigorously confirm previous strong evidence from other workers that the singularity is actually removed — in this quantization, there is no "big bang", and quantum gravity does tame the classical singularity in a quite dramatic way: the universe ``bounces'' at small volume instead of collapsing into a big-bang singularity [VI] [VII] [VIII]. The same techniques are now being applied to other models of physical interest, in particular to study the robustness of the prediction of removal of the cosmological singularity. Work is now essentially complete to construct the decoherence functional for a spin foam quantization — in essence, a discrete path integral formulation of quantum gravity — of symmetric cosmological models along similar lines [IX]. A longer term project is to investigate observational implications of these results, including both potential modifications to the CMB, predictions for the gravitational wave equivalent of the CMB, and a deeper understanding of the "quantum-to-classical" transition in the early universe that sets the stage for cosmic structure formation post-inflation. An additional investigation will be a much deeper pursuit of the relationship between histories-type frameworks and Rovelli's so-called "relational observables" first revealed by work on the Wheeler-DeWitt model [VI].

This program is of fundamental importance because it is actively developing the foundation for an explicit, rigorous, internally consistent framework for extracting predictions from quantum gravitational theories in the early universe — theories for which, as noted above, an appeal to external classical measurements is simply not available since such theories are putatively capable of modeling the universe as a whole. Robust strategies will therefore be essential for making internally consistent sense of emerging candidates for a quantum theory of gravity, whatever that theory may ultimately be. Indeed, as we have shown, in the absence of a coherent predictive framework it is easy to be misled by a superficial reading of a theory's quantum amplitudes (see e.g. [VI].) Something like a decoherent histories framework will ultimately prove to be an essential element of any fundamental quantum theory of gravity. The models worked out so far provide the template for future development of the theory and applications to more complex models.

While there are many individual threads comprising my research program, they have in common the overarching theme of understanding how to apply quantum mechanics to the universe as a whole.

Selected Publications

  1. Cosmological dynamics in spin-foam loop quantum cosmology: challenges and prospects [(with P. Singh) Class. Quantum Grav. 34 074001 (2017)]
  2. The consistent histories approach to loop quantum cosmology [Int. J. Mod. Phys. D25 1642009 (2016)]
  3. Consistent probabilities in loop quantum cosmology [(with P. Singh) Class. Quantum Grav. 30 205008 (2013)]
  4. Dynamical eigenfunctions and critical density in loop quantum cosmology [Class. Quantum Grav. 30 035010 (2013)]
  5. Consistent probabilities in Wheeler-DeWitt quantum cosmology [(with P. Singh) Phys. Rev. D 82 123526-123546 (2010)]
  6. A Bell inequality analog in quantum measure theory [(with F. Dowker, J. Henson, S. Major, D. Rideout, and R. Sorkin) J. Phys. A: Math. Theor. 40 501-523 (2007)]
  7. Generalized quantum theory of recollapsing homogeneous cosmologies [(with J.B. Hartle) Phys. Rev. D 69 123525-123547 (2004)]
  8. The geometry of consistency: decohering histories in generalized quantum theory
  9. Observation of the final boundary condition: extragalactic background radiation and the time symmetry of the universe [Ann. Phys. 251 384-425 (1996)]

You can find me on the arXiv and (somewhat) more completely at INSPIRE or Google Scholar — though the citation counts are a bit wonky. Further scientific entertainment and enlightenment may be found at xxx. Watch your step.

Go back home.

Last modified November 6, 2017 3:48 PM PST
Page URL:

The views and opinions expressed on this page are strictly those of the page author. The contents of this page have not been reviewed or approved by Oregon State University. © David A. Craig.