Bulletin of the American Physical Society
2006 APS March Meeting
Monday–Friday, March 13–17, 2006; Baltimore, MD
Session D40: Focus Session: Foundations of Quantum Theory |
Hide Abstracts |
Sponsoring Units: TGQI DCMP Chair: Daniel Greenberger, City College of the City University of New York Room: Baltimore Convention Center 343 |
Monday, March 13, 2006 2:30PM - 3:06PM |
D40.00001: From Quantum Foundations to Quantum Gravity Invited Speaker: Quantum theory is a probabilistic theory with fixed causal structure. General relativity is a deterministic theory with dynamic causal structure. It seems likely then that a theory of quantum gravity will be a probabilistic theory with dynamic causal structure. Work in the foundations of quantum theory provides insight into how to build a framework for such theories. In this way we can hope that insights coming from quantum foundations can guide us in constructing a theory of quantum gravity. [Preview Abstract] |
Monday, March 13, 2006 3:06PM - 3:18PM |
D40.00002: Quantum Mechanics in Terms of Symmetric Measurements Christopher Fuchs In the neo-Bayesian view of quantum mechanics that Appleby, Caves, Pitowsky, Schack, the author, and others are developing, quantum states are taken to be compendia of partial beliefs about potential measurement outcomes, rather than objective properties of quantum systems. Different observers may validly have different quantum states for a single system, and the ultimate origin of each individual state assignment is taken to be unanalyzable within physical theory---its origin, instead, comes from prior probability assignments at stages of physical investigation or laboratory practice previous to quantum theory. The objective content of quantum mechanics thus resides somewhere else than in the quantum state, and various ideas for where that ``somewhere else'' is are presently under debate. What is overwhelmingly agreed upon in this effort is only the opening statement. Still, quantum states are not Bayesian probability assignments themselves, and different representations of the theory (in terms of state vectors or Wigner functions or C*-algebras, etc.) can take one further from or closer to a Bayesian point of view. It is thus worthwhile thinking about which representation might be the most propitious for the point of view and might quell some of the remaining debate. In this talk, I will present several results regarding a representation of quantum mechanics in terms of symmetric bases of positive-semidefinite operators. I also argue why this is probably the most natural representation for a Bayesian-style quantum mechanics. [Preview Abstract] |
Monday, March 13, 2006 3:18PM - 3:30PM |
D40.00003: Influence-free states on compound quantum systems Howard Barnum, Christopher Fuchs, Joseph Renes, Alexander Wilce Probability states for bipartite local measurements and correlations between local measurements are considered, in general and when the local systems behave quantum-mechanically. We review the facts that in general allowing local measurements conditional on classically communicated results from the other site imposes no-signalling in the direction opposite communication, and that in the locally quantum case, two-way no-signalling restricts states to be in the dual of the cone of unentangled states, isomorphic to that of positive maps. We show that in the ``decomposable'' subcone, generated by quantum states and their partial transposes, the extremal quantum states and extremal partial transposes remain extremal. And we show that decomposable states do not violate Cirelson inequalities. We show that locally-quantum no-signalling states must be combined in a thoroughoing no-signalling fashion. Thus Alice and Bob cannot consistently accumulate a sequence of independent states of this nature (as they might a supply of shared Bell states to use in entanglement distillation) while having available the full panoply of quantum observables and operations at their respective sites. The relation of no-signalling to the ``closest-to-Bayesian'' conditional quantum dynamics of C. Fuchs will also be touched on. [Preview Abstract] |
Monday, March 13, 2006 3:30PM - 3:42PM |
D40.00004: Liouville mechanics with an epistemic restriction and Bohr's response to EPR Terry Rudolph, Stephen Bartlett, Robert Spekkens We introduce a toy theory that reproduces a wide variety of qualitative features of quantum theory for degrees of freedom that are continuous. Specifically, we consider classical mechanics supplemented by a constraint on the amount of information an observer may have about the motional state (i.e. point in phase space) of a collection of classical particles -- Liouville mechanics with an epistemic restriction (This may well be how Heisenberg initially understood the Uncertainty Principle). We develop the formalism of the theory by deriving the consequences of this ``classical uncertainty principle'' on state preparations, measurements, and dynamics. The result is a theory of hidden variables, although it is not a hidden variable model of quantum theory because of its locality and noncontextuality. Despite admitting a simple classical interpretation, the theory also exhibits the operational features of Bohr's notion of complementarity. Indeed, it includes all of the features of quantum mechanics to which Bohr appeals in his response to EPR. This theory demonstrates, therefore, that Bohr's arguments fail as a defense of the completeness of quantum mechanics. [Preview Abstract] |
Monday, March 13, 2006 3:42PM - 3:54PM |
D40.00005: Negativity and contextuality are equivalent notions of nonclassicality Robert Spekkens An important problem in the foundations of quantum theory is the identification of the precise manner in which quantum theories differ from their classical counterparts. Two notions of nonclassicality that have been investigated intensively are: (1) negativity, that is, the necessity of negative values in real-valued representations of quantum states such as the Wigner representation, and (2) contextuality, that is, the impossibility of a hidden variable model of quantum theory wherein the representation of measurements is noncontextual (also known as the Bell-Kochen-Specker theorem). We shall argue for a particular way of generalizing and making precise both of these notions. With the refined versions of each in hand, it becomes apparent that they are in fact one and the same notion of nonclassicality. It follows that any proof of contextuality is also a proof of negativity and vice-versa. [Preview Abstract] |
Monday, March 13, 2006 3:54PM - 4:06PM |
D40.00006: Epistemic Excess Baggage of Hidden Variable Theories Nicholas Harrigan, Terry Rudolph In Quantum Mechanics (QM) preparations of a system are represented by density operators acting on the associated Hilbert space. An ontological (`hidden' variable) model however, views preparations as being described by probability distributions (known as epistemic states) over a set of `hidden' variables. We investigate restrictions on the efficiency of any such model of QM through studying its preparation contextuality, a property that one can prove to be possessed by ontological models. This property implies the existence of cases wherein more than one distinct epistemic state must be associated with a single density operator in order to correctly reproduce QM predictions. Traditional proofs of preparation contextuality have exhibited scenarios in which it can only be seen that an ontological model must associate more than one epistemic state with some density operator, the exact number being uncertain. We investigate the existence of upper or lower bounds on the number of distinct epistemic states that an ontological model must associate with density operators in order to reproduce QM statistics. The bounds obtained are yet another clue as to how one might quantify the non-classical nature of QM. We provide some speculation on how these results may shed light on the difficulty of simulating quantum mechanical systems on a classical computer. [Preview Abstract] |
Monday, March 13, 2006 4:06PM - 4:18PM |
D40.00007: Pushing the Experimental Limits of Bell Inequalities Joseph Altepeter, Evan Jeffrey, Paul Kwiat Using pairs of polarization-entangled photons, we report measurements of Bell's inequalities near the limits of physically allowable violations. As there are several methods by which one can judge the significance of a violation, we report the largest violation to date measured in both standard deviations (2417-sigma) and absolute size (2.826 +/- 0.005). These extremely precise and extremely non-classical results were obtained by carefully characterizing each experimental loss and inefficiency. Unfortunately, accounting for these losses and inefficiencies in the system requires auxiliary assumptions, assumptions which strictly fail to exclude local hidden variable models, and therefore also fail to rigorously test local realism. We therefore additionally report on progress towards a ``loophole-free'' test of Bell's inequality, whereby these experiemental losses and inefficiencies are virtually eliminated, and along with them, the need for auxiliary assumptions about the nature of the systems being measured. [Preview Abstract] |
Monday, March 13, 2006 4:18PM - 4:30PM |
D40.00008: Can Two-photon Correlation of Chaotic Light Be Considered as Correlation of Intensity Fluctuations? Giuliano Scarcelli, Vincenzo Berardi, Yanhua Shih Unlike first-order correlation, which is considered as a coherent effect of the electromagnetic field, the second-order correlation of radiation is considered as the classical statistical correlation of intensity fluctuations. The first second-order correlation experiment was demonstrated by Hanbury Brown and Twiss (HBT) stimulating a debate about the classical or quantum nature of the phenomenon. Although quantum models of HBT experiment have been attempted, the classical statistical interpretation has been widely accepted. The concept of intensity fluctuation has even been extended to quantum models: ``photon bunching" is a phenomenological extension to quantum theory of the statistical correlation on photon number fluctuations. We argue that two-photon correlation phenomena, including HBT, have to be understood as a two-photon coherent effect: quantum interference between two-photon probability amplitudes. To do so, we present a ``ghost" imaging experiment of chaotic light to show that the classical understanding in terms of intensity fluctuations does not give a correct interpretation for the observation. From a practical point of view, this experiment shows the possibility of having high contrast lensless two- photon imaging with chaotic light, suggesting imaging applications for radiations for which no effective lens is available. [Preview Abstract] |
Monday, March 13, 2006 4:30PM - 4:42PM |
D40.00009: Predictability sieve, pointer states, and the classicality of quantum trajectories Diego Dalvit, Jacek Dziarmaga, Wojciech Zurek We study various measures of classicality of the states of open quantum systems subject to decoherence. Classical states are expected to be stable in spite of decoherence, and are thought to leave conspicuous imprints on the environment. Here these expected features of environment-induced superselection (einselection) are quantified using four different criteria: predictability sieve (which selects states that produce least entropy), purification time (which looks for states that are the easiest to find out from the imprint they leave on the environment), efficiency threshold (which finds states that can be deduced from measurements on a smallest fraction of the environment), and purity loss time (that looks for states for which it takes the longest to lose a set fraction of their initial purity). We show that when pointer states -- the most predictable states of an open quantum system selected by the predictability sieve -- are well defined, all four criteria agree that they are indeed the most classical states. We illustrate this with two examples: an underdamped harmonic oscillator, for which coherent states are unanimously chosen by all criteria, and a free particle undergoing quantum Brownian motion, for which most criteria select almost identical Gaussian states (although, in this case, predictability sieve does not select well defined pointer states.) Reference: D.A.R. Dalvit, J. Dziarmaga, and W.H. Zurek, Phys. Rev. A 72, 062101 (2005). [Preview Abstract] |
Monday, March 13, 2006 4:42PM - 4:54PM |
D40.00010: Decoherence from Spin Environments Fernando Cucchietti, Juan Pablo Paz, Wojciech Zurek We examine two exactly solvable models of decoherence -- a central spin-system, (i) with and (ii) without a self--Hamiltonian, interacting with a collection of environment spins. In the absence of a self--Hamiltonian we show that in this model (introduced some time ago to illustrate environment--induced superselection) generic assumptions about the coupling strengths can lead to a universal (Gaussian) suppression of coherence between pointer states. On the other hand, we show that when the dynamics of the central spin is dominant a different regime emerges, which is characterized by a non--Gaussian decay and a dramatically different set of pointer states. We explore the regimes of validity of the Gaussian--decay and discuss its relation to the spectral features of the environment and to the Loschmidt echo (or fidelity). [Preview Abstract] |
Monday, March 13, 2006 4:54PM - 5:06PM |
D40.00011: The Afshar Experiment and Complementarity Ruth Kastner A modified version of Young's experiment by Shahriar Afshar demonstrates that, prior to what appears to be a ``which-way'' measurement, an interference pattern exists. Afshar has claimed that this result constitutes a violation of the Principle of Complementarity. This paper discusses the implications of this experiment and considers how Cramer's Transactional Interpretation easily accomodates the result. It is also shown that the Afshar experiment is isomorphic in key respects to a spin one-half particle prepared as ``spin up along x'' and post- selected in a specific state of spin along z. The terminology ``which way'' or ``which-slit'' is critiqued; it is argued that this usage by both Afshar and his critics is misleading and has contributed to confusion surrounding the interpretation of the experiment. Nevertheless, it is concluded that Bohr would have had no more problem accounting for the Afshar result than he would in accounting for the aforementioned pre- and post- selection spin experiment, in which the particle's preparation state is confirmed by a nondestructive measurement prior to post-selection. In addition, some new inferences about the interpretation of delayed choice experiments are drawn from the analysis. [Preview Abstract] |
Monday, March 13, 2006 5:06PM - 5:18PM |
D40.00012: Robust Weak Measurements Jeff Tollaksen, Yakir Aharonov We introduce a new type of weak measurement which yields a quantum average of weak values that is robust, outside the range of eigenvalues, extends the valid regime for weak measurements, and for which the probability of obtaining the pre- and post-selected ensemble is not exponentially rare. This result extends the applicability of weak values, shifts the statistical interpretation previously attributed to weak values and suggests that the weak value is a property of every pre- and post-selected ensemble. We then apply this new weak measurement to Hardy's paradox. Usually the paradox is dismissed on grounds of counterfactuality, i.e., because the paradoxical effects appear only when one considers results of experiments which do not actually take place. We suggest a new set of measurements in connection with Hardy's scheme, and show that when they are actually performed, they yield strange and surprising outcomes. More generally, we claim that counterfactual paradoxes point to a deeper structure inherent to quantum mechanics characterized by weak values (Aharonov Y, Botero A, Popescu S, Reznik B, Tollaksen J, Physics Letters A, 301 (3-4): 130-138, 2002). [Preview Abstract] |
Monday, March 13, 2006 5:18PM - 5:30PM |
D40.00013: Quantum of Information Caslav Brukner, Anton Zeilinger The violation of local realism is today a well established experimental fact. From it follows that either locality or realism or both cannot provide a foundational basis of Nature. Relaxing the locality condition would essentially not change the epistemological structure of classical physics but only extend its limits. Abandonment of reality, however, would require a radical revision of the conceptual background of all our theories so far. Is a novel conceptual basis of quantum theory feasible, in which the impossibility of defining external reality independent and prior to observation naturally emerges? We suggest the finiteness of information content of a quantum system as providing such basis. Any realistic theory that could arrive at an accurate prediction of a particular event would require the system to carry information as to which specific result will be observed for all possible future measurements. Because the system cannot carry more information than is in principle available, there must exist measurements for which individual events contain an element of irreducible randomness. Quantum entanglement arises from the possibility that information in a composite system resides more in the correlations than in properties of individuals. In the talk we will report on recent efforts towards providing derivations of the elements of the Hilbert space structure from the quantization of information. [Preview Abstract] |
Follow Us |
Engage
Become an APS Member |
My APS
Renew Membership |
Information for |
About APSThe American Physical Society (APS) is a non-profit membership organization working to advance the knowledge of physics. |
© 2024 American Physical Society
| All rights reserved | Terms of Use
| Contact Us
Headquarters
1 Physics Ellipse, College Park, MD 20740-3844
(301) 209-3200
Editorial Office
100 Motor Pkwy, Suite 110, Hauppauge, NY 11788
(631) 591-4000
Office of Public Affairs
529 14th St NW, Suite 1050, Washington, D.C. 20045-2001
(202) 662-8700