Hide abstracts

Preprints
  1. David Schmid, John H. Selby, Robert W. Spekkens
    Addressing some common objections to generalized noncontextuality
    [arXiv:2302.07282 (quant-ph)]

    When should a given operational phenomenology be deemed to admit of a classical explanation? When it can be realized in a generalized-noncontextual ontological model. The case for answering the question in this fashion has been made in many previous works, and motivates research on the notion of generalized noncontextuality. Many criticisms and concerns have been raised, however, regarding the definition of this notion and of the possibility of testing it experimentally. In this work, we respond to some of the most common of these objections. One such objection is that the existence of a classical record of which laboratory procedure was actually performed in each run of an experiment implies that the operational equivalence relations that are a necessary ingredient of any proof of the failure of noncontextuality do not hold, and consequently that conclusions of nonclassicality based on these equivalences are mistaken. We explain why this concern in unfounded. Our response affords the opportunity for us to clarify certain facts about generalized noncontextuality, such as the possibility of having proofs of its failure based on a consideration of the subsystem structure of composite systems. Similarly, through our responses to each of the other objections, we elucidate some under-appreciated facts about the notion of generalized noncontextuality and experimental tests thereof.

  2. Lorenzo Catani, Matthew Leifer, David Schmid, Robert W. Spekkens
    Reply to "Comment on 'Why interference phenomena do not capture the essence of quantum theory'"
    [arXiv:2207.11791 (quant-ph)]

    Our article [arXiv:2111.13727(2021)] argues that the phenomenology of interference that is traditionally regarded as problematic does not, in fact, capture the essence of quantum theory -- contrary to the claims of Feynman and many others. It does so by demonstrating the existence of a physical theory, which we term the "toy field theory", that reproduces this phenomenology but which does not sacrifice the classical worldview. In their Comment [arXiv:2204.01768(2022)], Hance and Hossenfelder dispute our claim. Correcting mistaken claims found therein and responding to their criticisms provides us with an opportunity to further clarify some of the ideas in our article.

  3. David Schmid, John H. Selby, and Robert W. Spekkens
    Unscrambling the omelette of causation and inference: The framework of causal-inferential theories
    [arXiv:2009.03297 (quant-ph)]

    Using a process-theoretic formalism, we introduce the notion of a causal-inferential theory: a triple consisting of a theory of causal influences, a theory of inferences (of both the Boolean and Bayesian varieties), and a specification of how these interact. Recasting the notions of operational and realist theories in this mold clarifies what a realist account of an experiment offers beyond an operational account. It also yields a novel characterization of the assumptions and implications of standard no-go theorems for realist representations of operational quantum theory, namely, those based on Bell's notion of locality and those based on generalized noncontextuality. Moreover, our process-theoretic characterization of generalised noncontextuality is shown to be implied by an even more natural principle which we term Leibnizianity. Most strikingly, our framework offers a way forward in a research program that seeks to circumvent these no-go results. Specifically, we argue that if one can identify axioms for a realist causal-inferential theory such that the notions of causation and inference can differ from their conventional (classical) interpretations, then one has the means of defining an intrinsically quantum notion of realism, and thereby a realist representation of operational quantum theory that salvages the spirit of locality and of noncontextuality.

  4. David Schmid, John H. Selby, Matthew F. Pusey, and Robert W. Spekkens
    A structure theorem for generalized-noncontextual ontological models
    [arXiv:2005.07161 (quant-ph)]

    It is useful to have a criterion for when the predictions of an operational theory should be considered classically explainable. Here we take the criterion to be that the theory admits of a generalized-noncontextual ontological model. Existing works on generalized noncontextuality have focused on experimental scenarios having a simple structure, typically, prepare-measure scenarios. Here, we formally extend the framework of ontological models as well as the principle of generalized noncontextuality to arbitrary compositional scenarios. We leverage this process-theoretic framework to prove that, under some reasonable assumptions, every generalized-noncontextual ontological model of a tomographically local operational theory has a surprisingly rigid and simple mathematical structure; in short, it corresponds to a frame representation which is not overcomplete. One consequence of this theorem is that the largest number of ontic states possible in any such model is given by the dimension of the associated generalized probabilistic theory. This constraint is useful for generating noncontextuality no-go theorems as well as techniques for experimentally certifying contextuality. Along the way, we extend known results concerning the equivalence of different notions of classicality from prepare-measure scenarios to arbitrary compositional scenarios. Specifically, we prove a correspondence between the following three notions of classical explainability of an operational theory: (i) admitting a noncontextual ontological model, (ii) admitting of a positive quasiprobability representation, and (iii) being simplex-embeddable.

  5. Robert W. Spekkens
    The ontological identity of empirical indiscernibles: Leibniz's methodological principle and its significance in the work of Einstein
    [arXiv:1909.04628 (physics.hist-ph)]

    This article explores the following methodological principle for theory construction in physics: if an ontological theory predicts two scenarios that are ontologically distinct but empirically indiscernible, then this theory should be rejected and replaced by one relative to which the scenarios are ontologically the same. I defend the thesis that this methodological principle was first articulated by Leibniz as a version of his principle of the identity of indiscernibles, and that it was applied repeatedly to great effect by Einstein in his development of the special and general theories of relativity. I argue for an interpretation of the principle as an inference to the best explanation, defend it against some criticisms, discuss its potential applications in modern physics, and explain how it provides an attractive middle ground in the debate between empiricist and realist philosophies of science.

  6. Iman Marvian and Robert W. Spekkens
    An information-theoretic account of the Wigner-Araki-Yanase theorem
    [arXiv:1212.3378 (quant-ph)]

    The Wigner-Araki-Yanase (WAY) theorem can be understood as a result in the resource theory of asymmetry asserting the impossibility of perfectly simulating, via symmetric processing, the measurement of an asymmetric observable unless one has access to a state that is perfectly asymmetric, that is, one whose orbit under the group action is a set of orthogonal states. The simulation problem can be characterized information-theoretically by considering how well both the target observable and the resource state can provide an encoding of an element of the symmetry group. Leveraging this information-theoretic perspective, we show that the WAY theorem is a consequence of the no-programming theorem for projective measurements. The connection allows us to clarify the conceptual content of the theorem and to deduce some interesting generalizations.
Articles
  1. Lorenzo Catani, Matthew Leifer, Giovanni Scala, David Schmid, Robert W. Spekkens
    Aspects of the phenomenology of interference that are genuinely nonclassical
    Phys. Rev. A 108, 022207 (2023)
    [arXiv:2211.09850 (quant-ph)]

    Interference phenomena are often claimed to resist classical explanation. However, such claims are undermined by the fact that the specific aspects of the phenomenology upon which they are based can in fact be reproduced in a noncontextual ontological model [Catani et al., arXiv:2111.13727]. This raises the question of what other aspects of the phenomenology of interference do in fact resist classical explanation. We answer this question by demonstrating that the most basic quantum wave-particle duality relation, which expresses the precise tradeoff between path distinguishability and fringe visibility, cannot be reproduced in any noncontextual model. We do this by showing that it is a specific type of uncertainty relation and then leveraging a recent result establishing that noncontextuality restricts the functional form of this uncertainty relation [Catani et al., Phys. Rev. Lett. 129, 240401 (2022)]. Finally, we discuss what sorts of interferometric experiment can demonstrate contextuality via the wave-particle duality relation.

  2. Polino, Poderini, Rodari, Agresti, Suprano, Carvacho, Wolfe, Canabarro, Moreno, Milani, Spekkens, Chaves, Sciarino
    Experimental nonclassicality in a causal network without assuming freedom of choice
    Nature Communications 14, 909 (2023)
    [arXiv:2210.07263 (quant-ph)]

    In a Bell experiment, it is natural to seek a causal account of correlations wherein only a common cause acts on the outcomes. For this causal structure, Bell inequality violations can be explained only if causal dependencies are modelled as intrinsically quantum. There also exists a vast landscape of causal structures beyond Bell that can witness nonclassicality, in some cases without even requiring free external inputs. Here, we undertake a photonic experiment realizing one such example: the triangle causal network, consisting of three measurement stations pairwise connected by common causes and no external inputs. To demonstrate the nonclassicality of the data, we adapt and improve three known techniques: (i) a machine-learning-based heuristic test, (ii) a data-seeded inflation technique generating polynomial Bell-type inequalities and (iii) entropic inequalities. The demonstrated experimental and data analysis tools are broadly applicable paving the way for future networks of growing complexity.

  3. Lorenzo Catani, Matthew Leifer, Giovanni Scala, David Schmid, Robert W. Spekkens
    What is nonclassical about uncertainty relations?
    Phys. Rev. Lett. 129, 240401 (2022)
    [arXiv:2207.11779 (quant-ph)]

    Uncertainty relations express limits on the extent to which the outcomes of distinct measurements on a single state can be made jointly predictable. The existence of nontrivial uncertainty relations in quantum theory is generally considered to be a way in which it entails a departure from the classical worldview. However, this perspective is undermined by the fact that there exist operational theories which exhibit nontrivial uncertainty relations but which are consistent with the classical worldview insofar as they admit of a generalized-noncontextual ontological model. This prompts the question of what aspects of uncertainty relations, if any, cannot be realized in this way and so constitute evidence of genuine nonclassicality. We here consider uncertainty relations describing the tradeoff between the predictability of a pair of binary-outcome measurements (e.g., measurements of Pauli X and Pauli Z observables in quantum theory). We show that, for a class of theories satisfying a particular symmetry property, the functional form of this predictability tradeoff is constrained by noncontextuality to be below a linear curve. Because qubit quantum theory has the relevant symmetry property, the fact that its predictability tradeoff describes a section of a circle is a violation of this noncontextual bound, and therefore constitutes an example of how the functional form of an uncertainty relation can witness contextuality. We also deduce the implications for a selected group of operational foils to quantum theory and consider the generalization to three measurements.

  4. John H. Selby, David Schmid, Elie Wolfe, Ana Belén Sainz, Ravi Kunjwal, Robert W. Spekkens
    Accessible fragments of generalized probabilistic theories, cone equivalence, and applications to witnessing nonclassicality
    Phys. Rev. A 107, 062203 (2023)
    [arXiv:2112.04521 (quant-ph)]

    The formalism of generalized probabilistic theories (GPTs) was originally developed as a way to characterize the landscape of conceivable physical theories. Thus, the GPT describing a given physical theory necessarily includes all physically possible processes. We here consider the question of how to provide a GPT-like characterization of a particular experimental setup within a given physical theory. We show that the resulting characterization is not generally a GPT in and of itself-rather, it is described by a more general mathematical object that we introduce and term an accessible GPT fragment. We then introduce an equivalence relation, termed cone equivalence, between accessible GPT fragments (and, as a special case, between standard GPTs). We give a number of examples of experimental scenarios that are best described using accessible GPT fragments, and where moreover cone-equivalence arises naturally. We then prove that an accessible GPT fragment admits of a classical explanation if and only if every other fragment that is cone-equivalent to it also admits of a classical explanation. Finally, we leverage this result to prove several fundamental results regarding the experimental requirements for witnessing the failure of generalized noncontextuality. In particular, we prove that neither incompatibility among measurements nor the assumption of freedom of choice is necessary for witnessing failures of generalized noncontextuality, and, moreover, that such failures can be witnessed even using arbitrarily inefficient detectors.

  5. Lorenzo Catani, Matthew Leifer, David Schmid, Robert W. Spekkens
    Why interference phenomena do not capture the essence of quantum theory
    Quantum 7, 1119 (2023)
    [arXiv:2111.13727 (quant-ph)]

    Quantum interference phenomena are widely viewed as posing a challenge to the classical worldview. Feynman even went so far as to proclaim that they are the only mystery and the basic peculiarity of quantum mechanics. Many have also argued that such phenomena force us to accept a number of radical interpretational conclusions, including: that a photon is neither a particle nor a wave but rather a schizophrenic sort of entity that toggles between the two possibilities, that reality is observer-dependent, and that systems either do not have properties prior to measurements or else have properties that are subject to nonlocal or backwards-in-time causal influences. In this work, we show that such conclusions are not, in fact, forced on us by the phenomena. We do so by describing an alternative to quantum theory, a statistical theory of a classical discrete field (the `toy field theory') that reproduces the relevant phenomenology of quantum interference while rejecting these radical interpretational claims. It also reproduces a number of related interference experiments that are thought to support these interpretational claims, such as the Elitzur-Vaidman bomb tester, Wheeler's delayed-choice experiment, and the quantum eraser experiment. The systems in the toy field theory are field modes, each of which possesses, at all times, both a particle-like property (a discrete occupation number) and a wave-like property (a discrete phase). Although these two properties are jointly possessed, the theory stipulates that they cannot be jointly known. The phenomenology that is generally cited in favour of nonlocal or backwards-in-time causal influences ends up being explained in terms of inferences about distant or past systems, and all that is observer-dependent is the observer's knowledge of reality, not reality itself.

  6. Beata Zjawin, Elie Wolfe, Robert W. Spekkens
    Restricted Hidden Cardinality Constraints in Causal Models
    Electronic Proceedings in Theoretical Computer Science 343, 119 (2021)
    [arXiv:2109.05656 (math.ST)]

    Causal models with unobserved variables impose nontrivial constraints on the distributions over the observed variables. When a common cause of two variables is unobserved, it is impossible to uncover the causal relation between them without making additional assumptions about the model. In this work, we consider causal models with a promise that unobserved variables have known cardinalities. We derive inequality constraints implied by d-separation in such models. Moreover, we explore the possibility of leveraging this result to study causal influence in models that involve quantum systems.

  7. Michael Grabowecky, Christopher Pollack, Andrew Cameron, Robert Spekkens, Kevin Resch
    Experimentally bounding deviations from quantum theory for a photonic three-level system using theory-agnostic tomography
    Phys. Rev. A 105, 032204 (2022)
    [arXiv:2108.05864 (quant-ph)]

    If one seeks to test quantum theory against many alternatives in a landscape of possible physical theories, then it is crucial to be able to analyze experimental data in a theory-agnostic way. This can be achieved using the framework of generalized probabilistic theories (GPTs). Here we implement GPT tomography on a three-level system corresponding to a single photon shared among three modes. This scheme achieves a GPT characterization of each of the preparations and measurements implemented in the experiment without requiring any prior characterization of either. Assuming that the sets of realized preparations and measurements are tomographically complete, our analysis identifies the most likely dimension of the GPT vector space describing the three-level system to be nine, in agreement with the value predicted by quantum theory. Relative to this dimension, we infer the scope of GPTs that are consistent with our experimental data by identifying polytopes that provide inner and outer bounds for the state and effect spaces of the true GPT. From these, we are able to determine quantitative bounds on possible deviations from quantum theory. In particular, we bound the degree to which the no-restriction hypothesis might be violated for our three-level system.

  8. Patrick J. Daley, Kevin J. Resch, Robert W. Spekkens
    Experimentally adjudicating between different causal accounts of Bell inequality violations via statistical model selection
    Phys. Rev. A 105, 042220 (2022)
    [arXiv:2108.00053 (quant-ph)]

    Bell inequalities follow from a set of seemingly natural assumptions about how to provide a causal model of a Bell experiment. In the face of their violation, two types of causal models that modify some of these assumptions have been proposed: (1) those that are parametrically conservative and structurally radical, such as models where the parameters are conditional probability distributions (termed “classical causal models”) but where one posits interlab causal influences or superdeterminism, and (2) those that are parametrically radical and structurally conservative, such as models where the labs are taken to be connected only by a common cause but where conditional probabilities are replaced by conditional density operators (these are termed “quantum causal models”). We here seek to adjudicate between these alternatives based on their predictive power. The data from a Bell experiment are divided into a training set and a test set, and for each causal model, the parameters that yield the best fit for the training set are estimated and then used to make predictions about the test set. Our main result is that the structurally radical classical causal models are disfavored relative to the structurally conservative quantum causal model. Their lower predictive power seems to be due to their tendency to mistake statistical fluctuations away from the no-signaling condition for real features and thereby overfit the data. Our technique shows that it is possible to witness quantumness even in a Bell experiment that does not close the locality loophole. It also overturns the notion that it is impossible to experimentally test the plausibility of superdeterminist models of Bell inequality violations.

  9. Noam Finkelstein, Beata Zjawin, Elie Wolfe, Ilya Shpitser, Robert W. Spekkens
    Entropic Inequality Constraints from e-separation Relations in Directed Acyclic Graphs with Hidden Variables
    Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence, Proceedings of Machine Learning Research 161, 1045 (2021)
    [arXiv:2107.07087 (stat)]

    Directed acyclic graphs (DAGs) with hidden variables are often used to characterize causal relations between variables in a system. When some variables are unobserved, DAGs imply a notoriously complicated set of constraints on the distribution of observed variables. In this work, we present entropic inequality constraints that are implied by e-separation relations in hidden variable DAGs with discrete observed variables. The constraints can intuitively be understood to follow from the fact that the capacity of variables along a causal pathway to convey information is restricted by their entropy; e.g. at the extreme case, a variable with entropy 0 can convey no information. We show how these constraints can be used to learn about the true causal model from an observed data distribution. In addition, we propose a measure of causal influence called the minimal mediary entropy, and demonstrate that it can augment traditional measures such as the average causal effect.

  10. John H. Selby, David Schmid, Elie Wolfe, Ana Belen Sainz, Ravi Kunjwal, Robert W. Spekkens
    Contextuality without incompatibility
    Phys. Rev. Lett. 130, 230201 (2023)
    [arXiv:2106.09045 (quant-ph)]

    The existence of incompatible measurements is often believed to be a feature of quantum theory which signals its inconsistency with any classical worldview. To prove the failure of classicality in the sense of Kochen-Specker noncontextuality, one does indeed require sets of incompatible measurements. However, a more broadly applicable and more permissive notion of classicality is the existence of a generalized-noncontextual ontological model. In particular, this notion can imply constraints on the representation of outcomes even within a single nonprojective measurement. We leverage this fact to demonstrate that measurement incompatibility is neither necessary nor sufficient for proofs of the failure of generalized noncontextuality. Furthermore, we show that every proof of the failure of generalized noncontextuality in a prepare-measure scenario can be converted into a proof of the failure of generalized noncontextuality in a corresponding scenario with no incompatible measurements.

  11. R. Chaves, G. Moreno, E. Polino, D. Poderini, I. Agresti, A. Suprano, M. R. Barros, G. Carvacho, E. Wolfe, A. Canabarro, R. W. Spekkens, F. Sciarrino
    Causal networks and freedom of choice in Bell's theorem
    Phys. Rev. X Quantum 2 040323 (2021)
    [arXiv:2105.05721 (quant-ph)]

    Bell's theorem is typically understood as the proof that quantum theory is incompatible with local hidden variable models. More generally, we can see the violation of a Bell inequality as witnessing the impossibility of explaining quantum correlations with classical causal models. The violation of a Bell inequality, however, does not exclude classical models where some level of measurement dependence is allowed, that is, the choice made by observers can be correlated with the source generating the systems to be measured. Here we show that the level of measurement dependence can be quantitatively upper bounded if we arrange the Bell test within a network. Furthermore, we also prove that these results can be adapted in order to derive non-linear Bell inequalities for a large class of causal networks and to identify quantumly realizable correlations which violate them.

  12. David Schmid, Thomas C. Fraser, Ravi Kunjwal, Ana Belen Sainz, Elie Wolfe, Robert W. Spekkens
    Understanding the interplay of entanglement and nonlocality: motivating and developing a new branch of entanglement theory
    Quantum 7, 1194 (2023)
    [arXiv:2004.09194 (quant-ph)]

    A standard approach to quantifying resources is to determine which operations on the resources are freely available, and to deduce the partial order over resources that is induced by the relation of convertibility under the free operations. If the resource of interest is the nonclassicality of the correlations embodied in a quantum state, i.e., entanglement, then the common assumption is that the appropriate choice of free operations is Local Operations and Classical Communication (LOCC). We here advocate for the study of a different choice of free operations, namely, Local Operations and Shared Randomness (LOSR), and demonstrate its utility in understanding the interplay between the entanglement of states and the nonlocality of the correlations in Bell experiments. Specifically, we show that the LOSR paradigm (i) provides a resolution of the anomalies of nonlocality , wherein partially entangled states exhibit more nonlocality than maximally entangled states, (ii) entails new notions of genuine multipartite entanglement and nonlocality that are free of the pathological features of the conventional notions, and (iii) makes possible a resource-theoretic account of the self-testing of entangled states which generalizes and simplifies prior results. Along the way, we derive some fundamental results concerning the necessary and sufficient conditions for convertibility between pure entangled states under LOSR and highlight some of their consequences, such as the impossibility of catalysis for bipartite pure states. The resource-theoretic perspective also clarifies why it is neither surprising nor problematic that there are mixed entangled states which do not violate any Bell inequality. Our results motivate the study of LOSR-entanglement as a new branch of entanglement theory.

  13. Tomáš Gonda, Robert W. Spekkens
    Monotones in General Resource Theories
    Compositionality 5, issue 7 (2023)
    [arXiv:1912.07085 (quant-ph)]

    A central problem in the study of resource theories is to find functions that are nonincreasing under resource conversions---termed monotones---in order to quantify resourcefulness. Various constructions of monotones appear in many different concrete resource theories. How general are these constructions? What are the necessary conditions on a resource theory for a given construction to be applicable? To answer these questions, we introduce a broad scheme for constructing monotones. It involves finding an order-preserving map from the preorder of resources of interest to a distinct preorder for which nontrivial monotones are previously known or can be more easily constructed; these monotones are then pulled back through the map. In one of the two main classes we study, the preorder of resources is mapped to a preorder of sets of resources, where the order relation is set inclusion, such that monotones can be defined via maximizing or minimizing the value of a function within these sets. In the other class, the preorder of resources is mapped to a preorder of tuples of resources, and one pulls back monotones that measure the amount of distinguishability of the different elements of the tuple (hence its information content). Monotones based on contractions arise naturally in the latter class, and, more surprisingly, so do weight and robustness measures. In addition to capturing many standard monotone constructions, our scheme also suggests significant generalizations of these. In order to properly capture the breadth of applicability of our results, we present them within a novel abstract framework for resource theories in which the notion of composition is independent of the types of the resources involved (i.e., whether they are states, channels, combs, etc.).

  14. David Schmid, John Selby, Elie Wolfe, Ravi Kunjwal, Robert W. Spekkens
    The Characterization of Noncontextuality in the Framework of Generalized Probabilistic Theories
    Phys. Rev. X Quantum 2, 010331 (2021)
    [arXiv:1911.10386 (quant-ph)]

    To make precise the sense in which the operational predictions of quantum theory conflict with a classical worldview, it is necessary to articulate a notion of classicality within an operational framework. A widely applicable notion of classicality of this sort is whether or not the predictions of a given operational theory can be explained by a generalized-noncontextual ontological model. We here explore what notion of classicality this implies for the generalized probabilistic theory (GPT) that arises from a given operational theory, focusing on prepare-measure scenarios. We first show that, when mapping an operational theory to a GPT by quotienting relative to operational equivalences, the constraint of explainability by a generalized-noncontextual ontological model is mapped to the constraint of explainability by an ontological model. We then show that, under the additional assumption that the ontic state space is of finite cardinality, this constraint on the GPT can be expressed as a geometric condition which we term simplex embeddability. Whereas the traditional notion of classicality for a GPT is that its state space be a simplex and its effect space be the dual of this simplex, simplex embeddability merely requires that its state space be embeddable in a simplex and its effect space in the dual of that simplex. We argue that simplex embeddability constitutes an intuitive and freestanding notion of classicality for GPTs. Our result also has applications to witnessing nonclassicality in prepare-measure experiments.

  15. Elie Wolfe, David Schmid, Ana Belen Sainz, Ravi Kunjwal, Robert W. Spekkens
    Quantifying Bell: the Resource Theory of Nonclassicality of Common-Cause Boxes
    Quantum 4, 280 (2020)
    [arXiv:1903.06311 (quant-ph)]

    We take a resource-theoretic approach to the problem of quantifying nonclassicality in Bell scenarios. The resources are conceptualized as probabilistic processes from the setting variables to the outcome variables having a particular causal structure, namely, one wherein the wings are only connected by a common cause. We term them "common-cause boxes". We define the distinction between classical and nonclassical resources in terms of whether or not a classical causal model can explain the correlations. One can then quantify the relative nonclassicality of resources by considering their interconvertibility relative to the set of operations that can be implemented using a classical common cause (which correspond to local operations and shared randomness). We prove that the set of free operations forms a polytope, which in turn allows us to derive an efficient algorithm for deciding whether one resource can be converted to another. We moreover define two distinct monotones with simple closed-form expressions in the two-party binary-setting binary-outcome scenario, and use these to reveal various properties of the pre-order of resources, including a lower bound on the cardinality of any complete set of monotones. In particular, we show that the information contained in the degrees of violation of facet-defining Bell inequalities is not sufficient for quantifying nonclassicality, even though it is sufficient for witnessing nonclassicality. Finally, we show that the continuous set of convexly extremal quantumly realizable correlations are all at the top of the pre-order of quantumly realizable correlations. In addition to providing new insights on Bell nonclassicality, our work also sets the stage for quantifying nonclassicality in more general causal networks.

  16. Iman Marvian, Robert W. Spekkens
    A no-broadcasting theorem for quantum asymmetry and coherence and a trade-off relation for approximate broadcasting
    Phys. Rev. Lett 123, 020404 (2019)
    [arXiv:1812.08766 (quant-ph)]

    Symmetries of both closed- and open-system dynamics imply many significant constraints. These generally have instantiations in both classical and quantum dynamics (Noether’s theorem, for instance, applies to both sorts of dynamics). We here provide an example of such a constraint which has no counterpart for a classical system, that is, a uniquely quantum consequence of symmetric dynamics. Specifically, we demonstrate the impossibility of broadcasting asymmetry (symmetry breaking) relative to a continuous symmetry group, for bounded-size quantum systems. The no-go theorem states that if two initially uncorrelated systems interact by symmetric dynamics and asymmetry is created at one subsystem, then the asymmetry of the other subsystem must be reduced. We also find a quantitative relation describing the trade-off between the subsystems. These results cannot be understood in terms of additivity of asymmetry, because, as we show here, any faithful measure of asymmetry violates both subadditivity and superadditivity. Rather, they must be understood as a consequence of an (intrinsically quantum) information-disturbance principle. Our result also implies that if a bounded-size quantum reference frame for the symmetry group, or equivalently, a bounded-size reservoir of coherence (e.g., a clock with coherence between energy eigenstates in quantum thermodynamics) is used to implement any operation that is not symmetric, then the quantum state of the frame or reservoir is necessarily disturbed in an irreversible fashion, i.e., degraded.

  17. David Schmid, Katja Ried, and Robert W. Spekkens
    Why initial system-environment correlations do not imply the failure of complete positivity: a causal perspective
    Phys. Rev. A 100, 022112 (2019)
    [arXiv:1806.02381 (quant-ph)]

    The common wisdom in the field of quantum information theory is that when a system is initially correlated with its environment, the map describing its evolution may fail to be completely positive. If true, this would have practical and foundational significance. We here demonstrate, however, that the common wisdom is mistaken. We trace the error to the standard argument for how the evolution map ought to be defined. We show that it sometimes fails to define a linear map or any map at all and that these pathologies persist even in completely classical examples. Drawing inspiration from the framework of classical causal models, we argue that the correct definition of the evolution map is obtained by considering a counterfactual scenario wherein the system is reprepared independently of any systems in its causal past while the rest of the circuit remains the same, yielding a map that is always completely positive. In a post-mortem on the standard argument, we highlight two distinct mistakes that retrospectively become evident (in its application to completely classical examples): (i) the types of constraints to which it appealed are constraints on what one can infer about the final state of a system based on its initial state, where such inferences are based not just on the cause-effect relation between them-which defines the correct evolution map-but also on the common cause of the two; (ii) in a (retrospectively unnecessary) attempt to introduce variability in the input state, it inadvertently introduced variability in the inference map itself, then tried to fit the input-output pairs associated to these different maps with a single map.

  18. David Schmid, Robert W. Spekkens, Elie Wolfe
    All the noncontextuality inequalities for arbitrary prepare-and-measure experiments with respect to any fixed sets of operational equivalences
    Phys. Rev. A 97, 062103 (2018)
    [arXiv:1710.08434 (quant-ph)]

    Within the framework of generalized noncontextuality, we introduce a general technique for systematically deriving noncontextuality inequalities for any experiment involving finitely many preparations and finitely many measurements, each of which has a finite number of outcomes. Given any fixed sets of operational equivalences among the preparations and among the measurements as input, the algorithm returns a set of noncontextuality inequalities whose satisfaction is necessary and sufficient for a set of operational data to admit of a noncontextual model. Additionally, we show that the space of noncontextual data tables always defines a polytope. Finally, we provide a computationally efficient means for testing whether any set of numerical data admits of a noncontextual model, with respect to any fixed operational equivalences. Together, these techniques provide complete methods for characterizing arbitrary noncontextuality scenarios, both in theory and in practice.

  19. Michael D. Mazurek, Matthew F. Pusey, Kevin J. Resch, Robert W. Spekkens
    Experimentally bounding deviations from quantum theory in the landscape of generalized probabilistic theories
    Phys. Rev. X Quantum 2, 020302 (2021)
    [arXiv:1710.05948 (quant-ph)]

    Many experiments in the field of quantum foundations seek to adjudicate between quantum theory and speculative alternatives to it. This requires one to analyze the experimental data in a manner that does not presume the correctness of the quantum formalism. The mathematical framework of generalized probabilistic theories (GPTs) provides a means of doing so. We present a scheme for determining which GPTs are consistent with a given set of experimental data. It proceeds by performing tomography on the preparations and measurements in a self-consistent manner, i.e., without presuming a prior characterization of either. We illustrate the scheme by analyzing experimental data for a large set of preparations and measurements on the polarization degree of freedom of a single photon. We first test various hypotheses for the dimension of the GPT vector space for this degree of freedom. Our analysis identifies the most plausible hypothesis to be dimension 4, which is the value predicted by quantum theory. Under this hypothesis, we can draw the following additional conclusions from our scheme: (i) that the smallest and largest GPT state spaces that could describe photon polarization are a pair of polytopes, each approximating the shape of the Bloch sphere and having a volume ratio of 0.977+-0.001, which provides a quantitative bound on the scope for deviations from the state and effect spaces predicted by quantum theory, and (ii) that the maximal violation of the Clauser, Horne, Shimony, and Holt inequality can be at most 1.3%+-0.1 greater than the maximum violation allowed by quantum theory, and the maximal violation of a particular inequality for universal noncontextuality can not differ from the quantum prediction by more than this factor on either side. The only possibility for a greater deviation from the quantum state and effect spaces or for greater degrees of supraquantum nonlocality or contextuality, according to our analysis, is if a future experiment (perhaps following the scheme developed here) discovers that additional dimensions of GPT vector space are required to describe photon polarization, in excess of the four dimensions predicted by quantum theory to be adequate to the task.

  20. Ravi Kunjwal and Robert W. Spekkens
    From statistical proofs of the Kochen-Specker theorem to noise-robust noncontextuality inequalities
    Phys. Rev. A 97, 052110 (2018)
    [arXiv:1708.04793 (quant-ph)]

    The Kochen-Specker theorem rules out models of quantum theory wherein sharp measurements are assigned outcomes deterministically and independently of context. This notion of noncontextuality is not applicable to experimental measurements because these are never free of noise and thus never truly sharp. For unsharp measurements, therefore, one must drop the requirement that an outcome is assigned deterministically in the model and merely require that the distribution over outcomes that is assigned in the model is context-independent. By demanding context-independence in the representation of preparations as well, one obtains a generalized principle of noncontextuality that also supports a quantum no-go theorem. Several recent works have shown how to derive inequalities on experimental data which, if violated, demonstrate the impossibility of finding a generalized-noncontextual model of this data. That is, these inequalities do not presume quantum theory and, in particular, they make sense without requiring a notion of "sharpness" of measurements in any operational theory describing the experiment. We here describe a technique for deriving such inequalities starting from arbitrary proofs of the Kochen-Specker theorem. It extends significantly previous techniques, which worked only for logical proofs (based on uncolourable orthogonality graphs), to the case of statistical proofs (where the graphs are colourable, but the colourings cannot explain the quantum statistics). The derived inequalities are robust to noise.

  21. David Schmid and Robert W. Spekkens
    Contextual advantage for state discrimination
    Phys. Rev. X 8, 011015 (2018)
    [arXiv:1706.04588 (quant-ph)]

    Finding quantitative aspects of quantum phenomena which cannot be explained by any classical model has foundational importance for understanding the boundary between classical and quantum theory. It also has practical significance for identifying information processing tasks for which those phenomena provide a quantum advantage. Using the framework of generalized noncontextuality as our notion of classicality, we find one such nonclassical feature within the phenomenology of quantum minimum error state discrimination. Namely, we identify quantitative limits on the success probability for minimum error state discrimination in any experiment described by a noncontextual ontological model. These constraints constitute noncontextuality inequalities that are violated by quantum theory, and this violation implies a quantum advantage for state discrimination relative to noncontextual models. Furthermore, our noncontextuality inequalities are robust to noise and are operationally formulated, so that any experimental violation of the inequalities is a witness of contextuality, independently of the validity of quantum theory. Along the way, we introduce new methods for analyzing noncontextuality scenarios, and demonstrate a tight connection between our minimum error state discrimination scenario and a Bell scenario.

  22. Katja Ried, Jean-Philippe W. MacLean, Robert W. Spekkens, Kevin J. Resch
    Quantum to classical transitions in causal relations
    Phys. Rev. A 95, 062102 (2017)
    [arXiv:1707.06131 (quant-ph)]

    The landscape of causal relations that can hold among a set of systems in quantum theory is richer than in classical physics. In particular, a pair of time-ordered systems can be related as cause and effect or as the effects of a common cause, and each of these causal mechanisms can be coherent or not. Furthermore, one can combine these mechanisms in different ways: by probabilistically realizing either one or the other or by having both act simultaneously (termed a physical mixture). In the latter case, it is possible for the two mechanisms to be combined quantum coherently. Previous work has shown how to experimentally realize one example of each class of possible causal relations. Here, we make a theoretical and experimental study of the transitions between these classes. In particular, for each of the two distinct types of coherence that can exist in mixtures of common-cause and cause-effect relations—coherence in the individual causal pathways and coherence in the way the causal relations are combined—we determine how it degrades under noise and we confirm these expectations in a quantum-optical experiment.

  23. Anirudh Krishna, Robert W. Spekkens, and Elie Wolfe
    Deriving robust noncontextuality inequalities from algebraic proofs of the Kochen-Specker theorem: the Peres-Mermin square
    New J. Phys. 19, 123031 (2017)
    [arXiv:1704.01153 (quant-ph)]

    When a measurement is compatible with each of two other measurements that are incompatible with one another, these define distinct contexts for the given measurement. The Kochen-Specker theorem rules out models of quantum theory that satisfy a particular assumption of context-independence: that sharp measurements are assigned outcomes both deterministically and independently of their context. This notion of noncontextuality is not suited to a direct experimental test because realistic measurements always have some degree of unsharpness due to noise. However, a generalized notion of noncontextuality has been proposed that is applicable to any experimental procedure, including unsharp measurements, but also preparations as well, and for which a quantum no-go result still holds. According to this notion, the model need only specify a probability distribution over the outcomes of a measurement in a context-independent way, rather than specifying a particular outcome. It also implies novel constraints of context-independence for the representation of preparations. In this article, we describe a general technique for translating proofs of the Kochen-Specker theorem into inequality constraints on realistic experimental statistics, the violation of which witnesses the impossibility of a noncontextual model. We focus on algebraic state-independent proofs, using the Peres-Mermin square as our illustrative example. Our technique yields the necessary and sufficient conditions for a particular set of correlations (between the preparations and the measurements) to admit a noncontextual model. The inequalities thus derived are demonstrably robust to noise. We specify how experimental data must be processed in order to achieve a test of these inequalities. We also provide a criticism of prior proposals for experimental tests of noncontextuality based on the Peres-Mermin square.

  24. Dax Enshan Koh, Mark D. Penney, and Robert W. Spekkens
    Computing quopit Clifford circuit amplitudes by the sum-over-paths technique
    Quantum Inf. Comput. 17 (13-14) 1081-1095 (2017)
    [arXiv:1702.03316 (quant-ph)]

    By the Gottesman-Knill Theorem, the outcome probabilities of Clifford circuits can be computed efficiently. We present an alternative proof of this result for quopit Clifford circuits (i.e., Clifford circuits on collections of p-level systems, where p is an odd prime) using Feynman's sum-over-paths technique, which allows the amplitudes of arbitrary quantum circuits to be expressed in terms of a weighted sum over computational paths. For a general quantum circuit, the sum over paths contains an exponential number of terms, and no efficient classical algorithm is known that can compute the sum. For quopit Clifford circuits, however, we show that the sum over paths takes a special form: it can be expressed as a product of Weil sums with quadratic polynomials, which can be computed efficiently. This provides a method for computing the outcome probabilities and amplitudes of such circuits efficiently, and is an application of the circuit-polynomial correspondence which relates quantum circuits to low-degree polynomials.

  25. John-Mark A. Allen, Jonathan Barrett, Dominic C. Horsman, Ciaran M. Lee, and Robert W. Spekkens
    Quantum common causes and quantum causal models
    Phys. Rev. X 7, 031021 (2017)
    [arXiv:1609.09487 (quant-ph)]

    Reichenbach's principle asserts that if two observed variables are found to be correlated, then there should be a causal explanation of these correlations. Furthermore, if the explanation is in terms of a common cause, then the conditional probability distribution over the variables given the complete common cause should factorize. The principle is generalized by the formalism of causal models, in which the causal relationships among variables constrain the form of their joint probability distribution. In the quantum case, however, the observed correlations in Bell experiments cannot be explained in the manner Reichenbach's principle would seem to demand. Motivated by this, we introduce a quantum counterpart to the principle. We demonstrate that under the assumption that quantum dynamics is fundamentally unitary, if a quantum channel with input A and outputs B and C is compatible with A being a complete common cause of B and C, then it must factorize in a particular way. Finally, we show how to generalize our quantum version of Reichenbach's principle to a formalism for quantum causal models, and provide examples of how the formalism works.

  26. Elie Wolfe, Robert W. Spekkens, and Tobias Fritz
    The Inflation Technique for Causal Inference with Latent Variables
    J. Causal Inference 7(2), (2019)
    [arXiv:1609.00672 (quant-ph)]

    The problem of causal inference is to determine if a given probability distribution on observed variables is compatible with some causal structure. The difficult case is when the structure includes latent variables. We here introduce the inflation technique for tackling this problem. An inflation of a causal structure is a new causal structure that can contain multiple copies of each of the original variables, but where the ancestry of each copy mirrors that of the original. For every distribution compatible with the original causal structure we identify a corresponding family of distributions, over certain subsets of inflation variables, which is compatible with the inflation structure. It follows that compatibility constraints at the inflation level can be translated to compatibility constraints at the level of the original causal structure; even if the former are weak, such as observable statistical independences implied by disjoint causal ancestry, the translated constraints can be strong. In particular, we can derive inequalities whose violation by a distribution witnesses that distribution's incompatibility with the causal structure (of which Bell inequalities and Pearl's instrumental inequality are prominent examples). We describe an algorithm for deriving all of the inequalities for the original causal structure that follow from ancestral independences in the inflation. Applied to an inflation of the Triangle scenario with binary variables, it yields inequalities that are stronger in at least some aspects than those obtainable by existing methods. We also describe an algorithm that derives a weaker set of inequalities but is much more efficient. Finally, we discuss which inflations are such that the inequalities one obtains from them remain valid even for quantum (and post-quantum) generalizations of the notion of a causal model.

  27. Dominic Horsman, Chris Heunen, Matthew F. Pusey, Jonathan Barrett, Robert W. Spekkens
    Can a quantum state over time resemble a quantum state at a single time?
    Proc. R. Soc. A 473(2205), p. 20170395 (2017)
    [arXiv:1607.03637 (quant-ph)]

    Standard quantum theory represents a composite system at a given time by a joint state, but it does not prescribe a joint state for a composite of systems at different times. If a more even-handed treatment of space and time is possible, then such a joint state should be definable, and one might expect it to satisfy the following five conditions: that it is a Hermitian operator on the tensor product of the single-time Hilbert spaces; that it represents probabilistic mixing appropriately; that it has the appropriate classical limit; that it has the appropriate single-time marginals; that composing over multiple time-steps is associative. We show that no construction satisfies all these requirements. If an even-handed treatment of space and time is possible, therefore, one or more axioms must be dropped. In particular, if Hermiticity is dropped, then we show that the construction is fixed uniquely up to an ordering convention.

  28. Jean-Philippe W. MacLean, Katja Ried, Robert W. Spekkens, Kevin J. Resch
    Quantum-coherent mixtures of causal relations
    Nat. Commun. 8, 15149 (2017)
    [arXiv:1606.04523 (quant-ph)]

    Understanding the causal influences that hold among parts of a system is critical both to explaining that system’s natural behaviour and to controlling it through targeted interventions. In a quantum world, understanding causal relations is equally important, but the set of possibilities is far richer. The two basic ways in which a pair of time-ordered quantum systems may be causally related are by a cause-effect mechanism or by a common-cause acting on both. Here we show a coherent mixture of these two possibilities. We realize this nonclassical causal relation in a quantum optics experiment and derive a set of criteria for witnessing the coherence based on a quantum version of Berkson’s effect, whereby two independent causes can become correlated on observation of their common effect. The interplay of causality and quantum theory lies at the heart of challenging foundational puzzles, including Bell’s theorem and the search for quantum gravity.

  29. Mark D. Penney, Dax Enshan Koh, and Robert W. Spekkens
    Quantum circuit dynamics via path integrals: Is there a classical action for discrete-time paths?
    New J. Phys. 19, 073006 (2017)
    [arXiv:1604.07452 (quant-ph)]

    It is straightforward to give a sum-over-paths expression for the transition amplitudes of a quantum circuit as long as the gates in the circuit are balanced, where to be balanced is to have all nonzero transition amplitudes of equal magnitude. Here we consider the question of whether, for such circuits, the relative phases of different discrete-time paths through the configuration space can be defined in terms of a classical action, as they are for continuous-time paths. We show how to do so for certain kinds of quantum circuits, namely, Clifford circuits where the elementary systems are continuous-variable systems or discrete systems of odd-prime dimension. These types of circuit are distinguished by having phase-space representations that serve to define their classical counterparts. For discrete systems, the phase-space coordinates are also discrete variables. We show that for each gate in the generating set, one can associate a symplectomorphism on the phase-space and to each of these one can associate a generating function, defined on two copies of the configuration space. For discrete systems, the latter association is achieved using tools from algebraic geometry. Finally, we show that if the action functional for a discrete-time path through a sequence of gates is defined using the sum of the corresponding generating functions, then it yields the correct relative phases for the path-sum expression. These results are likely to be relevant for quantizing physical theories where time is fundamentally discrete, characterizing the classical limit of discrete-time quantum dynamics, and proving complexity results for quantum circuits.

  30. Iman Marvian and Robert W. Spekkens
    How to quantify coherence: distinguishing speakable and unspeakable notions
    Phys. Rev. A 94, 052324 (2016)
    [arXiv:1602.08049 (quant-ph)]

    Quantum coherence is a critical resource for many operational tasks. Understanding how to quantify and manipulate it also promises to have applications for a diverse set of problems in theoretical physics. For certain applications, however, one requires coherence between the eigenspaces of specific physical observables, such as energy, angular momentum, or photon number, and it makes a difference which eigenspaces appear in the superposition. For others, there is a preferred set of subspaces relative to which coherence is deemed a resource, but it is irrelevant which of the subspaces appear in the superposition. We term these two types of coherence unspeakable and speakable respectively. We argue that a useful approach to quantifying and characterizing unspeakable coherence is provided by the resource theory of asymmetry when the symmetry group is a group of translations, and we translate a number of prior results on asymmetry into the language of coherence. We also highlight some of the applications of this approach, for instance, in the context of quantum metrology, quantum speed limits, quantum thermodynamics, and NMR. The question of how best to treat speakable coherence as a resource is also considered. We review a popular approach in terms of operations that preserve the set of incoherent states, propose an alternative approach in terms of operations that are covariant under dephasing, and we outline the challenge of providing a physical justification for either approach. Finally, we note some mathematical connections that hold among the different approaches to quantifying coherence.

  31. Iman Marvian, Robert W. Spekkens, and Paolo Zanardi
    Quantum speed limits, coherence and asymmetry
    Phys. Rev. A 93, 052331 (2016)
    [arXiv:1510.06474 (quant-ph)]

    The resource theory of asymmetry is a framework for classifying and quantifying the symmetry-breaking properties of both states and operations relative to a given symmetry. In the special case where the symmetry is the set of translations generated by a fixed observable, asymmetry can be interpreted as coherence relative to the observable eigenbasis, and the resource theory of asymmetry provides a framework to study this notion of coherence. We here show that this notion of coherence naturally arises in the context of quantum speed limits. Indeed, the very concept of speed of evolution, i.e., the inverse of the minimum time it takes the system to evolve to another (partially) distinguishable state, is a measure of asymmetry relative to the time translations generated by the system Hamiltonian. Furthermore, the celebrated Mandelstam-Tamm and Margolus-Levitin speed limits can be interpreted as upper bounds on this measure of asymmetry by functions which are themselves measures of asymmetry in the special case of pure states. Using measures of asymmetry that are not restricted to pure states, such as the Wigner-Yanase skew information, we obtain extensions of the Mandelstam-Tamm bound which are significantly tighter in the case of mixed states. We also clarify some confusions in the literature about coherence and asymmetry, and show that measures of coherence are a proper subset of measures of asymmetry.

  32. Ravi Kunjwal and Robert W. Spekkens
    From the Kochen-Specker theorem to noncontextuality inequalities without assuming determinism
    Phys. Rev. Lett. 115, 110403 (2015)
    [arXiv:1506.04150 (quant-ph)]

    The Kochen-Specker theorem demonstrates that it is not possible to reproduce the predictions of quantum theory in terms of a hidden variable model where the hidden variables assign a value to every projector deterministically and noncontextually. A noncontextual value-assignment to a projector is one that does not depend on which other projectors - the context - are measured together with it. Using a generalization of the notion of noncontextuality that applies to both measurements and preparations, we propose a scheme for deriving inequalities that test whether a given set of experimental statistics is consistent with a noncontextual model. Unlike previous inequalities inspired by the Kochen-Specker theorem, we do not assume that the value-assignments are deterministic and therefore in the face of a violation of our inequality, the possibility of salvaging noncontextuality by abandoning determinism is no longer an option. Our approach is operational in the sense that it does not presume quantum theory: a violation of our inequality implies the impossibility of a noncontextual model for any operational theory that can account for the experimental observations, including any successor to quantum theory.

  33. Ciaran M. Lee and Robert W. Spekkens
    Causal inference via algebraic geometry: feasibility tests for functional causal structures with two binary observed variables
    J. Causal Inference 5(2) (2017)
    [arXiv:1506.03880 (stat.ML)]

    We provide a scheme for inferring causal relations from uncontrolled statistical data based on tools from computational algebraic geometry, in particular, the computation of Groebner bases. We focus on causal structures containing just two observed variables, each of which is binary. We consider the consequences of imposing different restrictions on the number and cardinality of latent variables and of assuming different functional dependences of the observed variables on the latent ones (in particular, the noise need not be additive). We provide an inductive scheme for classifying functional causal structures into distinct observational equivalence classes. For each observational equivalence class, we provide a procedure for deriving constraints on the joint distribution that are necessary and sufficient conditions for it to arise from a model in that class. We also demonstrate how this sort of approach provides a means of determining which causal parameters are identifiable and how to solve for these. Prospects for expanding the scope of our scheme, in particular to the problem of quantum causal inference, are also discussed.

  34. Michael D. Mazurek, Matthew F. Pusey, Ravi Kunjwal, Kevin J. Resch, Robert W. Spekkens
    An experimental test of noncontextuality without unphysical idealizations
    Nat. Commun. 7, 11780 (2016)
    [arXiv:1505.06244 (quant-ph)]

    To make precise the sense in which nature fails to respect classical physics, one requires a formal notion of classicality. Ideally, such a notion should be defined operationally, so that it can be subjected to a direct experimental test, and it should be applicable in a wide variety of experimental scenarios, so that it can cover the breadth of phenomena that are thought to defy classical understanding. Bell's notion of local causality fulfills the first criterion but not the second. The notion of noncontextuality fulfills the second criterion, but it is a long-standing question whether it can be made to fulfill the first. Previous attempts to experimentally test noncontextuality have all presumed certain idealizations that do not hold in real experiments, namely, noiseless measurements and exact operational equivalences. We here show how to devise tests that are free of these idealizations. We also perform a photonic implementation of one such test that rules out noncontextual models with high confidence.

  35. Bob Coecke, Tobias Fritz and Robert W. Spekkens
    A mathematical theory of resources
    Information and Computation 250, 59 (2016).
    [arXiv:1409.5531 (quant-ph)]

    In many different fields of science, it is useful to characterize physical states and processes as resources. Chemistry, thermodynamics, Shannon's theory of communication channels, and the theory of quantum entanglement are prominent examples. Questions addressed by a theory of resources include: Which resources can be converted into which other ones? What is the rate at which arbitrarily many copies of one resource can be converted into arbitrarily many copies of another? Can a catalyst help in making an impossible transformation possible? How does one quantify the resource? Here, we propose a general mathematical definition of what constitutes a resource theory. We prove some general theorems about how resource theories can be constructed from theories of processes wherein there is a special class of processes that are implementable at no cost and which define the means by which the costly states and processes can be interconverted one to another. We outline how various existing resource theories fit into our framework. Our abstract characterization of resource theories is a first step in a larger project of identifying universal features and principles of resource theories. In this vein, we identify a few general results concerning resource convertibility.

  36. Robert W. Spekkens
    Quasi-quantization: classical statistical theories with an epistemic restriction
    Published in book "Quantum Theory: Informational foundations and foils", eds. G. Chiribella and R. W. Spekkens, Springer (2016)
    [arXiv:1409.5041 (quant-ph)]

    A significant part of quantum theory can be obtained from a single innovation relative to classical theories, namely, that there is a fundamental restriction on the sorts of statistical distributions over physical states that can be prepared. This is termed an "epistemic restriction" because it implies a fundamental limit on the amount of knowledge that any observer can have about the physical state of a classical system. This article provides an overview of epistricted theories, that is, theories that start from a classical statistical theory and apply an epistemic restriction. We consider both continuous and discrete degrees of freedom, and show that a particular epistemic restriction called classical complementarity provides the beginning of a unification of all known epistricted theories. This restriction appeals to the symplectic structure of the underlying classical theory and consequently can be applied to an arbitrary classical degree of freedom. As such, it can be considered as a kind of quasi-quantization scheme; "quasi" because it generally only yields a theory describing a subset of the preparations, transformations and measurements allowed in the full quantum theory for that degree of freedom, and because in some cases, such as for binary variables, it yields a theory that is a distortion of such a subset. Finally, we propose to classify quantum phenomena as weakly or strongly nonclassical by whether or not they can arise in an epistricted theory.

  37. Katja Ried, Megan Agnew, Lydia Vermeyden, Dominik Janzing, Robert W. Spekkens, Kevin J. Resch
    A quantum advantage for inferring causal structure
    Nature Physics 11, 414 (2015)
    [arXiv:1406.5036 (quant-ph)]

    The problem of using observed correlations to infer causal relations is relevant to a wide variety of scientific disciplines. Yet given correlations between just two classical variables, it is impossible to determine whether they arose from a causal influence of one on the other or a common cause influencing both, unless one can implement a randomized intervention. We here consider the problem of causal inference for quantum variables. We introduce causal tomography, which unifies and generalizes conventional quantum tomography schemes to provide a complete solution to the causal inference problem using a quantum analogue of a randomized trial. We furthermore show that, in contrast to the classical case, observed quantum correlations alone can sometimes provide a solution. We implement a quantum-optical experiment that allows us to control the causal relation between two optical modes, and two measurement schemes---one with and one without randomization---that extract this relation from the observed correlations. Our results show that entanglement and coherence, known to be central to quantum information processing, also provide a quantum advantage for causal inference.

  38. Iman Marvian and Robert W. Spekkens
    Extending Noether's theorem by quantifying the asymmetry of quantum states
    Nature Communications 5, 3821 (2014)
    [arXiv:1404.3236 (quant-ph)]

    Noether’s theorem is a fundamental result in physics stating that every symmetry of the dynamics implies a conservation law. It is, however, deficient in several respects: for one, it is not applicable to dynamics wherein the system interacts with an environment; furthermore, even in the case where the system is isolated, if the quantum state is mixed then the Noether conservation laws do not capture all of the consequences of the symmetries. Here we address these deficiencies by introducing measures of the extent to which a quantum state breaks a symmetry. Such measures yield novel constraints on state transitions: for nonisolated systems they cannot increase, whereas for isolated systems they are conserved. We demonstrate that the problem of finding non-trivial asymmetry measures can be solved using the tools of quantum information theory. Applications include deriving model-independent bounds on the quantum noise in amplifiers and assessing quantum schemes for achieving high-precision metrology.

  39. Robert W. Spekkens
    The status of determinism in proofs of the impossibility of a noncontextual model of quantum theory
    Found. Phys. 44, 1125 (2014)
    [arXiv: 1312.3667 (quant-ph)]

    In order to claim that one has experimentally tested whether a noncontextual ontological model could underlie certain measurement statistics in quantum theory, it is necessary to have a notion of noncontextuality that applies to unsharp measurements, i.e., those that can only be represented by positive operator-valued measures rather than projection-valued measures. This is because any realistic measurement necessarily has some nonvanishing amount of noise and therefore never achieves the ideal of sharpness. Assuming a generalized notion of noncontextuality that applies to arbitrary experimental procedures, it is shown that the outcome of a measurement depends deterministically on the ontic state of the system being measured if and only if the measurement is sharp. Hence for every unsharp measurement, its outcome necessarily has an indeterministic dependence on the ontic state. We defend this proposal against alternatives. In particular, we demonstrate why considerations parallel to Fine’s theorem do not challenge this conclusion.

  40. Iman Marvian and Robert W. Spekkens
    Modes of asymmetry: the application of harmonic analysis to symmetric quantum dynamics and quantum reference frames
    Phys. Rev. A 90, 062110 (2014)
    [arXiv:1312.0680 (quant-ph)]

    Finding the consequences of symmetry for open-system quantum dynamics is a problem with broad applications, including describing thermal relaxation, deriving quantum limits on the performance of amplifiers, and exploring quantum metrology in the presence of noise. The symmetry of the dynamics may reflect a symmetry of the fundamental laws of nature or a symmetry of a low-energy effective theory, or it may describe a practical restriction such as the lack of a reference frame. In this paper, we apply some tools of harmonic analysis together with ideas from quantum information theory to this problem. The central idea is to study the decomposition of quantum operations—in particular, states, measurements, and channels—into different modes, which we call modes of asymmetry. Under symmetric processing, a given mode of the input is mapped to the corresponding mode of the output, implying that one can only generate a given output if the input contains all of the necessary modes. By defining monotones that quantify the asymmetry in a particular mode, we also derive quantitative constraints on the resources of asymmetry that are required to simulate a given asymmetric operation. We present applications of our results for deriving bounds on the probability of success in nondeterministic state transitions, such as quantum amplification, and a simplified formalism for studying the degradation of quantum reference frames.

  41. M. S. Leifer and Robert W. Spekkens
    Towards a formulation of quantum theory as a causally neutral theory of Bayesian inference
    Phys. Rev. A 88, 052130 (2013)
    [arXiv:1107.5849 (quant-ph)]

    Quantum theory can be viewed as a generalization of classical probability theory, but the analogy as it has been developed so far is not complete. Whereas the manner in which inferences are made in classical probability theory is independent of the causal relation that holds between the conditioned variable and the conditioning variable, in the conventional quantum formalism, there is a significant difference between how one treats experiments involving two systems at a single time and those involving a single system at two times. In this article, we develop the formalism of quantum conditional states, which provides a unified description of these two sorts of experiment. In addition, concepts that are distinct in the conventional formalism become unified: Channels, sets of states, and positive operator valued measures are all seen to be instances of conditional states; the action of a channel on a state, ensemble averaging, the Born rule, the composition of channels, and nonselective state-update rules are all seen to be instances of belief propagation. Using a quantum generalization of Bayes’ theorem and the associated notion of Bayesian conditioning, we also show that the remote steering of quantum states can be described within our formalism as a mere updating of beliefs about one system given new information about another, and retrodictive inferences can be expressed using the same belief propagation rule as is used for predictive inferences. Finally, we show that previous arguments for interpreting the projection postulate as a quantum generalization of Bayesian conditioning are based on a misleading analogy and that it is best understood as a combination of belief propagation (corresponding to the nonselective state-update map) and conditioning on the measurement outcome.

  42. Gilad Gour, Markus P. Muller, Varun Narasimhachar, Robert W. Spekkens, Nicole Yunger Halpern
    The resource theory of informational nonequilibrium in thermodynamics
    Phys. Rep. 583, 1 (2015)
    [arXiv:1309.6586 (quant-ph)]

    We review recent work on the foundations of thermodynamics in the light of quantum information theory. We adopt a resource-theoretic perspective, wherein thermodynamics is formulated as a theory of what agents can achieve under a particular restriction, namely, that the only state preparations and transformations that they can implement for free are those that are thermal at some fixed temperature. States that are out of thermal equilibrium are the resources. We consider the special case of this theory wherein all systems have trivial Hamiltonians (that is, all of their energy levels are degenerate). In this case, the only free operations are those that add noise to the system (or implement a reversible evolution) and the only nonequilibrium states are states of informational nonequilibrium, that is, states that deviate from the maximally mixed state. The degree of this deviation we call the state's nonuniformity; it is the resource of interest here, the fuel that is consumed, for instance, in an erasure operation. We consider the different types of state conversion: exact and approximate, single-shot and asymptotic, catalytic and noncatalytic. In each case, we present the necessary and sufficient conditions for the conversion to be possible for any pair of states, emphasizing a geometrical representation of the conditions in terms of Lorenz curves. We also review the problem of quantifying the nonuniformity of a state, in particular through the use of generalized entropies. Quantum state conversion problems in this resource theory can be shown to be always reducible to their classical counterparts, so that there are no inherently quantum-mechanical features arising in such problems. This body of work also demonstrates that the standard formulation of the second law of thermodynamics is inadequate as a criterion for deciding whether or not a given state transition is possible.

  43. Iman Marvian and Robert W. Spekkens
    The theory of manipulations of pure state asymmetry I: basic tools and equivalence classes of states under symmetric operations
    New J. Phys. 15, 033001 (2013)
    [arXiv:1104.0018 (quant-ph)]

    If a system undergoes symmetric dynamics, then the final state of the system can only break the symmetry in ways in which it was broken by the initial state, and its measure of asymmetry can be no greater than that of the initial state. It follows that for the purpose of understanding the consequences of symmetries of dynamics, in particular, complicated and open-system dynamics, it is useful to introduce the notion of a state's asymmetry properties, which includes the type and measure of its asymmetry. We demonstrate and exploit the fact that the asymmetry properties of a state can also be understood in terms of information-theoretic concepts, for instance in terms of the state's ability to encode information about an element of the symmetry group. We show that the asymmetry properties of a pure state psi relative to the symmetry group G are completely specified by the characteristic function of the state, defined as chi_psi(g) = \< psi|U(g)|psi \> where g is in G and U is the unitary representation of interest. For a symmetry described by a compact Lie group G, we show that two pure states can be reversibly interconverted one to the other by symmetric operations if and only if their characteristic functions are equal up to a one-dimensional representation of the group. Characteristic functions also allow us to easily identify the conditions for one pure state to be converted to another by symmetric operations (in general irreversibly) for the various paradigms of single-copy transformations: deterministic, state-to-ensemble, stochastic and catalyzed.

  44. Robert W. Spekkens
    The paradigm of kinematics and dynamics must yield to causal structure
    In "Questioning the Foundations of Physics: Which of Our Fundamental Assumptions Are Wrong?" edited by A. Aguirre, B. Foster, and Z. Merali
    [arXiv:1209.0023 (quant-ph)]

    The distinction between a theory's kinematics and its dynamics, that is, between the space of physical states it posits and its law of evolution, is central to the conceptual framework of many physicists. A change to the kinematics of a theory, however, can be compensated by a change to its dynamics without empirical consequence, which strongly suggests that these features of the theory, considered separately, cannot have physical significance. It must therefore be concluded (with apologies to Minkowski) that henceforth kinematics by itself, and dynamics by itself, are doomed to fade away into mere shadows, and only a kind of union of the two will preserve an independent reality. The notion of causal structure seems to provide a good characterization of this union.

  45. Christopher J. Wood and Robert W. Spekkens
    The lesson of causal discovery algorithms for quantum correlations: Causal explanations of Bell-inequality violations require fine-tuning
    New J. Phys. 17, 033002 (2015)
    [arXiv:1208.4119 (quant-ph)]

    An active area of research in the fields of machine learning and statistics is the development of causal discovery algorithms, the purpose of which is to infer the causal relations that hold among a set of variables from the correlations that these exhibit . We apply some of these algorithms to the correlations that arise for entangled quantum systems. We show that they cannot distinguish correlations that satisfy Bell inequalities from correlations that violate Bell inequalities, and consequently that they cannot do justice to the challenges of explaining certain quantum correlations causally. Nonetheless, by adapting the conceptual tools of causal inference, we can show that any attempt to provide a causal explanation of nonsignalling correlations that violate a Bell inequality must contradict a core principle of these algorithms, namely, that an observed statistical independence between variables should not be explained by fine-tuning of the causal parameters. In particular, we demonstrate the need for such fine-tuning for most of the causal mechanisms that have been proposed to underlie Bell correlations, including superluminal causal influences, superdeterminism (that is, a denial of freedom of choice of settings), and retrocausal influences which do not introduce causal cycles.

  46. Iman Marvian and Robert W. Spekkens
    A generalization of Schur-Weyl duality with applications in quantum estimation
    Comm. Math. Phys. 331, 431 (2014)
    [arXiv:1112.0638 (quant-ph)]

    Schur-Weyl duality is a powerful tool in representation theory which has many applications to quantum information theory. We provide a generalization of this duality and demonstrate some of its applications. In particular, we use it to develop a general framework for the study of a family of quantum estimation problems wherein one is given n copies of an unknown quantum state according to some prior and the goal is to estimate certain parameters of the given state. In particular, we are interested to know whether collective measurements are useful and if so to find an upper bound on the amount of entanglement which is required to achieve the optimal estimation. In the case of pure states, we show that commutativity of the set of observables that define the estimation problem implies the sufficiency of unentangled measurements.

  47. Stephen D. Bartlett, Terry Rudolph, Robert W. Spekkens
    Reconstruction of Gaussian quantum mechanics from Liouville mechanics with an epistemic restriction
    Phys. Rev. A 86, 012103 (2012)
    [arXiv:1111.5057 (quant-ph)]

    How would the world appear to us if its ontology was that of classical mechanics but every agent faced a restriction on how much they could come to know about the classical state? We show that in most respects, it would appear to us as quantum. The statistical theory of classical mechanics, which specifies how probability distributions over phase space evolve under Hamiltonian evolution and under measurements, is typically called Liouville mechanics, so the theory we explore here is Liouville mechanics with an epistemic restriction. The particular epistemic restriction we posit as our foundational postulate specifies two constraints. The first constraint is a classical analogue of Heisenberg's uncertainty principle -- the second-order moments of position and momentum defined by the phase-space distribution that characterizes an agent's knowledge are required to satisfy the same constraints as are satisfied by the moments of position and momentum observables for a quantum state. The second constraint is that the distribution should have maximal entropy for the given moments. Starting from this postulate, we derive the allowed preparations, measurements and transformations and demonstrate that they are isomorphic to those allowed in Gaussian quantum mechanics and generate the same experimental statistics. We argue that this reconstruction of Gaussian quantum mechanics constitutes additional evidence in favour of a research program wherein quantum states are interpreted as states of incomplete knowledge, and that the phenomena that do not arise in Gaussian quantum mechanics provide the best clues for how one might reconstruct the full quantum theory.

  48. Fernando G. S. L. Brandao, Michal Horodecki, Jonathan Oppenheim, Joseph M. Renes, Robert W. Spekkens
    Resource Theory of Quantum States Out of Thermal Equilibrium
    Phys. Rev. Lett. 111, 250404 (2013)
    [arXiv:1111.3882 (quant-ph)]

    The ideas of thermodynamics have proved fruitful in the setting of quantum information theory, in particular the notion that when the allowed transformations of a system are restricted, certain states of the system become useful resources with which one can prepare previously inaccessible states. The theory of entanglement is perhaps the best-known and most well-understood resource theory in this sense. Here we return to the basic questions of thermodynamics using the formalism of resource theories developed in quantum information theory and show that the free energy of thermodynamics emerges naturally from the resource theory of energy-preserving transformations. Specifically, the free energy quantifies the amount of useful work which can be extracted from asymptotically-many copies of a quantum system when using only reversible energy-preserving transformations and a thermal bath at fixed temperature. The free energy also quantifies the rate at which resource states can be reversibly interconverted asymptotically, provided that a sublinear amount of coherent superposition over energy levels is available, a situation analogous to the sublinear amount of classical communication required for entanglement dilution.

  49. M. S. Leifer and Robert W. Spekkens
    A Bayesian approach to compatibility, improvement, and pooling of quantum states
    J. Phys. A: Math. Theor. 47 , 275301 (2014)
    [arXiv:1110.1085 (quant-ph)]

    In approaches to quantum theory in which the quantum state is regarded as a representation of knowledge, information, or belief, two agents can assign different states to the same quantum system. This raises two questions: when are such state assignments compatible? And how should the state assignments of different agents be reconciled? In this paper, we address these questions from the perspective of the recently developed conditional states formalism for quantum theory (Leifer M S and Spekkens R W 2013 Phys. Rev. A 88 052310). Specifically, we derive a compatibility criterion proposed by Brun, Finkelstein and Mermin from the requirement that, upon acquiring data, agents should update their states using a quantum generalization of Bayesian conditioning. We provide two alternative arguments for this criterion, based on the objective and subjective Bayesian interpretations of probability theory. We then apply the same methodology to the problem of quantum state improvement, i.e. how to update your state when you learn someone else's state assignment, and to quantum state pooling, i.e. how to combine the state assignments of several agents into a single assignment that accurately represents the views of the group. In particular, we derive a pooling rule previously proposed by Spekkens and Wiseman under much weaker assumptions than those made in the original derivation. All of our results apply to a much broader class of experimental scenarios than have been considered previously in this context.

  50. Iman Marvian and Robert W. Spekkens
    Asymmetry properties of pure quantum states
    Phys. Rev. A 90, 014102 (2014)
    [arXiv:1105.1816 (quant-ph)]

    The asymmetry properties of a state relative to some symmetry group specify how and to what extent the given symmetry is broken by the state. Characterizing these is found to be surprisingly useful for addressing a very common problem: to determine what follows from a system's dynamics (possibly open) having that symmetry. We demonstrate and exploit the fact that the asymmetry properties of a state can also be understood in terms of information-theoretic concepts. We show that for a pure state psi and a symmetry group G, they are completely specified by the characteristic function of the state, defined as chi_psi(g)= where g\in G and U is the unitary representation of interest. Based on this observation, we study several important problems about the interconversion of pure states under symmetric dynamics such as determining the conditions for reversible transformations, deterministic irreversible transformations and asymptotic transformations.

  51. Bob Coecke and Robert W. Spekkens
    Picturing classical and quantum Bayesian inference
    Synthese 186, 651 (2012)
    [arXiv:1102.2368 (quant-ph)]

    We introduce a graphical framework for Bayesian inference that is sufficiently general to accommodate not just the standard case but also recent proposals for a theory of quantum Bayesian inference wherein one considers density operators rather than probability distributions as representative of degrees of belief. The diagrammatic framework is stated in the graphical language of symmetric monoidal categories and of compact structures and Frobenius structures therein, in which Bayesian inversion boils down to transposition with respect to an appropriate compact structure. We characterize classical Bayesian inference in terms of a graphical property and demonstrate that our approach eliminates some purely conventional elements that appear in common representations thereof, such as whether degrees of belief are represented by probabilities or entropic quantities. We also introduce a quantum-like calculus wherein the Frobenius structure is noncommutative and show that it can accommodate Leifer's calculus of `conditional density operators'. The notion of conditional independence is also generalized to our graphical setting and we make some preliminary connections to the theory of Bayesian networks. Finally, we demonstrate how to construct a graphical Bayesian calculus within any dagger compact category.

  52. Yeong-Cherng Liang, Robert W. Spekkens, Howard M. Wiseman
    Specker's Parable of the Over-protective Seer: A Road to Contextuality, Nonlocality and Complementarity
    Physics Reports 506, 1 (2011)
    [arXiv:1010.1273 (quant-ph)]

    In 1960, the mathematician Ernst Specker described a simple example of nonclassical correlations which he dramatized using a parable about a seer who sets an impossible prediction task to his daughter's suitors. We revisit this example here, using it as an entree to three central concepts in quantum foundations: contextuality, Bell-nonlocality, and complementarity. Specifically, we show that Specker's parable offers a narrative thread that weaves together a large number of results, including: the impossibility of measurement-noncontextual and outcome-deterministic ontological models of quantum theory (the Kochen-Specker theorem), in particular the proof of Klyachko; the impossibility of Bell-local models of quantum theory (Bell's theorem), especially the proofs by Mermin and Hardy; the impossibility of a preparation-noncontextual ontological model of quantum theory; and the existence of triples of positive operator valued measures (POVMs) that can be measured jointly pairwise but not triplewise. Along the way, several novel results are presented, including: a generalization of a theorem by Fine connecting the existence of a joint distribution over outcomes of counterfactual measurements to the existence of a noncontextual model; a generalization of Klyachko's proof of the Kochen-Specker theorem; a proof of the Kochen-Specker theorem in the style of Hardy's proof of Bell's theorem; a categorization of contextual and Bell-nonlocal correlations in terms of frustrated networks; a new inequality testing preparation noncontextuality; and lastly, some results on the joint measurability of POVMs and the question of whether these can be modeled noncontextually. Finally, we emphasize that Specker's parable provides a novel type of foil to quantum theory, challenging us to explain why the particular sort of contextuality and complementarity embodied therein does not arise in a quantum world.

  53. Lucien Hardy and Robert Spekkens
    Why Physics Needs Quantum Foundations
    Physics in Canada 66, 73 (2010)
    [arXiv:1003.5008 (quant-ph)]

    We discuss the motivation for pursuing research on the foundations of quantum theory.

  54. Bob Coecke, Bill Edwards, Robert W. Spekkens
    Phase groups and the origin of non-locality for qubits
    Electronic Notes in Theoretical Computer Science 270, 15 (2011)
    [arXiv:1003.5005 (quant-ph)]

    We describe a general framework in which we can precisely compare the structures of quantum-like theories which may initially be formulated in quite different mathematical terms. We then use this framework to compare two theories: quantum mechanics restricted to qubit stabiliser states and operations, and Spekkens's toy theory. We discover that viewed within our framework these theories are very similar, but differ in one key aspect - a four element group we term the phase group which emerges naturally within our framework. In the case of the stabiliser theory this group is Z4 while for Spekkens's toy theory the group is Z2 x Z2. We further show that the structure of this group is intimately involved in a key physical difference between the theories: whether or not they can be modelled by a local hidden variable theory. This is done by establishing a connection between the phase group, and an abstract notion of GHZ state correlations. We go on to formulate precisely how the stabiliser theory and toy theory are `similar' by defining a notion of `mutually unbiased qubit theory', noting that all such theories have four element phase groups. Since Z4 and Z2 x Z2 are the only such groups we conclude that the GHZ correlations in this type of theory can only take two forms, exactly those appearing in the stabiliser theory and in Spekkens's toy theory. The results point at a classification of local/non-local behaviours by finite Abelian groups, extending beyond qubits to finitary theories whose observables are all mutually unbiased.

  55. Howard Barnum, Jonathan Barrett, Lisa Orloff Clark, Matthew Leifer , Robert Spekkens, Nicholas Stepanik, Alex Wilce and Robin Wilke
    Entropy and information causality in general probabilistic theories
    New J. Phys. 12, 033024 (2010).
    [arXiv:0909.5075 (quant-ph)]

    We investigate the concept of entropy in probabilistic theories more general than quantum mechanics, with particular reference to the notion of information causality (IC) recently proposed by Pawlowski et al (2009 arXiv:0905.2292). We consider two entropic quantities, which we term measurement and mixing entropy. In the context of classical and quantum theory, these coincide, being given by the Shannon and von Neumann entropies, respectively; in general, however, they are very different. In particular, while measurement entropy is easily seen to be concave, mixing entropy need not be. In fact, as we show, mixing entropy is not concave whenever the state space is a non-simplicial polytope. Thus, the condition that measurement and mixing entropies coincide is a strong constraint on possible theories. We call theories with this property monoentropic. Measurement entropy is subadditive, but not in general strongly subadditive. Equivalently, if we define the mutual information between two systems A and B by the usual formula I(A: B)=H(A)+H(B)-H(AB), where H denotes the measurement entropy and AB is a non-signaling composite of A and B, then it can happen that I(A:BC)<I(A:B). This is relevant to IC in the sense of Pawlowski et al: we show that any monoentropic non-signaling theory in which measurement entropy is strongly subadditive, and also satisfies a version of the Holevo bound, is informationally causal, and on the other hand we observe that Popescu–Rohrlich boxes, which violate IC, also violate strong subadditivity. We also explore the interplay between measurement and mixing entropy and various natural conditions on theories that arise in quantum axiomatics.

  56. Nicholas Harrigan and Robert W. Spekkens
    Einstein, incompleteness, and the epistemic view of quantum states
    Found. Phys. 40, 125 (2010)
    [arXiv:0706.2661 (quant-ph)]

    Does the quantum state represent reality or our knowledge of reality? In making this distinction precise, we are led to a novel classification of hidden variable models of quantum theory. Indeed, representatives of each class can be found among existing constructions for two-dimensional Hilbert spaces. Our approach also provides a fruitful new perspective on arguments for the nonlocality and incompleteness of quantum theory. Specifically, we show that for models wherein the quantum state has the status of something real, the failure of locality can be established through an argument considerably more straightforward than Bell's theorem. The historical significance of this result becomes evident when one recognizes that the same reasoning is present in Einstein's preferred argument for incompleteness, which dates back to 1935. This fact suggests that Einstein was seeking not just any completion of quantum theory, but one wherein quantum states are solely representative of our knowledge. Our hypothesis is supported by an analysis of Einstein's attempts to clarify his views on quantum theory and the circumstance of his otherwise puzzling abandonment of an even simpler argument for incompleteness from 1927.

  57. Gilad Gour, Iman Marvian and Robert W. Spekkens
    Measuring the quality of a quantum reference frame: the relative entropy of frameness
    Phys. Rev. A 80, 012307 (2009)
    [arXiv:0901.0943 (quant-ph)]

    In the absence of a reference frame for transformations associated with group G, any quantum state that is noninvariant under the action of G may serve as a token of the missing reference frame. We here present a measure of the quality of such a token: the relative entropy of frameness. This is defined as the relative entropy distance between the state of interest and the nearest G-invariant state. Unlike the relative entropy of entanglement, this quantity is straightforward to calculate, and we find it to be precisely equal to the G-asymmetry, a measure of frameness introduced by Vaccaro et al. It is shown to provide an upper bound on the mutual information between the group element encoded into the token and the group element that may be extracted from it by measurement. In this sense, it quantifies the extent to which the token successfully simulates a full reference frame. We also show that despite a suggestive analogy from entanglement theory, the regularized relative entropy of frameness is zero and therefore does not quantify the rate of interconversion between the token and some standard form of quantum reference frame. Finally, we show how these investigations yield an approach to bounding the relative entropy of entanglement.

  58. Stephen D. Bartlett, Terry Rudolph, Robert W. Spekkens, and Peter S. Turner
    Quantum communication using a bounded-size quantum reference frame
    New J. Phys. 11 063013 (2009)
    arXiv:0812.5040 (quant-ph)

    Typical quantum communication schemes are such that to achieve perfect decoding the receiver must share a reference frame (RF) with the sender. Indeed, if the receiver only possesses a bounded-size quantum token of the sender's RF, then the decoding is imperfect, and we can describe this effect as a noisy quantum channel. We seek here to characterize the performance of such schemes, or equivalently, to determine the effective decoherence induced by having a bounded-size RF. We assume that the token is prepared in a special state that has particularly nice group-theoretic properties and that is near-optimal for transmitting information about the sender's frame. We present a decoding operation, which can be proven to be near-optimal in this case, and we demonstrate that there are two distinct ways of implementing it (corresponding to two distinct Kraus decompositions). In one, the receiver measures the orientation of the RF token and reorients the system appropriately. In the other, the receiver extracts the encoded information from the virtual subsystems that describe the relational degrees of freedom of the system and token. Finally, we provide explicit characterizations of these decoding schemes when the system is a single qubit and for three standard kinds of RF: a phase reference, a Cartesian frame (representing an orthogonal triad of spatial directions), and a reference direction (representing a single spatial direction).

  59. Robert W. Spekkens, D. H. Buzacott, A. J. Keehn, Ben Toner, G. J. Pryde
    Preparation Contextuality Powers Parity-Oblivious Multiplexing
    Phys. Rev. Lett. 102 , 010401 (2009)
    [arXiv:0805.1463 (quant-ph)]

    In a noncontextual hidden variable model of quantum theory, hidden variables determine the outcomes of every measurement in a manner that is independent of how the measurement is implemented. Using a generalization of this notion to arbitrary operational theories and to preparation procedures, we demonstrate that a particular two-party information-processing task, “parity-oblivious multiplexing,” is powered by contextuality in the sense that there is a limit to how well any theory described by a noncontextual hidden variable model can perform. This bound constitutes a “noncontextuality inequality” that is violated by quantum theory. We report an experimental violation of this inequality in good agreement with the quantum predictions. The experimental results also provide the first demonstration of 2-to-1 and 3-to-1 quantum random access codes.

  60. Dennis Kretschmann, David W. Kribs and Robert W. Spekkens
    Complementarity of Private and Correctable Subsystems in Quantum Cryptography and Error Correction
    Phys. Rev. A 78, 032330 (2008)
    [arXiv:0711.3438 (quant-ph)]

    We make an explicit connection between fundamental notions in quantum cryptography and quantum error correction. Error-correcting subsystems (and subspaces) for quantum channels are the key vehicles for contending with noise in physical implementations of quantum information-processing. Private subsystems (and subspaces) for quantum channels play a central role in cryptographic schemes such as quantum secret sharing and private quantum communication. We show that a subsystem is private for a channel precisely when it is correctable for a complementary channel. This result is shown to hold even for approximate notions of private and correctable defined in terms of the diamond norm for superoperators.

  61. Robert W. Spekkens
    Negativity and contextuality are equivalent notions of nonclassicality
    Phys. Rev. Lett. 101, 020401 (2008)
    [arXiv:0710.5549v1 (quant-ph)]

    Two notions of nonclassicality that have been investigated intensively are: (i) negativity, that is, the need to posit negative values when representing quantum states by quasiprobability distributions such as the Wigner representation, and (ii) contextuality, that is, the impossibility of a noncontextual hidden variable model of quantum theory. Although both of these notions were meant to characterize the conditions under which a classical explanation cannot be provided, we demonstrate that they prove inadequate to the task and we argue for a particular way of generalizing and revising them. With the refined version of each in hand, it becomes apparent that they are in fact one and the same. We also demonstrate the impossibility of noncontextuality or non-negativity in quantum theory with a novel proof that is symmetric in its treatment of measurements and preparations.

  62. Gilad Gour and Robert W. Spekkens
    The resource theory of quantum reference frames: manipulations and monotones
    New J. Phys. 10, 033023 (2008)
    [arXiv:0711.0043 (quant-ph)]

    Every restriction on quantum operations defines a resource theory, determining how quantum states that cannot be prepared under the restriction may be manipulated and used to circumvent the restriction. A superselection rule is a restriction that arises through the lack of a classical reference frame and the states that circumvent it (the resource) are quantum reference frames. We consider the resource theories that arise from three types of superselection rule, associated respectively with lacking: (i) a phase reference, (ii) a frame for chirality, and (iii) a frame for spatial orientation. Focussing on pure unipartite quantum states (and in some cases restricting our attention even further to subsets of these), we explore single-copy and asymptotic manipulations. In particular, we identify the necessary and sufficient conditions for a deterministic transformation between two resource states to be possible and, when these conditions are not met, the maximum probability with which the transformation can be achieved. We also determine when a particular transformation can be achieved reversibly in the limit of arbitrarily many copies and find the maximum rate of conversion. A comparison of the three resource theories demonstrates that the extent to which resources can be interconverted decreases as the strength of the restriction increases. Along the way, we introduce several measures of frameness and prove that these are monotonically nonincreasing under various classes of operations that are permitted by the superselection rule.

  63. Robert W. Spekkens and H. M. Wiseman
    Pooling quantum states obtained by indirect measurements
    Phys. Rev. A 75, 042104 (2007)
    [arXiv:quant-ph/0612190]

    We consider the pooling of quantum states when Alice and Bob both have one part of a tripartite system and, on the basis of measurements on their respective parts, each infers a quantum state for the third part S. We denote the conditioned states which Alice and Bob assign to S by alpha and beta respectively, while the unconditioned state of S is rho. The state assigned by an overseer, who has all the data available to Alice and Bob, is omega. The pooler is told only alpha, beta, and rho. We show that for certain classes of tripartite states, this information is enough for her to reconstruct omega by the formula omega \propto alpha rho^{-1} beta. Specifically, we identify two classes of states for which this pooling formula works: (i) all pure states for which the rank of rho is equal to the product of the ranks of the states of Alice's and Bob's subsystems; (ii) all mixtures of tripartite product states that are mutually orthogonal on S.

  64. Stephen D. Bartlett, Terry Rudolph, and Robert W. Spekkens
    Reference frames, superselection rules, and quantum information
    Rev. Mod. Phys. 79, 555 (2007)
    [arXiv:quant-ph/0610030]

    Recently, there has been much interest in a new kind of “unspeakable” quantum information that stands to regular quantum information in the same way that a direction in space or a moment in time stands to a classical bit string: the former can only be encoded using particular degrees of freedom while the latter are indifferent to the physical nature of the information carriers. The problem of correlating distant reference frames, of which aligning Cartesian axes and synchronizing clocks are important instances, is an example of a task that requires the exchange of unspeakable information and for which it is interesting to determine the fundamental quantum limit of efficiency. There have also been many investigations into the information theory that is appropriate for parties that lack reference frames or that lack correlation between their reference frames, restrictions that result in global and local superselection rules. In the presence of these, quantum unspeakable information becomes a new kind of resource that can be manipulated, depleted, quantified, etc. Methods have also been developed to contend with these restrictions using relational encodings, particularly in the context of computation, cryptography, communication, and the manipulation of entanglement. This paper reviews the role of reference frames and superselection rules in the theory of quantum-information processing.

  65. Mark R. Dowling, Stephen D. Bartlett, Terry Rudolph, Robert W. Spekkens
    Observing a coherent superposition of an atom and a molecule
    Phys. Rev. A 74, 052113 (2006)
    [arXiv:quant-ph/0606128]

    We demonstrate that it is possible, in principle, to perform a Ramsey-type interference experiment to exhibit a coherent superposition of a single atom and a diatomic molecule. This gedanken experiment, based on the techniques of Aharonov and Susskind [Phys. Rev. 155, 1428 (1967)], explicitly violates the commonly-accepted superselection rule that forbids coherent superpositions of eigenstates of differing atom number. This interference experiment makes use of a Bose-Einstein condensate as a reference frame with which to perform the coherent operations analogous to Ramsey pulses. We also investigate an analogous gedanken experiment to exhibit a coherent superposition of a single boson and a fermion, violating the commonly-accepted superselection rule forbidding coherent superpositions of states of differing particle statistics; in this case, the reference frame is realized by a multi-mode state of many fermions. This latter case reproduces all of the relevant features of Ramsey interferometry, including Ramsey fringes over many repetitions of the experiment. However, the apparent inability of this proposed experiment to produce well-defined relative phases between two distinct systems each described by a coherent superposition of a boson and a fermion demonstrates that there are additional, outstanding requirements to fully ``lift'' the univalence superselection rule.

  66. David W. Kribs and Robert W. Spekkens
    Quantum Error-Correcting Subsystems are Unitarily Recoverable Subsystems
    Phys. Rev. A 74, 042329 (2006)
    [arXiv:quant-ph/0608045]

    We show that every correctable subsystem for an arbitrary noise operation can be recovered by a unitary operation, where the notion of recovery is more relaxed than the notion of correction insofar as it does not protect the subsystem from subsequent iterations of the noise. As an application, we demonstrate that the noiseless subsystems for the composition of a unital quantum operation with its dual are precisely the correctable subsystems for the map that can be corrected by a single unitary operation. Using the recently developed structure theory for noiseless subsystems, the identification of such correctable subsystems is reduced to an algebraic exercise.

  67. Stephen D. Bartlett, Terry Rudolph, Robert W. Spekkens, and Peter S. Turner,
    Degradation of a quantum reference frame
    New J. Phys. 8, 58 (2006)
    [arXiv:quant-ph/0602069]

    We investigate the degradation of reference frames, treated as dynamical quantum systems, and quantify their longevity as a resource for performing tasks in quantum information processing. We adopt an operational measure of a reference frame's longevity, namely, the number of measurements that can be made against it with a certain error tolerance. We investigate two distinct types of reference frame: a reference direction, realized by a spin-j system, and a phase reference, realized by an oscillator mode with bounded energy. For both cases, we show that our measure of longevity increases quadratically with the size of the reference system and is therefore non-additive. For instance, the number of measurements that a directional reference frame consisting of N parallel spins can be put to use scales as N^2. Our results quantify the extent to which microscopic or mesoscopic reference frames may be used for repeated, high-precision measurements, without needing to be reset - a question that is important for some implementations of quantum computing. We illustrate our results using the proposed single-spin measurement scheme of magnetic resonance force microscopy.

  68. Gilad Gour and Robert W. Spekkens
    Entanglement of Assistance is not a bipartite measure nor a tripartite monotone
    Phys. Rev. A 73, 062331 (2006)
    [arXiv:quant-ph/0512139]

    The entanglement of assistance quantifies the entanglement that can be generated between two parties, Alice and Bob, given assistance from a third party, Charlie, when the three share a tripartite state and where the assistance consists of Charlie initially performing a measurement on his share and communicating the result to Alice and Bob through a one-way classical channel. We argue that if this quantity is to be considered an operational measure of entanglement, then it must be understood to be a tripartite rather than a bipartite measure. We compare it with a distinct tripartite measure that quantifies the entanglement that can be generated between Alice and Bob when they are allowed to make use of a two-way classical channel with Charlie. We show that the latter quantity, which we call the entanglement of collaboration, can be greater than the entanglement of assistance. This demonstrates that the entanglement of assistance (considered as a tripartite measure of entanglement), and its multipartite generalizations such as the localizable entanglement, are not entanglement monotones, thereby undermining their operational significance.

  69. Jonathan Oppenheim, Robert W. Spekkens and Andreas Winter
    A classical analogue of negative information
    Accepted for publication in Phys. Rev. Lett but unpublished due to copyright dispute.
    [arXiv:quant-ph/0511247]

    Recently, it was discovered that the `quantum partial information' needed to merge one party's state with another party's state is given by the conditional entropy, which can be negative [Horodecki, Oppenheim, and Winter, Nature 436, 673 (2005)]. Here we find a classical analogue of this, based on a long known relationship between entanglement and shared private correlations: namely, we consider a private distribution held between two parties, and correlated to a reference system, and ask how much secret communication is needed for one party to send her distribution to the other. We give optimal protocols for this task, and find that private information can be negative - the sender's distribution can be transferred and the potential to send future distributions in secret is gained through the distillation of a secret key. An analogue of `quantum state exchange' is also discussed and one finds cases where exchanging a distribution costs less than for one party to send it. The results give new classical protocols, and also clarify the various relationships between entanglement and privacy.

  70. Stephen D. Bartlett, Terry Rudolph and Robert W. Spekkens
    Dialogue Concerning Two Views on Quantum Coherence: Factist and Fictionist
    Int. J. Quantum Inf. 4, 17 (2006)
    [arXiv:quant-ph/0507214]

    A controversy that has arisen many times over in disparate contexts is whether quantum coherences between eigenstates of certain quantities are fact or fiction. We present a pedagogical introduction to the debate in the form of a hypothetical dialogue between proponents from each of the two camps: a factist and a fictionist. A resolution of the debate can be achieved, we argue, by recognizing that quantum states do not only contain information about the intrinsic properties of a system but about its extrinsic properties as well, that is, about its relation to other systems external to it. Specifically, the coherent quantum state of the factist is the appropriate description of the relation of the system to one reference frame, while the incoherent quantum state of the fictionist is the appropriate description of the relation of the system to another, uncorrelated, reference frame. The two views, we conclude, are alternative but equally valid paradigms of description.

  71. Stephen D. Bartlett, Patrick Hayden and Robert W. Spekkens
    Random subspaces for encryption based on a private shared Cartesian frame
    Phys. Rev. A 72, 052329 (2005)
    [arXiv:quant-ph/0506260]

    A private shared Cartesian frame is a novel form of private shared correlation that allows for both private classical and quantum communication. Cryptography using a private shared Cartesian frame has the remarkable property that asymptotically, if perfect privacy is demanded, the private classical capacity is three times the private quantum capacity. We demonstrate that if the requirement for perfect privacy is relaxed, then it is possible to use the properties of random subspaces to nearly triple the private quantum capacity, almost closing the gap between the private classical and quantum capacities.

  72. M. S. Leifer and R. W. Spekkens
    Pre- and Post-selection paradoxes and contextuality in quantum mechanics
    Phys. Rev. Lett. 95, 200405 (2005)
    [arXiv:quant-ph/0412178]

    Many seemingly paradoxical effects are known in the predictions for outcomes of intermediate measurements made on pre- and post-selected quantum systems. Despite appearances, these effects do not demonstrate the impossibility of a noncontextual hidden variable theory, since an explanation in terms of measurement-disturbance is possible. Nonetheless, we show that for every paradoxical effect wherein all the pre- and post- selected probabilities are 0 or 1 and the pre- and post-selected states are nonorthogonal, there is an associated proof of contextuality. This proof is obtained by considering all the measurements involved in the paradoxical effect -- the pre-selection, the post-selection, and the alternative possible intermediate measurements -- as alternative possible measurements at a single time.

  73. M. S. Leifer and R. W. Spekkens
    Logical Pre- and Post-Selection paradoxes, measurement-disturbance and contextuality
    Proceedings of QS 2004, Int. J. Theor. Phys. 44, 1977 (2005)
    [arXiv:quant-ph/0412179]

    Many seemingly paradoxical effects are known in the predictions for outcomes of measurements made on pre- and post-selected quantum systems. A class of such effects, which we call ``logical pre- and post-selection paradoxes'', bear a striking resemblance to proofs of the Bell-Kochen-Specker theorem, which suggests that they demonstrate the contextuality of quantum mechanics. Despite the apparent similarity, we show that such effects can occur in noncontextual hidden variable theories, provided measurements are allowed to disturb the values of the hidden variables.

  74. Stephen D. Bartlett, Andrew C. Doherty, Robert W. Spekkens and H. M. Wiseman
    Entanglement under restricted operations: Analogy to mixed-state entanglement
    Phys. Rev. A 73, 022311 (2005)
    [arXiv:quant-ph/0412158]

    We show that the classification of bi-partite pure entangled states when local quantum operations are restricted yields a structure that is analogous in many respects to that of mixed-state entanglement. Specifically, we develop this analogy by restricting operations through local superselection rules, and show that such exotic phenomena as bound entanglement and activation arise using pure states in this setting. This analogy aids in resolving several conceptual puzzles in the study of entanglement under restricted operations. In particular, we demonstrate that several types of quantum optical states that possess confusing entanglement properties are analogous to bound entangled states. Also, the classification of pure-state entanglement under restricted operations can be much simpler than for mixed-state entanglement. For instance, in the case of local Abelian superselection rules all questions concerning distillability can be resolved.

  75. R. W. Spekkens
    Contextuality for preparations, transformations, and unsharp measurements
    Phys. Rev. A 71, 052108 (2005)
    [arXiv:quant-ph/0406166]

    An operational definition of contextuality is introduced which generalizes the standard notion in three ways: (1) it applies to arbitrary operational theories rather than just quantum theory, (2) it applies to arbitrary experimental procedures, rather than just sharp measurements, and (3) it applies to a broad class of ontological models of quantum theory, rather than just deterministic hidden variable models. We derive three no-go theorems for ontological models, each based on an assumption of noncontextuality for a different sort of experimental procedure; one for preparation procedures, another for unsharp measurement procedures (that is, measurement procedures associated with positive-operator valued measures), and a third for transformation procedures. All three proofs apply to two-dimensional Hilbert spaces, and are therefore stronger than traditional proofs of contextuality.

  76. Robert W. Spekkens
    Evidence for the epistemic view of quantum states: a toy theory
    Phys. Rev. A 75, 032110 (2007)
    [arXiv:quant-ph/0401052]

    We present a toy theory that is based on a simple principle: the number of questions about the physical state of a system that are answered must always be equal to the number that are unanswered in a state of maximal knowledge. Many quantum phenomena are found to have analogues within this toy theory. These include the noncommutativity of measurements, interference, the multiplicity of convex decompositions of a mixed state, the impossibility of discriminating nonorthogonal states, the impossibility of a universal state inverter, the distinction between bipartite and tripartite entanglement, the monogamy of pure entanglement, no cloning, no broadcasting, remote steering, teleportation, entanglement swapping, dense coding, mutually unbiased bases, and many others. The diversity and quality of these analogies is taken as evidence for the view that quantum states are states of incomplete knowledge rather than states of reality. A consideration of the phenomena that the toy theory fails to reproduce, notably, violations of Bell inequalities and the existence of a Kochen-Specker theorem, provides clues for how to proceed with this research program.

  77. J.-C. Boileau, D. Gottesman, R. Laflamme, D. Poulin, R.W. Spekkens
    Robust polarization-based quantum key distribution over collective-noise channel
    Phys. Rev. Lett. 92, 017901 (2004)
    [arXiv:quant-ph/0306199]

    We present two polarization-based protocols for quantum key distribution. The protocols encode key bits in noiseless subspaces or subsystems and so can function over a quantum channel subjected to an arbitrary degree of collective noise, as occurs, for instance, due to rotation of polarizations in an optical fiber. These protocols can be implemented using only entangled photon-pair sources, single-photon rotations, and single-photon detectors. Thus, our proposals offer practical and realistic alternatives to existing schemes for quantum key distribution over optical fibers without resorting to interferometry or two-way quantum communication, thereby circumventing, respectively, the need for high precision timing and the threat of Trojan horse attacks.

  78. Stephen D. Bartlett, Terry Rudolph, and Robert W. Spekkens
    Optimal measurements for relative quantum information
    Phys. Rev. A 70, 032321 (2004)
    [arXiv:quant-ph/0310009]

    We provide optimal measurement schemes for estimating relative parameters of the quantum state of a pair of spin systems. We prove that the optimal measurements are joint measurements on the pair of systems, meaning that they cannot be achieved by local operations and classical communication. We also demonstrate that in the limit where one of the spins becomes macroscopic, our results reproduce those that are obtained by treating that spin as a classical reference direction.

  79. Terry Rudolph and Robert W. Spekkens
    Quantum State Targeting
    Phys. Rev. A 70, 052306 (2004)
    [arXiv:quant-ph/0310060]

    We introduce a primitive for quantum cryptography that we term "state targeting." We show that increasing one's probability of success in this task above a minimum amount implies an unavoidable increase in the probability of a particular kind of failure. This is analogous to the unavoidable disturbance to a quantum state that results from gaining information about its identity, and can be shown to be a purely quantum effect. We solve various optimization problems for state targeting that are useful for the security analysis of two-party cryptographic tasks implemented between remote antagonistic parties. Although we focus on weak coin flipping, the results are significant for other two-party protocols, such as strong coin flipping, partially binding and concealing bit commitment, and bit escrow. Furthermore, the results have significance not only for the traditional notion of security in cryptography, that of restricting a cheater's ability to bias the outcome of the protocol, but also for a different notion of security that arises only in the quantum context, that of cheat sensitivity. Finally, our analysis leads to some interesting secondary results, namely, a generalization of Uhlmann's theorem and an operational interpretation of the fidelity between two mixed states.

  80. Terry Rudolph, Robert W. Spekkens, Peter S. Turner
    Unambiguous discrimination of mixed states
    Phys. Rev. A 68, 010301(R) (2003)
    [arXiv:quant-ph/0303071]

    We present the conditions under which probabilistic error-free discrimination of mixed states is possible, and provide upper and lower bounds on the maximum probability of success for the case of two mixed states. We solve certain special cases exactly, and demonstrate how the problems of state filtering and state comparison can be recast as problems of mixed state unambiguous discrimination.

  81. Stephen D. Bartlett, Terry Rudolph and R. W. Spekkens,
    Classical and Quantum Communication without a Shared Reference Frame,
    Phys. Rev. Lett. 91, 027901 (2003)
    [arXiv:quant-ph/0302111]

    We show that communication without a shared reference frame is possible using entangled states. Both classical and quantum information can be communicated with perfect fidelity without a shared reference frame at a rate that asymptotically approaches one classical bit or one encoded qubit per transmitted qubit. We present an optical scheme to communicate classical bits without a shared reference frame using entangled photon pairs and linear optical Bell state measurements.

  82. R. W. Spekkens and Terry Rudolph
    Quantum protocol for cheat-sensitive weak coin flipping
    Phys. Rev. Lett. 89, 227901 (2002)
    [arXiv:quant-ph/0202118]

    We present a quantum protocol for the task of weak coin flipping. We find that, for one choice of parameters in the protocol, the maximum probability of a dishonest party winning the coin flip if the other party is honest is 1/sqrt[2]. We also show that if parties restrict themselves to strategies wherein they cannot be caught cheating, their maximum probability of winning can be even smaller. As such, the protocol offers additional security in the form of cheat sensitivity.

  83. R. W. Spekkens and T. Rudolph
    Optimization of coherent attacks in generalizations of the BB84 quantum bit commitment protocol
    Quantum Inform. Compu. 2, 66 (2002)
    [arXiv:quant-ph/0107042]

    It is well known that no quantum bit commitment protocol is unconditionally secure. Nonetheless, there can be non-trivial upper bounds on both Bob's probability of correctly estimating Alice's commitment and Alice's probability of successfully unveiling whatever bit she desires. In this paper, we seek to determine these bounds for generalizations of the BB84 bit commitment protocol. In such protocols, an honest Alice commits to a bit by randomly choosing a state from a specified set and submitting this to Bob, and later unveils the bit to Bob by announcing the chosen state, at which point Bob measures the projector onto the state. Bob's optimal cheating strategy can be easily deduced from well known results in the theory of quantum state estimation. We show how to understand Alice's most general cheating strategy, (which involves her submitting to Bob one half of an entangled state) in terms of a theorem of Hughston, Jozsa and Wootters. We also show how the problem of optimizing Alice's cheating strategy for a fixed submitted state can be mapped onto a problem of state estimation. Finally, using the Bloch ball representation of qubit states, we identify the optimal coherent attack for a class of protocols that can be implemented with just a single qubit. These results provide a tight upper bound on Alice's probability of successfully unveiling whatever bit she desires in the protocol proposed by Aharonov et al., and lead us to identify a qubit protocol with even greater security.

  84. R. W. Spekkens and T. Rudolph
    Degrees of concealment and bindingness in quantum bit commitment protocols
    Phys. Rev. A 65, 012310 (2001)
    [arXiv:quant-ph/0106019]

    Although it is impossible for a bit commitment protocol to be both arbitrarily concealing and arbitrarily binding, it is possible for it to be both partiallyconcealing and partiallybinding. This means that Bob cannot, prior to the beginning of the unveiling phase, find out everything about the bit committed, and Alice cannot, through actions taken after the end of the commitment phase, unveil whatever bit she desires. We determine upper bounds on the degrees of concealment and bindingness that can be achieved simultaneously in anybit commitment protocol, although it is unknown whether these can be saturated. We do, however, determine the maxima of these quantities in a restricted class of bit commitment protocols, namely, those wherein all the systems that play a role in the commitment phase are supplied by Alice. We show that these maxima can be achieved using a protocol that requires Alice to prepare a pair of systems in an entangled state, submit one of the pair to Bob at the commitment phase, and the other at the unveiling phase. Finally, we determine the form of the trade off that exists between the degree of concealment and the degree of bindingness given various assumptions about the purity and dimensionality of the states used in the protocol.

  85. R. W. Spekkens and J. E. Sipe
    A Modal Interpretation of Quantum Mechanics Based on a Principle of Entropy Minimization
    Found. Phys. 31, 1431 (2001)
    [arXiv:quant-ph/0003092]

    Within many approaches to the interpretation of quantum mechanics, especially modal interpretations, one singles out a particular decomposition of the state vector in order to fix the properties that are well-defined for the system. We present a novel proposal for this preferred decomposition. Given a distinguished factorization of the Hilbert space, it is the decomposition that minimizes the Ingarden–Urbanik entropy from among all product decompositions with respect to the distinguished factorization. We incorporate this choice of preferred decomposition into a framework for modal interpretations and investigate in detail the extent to which it provides a solution to the measurement problem and the extent to which it ensures that measurements whose outcomes are predictable with probability 1 reveal pre-existing properties of the system under investigation.

  86. R. W. Spekkens and J. E. Sipe
    Non-Orthogonal Core Projectors for Modal Interpretations of Quantum Mechanics
    Found. Phys. 31, 1403 (2001).
    [arXiv:quant-ph/0003092]

    Modal interpretations constitute a particular approach to associating dynamical variables with physical systems in quantum mechanics. Given the quantum logical constraints that are typically adopted by such interpretations, only certain sets of variables can be taken to be simultaneously definite-valued, and only certain sets of values can be ascribed to these variables at a given time. Moreover, each allowable set of variables and values can be uniquely specified by a single core projector in the Hilbert space associated with the system. In general, the core projector can be one of several possibilities at a given time. In most previous modal interpretations, the different possible core projectors have formed an orthogonal set. This paper investigates the possibility of adopting a non-orthogonal set. It is demonstrated that such non-orthogonality is required if measurements for which the outcome can be predicted with probability 1 are to reveal the pre-existing value of the variable measured, an assumption which has traditionally constituted a strong motivation for the modal approach. The existing framework for modal interpretations is generalized to explicitly accommodate non-orthogonal core projectors.

  87. R. W. Spekkens and J. E. Sipe
    Spatial fragmentation of a Bose-Einstein condensate in a double-well potential
    Phys. Rev. A 59, 3868 (1999)
    [arXiv:quant-ph/9810094]

    We present a theoretical study of the ground state of a Bose-Einstein condensate with repulsive interparticle interactions in a double-well potential, using a restricted variational principle. Within such an approach, there is a transition from a single condensate to a fragmented condensate as the strength of the central barrier of the potential is increased. We determine the nature of this transition through an approximate analytic solution as well as a numerical solution of our model, in the regime where the interparticle interactions can be treated perturbatively. The degree of fragmentation of the condensate is characterized by the degrees of first- and second-order spatial coherence across the barrier.