What's Ultimately Possible in Physics? Essay Contest
Possibilistic Physics by Tobias Fritz
I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels "possible to occur" or "impossible to occur" to each outcome of each measurement. It is found that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The essay ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Author Bio
Tobias Fritz is a graduate student at the Max Planck Institute for Mathematics in Bonn, Germany. He works mainly on the foundations of quantum mechanics.Download Essay PDF File
I like this essay. However, I am not at all convinced why giving up probabilities in favor of possibilities will do any good. After all, the very basis of quantitative science (including physics) comprises the natural numbers. Natural numbers are intrinsically and fundamentally probabilistic, which is due to the random nature of the primes. What would be the fundamental counterpart of possibilities? Why giving up something fundamental and specific (probabilities) for something that lacks both a fundamental basis and specificity (your possibilities)?
thank you for your feedback. You've made a very interesting point.
I agree that most theories of physics are based on the natural numbers. There are a few exceptions, though, for example the causal set approach to quantum gravity. I do not agree, though, that the natural numbers are instrinsically and fundamentally probabilistic. Let me explain why.
First of all, certainly the primes are a very interesting subset of the natural numbers &ndash mathematically. But how are they relevant for fundamental physics?
Then, yes, it seems that the probabilistic Cramér model accurately captures many aspects of the primes. But, as Terry Tao mentions in his structure and randomness in the primes
, there are some problems with this model like "most primes are not divisible by 5". If the primes were totally pseudo-randomly probabilistic, exactly 1/5 of them should be divisible by 5. This aspect can indeed better be captured in a possibilistic model. Consider the following possibilities:
So a natural number is considered to be a possible prime if and only if it is not divisible by 2, 3 and 5. This yields the sieve of Eratosthenes.
These are the reasons why I don't believe that the natural numbers are enough justification for the use of probabilities. This, together with the arguments given in the essay, is everything that I can say right now in the favor of possibility. Personally, I don't believe that the concept of possibility is necessarily superior. But I do believe that the concept of probability shouldn't be taken for granted.
Hello dear Tobias Fritz,
Very interesting essay and discussions about numbers .Congratulations.
They are fascinatings these numbers indeed.
I think what these numbers have a correlation ,a pure correlation,with the physicality and their specific fractals .
The serie and its oscillations si specific .
The pairs are relevant .
I think rerally what the primes are finites like a specific number .
For me the same what our quantum entanglement of spheres and our universal spheres with its cosmological spheres .
The number of spheres is specific ,finite .
If our fundamental mathematics are physicals ,the synchronization is possible .
I have always thought that the infinty ,the zero ,the imaginaries are just a human extrapolation ,the physicality ,rational is finite ,
thus in our serie too we must insert the limits .
Of course the naturals ,reals, there ,are a multiplication or addition of primes .
Thus a finite system with primes and inside an infinite systems of naturals ,products ....for the complexification .The closed system is evidently a spherical system with the center ,the volume ....we can insert the thermodynamic ,the evolution too the mass .....
The link with the uniqueness and the unity vector is relevant .
The serie in a kind of physical fractal beginning with the number 1 is relevant .if the serie goes....... thus the thermodynamic link about the volume of the sphere is important,the serie increases in the numbers of primes but the volume prortionally decreases in its specific fractal in the two senses,quant. and cosm. 1 2 3 is fascinating it's the ultim begining even correlated with the Big Bang which is for me a kind of multiplication .If I had the volume of the main central sphere and the volume of our Universal sphere and the numbers of cosmological spheres ,it d be better and easier to calculate .
Here are some comments by Robert Spekkens on my essay, divided into two posts for length reasons:
Thank you for directing me to your paper. I largely agree withyour perspective. I am also of the opinion that it is best *not* toconsider the toy theory as a probabilistic theory. I have often saidthat all that is required for this theory is modal logic, that is,...
view entire post
Here are some comments by Robert Spekkens on my essay, divided into two posts for length reasons:
Thank you for directing me to your paper. I largely agree withyour perspective. I am also of the opinion that it is best *not* toconsider the toy theory as a probabilistic theory. I have often saidthat all that is required for this theory is modal logic, that is, thelogic of possibility and necessity. Bob Coecke and his students havebeen working on category-theoretic presentations of the toy theory andwithin that framework, one can be a bit more formal about this. I attachan article which I am writing with them and which is available on Bob'swebsite (http://www.comlab.ox.ac.uk/people/bob.coecke/) but not yetpublished. The phrase "possibilistic theory" is actually used anddiscussed therein (see remark 4.5). See also arXiv:0808.1037v1[quant-ph].
Despite the naturalness of thinking of the toy theory as apossibilistic theory, most people's intuitions go the other way and I amconstantly asked why I didn't choose to include arbitrary convexcombinations of the epistemic states. I take it that you are askingyourself the same question. Here is my response.
First, note that allowing such convex sums of epistemic states would, strictly speaking, violate the knowledge-balance principle because for such states one could not say that one has the answer to half the questions in a canonical set. I therefore disagree with your statementon p. 4 that "the knowledge balance principle advocated in [Sp] would still be satisfied".
Second, one cannot close the set of epistemic states under convex combination while simultaneously keeping a restricted notion of coherent combination without doing violence to the quality of the analogy with quantum theory. Here is what I say in section VII.C of my paper.
[quote] We have seen that there are two types of binary operations defined for epistemic states in the toy theory, analogous to convex combinations and coherent superpositions of quantum states. However, these operations are partial; they are not defined for every pair of epistemic states. It might therefore seem desirable to close the set of epistemic states in the toy theory under convex combination with arbitrary probability distributions. In this case, the set of allowed epistemic states for a single elementary system would have the shape of an octahedron in the Bloch sphere picture. Hardy's toy theory, for instance, has this feature . Such a variant of our toy theory has also been consideredby Halvorson . However, there is an important sense in which such a theory is less analogous to quantum theory than the one presented in this paper. The toy theory shares with quantum theory the feature that every mixed state has multiple convex decompositions into pure states, whereas in this modified version, there are many mixed states that have unique decompositions. Similarly, in the toy theory, as in quantum theory, every mixed state has a "purification"-a correlated state between the system of interest and another of equal size such that the marginal over the system of interest is equal to the mixed state in question-whereas in the modifiedversion, there are many mixed states that have only a single purification.
[quote] The problem with the modified theory is that although convex combination has been extended to a full binary operation rather than a partial one, the coherent binary operations have not been so extended. Moreover, although one has allowed arbitrary weights in the convex combinations, one has not allowed the analogue of arbitrary amplitudes and phases for the coherent binary operations. It is likely that a better analogy with quantum theory can be obtained only if both operations are generalized. Unfortunately, it is unclear how to do so in a conceptually well-motivated way. [end quote]
view post as summary
Third, I don't interpret the probabilities in the toy theory as subjective degrees of belief nor as objective propensities but rather as relative frequencies. My epistemic states describe the relative frequencies of ontic states in an ensemble of similarly preparedsystems, or, equivalently, what an agent knows about the ontic state of a single system drawn from this ensemble. Unlike the subjective...
view entire post
Third, I don't interpret the probabilities in the toy theory as subjective degrees of belief nor as objective propensities but rather as relative frequencies. My epistemic states describe the relative frequencies of ontic states in an ensemble of similarly preparedsystems, or, equivalently, what an agent knows about the ontic state of a single system drawn from this ensemble. Unlike the subjective Bayesians, I believe that the epistemic state that an agent uses is justified and so in this sense the probabilities are not subjective. See below for some additional comments on the interpretation of probability. The upshot is that to prepare an arbitrary probability distribution over epistemic states, I would argue that one requires something like a physical die that samples from that distribution. The question then arises about whether such a die could exist in a world governed by the toy theory. Suppose we begin by presuming only those epistemic statesthat satisfy the knowledge-balance principle. What dice can be built out of these? A single elementary system in the toy theory can be used as an unbiased coin flip. Similarly, using a pair of elementary systems one can achieve a distribution of the form (1/4,1/4,1/4,1/4). Below is an example of how to achieve this. However, to get a biased distribution, such as (1/4,3/4), by a measurement on an epistemic state satisfying the knowledge-balance principle would require an invalid measurement. Again, see below.
Another strategy would be to prepare a single elementary system, call it A, in the epistemic state of maximal ignorance and then implement acontrolled operation from A to another system, call it B, that shifts the ontic state of B if the ontic state of A is 2,3 or 4 and leaves the ontic state of B invariant if the ontic state of A is 1. One can show, however, that this sort of controlled operation also violates the knowledge-balance principle.
Note that if one had access to classical dice and classical conditional operations, then clearly any convex sum could be implemented. However, if we imagine that the dice are themselves made out of the elementarysystems of the toy theory, and are subject to the same sort ofrestricted dynamics, then this sort of conditional operation ispossible. The toy theory does not "play well" with classical mechanics.
The principle to which I'm appealing here is that any dice or operationsconditioned on dice that are presumed to be available as externalresources must also be internal resources. Only then can the cutbetween internal and external degrees of freedom be moved arbitrarily. Such motility strikes me as a consistency condition that one would want to impose on any theory claiming universal applicability.
Let me end with a comment concerning how you describe my work. Your statement that my toy theory "is inconsistent in the usual probabilistic interpretation, but is a perfectly fine example of a possibilistictheory" suggests to the reader of your article that the interpretation that I proposed for it (presumably the "usual" one) is probabilistic. I hope that the discussion above and in particular the content of sectionVII.C of my paper make it clear that my favored interpretation is in fact much closer to the possibilistic one that you expound.
All the best,
view post as summary
Interesting answer from Spekkens. However, I do not see why modal logic automatically implies the binary, possibilistic view. Any model (or actual representation as a Kripke structure) of modal logic is equipped with a truth assignment function over all possible worlds. This function naturally leads to probabilities or, at the least, to beliefs in the sense of Dempster-Shafer theory.
See, e.g., "Conceptual Foundations of Quantum Mechanics:. the Role of Evidence Theory, Quantum Sets, and Modal Logic"
Resconi, Germano; Klir, George J.; Pessa, Eliano
International Journal of Modern Physics C, Volume 10, Issue 01, pp. 29-62 (1999).
Also, my own essay, "The Ultimate Physics of Number Counting", which too starts from a modal logic perspective.
Let me mention with a grin that John Baetz could probably if possibly have trouble with replacement of his negative probability by negative possibility.
thank you for your comment. Although I'm aware of John Baez' work, I don't know what "his negative probability" is. Could you point me to a reference?
Obviously, probabilities in the sense of asymptotic frequencies are always non-negative. And asymptotic frequencies are what the Born predicts. No negative probabilities here.
John Baez is a prolific writer. I am sorry. I did not consider it worth to note where I read claimed negative energy and even most nonsensical negative probability, maybe in sci.physics.research, maybe via his home. While I did not find physical items without a natural zero, experts like Baez are claiming that virtually anything can be positive as well as negative.
I understand that e.g. negative pressure is reasonable for instance:
- if we consider an instantaneous value of the alternating component
- if we measure sound pressure re 20 microPascal in dB.
In general, application of formal mathematics needs obedience of common sense if necessary in the aftermath. Joke: Just 3 people are sitting in a room, then 5 of them leave it. Consequently 2 have to come in as to make the room empty.
If you have humor, read the longish lesson in two parts just written to me by Anton Biermans at my 527. I feel sick.
Dear Andreas Martin,
unfortunately I'm having trouble retrieving the paper you mentioned... no online access and not in our library.
But I am currently reading your essay, which seems to contain some intriguing observations, and I will need a little more time to fully grasp it.
So do you mean that probability emerges from a Kripke model of modal logic, just as it emerges in many-worlds interpretations of quantum mechanics? If yes, then that implies that -- in modal logic -- probability is not a fundamental concept, but merely a derived quantity. If this is what you mean when you write "this function naturally leads to probabilities", then we agree: probabilities then are an extremely useful, but non-fundamental, concept.
Also remember that I'm not claiming possibilistic physics to be realistic in any sense; I'm just trying to question established concepts and see what one can do without them.
(By the way, both the long delay in posting Spekkens' email and the missing spaces therein are completely my bad.)