CATEGORY:
Undecidability, Uncomputability, and Unpredictability Essay Contest (2019-2020)
[back]
TOPIC:
Undecidability and indeterminism by Klaas Landsman
[refresh]
Login or
create account to post reply or comment.
Author Klaas Landsman wrote on Mar. 7, 2020 @ 21:09 GMT
Essay AbstractThe famous theorem of Bell (1964) left two loopholes for determinism underneath quantum mechanics, viz. non-local deterministic hidden variable theories (like Bohmian mechanics) or theories denying free choice of experimental settings (like 't Hooft's cellular automaton interpretation of quantum mechanics). However, a precise analysis of the role of randomness in quantum theory and especially its undecidability closes these loopholes, so that-accepting the statistical predictions of quantum mechanics-determinism is excluded full stop. The main point is that Bell's theorem does not exploit the full empirical content of quantum mechanics, which consists of long series of outcomes of repeated measurements (idealized as infinite binary sequences). It only extracts the long-run relative frequencies derived from such series, and hence merely asks hidden variable theories to reproduce certain single-case Born probabilities. For the full outcome sequences of a fair quantum coin flip, quantum mechanics predicts that these sequences (almost surely) have a typicality property called 1-randomness in logic, which is definable via computational incompressibility à la Kolmogorov and is much stronger than e.g. uncomputability. Chaitin's remarkable version of Gödel's (first) incompleteness theorem implies that 1-randomness is unprovable (even in set theory). Combined with a change of emphasis from single-case Born probabilities to randomness properties of outcome sequences, this is the key to the above claim.
Author BioKlaas Landsman (1963) is a professor of mathematical physics at Radboud University (Nijmegen, the Netherlands). He was a postdoc at DAMTP in Cambridge from 1989-1997. He mainly works in mathematical physics, mathematics (notably non-commutative geometry), and foundations of physics. His book Foundations of Quantum Theory: From Classical Concepts to Operator Algebras (Springer, 2017, Open Access) combines these interests. He is associate editor of Foundations of Physics and of Studies in History and Philosophy of Modern Physics and is a member of FQXi.
Download Essay PDF File
Jochen Szangolies wrote on Mar. 8, 2020 @ 13:02 GMT
Dear Prof. Landsmann,
this is a very exciting essay! I have only given it a first pass, but as far as I understand, you propose to extend the scope of Bell's theorem from the statistics of ensembles of measurement outcomes to the characteristics of individual outcome strings, thus uncovering the incompatibility of quantum mechanics with (a certain notion of) determinism.
I think...
view entire post
Dear Prof. Landsmann,
this is a very exciting essay! I have only given it a first pass, but as far as I understand, you propose to extend the scope of Bell's theorem from the statistics of ensembles of measurement outcomes to the characteristics of individual outcome strings, thus uncovering the incompatibility of quantum mechanics with (a certain notion of) determinism.
I think this is a highly original way to think about these issues; certainly, most treatments never leave the level of statistical analysis, but of course, the statistical properties of an ensemble don't suffice to fix those of its members. I'm reminded of the old joke: the average human has one testicle and one breast, features which a 'theory' of beings that have one testicle and one breast each may well replicate; but that theory would fail badly at reproducing the characteristics of humans on an individual basis.
Again, if I understand you correctly, your main argument is that the deterministic replacements of quantum mechanics fail to replicate the typicality of individual outcome strings, while meeting the requirements posed by the Born rule. That outcome sequences of quantum mechanics must be Kolmogorov random has been argued before, in various ways---Yurtsever has argued that computable pseudorandomness would lead to exploitable signalling behavior (https://arxiv.org/abs/quant-ph/9806059), echoed by Bendersky et al., who explicitly prove that non-signalling deterministic models must be noncomputable, if they are to recapitulate the predictions of quantum mechanics (https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.1
18.130401). Likewise, Calude and Svozil have argued for non-computability from a generalization of the Kochen-Specker theorem (https://arxiv.org/abs/quant-ph/0611029). (And of course, there are my own efforts, see https://link.springer.com/article/10.1007/s10701-018-0221-9,
which also makes use of Chaitin's incompleteness theorem, and my contribution to this contest.)
Thus, the randomness of any quantum outcome sequence can't be produced by any effective means, and hence, any deterministic theory must either fail to reproduce these outcome sequences, or otherwise incorporate this randomness by fiat (as in the Bohmian equilibrium hypothesis), which renders its origin essentially mysterious.
It seems to me that at the heart of this is really the observation that you can write any noncomputable function as a finite algorithm that has access to an infinite reservoir of (algorithmic) randomness. In this way, Bohmian mechanics can play the role of the algorithmic part, which has to be augmented by an infinite random string in order to replicate the quantum predictions.
There is, however, another way that's quite popular at present---you can also just compute any sequence whatsoever in parallel, by an 'interleaving' algorithm that just outputs all possible bit strings, running forever. A measure-1 subset of the strings produced in this way will by typical, but the overall operation is, of course, quite deterministic. This is basically the sense in which the many worlds interpretation is deterministic: if we just look at any given bitstring as a single 'world', then in general one would expect to find oneself in a world that's algorithmically random.
Another observation of yours I also found highly interesting, namely, that in principle the non-signalling nature of quantum mechanics should be considered as a statistical notion, like the second law of thermodynamics. In the limit of infinitely long strings, the non-signalling principle will hold with probability 1, but for finite lengths, deviations may be possible. However, one probably couldn't get any 'useful' violations of non-signalling out of this, because one could probably not certify these sorts of violations (although perhaps one could provide a bound in such a way that Alice and Bob will agree on a certain message with slightly higher probability than merely by guessing, with that probability going to the guessing probability with the length of the message).
Anyway, thanks for this interesting contribution. I wish you the best of luck in this contest!
Cheers
Jochen
view post as summary
report post as inappropriate
Author Klaas Landsman replied on Mar. 9, 2020 @ 08:32 GMT
Dear Jochen, Thank you for this kind and detailed post, which summarizes my paper well. I am well aware of the literature you cite (including your own 2018 paper, which I studied in detail with an honours student), but the rules for this essay contest exclude an extensive bibliography - I cite most of these papers in my Randomness? What randomness? paper, ref. [20] in my essay, available Open...
view entire post
Dear Jochen, Thank you for this kind and detailed post, which summarizes my paper well. I am well aware of the literature you cite (including your own 2018 paper, which I studied in detail with an honours student), but the rules for this essay contest exclude an extensive bibliography - I cite most of these papers in my Randomness? What randomness? paper, ref. [20] in my essay, available Open Access at https://link.springer.com/article/10.1007/s10701-020-00318-8
(although your 2018 paper dropped off the longlist of references included in earlier drafts, which became almost book-length, so even in [20] at the end of the day I only cited papers I directly relied on. I will comment on your paper in a separate post about your own essay in this contest later today).
You write: "It seems to me that at the heart of this is really the observation that you can write any noncomputable function as a finite algorithm that has access to an infinite reservoir of (algorithmic) randomness." This observation is the Kucera-Gacs Theorem (this is Theorem 8.3.2 in ref. [8] of my essay (Downey & Hirschfeldt), which states that every set is reducible to a random set (acting as an oracle). Phrased in this way, your point on Bohmian mechanics ("Bohmian mechanics can play the role of the algorithmic part, which has to be augmented by an infinite random string in order to replicate the quantum predictions.") is, as you suggest, exactly the point I make in my essay, implying that precisely because of this dependence on a random oracle (which has to come from "nature" itself? or what?) it cannot be a deterministic underpinning of quantum mechanics. And likewise for 't Hooft's or any other serious hidden variable theory.
Finally, as to your last point, "that in principle the non-signaling nature of quantum mechanics should be considered as a statistical notion, like the second law of thermodynamics.": I proposed this in my Randomness? What randomness? paper but did not include it in my current essay, although all these things are closely related. In any case, as I learnt from my friend Guido Bacciagaluppi, it was Antony Valentini who first made this point, long ago. But it should be much more widely known!
Keep it up! All the best, Klaas
view post as summary
Jochen Szangolies replied on Mar. 11, 2020 @ 16:24 GMT
Dear Klaas,
thanks for your reply. I seem to have gotten the gist of your argument right. I agree that, from the point of view you suggest, the 'determinism' of Bohmian mechanics (I can't comment too much on 't Hooft's cellular automata) looks sort of question-begging---sure you can shunt all the randomness to the initial conditions, but that doesn't really make it disappear.
I did have a quick look at your 'Randomness'-paper, which will surely make for some interesting reading once I get some time to give it a more in-depth reading, hence why I got confused regarding the source of the 'statistical' nature of the non-signalling constraint. The Valentini paper you mention is 'Signal-Locality in Hidden-Variables Theories'?
Finally, please tell the poor honors student who had to wade through my bloviations that I'm sorry. ;)
Cheers
Jochen
report post as inappropriate
Dan J. Bruiger wrote on Mar. 8, 2020 @ 17:57 GMT
Dear Prof. Landsman,
I am no mathematician, but it strikes me that all attempts to define randomness as an inherent objective property of sequences in math, or of events in nature, are doomed to failure precisely because randomness (or its opposite, determinism) is the assertion of an agent (mathematician or physicist). This fact aligns it with prediction and computation, which are the...
view entire post
Dear Prof. Landsman,
I am no mathematician, but it strikes me that all attempts to define randomness as an inherent objective property of sequences in math, or of events in nature, are doomed to failure precisely because randomness (or its opposite, determinism) is the assertion of an agent (mathematician or physicist). This fact aligns it with prediction and computation, which are the actions of agents. The concept of determinism as an objective state of affairs is an unfortunate result of an ambiguity in language and thought. It confuses the purported ability of one external thing, to fix the state of another (causality), with the ability of an agent to ascertain the state in question (knowledge)—that is, to “determine” what has actually occurred. I hold that determinism is a property of human artifacts, such as equations and machines, because they are products of definition. Physicists have found it convenient to ascribe it to nature in the macroscopic realm and some would like to extend that to the micro realm. But strictly speaking, neither determinism nor non-determinism can be properties of nature itself.
On another note, as you point out, one completed or pre-existing STRING of binary digits is no more probable than another, as an arbitrary selection from the set of all strings (“an apparently “random” string like σ = 0011010101110100 is as probable as a ‘deterministic’ string like σ = 111111111111111”). In contrast, as a product of individual coin flip events, the latter sequence is certainly less probable than the former. I would point out that completed sequences (and the set of all strings) are concepts created by agents, not things existing in nature. The same must apply to the notion of prior probability as a property inhering in individual events (independent of observers).
I suspect that many dilemmas in physics would be viewed differently if the role of the observer or theorist were better taken into account. I hope these comments might be of some interest, and I apologize if they are tangential to your very impressive argument.
Cheers,
Dan Bruiger
view post as summary
report post as inappropriate
Author Klaas Landsman replied on Mar. 9, 2020 @ 15:57 GMT
Dear Dan,
Thanks for your comments. I would agree that (in)determinism is not a property of "Nature" but of physical theories, which indeed are assertions of agents. However, this does not seem to block the possibility of randomness of sequences or other mathematical structures.
I do not understand your remark that "In contrast, as a product of individual coin flip events, the latter sequence is certainly less probable than the former." According to standard probability theory they _are_ equally probable.
Your comments are not at all merely tangential to my topic: they go to the heart of it!
Best wishes, Klaas
Dan J. Bruiger replied on Mar. 11, 2020 @ 17:20 GMT
Hi, again
I simply mean that a series of coin tosses would look much more like mixed zeros and ones than a sequence of all ones (about half and half heads and tails as opposed to all heads). I think what you mean ("standard probability theory") is that ANY sequence, itself considered as a pre-existing entity, that is arbitrarily drawn from an infinite set of possible such sequences is equally probable (or rather improbable)? But my point is that such pre-existing sequences don't exist. There is only each event of a coin toss and the records we keep of them. In a series of such records, more of them will accumulate mixed ones and zeros than a straight run of ones, for example.
Thanks for your patience.
Dan
report post as inappropriate
Wilhelmus de Wilde de Wilde wrote on Mar. 9, 2020 @ 09:34 GMT
Dear professor Landsman (Beste Klaas);
First: Thank you for the very clear and understandable historical introduction.
Quote “I conclude that deterministic hidden variable theories compatible with quantum mechanics cannot exist; Bell’s theorem leaves two loopholes for determinism (i.e. nonlocality or no choice) because its compatibility condition with quantum mechanics is not...
view entire post
Dear professor Landsman (Beste Klaas);
First: Thank you for the very clear and understandable historical introduction.
Quote “I conclude that deterministic hidden variable theories compatible with quantum mechanics cannot exist; Bell’s theorem leaves two loopholes for determinism (i.e. nonlocality or no choice) because its compatibility condition with quantum mechanics is not stated strongly enough.34” unquote. I agree you’re your conclusion only for different reasons:
1. Our past (as down-loaded in our memory) is the only deterministic entity (the seemingly cause and event line)
2. The NOW moment is still in the future for an emergent conscious agent (the flow of processing time in the emergent phenomenon reality ).
3. ALL the experienced emergent time-lines are as I argue ONE (eternal compared to our emergent reality) dimensionless Point Zero in Total Simultaneity, from where there is an infinity of Choices (hidden variables) can be made by the partial consciousness of an agent. So, the future is NON-Deterministic as we are approaching Point Zero. Point Zero, however, is unreachable for any emergent phenomenon out of Total Simultaneity.
Quote: “Now a proof of the above true statement about deterministic hidden variable theories should perhaps not be expected to show that all typical quantum mechanical outcome sequences violate the predictions of the hidden variable theory, but it should identify at least an uncountable number of such typical sequences–for finding a countable number, let alone a mere finite number, would make the contradiction with quantum mechanics happen with probability zero.” Unquote. I think you are quite right here (This conclusion makes me think of the super-asymmetry theory of Sheldon Cooper…I hope I am not offending you), everything we are making a LAW of doesn’t mean that it will be always so in a future that emerge, the more information we are gathering (in trying to complete our models), the more changes we will encounter.
I liked your approach and conclusions, but there are different ways to come to the same conclusions, so, I hope that you can find some time to read
my approach Thank you very much and good luck in the contest.
Wilhelmus de Wilde
view post as summary
report post as inappropriate
Wilhelmus de Wilde de Wilde replied on Mar. 30, 2020 @ 09:29 GMT
Dear Profesor Landsman,
I am still waiting for your comment.
Sorry for the comparison with "The BB theory"
Wilhelmus de Wilde
report post as inappropriate
Mihai Panoschi Panoschi wrote on Mar. 17, 2020 @ 15:28 GMT
Professor Landsman,
I must admit your approach took me back in time to the ancient Greek insight that cosmos/order/certainty came out of or is grounded on chaos/disorder/uncertainty.
If randomness (as apparent lack of patterns and predictability in events ) is a measure of uncertainty, and since outcomes over repeated trials of the same event often follow a probability distribution then the relative frequency over many trials is predictable (so in a way Von Mises was right to derive probability theory from randomness even though he failed in that attempt but helped Kolmogorov succed). In other words randomness could be more fundamental than probability theory that permeates QM and Statistical mechanics since Boltzmann, even though your concept of randomness is in the mathematical sense of Chaitin-Kolmogorov, not Von Mises sense.
Coming back to our theme, if Godel's theorems tells us that the fields of mathematics and physics (according to Hilbert's axiomatic programme) cannot be grounded on logic(in classical and symbolic sense, but who knows maybe one day it could be grounded on a different type of logic, say in Brouwer-Heyting intuitionism or Moisil-Lukasiewicz many valued logic) and Bell's theorem that QM cannot be grounded on classical determinism or any underlying hidden variables theory a la deBroglie-Bohm-Vigier then how do we know that we haven't been using undecidable results to prove our theorems in both mathematics and physics throughout millenia? (like the ones we found in Euclidian geometry so that Hilbert had to re-axiomatize it)
Does this mean that ultimayely, randomness and chaos could be the ground for both mathematics and physics, with their logical necessity and deterministic or indeteministic laws and that ultimately the Greeks were right?...
report post as inappropriate
Author Klaas Landsman replied on Mar. 18, 2020 @ 08:26 GMT
Hi Mihai, Thank you for these interesting comments. I agree with your last point: ultimately, all laws derive from randomness! A point made repeatedly by my colleague Cristian Calude is that randomness cannot imply lawlessness, since any (infinite) sequence necessarily possesses some arithmetic regularities (Baudet's Conjecture/van der Waerden's Theorem. It should be stressed that random sequences of the kind studied in Kolmogorov complexity theory are far from lawless in the sense of Brouwer - they are pretty regular in satisfying all kinds of statistical laws that follows from 1-randomness, as I explain in my essay. I am less sure about your observation that the theorems of mathematics we use in physics are grounded on undecidable results, e.g. the derivation of the incompleteness results by Gödel and Chaitin itself is based on decidable propositions only (at least as far as I know). Also, I would not say that Gödel's theorems imply that mathematics cannot be grounded on logic, except when you mean "grounded" in Hilbert's sense, namely a proof of consistency. Without knowing that e.g. ZFC is consistent, it is still a logical language in which we do our mathematics, most of which is decidable in ZFC. Best wishes, Klaas
Mihai Panoschi Panoschi replied on Mar. 20, 2020 @ 13:21 GMT
Thank you for your rely Prof. Landsman. Indeed, I meant consistency of ZFC in Hilbert sense, which according to Godel's second incompleteness theorem, the consistency of ZFC cannot be proved within ZFC itself (unless it is actually inconsistent). In other words, if ZFC as a deductive formal axiomatic system of set theory as mathematics and foundational for most of classical mathematics, then the consistency of ZFC cannot be demonstrated in this system alone. Given that, so far, ZFC has proved immune to the classical set theory paradoxes of Cantor, Russell or Burali- Forti, this of course does not imply that ZFC is absolutely (categorically) free of any potential inner contradictions that might come up one day, wouldn't you agree?...
The relativity of set-theoretic concepts, as the ones used in ZFC was signalled quite early on with the Lowenheim-Skolem theorem that subsequently led to Skolem's paradox, which implies that, if ZFC is consistent then its axioms must be satisfaible within a countable domain, even though they prove the existence of uncountable sets, in Cantor's sense (1922). Nowadays, we know that there are many mathematical statements undecidable in ZFC and other axioms need to be added in order to prove results in branches of mathematics such category theory or algebraic geometry, whose theorems are currently being used in some modern theories of physics today, and which work with Tarski-Grothendieck set theory for instance, one of many extensions of ZFC. Best wishes, Mihai
report post as inappropriate
Stefan Weckbach wrote on Mar. 23, 2020 @ 17:49 GMT
Dear Klaas Landsman,
interesting essay, i like it. Especially the fact that 'randomness' does not imply 'lawlessness', a result that is often overlooked when talking about randomness.
I would be happy if you would comment on my essay where i try to figure out similar links between logics and physics.
post approved
Fabien Paillusson wrote on Apr. 12, 2020 @ 16:13 GMT
Dear Klass,
I found your essay truly brilliant. Combining in a clear manner Quantum mechanics, Goedelian-like results and using them for a discussion of the possibility of deterministic theories of quantum mechanics.
Since this is not my area, I must say I was quite blown away by some statements such as " fair classical coins do not exist". I am still recovering in fact and will have to look at the references you gave in your essay.
With regards to that statement, I wanted to make sure I understood what is meant here:
- Do you mean to say that a coin whose motion is determined by Newton's (or Hamilton's) equations of motion cannot eventually give rise to a fair coin toss (unless true fair randomness is put in the initial conditions)? or
- Do you mean to say that a fair coin model within classical probability theory is actually not fair?
I believe this is the former but just want to make sure.
Finally, given that the argument relies, as far as I understood, on infinite sequences, is there a finite version of it whereby, say, a membership function (for the Kolmogorov random character of a sequence) would be in between 0 and 1 for any finite N but would tend to zero when N tends to infinity?
Congratulations again on this very nice essay.
Many thanks.
Best,
Fabien
report post as inappropriate
Author Klaas Landsman replied on Apr. 12, 2020 @ 16:54 GMT
Dear Fabien,
Thank you for your kind words. I meant the former, the latter presupposes fairness. The reason is, as I explain in my essay, that a fair coin toss requires a truly random sampling of a probability measure, which classical physics cannot provide (I am not claiming that Nature can provide it! But QM can, in theory).
Your second question is very good, space limitations prevented me from discussing it. The question is about what I call "Earman's Principle" in my (Open Access) book, Foundations of Quantum Theory, see http://www.springer.com/gp/book/9783319517766, namely: "While idealizations are useful and, perhaps, even essential to progress in physics, a sound principle of interpretation would seem to be that no effect can be counted as a genuine physical effect if it disappears when the idealizations are removed." This is valid in the arguments in my essay because the definition of Kolmogorov randomness of infinite sequences guarantees it, coming from a limiting construction, as it does.
Best wishes, Klaas
Steve Dufourny wrote on Apr. 12, 2020 @ 18:23 GMT
Hello Professor Landsman,
Wowww , I must say that your essay is very relevant and general, I liked a lot how you approach this topic about our limitations, I wish you all the best, I shared it on Facebook, one of my favorites with the essay of Del Santo and Klingman,
best Regards
report post as inappropriate
Robert H McEachern wrote on Apr. 12, 2020 @ 20:07 GMT
"The famous theorem of Bell (1964) left two loopholes for determinism underneath quantum mechanics..."
It left a lot more than that. The theorem also is founded upon the dubious,
idealistic assumptions, that (1) particles are absolutely identical, and worse still, (2) that all the "measurements" of the particle states are absolutely without errors.
It is easy to demonstrate that when those two assumptions are violated,
as they both must be, when only a single bit of information is being manifested, by an entangled pair, then Bell's correlations can be readily reproduced classically, with
detections efficiencies above the supposed quantum limit. Note that the detection efficiency, being actually measured in the demonstration, is the combined, dual detection efficiency,
not the usually reported single detector efficiency. The former cannot even be measured, in a conventional Bell test.
Rob McEachern
report post as inappropriate
Edwin Eugene Klingman wrote on Apr. 12, 2020 @ 21:40 GMT
Dear Klaas Landsman,
If by ‘
compatible with quantum mechanics’ one means that ‘qubits’ are real, then the argument is over. But there are probably a dozen interpretations of QM with almost as many claimed ‘ontologies’.
Bell demands qubits in his first equation: A,B = +1, -1. And for spins in magnetic domains this is a good statistical model, and reasonable. ...
view entire post
Dear Klaas Landsman,
If by ‘
compatible with quantum mechanics’ one means that ‘qubits’ are real, then the argument is over. But there are probably a dozen interpretations of QM with almost as many claimed ‘ontologies’.
Bell demands qubits in his first equation: A,B = +1, -1. And for spins in magnetic domains this is a good statistical model, and reasonable. Unfortunately, for the Stern-Gerlach model upon which Bell based his reasoning, it is not. The SG data shown on the “Bohr postcard’ is anything but +1 and -1, and a 3-vector spin model produces almost exactly the SG-data.
A number of authors are concerned whether ‘classical physics’ is truly deterministic, and if not, how is this explained.
If one assumes that the deBroglie-like gravitomagnetic wave circulation is induced by the mass flow density of the particle [momentum-density], then the equivalent mass of the field energy induces more circulation. This means that the wave field is self-interacting. For ‘one free particle’ a stable soliton-like particle plus wave is essentially deterministic. But for many interacting particles, all of which are also self-interacting, then ‘determinism’ absolutely vanishes, in the sense of calculations or predictions, and the statistical approach becomes necessary.
This theory clearly supports ‘local’ entanglement, as the waves interact and self-interact, while rejecting Bell’s ‘qubit’-based projection: A, B = +1, -1 consistent with the Stern-Gerlach data (see Bohr postcard). For Bell experiments based on ‘real’ spin (3-vector) vs ‘qubit’ spin (good for spins in magnetic domains) the physics easily obtains the correlation which Bell claims is impossible, hence ‘long distance’ entanglement is not invoked and locality is preserved.
This is not a matter of math; it is a matter of ontology. I believe ontology is the issue for the number of authors who also seem to support more ‘intuition’ in physics. My current essay,
Deciding on the nature of time and space treats intuition and ontology in a new analysis of special relativity, and I invite you to read it and comment.
Edwin Eugene Klingman
view post as summary
report post as inappropriate
Robert H McEachern replied on Apr. 13, 2020 @ 15:29 GMT
Edwin,
Thanks for mentioning the Bohr postcard. I had never actually seen the image until your comment provoked me to search for it.
I would assert that QM is not about ontology at all. It is
not describing what exists in "being", but only the statistics of a threshold-based energy-detection, of the things in being. So if you draw a vertical threshold line, down the middle of...
view entire post
Edwin,
Thanks for mentioning
the Bohr postcard. I had never actually seen the
image until your comment provoked me to search for it.
I would assert that QM is not about ontology at all. It is
not describing what exists in "being", but only the statistics of a threshold-based energy-detection, of the things in being. So if you draw a vertical threshold line, down the middle of the "B-field on" image, you
create the two states. But at the top and bottom of the image, those two states blur together and it becomes impossible to
correctly distinguish between them. That is the problem with all Bell tests, that I noted in my comment above. When you examine a "coin" face-on, it is easy to correctly "call it". But not when you examine it "edge-on." The actual ontology of a coin is that it is what is is - not what you observe. Thus, a polarized coin is in the ontological state of merely being polarized; it is
not polarized either "up" or "down" - the latter are merely the result of "observing" the polarization, with a detector that measures a different energy in the "polarization", as a function of the angle between the coin's axis and the axis of the detector - and then introducing a threshold to "call it" one state or the other - or "none of the above", in the event that there is not enough energy to ever
reliably detect the object at all, as when it is nearly edge-on and thus "too close to call."
In this context, it is useful to bear in mind, that QM got its start, when it was first observed that the photoelectric-effect behaved just as if an energy threshold exists.
Rob McEachern
view post as summary
report post as inappropriate
John R. Cox replied on Apr. 24, 2020 @ 14:47 GMT
Hello Ed,
sorry not to have yet read your essay, I've been concentrating on practical matters and have only looked in on the contest pages. You and I have discussed the idealized Spin co-ordinate system in the past and the Bell dependence on it. So the issue of incompatibility of QM statistical models with a rational realism is perhaps inevitable. You make an excellent point in paragraph 4 about 'one free particle' ( the classic hypothetical 'free rest mass' ) being capable of local deterministic theoretical construct, while globally interactions would have to be treated statistically. So the Bell arguments come down to a simple; 'Why Not?'. QM was designed on purpose to idealize for the sake of granular simplicity. It was not intended to be realistic and its precision derives from intentional extensive reiteration. It is expedient, and Bell is yet another iteration. And so, 'why not?', there is surely room enough in physics for both realism and expediency. Best wishes, jrc
report post as inappropriate
Harrison Crecraft wrote on Apr. 13, 2020 @ 16:36 GMT
Dear Dr. Landsman,
Thank you for your well written essay. I agree with your conclusion that quantum mechanics is intrinsically random, and that hidden variables or initial conditions do not adequately explain the randomness of quantum measurement results. However, I reach a different conclusion on the origin of quantum randomness.
In comparing Standard QM and Hidden Variables QM in...
view entire post
Dear Dr. Landsman,
Thank you for your well written essay. I agree with your conclusion that quantum mechanics is intrinsically random, and that hidden variables or initial conditions do not adequately explain the randomness of quantum measurement results. However, I reach a different conclusion on the origin of quantum randomness.
In comparing Standard QM and Hidden Variables QM in section 4, you conclude that we have a choice between 1) random outcomes of measurements on identical quantum states, and 2) deterministic measurement outcomes on random or randomly sampled HVs.
You reject the second choice on the basis that HV theories are deterministic only at first sight, and this therefore undermines their rationale. You conclude that the randomness of measurement results reflects randomness of the measurement process itself. This is, in essence, the orthodox (Copenhagen) interpretation. The Copenhagen interpretation is essentially an empirical model describing measurement outcomes in terms of Born probabilities.
In my essay I outline a conceptual model and interpretation that provides a third option to explain the randomness of measurement results. I suggest that the randomness of measurement outcomes results from deterministic measurements on an ensemble of metastable quantum states, for example, an ensemble of identically prepared radioactive isotopes. Between its initial preparation and subsequent measurement, a metastable particle is subject to random transitions to a quantum state of higher stability. Deterministic measurements subsequent to the ensemble’s preparation will therefore reveal random outcomes—no hidden variable required. As described in the essay, the proposed conceptual model is consistent with empirical observations, it is based on empirically sound and conceptually simple assumptions, and it explains the measurement problem and many other quantum “paradoxes.” I hope you have a chance to look at it.
Best regards,
Harrison Crecraft
view post as summary
report post as inappropriate
Peter Jackson wrote on Apr. 15, 2020 @ 13:18 GMT
Dear Klass,
Sounds interesting. I've downloaded to my read list. You may have missed my last years finalist essay showing a physical sequence can surprisingly reproduce QM's data set, in the way Bell predicted. I touch on it this year.
In the meantime, could you perhaps answer these questions for me;
1. Is a physical 'measurement' interaction more likely to be with a spinning sphere, or a 2D spinning coin? If the former, then;
2. If we approach the sphere from random directions to measure the momentum states; "ROTATIONAL" (clockwise or anti-clockwise) and also; "LINEAR" (left or right) will we always likely find 100% certainty for both?
3. With one at 100% certainty (say linear at the equator) will the other state not reduce, down to 50:50?
4. Now with 100 interactions in a row, will any statistical uncertainly tend to
increase or
decrease?
5. Did you know the rate of change of rotatation speed (so momentum) of Earth's surface with latitude over 90
o between pole and equator is CosLat?
Catch back up with you soon I hope.
Very Best
Peter
report post as inappropriate
Author Klaas Landsman replied on Apr. 19, 2020 @ 12:43 GMT
Dear Peter,
These questions are very interesting but they do not really reflect on my essay, as you also say yourself, and I find them very hard to answer. The last one I do not even understand. They seem to be more general physics questions than I am able to deal with. Best wishes, Klaas
Peter Jackson replied on Apr. 19, 2020 @ 20:48 GMT
Dear Klass,
Excellent analysis and new exclusion of DeBroglie/Bohm & t'Hooft. Clearly & rigorously argued. My one concern was anticipated in your brilliant last paragraph, but that does then leave a THIRD option not addressed. I alluded to it above, with a proof in my last years essay, with confirmation code & plot in Trails essay. It suggests the elusive '
state after collision'....
view entire post
Dear Klass,
Excellent analysis and new exclusion of DeBroglie/Bohm & t'Hooft. Clearly & rigorously argued. My one concern was anticipated in your brilliant last paragraph, but that does then leave a THIRD option not addressed. I alluded to it above, with a proof in my last years essay, with confirmation code & plot in Trails essay. It suggests the elusive '
state after collision'. Your unique understanding, and ability to see QM may be
emergent, should allow you to assess it;;
I propose that your conclusion;
Randomness "contradicts" determinism needs better definition. My questions on spinning sphere momenta ('OAM') above use a simply 'collision' momentum transfer analogy. A random axis 'pair' state meets Bob's electron, & TWO part vector addition means A,B outcomes are
independent!For collisions on the equator, amplitudes for the question; "clockwise or anticlockwise", will be deterministic, but ALSO uncertain (50:50) so 'random'.
For collisions exactly at a pole; amplitudes for left/right? will be similarly uncertain, but also still
deterministic in classical mechanics terms.!What I pointed out in Q5 was that the momentum CHANGE RATE (as Bob turns his dial) across the surface between 0 and 1 is
by the cosine of the 'angle of latitude' of the tangent point!. (I then derive Cos
2Theta.)
I suggest this is a very important finding, going way beyond just your essay (though meaning your conclusion needs a little tweaking), but ALSO that your last paragraph is correct, as it emerges from a 'Higgs condensate' gravity model-see my essay) and a Gell-Mann 'Quasi' classical derivation of QM should exist, though it may need your help to complete!
If you have a sec do also see my top peer scored 2015 'Red/Green sock trick' essay.
Thought I'm suggesting a minor 'incompleteness' in your analysis, of course agreement on content isn't a valid scoring matter. Well done, and thanks.
Very best
Peter
view post as summary
report post as inappropriate
Cristinel Stoica wrote on Apr. 16, 2020 @ 06:13 GMT
Dear Klaas,
I enjoyed very much your essay, from your insightful parallels between Gödel's and Bell's theorems, to your no-go theorem, which I think it's amazing. I still try to grasp its physical implications. I'm also glad to see form your essay that you know Cris Calude. We've met again when he came back to Bucharest a few months ago. He made me realize that randomness is not what we...
view entire post
Dear Klaas,
I enjoyed very much your essay, from your insightful parallels between Gödel's and Bell's theorems, to your no-go theorem, which I think it's amazing. I still try to grasp its physical implications. I'm also glad to see form your essay that you know Cris Calude. We've met again when he came back to Bucharest a few months ago. He made me realize that randomness is not what we commonly think it is in physics. I realized that we use the word "randomness" pretty much randomly :D Your essay shows that indeed this is an important point, as Cris explained me in our discussions, which is not well understood in physics. Despite his explanations and your eloquent essay, I am still not sure I fully understand the implications. I have a lot to digest, and I also want to find time to go deeper into your ref. [19], a heavy book I have in my library for some time. So I may come back with some questions, but for the moment I am interested into one. Do you think, based on your analysis of the two representative examples of deterministic models and the implication of your theorem on them, that it is possible to distinguish them empirically from nondeterministic versions of quantum mechanics? My interest comes from trying to find falsifiable predictions for a single-world-unitary-without-collapse model, which seems to fit in the same category as 't Hooft's cellular automata, but I interpret it differently than denying free choice of experimental settings, as I explain in the attached pdf. In the last section I mention two possible experiments, and I am interested to see if testing for genuine randomness can be physically done. I expect some loopholes stronger than in the EPR case, due to the fact that measurements are not sharp in general, and that the measurement device and the environment may not be stable enough to allow a comparison of repeated experiments numerous enough to tell if the resulting randomness is genuine or not. But I'm interested if you think this to be possible, at least in principle.
Cheers,
Cristi
view post as summary
attachments:
Cristi_Stoica_The_post-determined_block_universe_draft_2020-04-16.pdf
report post as inappropriate
Author Klaas Landsman replied on Apr. 19, 2020 @ 12:51 GMT
Dear Cristi,
That's an interesting question. My analysis at the end of my essay suggests that the answer is no, deterministic HVB models are empirically indistinguishable from standard QM. This is not just because it is the way they are designed, but as I try to argue, the reason also lies in the unprovability of randomness, which should be the distinguishing feature. I cannot say I have fully grasped this issue, though, and you might benefit from the extremely interesting PhD thesis at the Universidad de Buenos Aires of G. Senno, A Computer-Theoretic Outlook on Foundations of Quantum Information (2017), which is easily found online.
The case of dynamical collapse models is similar - I proposed one of these, see my Open Access book Foundations of Quantum Theory, http://www.springer.com/gp/book/9783319517766, and I also worked with a group in Oxford to design an experiment to test my theory, but this failed, perhaps for different reasons: you cannot really monitor the collapse of the wave-function in real time. I will try to take a look at your PhD thesis also, though for completely different reasons. Best wishes, Klaas
Satyavarapu Naga Parameswara Gupta wrote on Apr. 17, 2020 @ 10:21 GMT
Dear Professor Klaas Landsman,
Thank you for presenting a wonderful essay written with a very smooth flow....
Your statement about Godel's law as.........Godel proved that any consistent mathematical theory (formalized as an axiomatic deductive system in which proofs could in principle be carried out mechanically by a computer) that contains enough arithmetic is incomplete (in that arithmetic sentences ' exist for which neither ' nor its negation can be proved)...................
I have few questions about it. This law is applicable to Quantum Mechanics, but will this law be applicable to COSMOLOGY.......?????.........
I never encountered any such a problem in Dynamic Universe Model in the Last 40 years, all the the other conditions mentioned in that statement are applicable ok I hope you will have
CRITICAL examination of my essay... "A properly deciding, Computing and Predicting new theory’s Philosophy".....
Best Regards
=snp
report post as inappropriate
Member Noson S. Yanofsky wrote on Apr. 17, 2020 @ 20:02 GMT
Dear Klass,
You really wrote a great article. Thank you!
All the best,
Noson
report post as inappropriate
John David Crowell wrote on Apr. 19, 2020 @ 10:03 GMT
Dear Klaas. While reading your essay I got very excited. I am not a physicist or a mathematician. My major expertise is in creativity and its fit in the world-and- in some respects its relationships with science and mathematics. You provided a very interesting overview of the “battle” between determinism and indeterminism. Several things in your essay as it relates to my essay are “very” exciting. In my essay I describe a process that converts chaos to order - the C*s to SSCU transformation (described in the appendix of my essay) and the scale up of the SSCU to become our (the visible) universe (described in the body of the essay). It appears to me that the C*s to SSCU transformation is the “...internal processing of atoms (in my theory the internal processing of all physicality) that enforce some particular outcome” expressed by Born (1926). In the Successful Self Creation theory that “enforcement” was the self replicating/self organizing progression that eventually became the universe. Also you mention that the “... attempts to undermine the “Copenhagen claim of randomness looked for deterministic theories underneath quantum mechanics” and you concluded that was impossible. I agree with your conclusion. However, those looking to undermine the Copenhagen claim of randomness had it backward. The SSC theory presents a randomness (chaos) underneath and the process that converted that randomness (chaos) to a repeating, self replicating deterministic progression that became the multiverse that contains our universe. There is much more that we should discuss. If you would read my essay and respond it could be the beginning of an exciting discussion of your “musings on (in)determinism” in your easy. I am looking forward to hearing from you. John D. Crowell
report post as inappropriate
Vesselin Petkov wrote on Apr. 19, 2020 @ 20:08 GMT
Dear Klaas,
Thanks for this brilliant essay. As I also "would personally expect that a valid theory of the Planck scale... would derive quantum mechanics as an emergent theory," have you thought of what seems to be a natural logical extension of the ancient idea of atomism - discreteness not only in space but in time (rather spacetime) as well? (To question a fundamental continuity - continuous existence in time - at the heart of quantum physics.)
Then the probabilistic behavior of the quantum object may be represented as a manifestation of a probabilistic distribution of the quantum object itself in the forever given spacetime: an electron, for instance, can be thought of as an ensemble of the points of its disintegrated worldline, which are scattered in the spacetime region where the electron wavefunction is different from zero. Then, in the ordinary three-dimensional language, an electron would be an ensemble of constituents which appear-disappear ∼10^20 times per second (the Compton frequency); and, obviously, such a "single" quantum object can pass simultaneously through all slits at its disposal in the double-slit experiments with single electrons and photons.
Had Minkowski lived longer he might have described such a probabilistic spacetime structure by the mystical expression "predetermined probabilistic phenomena."
It is true, the above question is more in the spirit of the Einstein-type approach to fundamental physics, whereas now the predominant approach seems to be more Minkowski-type (as we know Minkowski employed such an approach when he discovered the spacetime structure of the world).
Best wishes,
Vesselin
report post as inappropriate
Vesselin Petkov replied on Apr. 20, 2020 @ 13:28 GMT
I am sorry, I wrote the comment above quickly and omitted half of the information in the last sentence of the second paragraph:
"and, obviously, such a "single" quantum object can pass simultaneously through all slits at its disposal in the double-slit experiments with single electrons and photons."
It should have been:
"and, obviously, such a "single" quantum object (i) can pass simultaneously through all slits at its disposal in the double-slit experiments with single electrons and photons and (ii) can still be detected as a point-like particle - when the first of the appearing-disappearing constituents of the electron falls in a detector, due to a jump of the boundary conditions, all other constituents continue to appear-disappear only in the detector."
report post as inappropriate
Vesselin Petkov replied on May. 1, 2020 @ 02:23 GMT
Dear Klaas,
Please do not bother to react to my question above; I also know what is like to have to try to reply to all comments while teaching. Moreover, I posted the question rather for your consideration and did not expect an answer, but should have made that explicit. The reason for posting the question was that, personally, I think it is interesting to see what bringing the idea of atomism to its logical completion looks like.
Best wishes,
Vesselin
report post as inappropriate
Member Simon DeDeo wrote on Apr. 20, 2020 @ 01:10 GMT
Hello Klaas,
Thank you for this delightful essay. Theorem 3.5 seems to be dead on. It's an unexpected move, but I think it has to be true intuitively.
That said, let me probe a little bit. I'm a Many-Worlds person (for the purposes of this comment!) For me, I get classical indeterminism from tracing over the wavefunction via decoherence. In this case, I ought *not* to expect algorithmic randomness. Indeed, what happened could be understood as the very deterministic outcome of what I happen to be tracing over. It's a thermodynamic randomness, not a Chaitin one.
In that case, I think, you can only recover the Chaitin randomness if the wavefunction itself is incompressible. (Going back and forth between K(x) and K(f) (f the wavefunction, x the realization) is something that's been on my mind a bit, and is part of the essay I did this year—I'm not sure if I understand it fully yet.) I think your remarks about emergence in the final paragraph suggest you might find this idea sympathetic, if you haven't already done a lot with it already!
I don't quite know how to square the thermodynamic and Chaitin notions of randomness—I think you can get pretty far by talking about K(x | f), but I'm not sure how far! The problem is that if f comes from a deterministic theory that you then trace over with an observer's ignorance, i.e.,
f(x) = sum_y d(x,g(y)) P(y)
g is a deterministic, computable mapping, d(x,y) is a delta function, and P(y) is the observer's ignorance of the rest of the world, I don't know what happens to K(x).
Yours,
Simon
report post as inappropriate
Author Klaas Landsman replied on Apr. 22, 2020 @ 08:12 GMT
Dear Simon, Thank your for these comments and questions (meanwhile I have also read you own essay with interest, it will take me some time to digest the main point you make but so far I agree with it). Since your question is posed in the context of the MWI, let me confess that in my early twenties I decided not to smoke, drink alcoholic beverages, eat meat, own a car, curse or shout in public,...
view entire post
Dear Simon, Thank your for these comments and questions (meanwhile I have also read you own essay with interest, it will take me some time to digest the main point you make but so far I agree with it). Since your question is posed in the context of the MWI, let me confess that in my early twenties I decided not to smoke, drink alcoholic beverages, eat meat, own a car, curse or shout in public, betray any lover, or waste my time on string theory and the many worlds interpretation (the latter two for the same two reasons: first, the religious zeal and sense of certainty of adherents of said ideas, and second, once more against the spirit of science as I see it, both assume QM as absolutely true and given, which already then I felt would need to be replaced at a more fundamental level - now that I have become familiar with the notion of emergence, I feel this even more strongly). However, your question can be detached from the MWI and one may validly ask what it means for a wave-function to be uncomputable: this may already happen for a two-level quantum system, if the (pure) state, seen as a point on the Bloch sphere, has non-computable coordinates. Inspired in part by the recent work of Gisin (and his collaborator Del Santo), but also by the much earlier work of the Dutch intuitionistic mathematician L.E.J. Brouwer, with my student Tein van der Lugt and others I am currently pursuing such questions, which come down to analyzing the role of real numbers in physics, including questioning even the definition of the real numbers in mathematics (which in intuitionism is very different from classical mathematics). Ultimately, I hope for a theory in which random sequences (in the sense of Kolmogorov, Chaitin, etc.), lawless choice sequences (in the sense of Brouwer), the output of repeated measurements on quantum systems, the role of (non)computability, finite precision even in classical physics, hidden variables, etc. all fall into place. You would also be well qualified to contribute to this program.
Stay in touch! Best wishes, Klaas
view post as summary
Eckard Blumschein wrote on Apr. 20, 2020 @ 01:10 GMT
Congratulation Klaas Landsman,
I see your essay a if not the most mature proficient one. However, is Bell’s theorem really the key to something new in science? What consequences does your set theoretical based reasoning have in physics?
Perhaps you are excellently improving the map. Schlafly and I are distinguishing the map from the territory. Obviously, you didn’t convince for instance McEachern and Kadin that they are wrong. Well, I too am unable to swallow Peter Jackson’s alternative explanation. From Petkov’s comment I learned the somewhat intentional sounding expression “predetermined probabilistic phenomena”. De Wilde seems to be still waiting for your reply. I am not in position to judge some possibly relevant raised question questions for instance by Crecraft, Klingman, and Gupta and don’t ask you to deal with them.
How many decades will Kadin have to wait until his prediction comes wrong? I feel helpless against accepted mathematics that I consider questionable. When I showed in my recent essay that Fourier was partially wrong, this was an attack on the fundamental adaptation of mathematics on very basic physics.
Unfortunately your training made you blind for my hints to what I consider unjustified related maneuvers in mathematics. Isn’t R+ reasonable even if not Hausdorff at zero?
Incidentally, appreciating your logical reasoning, I hope you will understand me if admit that I am arguing, so far in vain, against the use of “present state” between the alternatives past and future. I also tend to restrict “initial values” to ideal models, not to the physical reality.
Best further success,
Eckard Blumschein
report post as inappropriate
Member Tejinder Pal Singh wrote on Apr. 23, 2020 @ 19:36 GMT
Dear Professor Landsman,
While I am trying to understand your wonderful essay, your last paragraph caught my attention. In my recently proposed theory of Spontaneous Quantum Gravity, I indeed derive quantum theory as an emergent theory, from an underlying *deterministic* matrix dynamics at the Planck scale. This new theory actually forms the content of my essay here: The pollen and the electron. This theory builds on the earlier work of Stephen Adler on `Quantum theory as an emergent phenomenon'
My purpose in submitting this post on your page is not so much as to ask you to read my essay, but rather to request you to kindly have a look at the technical references given at the end of my essay. I am curious to know what you think of this new theory, and will be so grateful for your critical comments. Also useful could be arXiv:1912.03266
I will try to understand the proofs in your very interesting essay.
Thanks so much,
Tejinder
report post as inappropriate
Steve Dufourny replied on Apr. 25, 2020 @ 10:55 GMT
Hi well, see well on other essays how I have explained, reached, quantified, renormalised this quantum gravitation, see the universal balance necessary between entropy and negentroy, cold and heat in considering the 3 main finite series of 3D coded spheres, one for the space and two fuels, photons and cold dark matter, I have respected this newtonian mechanic, we search this quantum gravitation, it is the holy graal, but the thinkers could be less in their works and recognise the works of others, I repeat it is quantified in changing simply the distances anad mass, the standard model is just emergent, I will published this year several papers, the thinkers have forgotten to Think beyound the box and consider new parameters superimposed, they are in an electromagnetic and relativistic prison.
Regards
report post as inappropriate
Steve Dufourny replied on Apr. 25, 2020 @ 11:01 GMT
I d like to have relevant mathematician here on FQXi like Connes, Wittem Susskind, Witten, John Baez, with them we can make a revolution, they are better than me in maths I beleive even If I have utilised these Tools, like the Ricci flow, the Hamilton Ricci flow, the lie derivatives, the lie groups, the lie algebras, the poincare conjecture, the topological and euclidian spaces and the Clifford algebras and matrix of Dirac and matrix of Clifford, I need help, there are maybe several errors in my mathematical extrapolations, I have also invented with a person the assymetric Ricci flow to explain the unique things probably in the smaller volumes of my finite series of 3d coded spheres where this space disappears and having the same finite number than our finite cosmological finte series of spheres.The universe shows us the generality and how it acts with the universal balance between heat and cold, gravitation and electromagnetis, order and disorder, entropy and negentropy, sometimes the complexity returnms to simplicity, the universe is simple generally, the human psychology and its Vanity that said is complex and not really rational lol
report post as inappropriate
David Brown wrote on Apr. 27, 2020 @ 10:43 GMT
“Both experts and amateurs seem to agree that Gödel’s theorem and Bell’s theorem penetrate the very core of the respective disciplines of mathematics and physics.” If nature is finite and digital then are Gödel’s theorem and Bell’s theorem fundamentally irrelevant to the reality of nature? I have suggested that my basic theory is wrong if and only if dark-matter-compensation-constant...
view entire post
“Both experts and amateurs seem to agree that Gödel’s theorem and Bell’s theorem penetrate the very core of the respective disciplines of mathematics and physics.” If nature is finite and digital then are Gödel’s theorem and Bell’s theorem fundamentally irrelevant to the reality of nature? I have suggested that my basic theory is wrong if and only if dark-matter-compensation-constant = 0.
If my basic theory is wrong, then the Koide formula and Lestone’s theory of virtual cross sections might be valid (although not in the way hypothesized by my basic theory).
Assume dark-matter-compensation-constant = 0 and string theory with the infinite nature hypothesis is empirically valid.
Lestone's theory of virtual cross sections might explain the numerical value of the fine structure constant.
Lestone, John Paul. Possible reason for the numerical value of the fine-structure constant. No. LA-UR-18-21550. Los Alamos National Lab.(LANL), Los Alamos, NM (United States), 2018.Lestone, J. P. "QED: A different perspective." (2018). Los Alamos report LA-UR-18-29048If Lestone's theory of virtual cross sections is empirically valid, then does it require a new uncertainty principle?
According to some of the string theorists, spacetime is doomed. If spacetime is doomed then is a new uncertainty principle required? What are the criticisms of the following?
There exists a (finite) Lestone-maximum-mass > 0, such that for any massive elementary particle in the Standard Model of particle physics,
(standard deviation of position) * (standard deviation of velocity) ≥
(reduced-Planck's-constant/2) / (Lestone-maximum-mass) .
If, near the Planck scale, the concepts of time and space fail, then could uncertainty in the speed of light allow Lestone’s theory to be empirically valid?
view post as summary
report post as inappropriate
H.H.J. Luediger wrote on Apr. 28, 2020 @ 15:27 GMT
Dear Klaas Landsman,
Gödel's and Bell's works were on processes (or algorithms) in 'time', which make the question of (in)determinism the driving notion of your essay. Determinism, as we understand it today, deviates substantially from the Classical Greek interpretation and originates from the heyday of historiography in enlightenment. Hume, the historian, was exponent of the temporalization of affairs and belonged to the firsts who misinterpreted the symbol t in Newton's equations as historical time, thus initiating a ghost-debate on (in)determinism.
What happened in enlightenment is the change of view from a priori (law) to a posteriori (model), with the latter arousing 'time'. It is not such that a posteriori empiricism observes and models real processes in 'time', rather it provokes the psychological idea of time by its inherently probabilistic and hence undecidable models. So, empiricism itself is the cause of 'time', with the consequence that whatever it 'observes' MUST be indeterministic for the reason that the 'historical future' exists only as deviation from expectation. And since the a priori natural law is devoid of deviation from expectation, it is not in 'time'.
In short, a priori laws are deterministic by being literally time-less knowledge, while a posteriori models, and be they axiomatic, animate a timeless Parmenidean world and end up in complexity, undecidability and indeterminism. So, from my point of view your essay appears to administer pseudo-problems.
Heinz
report post as inappropriate
Author Klaas Landsman replied on Apr. 28, 2020 @ 16:11 GMT
Dear Heinz,
Thank you for adding this extremely interesting perspective. Is there any relevant literature I could read to understand this issue in more detail? Surely, almost every physicist including even Newton himself has failed to distinguish between these two notions of time, if they are indeed distinct (for the physicists may have been right in identifying them after all). What, for example was the original Classical Greek notion of determinism, other than "order" in the Platonic sense, which seems timeless to me?
Best wishes, Klaas
H.H.J. Luediger replied on Apr. 29, 2020 @ 10:30 GMT
Dear Klaas,
major parts of the physical literature, beginning with Newton, can be taken as a reference, if one is prepared to take off one's empirically-tinted glasses. In the scholium of the Principia Newton makes a pretty clear case for an idealistic view of physics:
"Wherefore relative quantities [time, place, space and motion] are not the quantities themselves, whose names they bear.....And the expressions [time, place, space and motion] will be unusual, and purely mathematical, if the measured quantities themselves are meant." So, Newton's space and time, for instance, are not meant to be isomorphic with the everyday use of these notions and he stresses that the notions of physics (=mathematics) must not be confounded with the notions of empirical quantities [vulgaribus mensuris], which I have radicalized to the idea that natural language and physical language can only be Absolutely non-contradictory if they are incommensurable, complementary or orthogonal.
Further, one still finds very empiricism-critical approaches to relativity and quantum mechanics in the early papers of Einstein (1905) and Heisenberg (1925), respectively. Both make very clear that they speak of KINEMATIC representations, not of PHYSICAL theories. The fusion of the 'two languages' and hence the Unanschaulichkeit of modern physics begins with the coup of the Göttingen school of mathematics...
Just in case you are interested in the basics of my thinking (and happen to speak German), google for "Wird die Wissenschaft zum Feind des Wissens?" or "Das Ding an sich", both posted on Philosophie.ch
Heinz
report post as inappropriate
Author Klaas Landsman replied on Apr. 30, 2020 @ 08:11 GMT
Dear Heinz, Thanks, I found your essays and will read them as soon as I have time (we are in the middle of a teaching period here, which takes far more time than normal). As to the "Unanschaulichkeit of modern physics begins with the coup of the Göttingen school of mathematics..." I wrote a historical essay about that which might interest you, seehttps://arxiv.org/abs/1911.06630. You surely know that Einstein reprimanded both Heisenberg and his earlier (1905) self for insisting on too much empiricism, see e.g. Heisenberg's autobiography "Der Teil und das Ganze". I would say this "Unanschaulichkeit of modern physics" already started with Newton's law of gravity, which he saw as a purely mathematical description and warned against any physical interpretation. Best wishes, Klaas
Chidi Idika wrote on Apr. 29, 2020 @ 19:21 GMT
Dear professor, Landsmann,
You write
“...it may be helpful to note that in classical coin tossing the role of the hidden state is also played by the initial conditions (cf.Diaconis & Skyrms, 2018 Chapter 1, Appendix 2). The 50-50 chances (allegedly) making the coin fair are obtained by averaging over the initial conditions, i.e., by sampling. By my arguments, this sampling cannot...
view entire post
Dear professor, Landsmann,
You write
“...it may be helpful to note that in classical coin tossing the role of the hidden state is also played by the initial conditions (cf.Diaconis & Skyrms, 2018 Chapter 1, Appendix 2). The 50-50 chances (allegedly) making the coin fair are obtained by averaging over the initial conditions, i.e., by sampling. By my arguments, this sampling cannot be deterministic, for otherwise the outcome sequences appropriate to a fair coin would not obtain: it must be done in a genuinely random way. This is impossible classically, so that
(unless they have a quantum-mechanical seed) fair classical coins do not exist...”
And, concluding you, state:
“I would personally expect that a valid theory of the Planck scale (including quantum gravity or string theory, though these words are misleading here), far from assuming the Born rule and the rest of quantum mechanics (as these theories normally do), would derive quantum mechanics as an emergent theory (instead, the opposite seems to be the majority goal, i.e. deriving gravity as an emergent phenomenon from quantum theory). Thus quantum mechanics would typically be a limiting case of something else, which would, then, also render the Born rule valid in some limit only, rather than absolutely.”
My humble self is interested to see that we can indeed reduce this whole classical/quantum divide to an intuitive picture — when we think of the self-referencing state (typically every mind/observer) as, by you, own “quantum mechanical seed”.
That is, we can approach quantum mechanics from the perspective that it is fundamentally about self-reference such that the observer is by definition the norm/normal e.g. the unit or constant refractive index by which we are at any instance describing our cosmology. Modelled as Gödel’s self-referencing state, every mind simply is thus the quantum of own observables (typically the probe energy or beat frequency or quantum vacuum) if the Landauer limit.
This will be just in the exact same way that Planck’s quantum (v = E/h) is that norm/normal by which in spectroscopy we are attempting to describe the applicable black-body radiation as a unique arrow of time (the black body spectrum).
In this sense every mind would be as the phase constant own Kolmogorov incompressibility (own energy/time or wave/corpuscular uncertainty threshold); just what the constant refractive index is to all observable dispersion relations of light i.e. to all self-referential splitting of light into constructive versus destructive interference; perhaps your binary string.
Hoping that you can take a look at how I grapple with this vision (being an editor, please don’t let poor wordings distract you).
Chidi Idika (forum topic: 3531)
view post as summary
report post as inappropriate
Chidi Idika replied on Apr. 29, 2020 @ 23:28 GMT
Sorry about the typo in the name. Meant to write Landsman.
report post as inappropriate
Author Klaas Landsman replied on Apr. 30, 2020 @ 08:14 GMT
Dear Chidi, This is hard to reply to, especially since we are in the middle of a teaching period here, which takes far more time than normal. Some comments one my paper ask direct questions about it which I try to answer quickly, others forward their own theories which are typically only peripherally related to my paper. These take far more time to answer, however good they may be. I hope you understand this. Best wishes, Klaas
Chidi Idika replied on Apr. 30, 2020 @ 15:09 GMT
You are right. Bear with me, sir.
And when you find the time please do take a peak I'm sure will find the connection interesting.
report post as inappropriate
Steve Dufourny wrote on May. 1, 2020 @ 09:21 GMT
Hi well , like I said your essay is a good essay with a good ranking of all the different thinkers and a mix of their ideas about the computabilities, the randomness, the mathematics, we see a very good knowledge of all their works, Solomonoff, Kolmogorov, Chaitin, Svozil ,Downey, Hirschfeldt, Born, Bell, Godel, Hilbert,Durr, Goldstein, Zanghi, Hooft,and others that I forget to name.
...
view entire post
Hi well , like I said your essay is a good essay with a good ranking of all the different thinkers and a mix of their ideas about the computabilities, the randomness, the mathematics, we see a very good knowledge of all their works, Solomonoff, Kolmogorov, Chaitin, Svozil ,Downey, Hirschfeldt, Born, Bell, Godel, Hilbert,Durr, Goldstein, Zanghi, Hooft,and others that I forget to name.
So indeed you have well learnt their works and you play with all this, but I see in all humility maybe a big problem , you consider these strings and correlted bits and a kind of wave pilot, so it is correlated with these 1D main strings at this planck scale and the 1D main field form this cosmology if I can say, you search the good road to correlate the universe, the quantum mechanics and the quantum computing in trying to consider the good randomness, series finite and infinites in a kind of universal converging partition, but for me if the main essence is made of particles instead of fields like in the strings, branes, Mtheory , so you are not going to converge with this universal foundamental mechanics if I can say.
We cannot affirm that all is made of fields and philosophically the same, the strings consider a 1D main field at this planck scale and extradimensions but we cannot affirm that it is the truth to explain our geometries, topologies, matters and properties.
My model considers 3D spheres coded but I am not here to speak about this, we are not here to prove philosophically who is right about these foundamental mathematical and physical objects and the correlated philosophy of this universe, we cannot affirm simply. Like I told your essay shows that you have well studied all these works and that you perceive the generality, it is important. But if I can , if your strings generally are not correct , so never you shall explain the quantum computing , I recognise good mathematical Tools with these strings to rank the fields, but witten has created a prison and I beleive that many confound his field medal wich is a very good mathematical work with his theory of strings. I doubt that this universe is only made of photons and that they oscillate due to strings to imply the physicality. Witten and Einstein even if the GR and SR are correct have created a prison for the thinkers. So the landscape is complex and we cannot affirm the main cause and the philosophy.
You can utilise all what you want for a general universal landscape, if the foundamental objects and the philosophy general are not correct, so just a part is correct but not the generality. And the non commutativity is not the problem but the foundamental Tools , objects and the general philosophy yes. I consider like I told you these 3D spheres in a kind of superfluid gravitational aether, space. And 3 main primordial coded series, one for the space and two fuels, the photons and the cold dark matter and when they merge they create these topologies, geometries, matters and properties with fields.
he wave particles duality is respected also because all is in contact when we consider specific finite primordial coded series of 3D spheres. I formalise all this with these mathematical Tools, an intrinsic ricc flow, the Hamilton Ricci flow, an assymetri Ricci flow that I have invented to explain the unique things, the lie derivatives, the lie algebras, the lie groups, the topological and euclidian spaces, the Clifford algebras, the poincare conjecture, the deformations of spheres also. Who is right ? me or the strings theorists ? we don t know and nobody can affirm but when I see the nature, it seems evident that these spheres, spheroids, ellipsoids,....are the choice of this universe, why I don t know but this geometry is different, it is the perfect equilibrium of forces and has no angle and can create all geometries.
The big problem is philosophical in fact. It is not easy I know to change a road of thoughts for the thinkers but the doubt is the real torch of generalists, if they are persuaded about unknowns , so it is a problem of Vanity for me, in all case these strings and these 3D spheres can converge because they oscillate also and are in motions, and in contact due to this universal superfluidity of the space.
All this to tell you that for a quantum computer , we must absolutelly utilise the good foundamental objects , if not we cannot reach it, the aim is not to play with all the mathematical Tools invented by all our best thinkers in maths about this randomness, the infinities, the real infinity, the finite series, the Waves, fields… the aim is to find the good universal partition with the foundamental mathematical and physical objects and their universal philosophy.
Regards
view post as summary
report post as inappropriate
Steve Dufourny replied on May. 2, 2020 @ 09:40 GMT
The problem is complex and mainly philosophical. Why the majority of thinkers now consider that all comes from the fields ? it is due to the strings theory in fact and Witten, they consider that we have only photons due to an infinite heat and that this thing oscillates these photons with strings at this planck scale in 1D connected with a 1D main Cosmic field. And after with the geometrical algebras they create extradimensions towards 11D. and they explain the emergent topologies, geometries and matters, I can respect these strings and recognise good mathematical plays to rank these fields but the generality for me is totally false, because we have probably coded particles and 3 main finite series of particles coded merging in a superfluid to create these emergent topologies, geometries, matters and properties of Waves, fields and particles. I consider a pure 3D at all scales. In fact it is logic that these particles are the main essence , we cannot have fields without particles because without motions, we have particles but no field, not the opposite, this strings theory and general relativity have really created a prison, and it is difficult now to discuss with these strings theorists, they cannot change their line of reasonings, they are persuaded. But these strings have a big problem philosophical ,they don t explain the evolution...
report post as inappropriate
Steve Dufourny replied on May. 28, 2020 @ 22:01 GMT
Lol, are you just good in studying the works of known thinkers and mix their ideas or are you also able to go deeper in philosophy and link the generality of this universe, and of course without Vanity because we know that this parameter decreases the complementarity and evolution of the theoretical sciences Community, too much forget to change their line of resoning the most of the time due to the paramter cited , I repeat but a few number are able to create new innovative theories linking the maths, physics and philosophy, so it is probably the reason why they fear to discuss because they are probably not made to go deeper or to discuss about these things, so they don t develop these innovant theories :) spherically yours
report post as inappropriate
Rajiv K Singh wrote on May. 1, 2020 @ 10:05 GMT
My comments are based on the text presented in the essay, without a consideration to any work cited, and my limited knowledge of the subject.
1. "For a fair coin flip the probability of each string σ within 2^|σ| is 2−|σ|, so that an apparently “random” string like σ = 0011010101110100 is as probable as a “deterministic” string like σ = 111111111111111. Therefore, whatever...
view entire post
My comments are based on the text presented in the essay, without a consideration to any work cited, and my limited knowledge of the subject.
1. "For a fair coin flip the probability of each string σ within 2^|σ| is 2−|σ|, so that an apparently “random” string like σ = 0011010101110100 is as probable as a “deterministic” string like σ = 111111111111111. Therefore, whatever its definition, the “randomness” of a string cannot be defined via its probability."
I am not sure why the string 111111111111111 is referred to as deterministic? If we consider the fair coin flip generated strings then this is as probable as 0011010101110100 as you stated, but if we consider 111111111111111 to be deterministic for its given generator mechanism produces only 1s, then all sequences are to be deterministic to be the outcome of their respective generator mechanisms, not a result of fair coin flips. I presume the difference in string length is typographical (16 vs. 15 bits), but that is intended, then I missed the point. Indeed the Kolmogorov complexity of a given string 111111111111111 may be very low, but the Kolmogorov complexity of the process of generation (coin flip) has to have infinite detail to generate truly random sequence of any length. Moreover, complexity of even 0011010101110100 can be as low as one bit if we compare against the pre-set base sequence of 0011010101110100. That is the complexity depends on the basis of generation or comparison.
Furthermore, if we leave the definition of randomness of a process on interpretation, then by ignoring the generator mechanism any physical process can be interpreted to have a degree of randomness.
But, if we consider the underlying reality of quantum system to be analog or continuous with infinite possible configuration and require transitions to be always in quanta depending on the gate with discrete allowances presented by the observing system, then the process will have genuine randomness as discussed in my essay -- Mother of All Existence.
2. "Theorem 3.2 With respect to P^∞ almost every outcome sequence x ∈ 2^N is 1-random."
Quite a few places, 2^N notation has used the space of integers in the power rather than a number N, which I could not follow if that was intended.
Rajiv
view post as summary
report post as inappropriate
Steve Dufourny replied on May. 1, 2020 @ 10:17 GMT
Hi, why Always strings like foundamental objects`? like it was sure that they are the foundamental objects at this planck scale with a 1D main field ? if the universe is not like that, so we cannot reach with the strings the quantum computing , so the binaries 1 and 0 even with Pi digits cannot converge,
regards
report post as inappropriate
George Gantz wrote on May. 5, 2020 @ 18:17 GMT
Dr Landesman -
A brilliant, erudite essay. Unfortunately I do not have the expertise to follow the mathematical reasoning and I am puzzled about the epistemological significance of the findings. I was left "in the weeds" so to speak. But I do agree that physical determinism has been shown to be false - although that result is not provable.
I have made a more ambitious effort to bridge the gap between the quantum and mathematical challenges using a three-worlds framework. I'd appreciate your review if you have the time.
Cheers - George Gantz: The Door That Has No Key: https://fqxi.org/community/forum/topic/3494
report post as inappropriate
Member Emily Christine Adlam wrote on May. 16, 2020 @ 14:24 GMT
I really enjoyed this essay. It seems to me that shifting the focus from individual experiments to sequences of outcomes is a very important insight and a step that quantum foundations certainly ought to take - after all, the experimental evidence for quantum mechanics comes from relative frequencies in sequences of measurements, not single-shot experiments, so it's likely that our interpretational difficulties come in part from trying to apply a theory defined over sequences to individual probabilistic events.
That said, I didn't entirely understand the motivation for imposing the requirement of 'typicality of almost every outcome sequence of an infinitely repeated fair quantum coin flip.' Given that all the outcome sequences we have actually examined are finite, and that in any case typicality of an observed sequence can't be proved, the experimental evidence cannot justify the claim that outcome sequences must be typical. This claim, as I understand it, comes from assuming that the standard quantum-mechanical representation (based on the Born rule and the combination of independent systems by taking tensor products of the underlying Hilbert spaces or operator algebras) is the correct mathematical representation to obtain the probabilities for outcome sequences even in the infinite limit. Would the proponents of the de Broglie-Bohm and t'Hooft models not simply deny this claim, arguing that there is no reason that their models need to be compatible with quantum mechanics in an untestable limit? Requiring compatibility with the predictions of quantum mechanics even for unphysical examples seems to be assigning too much weight to the mathematical structure which we happen to have chosen to represent the observed physical results.
report post as inappropriate
Author Klaas Landsman replied on May. 22, 2020 @ 13:27 GMT
Hi Emily, Nice to hear from you and the point you make is excellent, probably reinforcing the conclusion I already drew on the basis of infinite sequences, namely that deterministic underpinnings of QM are wrong but cannot be proved so. What needs to be done is find a finite-size analogue of typicality, in the spirit of what I call the Earman and Butterfield principles. But whatever it is, it will be approximate, and this gives Bohmians and 't Hooft exactly the leeway you describe.
PS I see you also submitted an essay and I will read it asap.
All the best, Klaas
Irek Defee wrote on May. 18, 2020 @ 12:38 GMT
Dear Prof. Landsman,
You essay is brilliant, very important for closing loopholes and final ending with determinism in QM. The question now is where the fundamental indeterminism or randomness plus undecidability comes from. You state that theory of the Planck scale would derive QM as emergent, then hopefully we may know. Such theory, or an ultimate theory of everything should also be able to tell emerging 'why' and from 'what' and this 'what' should be then kind of irreducible or self-referential. Such topics are obviously rather mind-blowing. In
my essay I am focusing on the 'why' and 'what' and it turns out it is uncomputable sequences which have curious relation with nothingness. Material in my essay is rather compressed and concepts are not stated precisely but I hope the essence can be grasped.
Best regards,
Irek
report post as inappropriate
Author Klaas Landsman replied on May. 22, 2020 @ 13:29 GMT
Dear Irek, thank you, I will do my best to read your essay asap. Best wishes, Klaas
John R. Cox wrote on Jun. 28, 2020 @ 21:37 GMT
Dr. Landsman,
Not to detract in any way from the QM arguments you elucidate (much of it above my paygrade) but when it comes to the mathematical construct of Bell's Theory it seems to me that it assumes a disconnect from the physical in the first statement. A line segment of indeterminate scale with incrementals between -1 & +1 would not be physically capable of embedding correlations on three vectors of R3. And for R3 to become embedded in S3, it would have to evolve into R4 and undergo one more half pi rotation for the three right hand axes of R4 to coincide with the three RH (assuming RH initial designation) axes in S3 without one axis becoming a pseudovector.
What ever the physical reality of the entanglement correlations might be, and which none can adequately describe beyond endless debate, technologically it matters naught whether deterministic theories or non-local theories are the more correct. What matters is that whom ever gets a functional system up and running first will corner the global market of electronic transfer of funds. And with that the preferred national currency will quickly become the global reserve currency for the coming century. Simple political science. Technology is becoming more an essential commodity than oil.
Best wishes, your essay seems to me to be a tour de force of many painstaking years. jrc
report post as inappropriate
Peter Warwick Morgan wrote on Jul. 29, 2020 @ 17:13 GMT
Fervent congratulations, Klaas. I have been a passionate follower of your work since I saw you give a talk in Oxford in the early 2000s. You may recall that my focus is on algebraic QFT: algebraic QM with, as we might say, arbitrary subdivisions allowed.
On page 8, you suggest a dilemma, "The sampling is [or is not] provided by the hidden variable theory".
I suggest that there is, at...
view entire post
Fervent congratulations, Klaas. I have been a passionate follower of your work since I saw you give a talk in Oxford in the early 2000s. You may recall that my focus is on algebraic QFT: algebraic QM with, as we might say, arbitrary subdivisions allowed.
On page 8, you suggest a dilemma, "The sampling is [or is not] provided by the hidden variable theory".
I suggest that there is, at least, a trilemma, including "The sampling is provided by the design and construction of the experiment."
That is, we search for materials and devices that generate signals that are not constant and are not obviously periodic, both of which would be not fit for purpose, but instead that exhibit thermodynamic transitions that are in some sense random or very nearly so, between metastable and more stable states. Given that search succeeds, we design electronic circuitry and computer software that recognizes and records details of those thermodynamic transitions as events. Then we put such devices as we construct into different surroundings and examine the statistics of events and how well they match quantum or random field models for them. Such transitions, at MegaHertz rates, are not determined by consciousness, but they are determined by the possibility of agency: a scientist has to be able to choose to build one apparatus rather than another (whether such a choice as that is determined or not we might not be able to determine.)
To put the argument above rather more classically, when we want something close to a random sequence, we do not toss a featureless sphere, we toss a coin that has identifiably different sides.
"An algebraic approach to Koopman classical mechanics", in Annals of Physics 2020, suggests this amongst other considerations.
In such a short article very little can be said, but I also wondered how you would tackle statistical decision and parameter estimation problems such as "when will I accept that this coin is biased and by how much will I change the parameter away from ½?" We do make such decisions, and of course there's a huge literature on how to make them, both for classical and quantum states, even though we know that more experience will almost surely make us change our minds.
I'm always a little graceless in this kind of comment. Sorry! I hope you will forgive me, but in any case thank you again for such clarity.
view post as summary
report post as inappropriate
Member Hector Zenil wrote on Aug. 15, 2020 @ 21:14 GMT
Dear Klaas,
Congratulations for your essay and prize! Could you please tell me if this is a fair characterisation of your proposal? I think what you do is to define QM and Bell's experiment outcomes as algorithmic random and then explain this would be incompatible with determinism for infinite 1-random sequences. However, there is some tension between the definition of finite vs infinite randomness and your characterisation would apply only to infinite randomness. Finite randomness is computable and hence reproducible by finite mechanistic (computable) means e.g. with programs of the form Print[X]. Would your argument not apply in this case and also in the case that QM randomness is defined as in purely thermodynamics in purely statistical (and hence) computable terms? I guess this is what you mean when you ask what kind of randomness is QM randomness, making a strong uncomputability (infinite algorithmic randomness) assumption leads to the contradiction between 1-randomness generation and hidden variable theories based on computable determinism.
Thanks!
report post as inappropriate
Dale Carl Gillman wrote on Sep. 16, 2020 @ 07:47 GMT
Hi Dr. Landsman,
I sent you an email a few weeks ago. I'd just like to make sure that you received it.
Thank you,
Dale C. Gillman
report post as inappropriate
Sue Lingo wrote on Oct. 13, 2020 @ 05:15 GMT
Congratulations Klass Landsman...
In that "Quantum Mechanics Needs a New Theory " ~ Roger Penrose 2020 Nobel Physics Prize Recipient... and "emergent" implies evolving, I thank you for your endorsement of an "Emergent Quantum Theory", and for exposing semantic issues that must be resolved to facilitate that emergence... e.g. indiscriminant usage of THEORY and MODEL, RANDOM and...
view entire post
Congratulations Klass Landsman...
In that "Quantum Mechanics Needs a New Theory " ~ Roger Penrose 2020 Nobel Physics Prize Recipient... and "emergent" implies evolving, I thank you for your endorsement of an "Emergent Quantum Theory", and for exposing semantic issues that must be resolved to facilitate that emergence... e.g. indiscriminant usage of THEORY and MODEL, RANDOM and INDETERMINANT, etc.
A purturbative analysis, mathematical or observational, is not conducive to fundamental assessments... i.e. in that one cannot generate an infinite series to verify that the outcome of a flip of a coin has a 50-50 chance,"(allegedly) making the coin fair", is impossible, it is not a valid means to assess the underlying/fundamental nature of quantum mechanisms as being deterministic, indeterministic, or other... but coin flipping is applicable to an investigation of functions enabled by a QUANTUM ENERGY EMISSION MODEL in which spatially addressable minimum/indivisible quanta of Energy (QE), as substance, are spontaneously, harmoniously distributed within a QUANTUM SPATIAL GEOMETRY MODEL... i.e. a digital CAD environment quantized by addressable minimum/indivisible quanta of Space (QI)... on each pulse... i.e. digitally SIMulated quantum emission mechanisms.
A QUANTUM MODEL of Space-Time as an emergence from a single pulsed sourced QE emission, requires resolve of a geometry singularity and its associated field coordinate system... i.e. logic map... which implies intelligence... i.e. directives/determination... but it facilitates a fundamental principle in which spontaneous, harmonious distribution of Energy (QE) in Space-Time, is neither INDETERMINANT nor RANDOM, nor is it determinant... i.e. it must be SOLVED for the entire system, on every Q-Tick, by internal, scalable, system wide, intelligent network monitoring circuits... e.g. humans.
A QUANTUM MODEL in which system intelligence is geometrically demonsratable, warrants an investigation as to whether internal monitoring circuits can address the system wide networked intelligence with a binary query... i.e. Yes/No... and influence the outcome of a coin flip.
I can repeatedly preform an experiment which demonstrates that a finite string of human, singular, binary queries... i.e. Yes/No... can influence the outcome of a flip of a coin, and resolve a coherent logic series... REF:
- Topic: "Modeling Universal Intelligence"Are mental functions... e.g. attention... "subliminal"?
In that attention does not occupy space as QE substance... i.e. is not a physical entity in the GEOMETRY MODEL... it is "subliminal", but as a system internal logic circuit query mechanism, it can be verified by the amount of chaos that occurs if it is not functioning... i.e. by application of an GEOMETRY MODEL specific codec between the Space-Time logic frame and Spaceless-Timeless logic frame, attention facilitates a link, often unconscious, to networked intelligence.
Might I suggest that prior to a "New THEORY", "quantum mechanics", as fundamental QE emission and distribution mechanisms, would benefit from a GEOMETRY MODEL that facilitates an unbroken kinematic logic chain from observation to the dimensionless single point... i.e. LOGIC SINGULARITY between the Space-Time logic frame and Spaceless-Timeless logic frame... encapsulated by the 3D GEOMETRY SINGULARITY from which the GEOMETRY MODEL emerges, as the Space-Time logic map in which to spontaneously, harmoniously resolve pulse sourced QE emission and subsequent distribution... i.e. a QUANTUM MODEL that is not purturbative, inherently supports a cosmic intelligence.
"All matter originates and exist only by virtue of a force... and we must assume behind this force the existence of a conscious and intelligent mind. The (that) mind is the matrix of all matter." ~ Max Planck, Quanta Author and Physicist
S. Lingo
UQS Author/Logician
www.uqsmatrixmechanix.com
view post as summary
report post as inappropriate
Login or
create account to post reply or comment.