Explaining Bell violations from a statistical / stochastic quantum interpretation
1. The wave function describes a stochastic process.
Some view the wave function as representing an actual physical state that exists at a given time and changes during the course of an experiment. Viewing it this way: when unobserved, individual particles do not have definite properties and can be seen as occupying multiple different states simultaneously in superposition. Particles take on definite properties only when we measure them; however, which particle state we eventually observe when measured occurs non-deterministically. We can only know the probability that a given state will occur, which can be derived directly from the wave function.
The statistical interpretation removes the assumption that the wave function represents some unobserved physical state. Instead, it focuses on the fact that measurement outcomes for a given point in time during an experiment can be predicted using the probabilities derived directly from the wave function. Hypothetically, if we were to perform repeated experimental runs for that same particular experimental context, the long run relative frequencies of the outcomes would be what is described by these derived probabilities. By not representing a physical state per se, this makes the wave function solely a predictive model for possible measurement outcomes. Its evolution is not describing changes in some actual physical state but changes in the probabilities of finding a particle in various possible states as the experimental context changes with time. Specifically then, what the wavefunction is describing is the behavior of a stochastic (random) process: collections or sequences of random variables that represent the state of a particle over different points in time. While not representing the enigmatic quantum state as more commonly envisioned, it should be seen as no different from how stochastic processes are widely used to describe the behavior of many other real physical systems in the natural sciences.
A motivation for deflating the wave function's status as a representation of some specific physical state is that we can restore some intuitive aspects of "realism" with regard to particles: during an experiment, particles do have definite properties at a given point in time, we just do not know which states they are in unless we measure them because of the random nature of their behavior. The famous wave-particle duality then becomes completely deflated given that quantum mechanics just becomes about particles with definite states which behave according to probability distributions given by the wave function.
In orthodox interpretations, treating the wave function as a physical state is accompanied by the introduction of the collapse postulate in order to reconcile, and provide a mechanism that transitions between, the indefinite properties of the unobserved quantum state and the definite properties we observe with measurement. The statistical approach obviously has no requirement for this: on the one hand, particles have definite properties both when unobserved and measured, intuitively conforming to our sense of "realism"; on the other hand, the wave function does not represent a single given particle anyway, just probabilities regarding particle behavior under particular conditions. Without physical collapse, we don't need a special role played by observation or measurement that needs further explanation concerning when and why collapse occurs - i.e. the measurement problem. Nor do we have to deal with various ambiguities and difficulties regarding relativistic causation, which are especially salient when applying collapse to entangled pairs of particles. The emergence of the classical world is also in principle less complicated given that particles always have definite properties; it can be noted that limits where quantum mechanics approximates classical behavior can be derived without any reference to collapse.
2. Bell violations are direct consequences of non-commutativity.
An obvious question for any interpretation is: how does it approach the infamous Bell violations? Works by a number of people have indicated that the violation of Bell inequalities by a set of observables is equivalent to the absence of a joint probability distribution that encompasses all of those observables. One notable example of this work is Fine's theorem; from this, it can be inferred that Bell violations are consequences of non-commutativity, which will be defined shortly.
--- --- ---
(Some references for Fine's theorem)
[i]Hidden variables, joint probability, and the Bell inequalities
https://scholar.google.co.uk/scholar?cluster=2543155278787880428&hl=en&as_sdt=0,5&as_vis=1
Joint distributions, quantum correlations, and commuting observables
https://scholar.google.co.uk/scholar?cluster=8813695518940155915&hl=en&as_sdt=0,5&as_vis=1 [/i]
--- --- ---
Also in a similar vein:
--- --- ---
[i]George Boole's 'conditions of possible experience' and the quantum puzzle
https://scholar.google.co.uk/scholar?cluster=16366977415123888164&hl=en&as_sdt=0,5&as_vis=1
Possible Experience: From Boole to Bell
https://scholar.google.co.uk/scholar?cluster=301320604491795906&hl=en&as_sdt=0,5&as_vis=1[/i]
--- --- ---
Fine's theorem specifically demands the following as equivalent:
[i]"(3) There is one [global] joint distribution for all observables of the experiment, returning the experimental probabilities.
(4) There are well-defined, compatible joint distributions for all pairs and triples of commuting and non-commuting observables.
(5) The Bell inequalities hold."[/i]
The Kochen-Specker theorem stipulates that (4) from above is impossible in quantum mechanics due to its Hilbert space structure. More specifically, (4) is violated simply because in quantum mechanics there is always a presence of non-commuting pairs of observables and these cannot have valid pairwise joint probability distributions under usual assumptions.
Pairs of observables without joint probability distributions can be said to be incompatible. When we try to define joint probability distributions for these incompatible pairs of observables, their distributions violate the rules of probability, notably the law of total probability which equates marginal probabilities to sums of joint probabilities:
https://en.m.wikipedia.org/wiki/Law_of_total_probability
This prevents the resultant probabilities from describing a conventionally valid probability distribution (though I suppose this doesn't necessarily stop someone using unconventional rules of probability). In contrast, compatible pairs of observables do have valid joint probability distributions.
Non-commutativity just means that the order of applying a pair of quantum measurement operators affects the measurement results for that pair: essentially, measurements on non-commuting observables disturb each other. On the contrary, commuting pairs of observables will not disturb each other: the outcome of one observable in the pair will not be affected by the measurement of the other, and so the measurement order has no effect.
Since (4) doesn't hold, we see that (5) doesn't either. The fact that a quantum system generates profound correlations to the extent of Bell violation is equivalent to the fact that it does not have a global joint probability distribution (also by violating the law of total probability), which is caused by the presence of incompatible pairs of variables that do not commute. Without a global joint probability description, the statistics of Bell experiments can only be described using the multiple separate joint probability spaces that describe each of the compatible pairs.
The non-commuting pairs are therefore the root of Bell violations, something that has been emphasized by some recent physicists including proofs in the case of the CHSH inequality for the necessity and sufficiency of non-commutativity for Bell violation.
--- --- ---
[i]Get rid of nonlocality from quantum physics
https://scholar.google.co.uk/scholar?cluster=11575548674791370584&hl=en&as_sdt=0,5&as_vis=1
Making sense of Bell's theorem and quantum nonlocality:
https://scholar.google.co.uk/scholar?cluster=6010274925746687086&hl=en&as_sdt=0,5&as_vis=1
Nonlocality claims are inconsistent with Hilbert-space quantum mechanics:
https://scholar.google.co.uk/scholar?cluster=18053424246858448382&hl=en&as_sdt=0,5&as_vis=1
In praise of quantum uncertainty:
https://scholar.google.co.uk/scholar?cluster=4615841789462490335&hl=en&as_sdt=0,5&as_vis=1
Experimental Counterexample to Bell's Locality Criterion:
https://scholar.google.co.uk/scholar?cluster=4769324801739580243&hl=en&as_sdt=0,5&as_vis=1[/i]
--- --- ---
It is worth noting briefly that in any Bell experiment, the non-commuting pairs of observables that cause the Bell correlations are always locally related to each other in the sense of being properties of the same particle. Therefore, we cannot explain Bell correlations as being a direct result of disturbances between non-commuting observables acting across spatially separated locations. On the contrary, the spatially separated pairs of observables in these experiments are always pairwise compatible and this is typically interpreted as suggesting that non-locally separated observables cannot disturb each other in ways that violate speed-of-light limits (non-signalling).
It might also be worth noting that the link between non-commutativity and Bell-type violations also seems to occur in classical systems, most notably in classical polarization optics. Bell-type violations have also been derived in the context of Brownian motion as a consequence of Heisenberg-like uncertainty relations. An important distinction from quantum violations is that none of the classical examples involve nonlocal correlations between spatially separated variables: i.e. they are local intrasystem violations. While clearly not quantum, this perhaps adds more evidence that Bell violations are clearly a formal necessity due to non-commutativity, regardless of the system.
--- --- ---
[i]Quantum Mechanics and Classical Optics: New Ways to Combine Classical and Quantum Methods:
https://scholar.google.co.uk/scholar?cluster=9708108079894379453&hl=en&as_sdt=0,5&as_vis=1
Entanglement in Classical Optics:
https://scholar.google.co.uk/scholar?cluster=9687012103438601910&hl=en&as_sdt=0,5&as_vis=1
Brownian Entanglement:
https://scholar.google.co.uk/scholar?cluster=11916823875626356065&hl=en&as_sdt=0,5&as_vis=1[/i]
--- --- ---
3. Heisenberg's uncertainty principle and non-commutativity are generic properties of stochastic processes.
First, it can be noted that generally, canonical non-commuting variables in quantum mechanics are those that correspond to cases of generalized coordinates (e.g. position) and generalized momentum in Hamiltonian mechanics. The major exception is the mutually non-commuting angular momentum operators along x, y and z axes (which can then be generalized to spin observables as seen in Bell experiments). Though deriving their commutation relations requires the use of the regular canonical commutation relations for position and momentum, this non-commutativity is essentially inherited from a generic non-commutativity that affects all descriptions of 3D rotation (The SU(2) or SO(3) group). It is also the root of Bell-type violations in classical polarization optics. Even with everyday objects, you can see that successive rotations along different planes from the same starting positions will result in different ending positions if you perform those same rotations in different orders: the different planes of rotation just don't commute.
From non-commutativity, Heisenberg's uncertainty relations can be derived. These relations dictate that the variance or uncertainty of measurement for one observable of a non-commuting pair is inversely related to the uncertainty for the other observable in that pair. In the statistical interpretation, we can interpret this purely in terms of probability distributions that are realized by the long run relative frequencies of outcomes. For instance: if, over many repeated iterations of some experimental context, the measured position of a particle tends to be bunched up in one location, then the repeated measurements of momentum will give values be dispersed in all directions.
There is a very simple example of this which can be described in classical optics. If we send a beam of light through a slit, the width of the slit (denoting position) is inversely related to the size of the angle or spread of directions (denoting momentum) by which the light exits the other side of the slit. In the (statistical) quantum description, this pattern is predicted when repeatedly sending single photons one by one through the slit. Importantly, it can be gleaned from the example that the Heisenberg uncertainty relation is not about measurement itself, but the experimental context which constrains the behavior of the particles. If the experimental context necessitates a particular spread of position measurements (e.g. because of the width of the slit), then this constrains (i.e. disturbs a la non-commutativity) the spread of momentum measurements, and vice versa.
We can also see (Old Edit: ignore this paragraph; do not think this intuition is correct) in an intuitive sense how this might lead to pairwise incompatibility, precluding a valid joint probability distribution. Just as the statistics of the global joint probability distribution referred to earlier can only be represented in multiple separate probability spaces or contexts, it seems that joint probability distributions for position and momentum might in principle only be represented validly within distinct experimental contexts (e.g. different widths of slit) which each ascribe mutually exclusive sets of variances / spreads to the different observables. [b](New edit: having thought about it, I am pretty sure this paragraph is actually correct. Some experimental set up might be constituted of two variables A and B, each with marginal probabilities that will be realized in the experimental outcomes. There will be incompatibility when the law of total probability (LTP) is violated so that p(A) will not be the same as [sum p(A, B)] when A is considered jointly with B, perhaps under some specific measurement setting. This means B is disturbing p(A). [sum p(A, B)] is still a marginal probability but the question is: for what? Skipping some explanation for brevity, It will just be a marginal probability for A in some specific context involving B that must be somehow different to our original p(A) where no contexts have been explicitly differentiated in the experimental set up. For position-momentum, we might see the disturbance as linked to the Heisenberg Uncertainty - since the variance of one observable is inversely related to the other, their marginal probability distributions are necessarily constrained / altered by each other. The marginal distribution of one of the non-commuting pairs will depend on the specific distribution of the other, disturbing the notion of any context independent marginal probabilities. The contextual joint probabilities are then always dependent on how particular experimental contexts constrain position/momentum and so there would be no possible joint probability that we can construct just using p(A) and p(B) taken at face value from the experimental set up. Any experimental set up which subsumes or fails to differentiate multiple different contexts for non-commuting observables will have marginal probabilities which fail to produce a single valid joint probability distribution for those observables that attempts to generalize across all of those different contexts simultaneously... Only joint probabilities in individual contexts induced by disturbances.
Andrei Khrennikov has a whole series of papers informative on this, talking about the link between the law of total probability, interference and experimental contexts. The following is just one example:
https://scholar.google.co.uk/scholar?cluster=4642651957428255714&hl=en&as_sdt=0,5&as_vis=1 )[/b]
We can now look at specifically why position and momentum do not commute and have particular uncertainty relations. When looked at in terms of the path integral formulation, the non-commutativity in quantum mechanics can be derived directly from the fact that the particle trajectories are non-differentiable. This can be seen as a direct result of the erratic, zig-zagging nature of the paths, embodying the inherently random, probabilistic nature of measurement outcomes in quantum mechanics.
https://en.m.wikipedia.org/wiki/Path_integral_formulation (Section: path integral in quantum mechanics; canonical commutation relations)
This property seems to be directly inherited from the Wiener process / integral that the path integral formulation is related to by Wick rotation.
[i]https://en.m.wikipedia.org/wiki/Wick_rotation
Note: Where is the Commutation Relation Hiding in the Path Integral Formulation?
https://scholar.google.co.uk/scholar?cluster=11872738296616463941&hl=en&as_sdt=0,5&as_vis=1[/i]
The Wiener process is a very broadly applicable stochastic process, perhaps most well known in physics as a model for Brownian motion which describes the random behavior of a particle suspended in a medium (i.e. liquid or gas). The evolving probability density function for the Brownian motion of a particle can then be described by a diffusion equation. Interestingly, the Schrodinger equation is also related by Wick rotation to a diffusion equation so that the Schrodinger equation can effectively be seen as a diffusion equation with a complex constant or describing a diffusion process in imaginary (related to complex numbers) time.
Paths realized by Wiener processes are also characterized by the same kinds of random jumps, rendering them non-differentiable. This non-differentiability is well known, leading to the construction of tools such as stochastic calculus designed specifically to deal with this non-differentiable nature. Consequently, we can actually derive non-commutativity properties for classical Wiener processes in a similar way as in the quantum case:
[i]Itos stochastic calculus and Heisenberg
commutation relations:
https://www.sciencedirect.com/science/article/pii/S0304414910000256[/i]
Given that the Heisenberg uncertainty relations can be derived from non-commutativity, it is then no surprise that analogous uncertainty relations are actually well documented in diffusion processes. Not only can these be derived from the same kind of non-differentiability but it can be shown that this seems to apply generically for a broad range of stochastic systems. Examples, subsumed into this also include hydrodynamics and well known uncertainty relations in statistical thermodynamics:
--- --- ---
[i]Generalization of uncertainty relation for quantum and stochastic systems:
https://www.sciencedirect.com/science/article/abs/pii/S0375960118303633
Classical uncertainty relations and entropy production in non-equilibrium statistical mechanics:
https://scholar.google.co.uk/scholar?cluster=6026419417934600498&hl=en&as_sdt=0,5&as_vis=1
Non-quantum uncertainty relations of stochastic dynamics:
https://scholar.google.co.uk/scholar?cluster=12722751213412558053&hl=en&as_sdt=0,5&as_vis=1[/i]
--- --- ---
Given that quantum mechanics is clearly a special case of stochastic systems where non-commutativity and uncertainty relations occur, this hints that they are just a direct result of the random, probabilistic nature of quantum mechanics.
There have also been some attempts to derive uncertainty relations directly from the probability density functions of stochastic processes in a comparatively generic manner. Here, probability densities (analogous to generalized coordinates/position) are contrasted with the probability gradient or flow/dynamics (analogous to generalized momentum) in the context of systems characterized by random behavior. These can also be described in terms of the Fourier transform in the same way as can be done for quantum position-momentum uncertainty relations:
--- --- ---
[i]Indeterminacy relations in random dynamics:
https://scholar.google.co.uk/scholar?cluster=6176854777283805481&hl=en&as_sdt=0,5&as_vis=1
Information dynamics: temporal behavior of uncertainty measures:
https://scholar.google.co.uk/scholar?cluster=6481107725040031189&hl=en&as_sdt=0,5&as_vis=1
Parcels and particles: Markov blankets in the brain:
https://scholar.google.co.uk/scholar?cluster=13034249073504028456&hl=en&as_sdt=0,5&as_vis=1
A free energy principle for a particular physics:
https://scholar.google.co.uk/scholar?cluster=10954599080507512058&hl=en&as_sdt=0,5&as_vis=1[/i]
--- --- ---
The author of the bottom paper has given an interesting possible intuition for thinking about why these types of uncertainty relations might exist generically:
"Intuitively, if the probability mass of a particular state is concentrated around one point in phase-space, then the flow must [be vigorously rebuilding gradients in all directions to counter the dispersive effects of random fluctuations]. This means that the flow is as dispersed as the fluctuations. Conversely, if flow is limited to a small range, random fluctuations would disperse particular states over state-space. In short, the dispersion of states and their flow must complement each other at a nonequilibrium steady-state."
As an example: if you want to stop a droplet of some liquid dissolving in a glass of water, you will want forces to act on the liquid particles from all directions to keep them within a small vicinity. If force is applied solely from one direction without counteracting forces from all of the others, the liquid droplet will still be able to disperse everywhere else except from where the force came from. It would be impossible to keep the liquid particles all in one place if the forces acting on them are all acting in / from a single precise direction. Conversely, the liquid particles could never disperse across a broad range of positions if there were forces acting from every possible direction gathering them up.
The author also expresses this similarly in the more specific context of thermodynamics which also has well known thermodynamic uncertainty relations:
"Intuitively speaking, random fluctuations always increase the entropy through dispersing the ensemble density, while flow decreases entropy by rebuilding probability gradients (i.e., where probability currents flow towards regimes of greater density). In other words, random fluctuations disperse states into regimes of high surprisal (and implicitly thermodynamic potential), while gradient flows due to forces counter the implicit dispersion."
4. Do we need to give up realism, locality or free choice?
As has been said, the wave function does no more than describe long run relative frequencies which manifest as compatible, non-signalling joint probability distributions between the spatially separated measurements of a Bell experiment. The wave function does not describe individual particles whose superpositions physically collapse either. When combined with the fact that the Bell violations are formally rooted in locally non-commuting variables which are necessitated by stochastic systems, it is tempting to think that Bell nonlocality is more or less just a strange statistical artifact that signifies the absence of global joint probability distributions (or conditions of possible experience according to Boole). This may explain why nonlocality seems to co-exist happily with non-signalling even though they prima-facie contradict each other. Therefore, even though there are definitely nonlocal correlations in quantum mechanics, we seem to be able to explain them away locally without a need to refer to spooky nonlocal forces that act between individual particles and violate speed-of-light limits. Given that the non-commutativity of locally related observables can be naturally explained as a consequence of stochasticity, there also does not seem to be a need to appeal to distant causes in the past that are influencing measurement settings a la superdeterminism, i.e. giving up free choice.
It seems that giving up realism may be what is preferable: giving up the notion that particle states have pre-existing values. Under a statistical interpretation, we can retain definite properties of particles at every point in time during any run of an experiment without contradicting ideas of contextuality, incompatibility or the notion of irrealism. This is because the wave function and superposition is about probability distributions describing long run frequencies, not individual particles. The idea that particles do not have definite pre-existing states prior to measurement is then simply replaced by, or rather, shifted toward the notion that non-commuting variables do not have meaningful joint probability distributions for their long run frequencies that are independent of the particular experimental context. Therefore, even though realism is given up in a way compatible with the requirements of Bell's theorem, particles can still retain their definite properties as individuals in a realist way.
5. Underlying causes of indeterminacy?
The main benefits of a statistical or stochastic interpretation is that there is no measurement problem and some realism is restored in the sense that particles have definite properties. It is also arguably the most straightforward way of interpreting the math of quantum mechanics without injecting any additional ontology into the formulas; after all, the Schrodinger equation is just a complex (number) diffusion equation, diffusion equations describing the behavior of stochastic processes. At the same time, this leaves a lot to be desired in terms of explanation: it might be asked why exactly particles behave non-deterministically, perhaps hinting at some underlying cause yet to be discovered. A deterministic explanation would not necessarily be ruled out, in the same way that stochastic models of Brownian motion are in principle explainable in a classical, deterministic way through collisions between a Brownian macro-particle and the micro-particles that constitute the medium it is suspended in. Given how Brownian motion is related to the behavior of Brownian particles suspended in some background medium, we might ask if there is something analogous in quantum mechanics which can provide the type of underlying explanation we might want. Potential hypotheses could relate to quantum foam or random fluctuations in the underlying spacetime and quantum vacuum fields.
Some view the wave function as representing an actual physical state that exists at a given time and changes during the course of an experiment. Viewing it this way: when unobserved, individual particles do not have definite properties and can be seen as occupying multiple different states simultaneously in superposition. Particles take on definite properties only when we measure them; however, which particle state we eventually observe when measured occurs non-deterministically. We can only know the probability that a given state will occur, which can be derived directly from the wave function.
The statistical interpretation removes the assumption that the wave function represents some unobserved physical state. Instead, it focuses on the fact that measurement outcomes for a given point in time during an experiment can be predicted using the probabilities derived directly from the wave function. Hypothetically, if we were to perform repeated experimental runs for that same particular experimental context, the long run relative frequencies of the outcomes would be what is described by these derived probabilities. By not representing a physical state per se, this makes the wave function solely a predictive model for possible measurement outcomes. Its evolution is not describing changes in some actual physical state but changes in the probabilities of finding a particle in various possible states as the experimental context changes with time. Specifically then, what the wavefunction is describing is the behavior of a stochastic (random) process: collections or sequences of random variables that represent the state of a particle over different points in time. While not representing the enigmatic quantum state as more commonly envisioned, it should be seen as no different from how stochastic processes are widely used to describe the behavior of many other real physical systems in the natural sciences.
A motivation for deflating the wave function's status as a representation of some specific physical state is that we can restore some intuitive aspects of "realism" with regard to particles: during an experiment, particles do have definite properties at a given point in time, we just do not know which states they are in unless we measure them because of the random nature of their behavior. The famous wave-particle duality then becomes completely deflated given that quantum mechanics just becomes about particles with definite states which behave according to probability distributions given by the wave function.
In orthodox interpretations, treating the wave function as a physical state is accompanied by the introduction of the collapse postulate in order to reconcile, and provide a mechanism that transitions between, the indefinite properties of the unobserved quantum state and the definite properties we observe with measurement. The statistical approach obviously has no requirement for this: on the one hand, particles have definite properties both when unobserved and measured, intuitively conforming to our sense of "realism"; on the other hand, the wave function does not represent a single given particle anyway, just probabilities regarding particle behavior under particular conditions. Without physical collapse, we don't need a special role played by observation or measurement that needs further explanation concerning when and why collapse occurs - i.e. the measurement problem. Nor do we have to deal with various ambiguities and difficulties regarding relativistic causation, which are especially salient when applying collapse to entangled pairs of particles. The emergence of the classical world is also in principle less complicated given that particles always have definite properties; it can be noted that limits where quantum mechanics approximates classical behavior can be derived without any reference to collapse.
2. Bell violations are direct consequences of non-commutativity.
An obvious question for any interpretation is: how does it approach the infamous Bell violations? Works by a number of people have indicated that the violation of Bell inequalities by a set of observables is equivalent to the absence of a joint probability distribution that encompasses all of those observables. One notable example of this work is Fine's theorem; from this, it can be inferred that Bell violations are consequences of non-commutativity, which will be defined shortly.
--- --- ---
(Some references for Fine's theorem)
[i]Hidden variables, joint probability, and the Bell inequalities
https://scholar.google.co.uk/scholar?cluster=2543155278787880428&hl=en&as_sdt=0,5&as_vis=1
Joint distributions, quantum correlations, and commuting observables
https://scholar.google.co.uk/scholar?cluster=8813695518940155915&hl=en&as_sdt=0,5&as_vis=1 [/i]
--- --- ---
Also in a similar vein:
--- --- ---
[i]George Boole's 'conditions of possible experience' and the quantum puzzle
https://scholar.google.co.uk/scholar?cluster=16366977415123888164&hl=en&as_sdt=0,5&as_vis=1
Possible Experience: From Boole to Bell
https://scholar.google.co.uk/scholar?cluster=301320604491795906&hl=en&as_sdt=0,5&as_vis=1[/i]
--- --- ---
Fine's theorem specifically demands the following as equivalent:
[i]"(3) There is one [global] joint distribution for all observables of the experiment, returning the experimental probabilities.
(4) There are well-defined, compatible joint distributions for all pairs and triples of commuting and non-commuting observables.
(5) The Bell inequalities hold."[/i]
The Kochen-Specker theorem stipulates that (4) from above is impossible in quantum mechanics due to its Hilbert space structure. More specifically, (4) is violated simply because in quantum mechanics there is always a presence of non-commuting pairs of observables and these cannot have valid pairwise joint probability distributions under usual assumptions.
Pairs of observables without joint probability distributions can be said to be incompatible. When we try to define joint probability distributions for these incompatible pairs of observables, their distributions violate the rules of probability, notably the law of total probability which equates marginal probabilities to sums of joint probabilities:
https://en.m.wikipedia.org/wiki/Law_of_total_probability
This prevents the resultant probabilities from describing a conventionally valid probability distribution (though I suppose this doesn't necessarily stop someone using unconventional rules of probability). In contrast, compatible pairs of observables do have valid joint probability distributions.
Non-commutativity just means that the order of applying a pair of quantum measurement operators affects the measurement results for that pair: essentially, measurements on non-commuting observables disturb each other. On the contrary, commuting pairs of observables will not disturb each other: the outcome of one observable in the pair will not be affected by the measurement of the other, and so the measurement order has no effect.
Since (4) doesn't hold, we see that (5) doesn't either. The fact that a quantum system generates profound correlations to the extent of Bell violation is equivalent to the fact that it does not have a global joint probability distribution (also by violating the law of total probability), which is caused by the presence of incompatible pairs of variables that do not commute. Without a global joint probability description, the statistics of Bell experiments can only be described using the multiple separate joint probability spaces that describe each of the compatible pairs.
The non-commuting pairs are therefore the root of Bell violations, something that has been emphasized by some recent physicists including proofs in the case of the CHSH inequality for the necessity and sufficiency of non-commutativity for Bell violation.
--- --- ---
[i]Get rid of nonlocality from quantum physics
https://scholar.google.co.uk/scholar?cluster=11575548674791370584&hl=en&as_sdt=0,5&as_vis=1
Making sense of Bell's theorem and quantum nonlocality:
https://scholar.google.co.uk/scholar?cluster=6010274925746687086&hl=en&as_sdt=0,5&as_vis=1
Nonlocality claims are inconsistent with Hilbert-space quantum mechanics:
https://scholar.google.co.uk/scholar?cluster=18053424246858448382&hl=en&as_sdt=0,5&as_vis=1
In praise of quantum uncertainty:
https://scholar.google.co.uk/scholar?cluster=4615841789462490335&hl=en&as_sdt=0,5&as_vis=1
Experimental Counterexample to Bell's Locality Criterion:
https://scholar.google.co.uk/scholar?cluster=4769324801739580243&hl=en&as_sdt=0,5&as_vis=1[/i]
--- --- ---
It is worth noting briefly that in any Bell experiment, the non-commuting pairs of observables that cause the Bell correlations are always locally related to each other in the sense of being properties of the same particle. Therefore, we cannot explain Bell correlations as being a direct result of disturbances between non-commuting observables acting across spatially separated locations. On the contrary, the spatially separated pairs of observables in these experiments are always pairwise compatible and this is typically interpreted as suggesting that non-locally separated observables cannot disturb each other in ways that violate speed-of-light limits (non-signalling).
It might also be worth noting that the link between non-commutativity and Bell-type violations also seems to occur in classical systems, most notably in classical polarization optics. Bell-type violations have also been derived in the context of Brownian motion as a consequence of Heisenberg-like uncertainty relations. An important distinction from quantum violations is that none of the classical examples involve nonlocal correlations between spatially separated variables: i.e. they are local intrasystem violations. While clearly not quantum, this perhaps adds more evidence that Bell violations are clearly a formal necessity due to non-commutativity, regardless of the system.
--- --- ---
[i]Quantum Mechanics and Classical Optics: New Ways to Combine Classical and Quantum Methods:
https://scholar.google.co.uk/scholar?cluster=9708108079894379453&hl=en&as_sdt=0,5&as_vis=1
Entanglement in Classical Optics:
https://scholar.google.co.uk/scholar?cluster=9687012103438601910&hl=en&as_sdt=0,5&as_vis=1
Brownian Entanglement:
https://scholar.google.co.uk/scholar?cluster=11916823875626356065&hl=en&as_sdt=0,5&as_vis=1[/i]
--- --- ---
3. Heisenberg's uncertainty principle and non-commutativity are generic properties of stochastic processes.
First, it can be noted that generally, canonical non-commuting variables in quantum mechanics are those that correspond to cases of generalized coordinates (e.g. position) and generalized momentum in Hamiltonian mechanics. The major exception is the mutually non-commuting angular momentum operators along x, y and z axes (which can then be generalized to spin observables as seen in Bell experiments). Though deriving their commutation relations requires the use of the regular canonical commutation relations for position and momentum, this non-commutativity is essentially inherited from a generic non-commutativity that affects all descriptions of 3D rotation (The SU(2) or SO(3) group). It is also the root of Bell-type violations in classical polarization optics. Even with everyday objects, you can see that successive rotations along different planes from the same starting positions will result in different ending positions if you perform those same rotations in different orders: the different planes of rotation just don't commute.
From non-commutativity, Heisenberg's uncertainty relations can be derived. These relations dictate that the variance or uncertainty of measurement for one observable of a non-commuting pair is inversely related to the uncertainty for the other observable in that pair. In the statistical interpretation, we can interpret this purely in terms of probability distributions that are realized by the long run relative frequencies of outcomes. For instance: if, over many repeated iterations of some experimental context, the measured position of a particle tends to be bunched up in one location, then the repeated measurements of momentum will give values be dispersed in all directions.
There is a very simple example of this which can be described in classical optics. If we send a beam of light through a slit, the width of the slit (denoting position) is inversely related to the size of the angle or spread of directions (denoting momentum) by which the light exits the other side of the slit. In the (statistical) quantum description, this pattern is predicted when repeatedly sending single photons one by one through the slit. Importantly, it can be gleaned from the example that the Heisenberg uncertainty relation is not about measurement itself, but the experimental context which constrains the behavior of the particles. If the experimental context necessitates a particular spread of position measurements (e.g. because of the width of the slit), then this constrains (i.e. disturbs a la non-commutativity) the spread of momentum measurements, and vice versa.
We can also see (Old Edit: ignore this paragraph; do not think this intuition is correct) in an intuitive sense how this might lead to pairwise incompatibility, precluding a valid joint probability distribution. Just as the statistics of the global joint probability distribution referred to earlier can only be represented in multiple separate probability spaces or contexts, it seems that joint probability distributions for position and momentum might in principle only be represented validly within distinct experimental contexts (e.g. different widths of slit) which each ascribe mutually exclusive sets of variances / spreads to the different observables. [b](New edit: having thought about it, I am pretty sure this paragraph is actually correct. Some experimental set up might be constituted of two variables A and B, each with marginal probabilities that will be realized in the experimental outcomes. There will be incompatibility when the law of total probability (LTP) is violated so that p(A) will not be the same as [sum p(A, B)] when A is considered jointly with B, perhaps under some specific measurement setting. This means B is disturbing p(A). [sum p(A, B)] is still a marginal probability but the question is: for what? Skipping some explanation for brevity, It will just be a marginal probability for A in some specific context involving B that must be somehow different to our original p(A) where no contexts have been explicitly differentiated in the experimental set up. For position-momentum, we might see the disturbance as linked to the Heisenberg Uncertainty - since the variance of one observable is inversely related to the other, their marginal probability distributions are necessarily constrained / altered by each other. The marginal distribution of one of the non-commuting pairs will depend on the specific distribution of the other, disturbing the notion of any context independent marginal probabilities. The contextual joint probabilities are then always dependent on how particular experimental contexts constrain position/momentum and so there would be no possible joint probability that we can construct just using p(A) and p(B) taken at face value from the experimental set up. Any experimental set up which subsumes or fails to differentiate multiple different contexts for non-commuting observables will have marginal probabilities which fail to produce a single valid joint probability distribution for those observables that attempts to generalize across all of those different contexts simultaneously... Only joint probabilities in individual contexts induced by disturbances.
Andrei Khrennikov has a whole series of papers informative on this, talking about the link between the law of total probability, interference and experimental contexts. The following is just one example:
https://scholar.google.co.uk/scholar?cluster=4642651957428255714&hl=en&as_sdt=0,5&as_vis=1 )[/b]
We can now look at specifically why position and momentum do not commute and have particular uncertainty relations. When looked at in terms of the path integral formulation, the non-commutativity in quantum mechanics can be derived directly from the fact that the particle trajectories are non-differentiable. This can be seen as a direct result of the erratic, zig-zagging nature of the paths, embodying the inherently random, probabilistic nature of measurement outcomes in quantum mechanics.
https://en.m.wikipedia.org/wiki/Path_integral_formulation (Section: path integral in quantum mechanics; canonical commutation relations)
This property seems to be directly inherited from the Wiener process / integral that the path integral formulation is related to by Wick rotation.
[i]https://en.m.wikipedia.org/wiki/Wick_rotation
Note: Where is the Commutation Relation Hiding in the Path Integral Formulation?
https://scholar.google.co.uk/scholar?cluster=11872738296616463941&hl=en&as_sdt=0,5&as_vis=1[/i]
The Wiener process is a very broadly applicable stochastic process, perhaps most well known in physics as a model for Brownian motion which describes the random behavior of a particle suspended in a medium (i.e. liquid or gas). The evolving probability density function for the Brownian motion of a particle can then be described by a diffusion equation. Interestingly, the Schrodinger equation is also related by Wick rotation to a diffusion equation so that the Schrodinger equation can effectively be seen as a diffusion equation with a complex constant or describing a diffusion process in imaginary (related to complex numbers) time.
Paths realized by Wiener processes are also characterized by the same kinds of random jumps, rendering them non-differentiable. This non-differentiability is well known, leading to the construction of tools such as stochastic calculus designed specifically to deal with this non-differentiable nature. Consequently, we can actually derive non-commutativity properties for classical Wiener processes in a similar way as in the quantum case:
[i]Itos stochastic calculus and Heisenberg
commutation relations:
https://www.sciencedirect.com/science/article/pii/S0304414910000256[/i]
Given that the Heisenberg uncertainty relations can be derived from non-commutativity, it is then no surprise that analogous uncertainty relations are actually well documented in diffusion processes. Not only can these be derived from the same kind of non-differentiability but it can be shown that this seems to apply generically for a broad range of stochastic systems. Examples, subsumed into this also include hydrodynamics and well known uncertainty relations in statistical thermodynamics:
--- --- ---
[i]Generalization of uncertainty relation for quantum and stochastic systems:
https://www.sciencedirect.com/science/article/abs/pii/S0375960118303633
Classical uncertainty relations and entropy production in non-equilibrium statistical mechanics:
https://scholar.google.co.uk/scholar?cluster=6026419417934600498&hl=en&as_sdt=0,5&as_vis=1
Non-quantum uncertainty relations of stochastic dynamics:
https://scholar.google.co.uk/scholar?cluster=12722751213412558053&hl=en&as_sdt=0,5&as_vis=1[/i]
--- --- ---
Given that quantum mechanics is clearly a special case of stochastic systems where non-commutativity and uncertainty relations occur, this hints that they are just a direct result of the random, probabilistic nature of quantum mechanics.
There have also been some attempts to derive uncertainty relations directly from the probability density functions of stochastic processes in a comparatively generic manner. Here, probability densities (analogous to generalized coordinates/position) are contrasted with the probability gradient or flow/dynamics (analogous to generalized momentum) in the context of systems characterized by random behavior. These can also be described in terms of the Fourier transform in the same way as can be done for quantum position-momentum uncertainty relations:
--- --- ---
[i]Indeterminacy relations in random dynamics:
https://scholar.google.co.uk/scholar?cluster=6176854777283805481&hl=en&as_sdt=0,5&as_vis=1
Information dynamics: temporal behavior of uncertainty measures:
https://scholar.google.co.uk/scholar?cluster=6481107725040031189&hl=en&as_sdt=0,5&as_vis=1
Parcels and particles: Markov blankets in the brain:
https://scholar.google.co.uk/scholar?cluster=13034249073504028456&hl=en&as_sdt=0,5&as_vis=1
A free energy principle for a particular physics:
https://scholar.google.co.uk/scholar?cluster=10954599080507512058&hl=en&as_sdt=0,5&as_vis=1[/i]
--- --- ---
The author of the bottom paper has given an interesting possible intuition for thinking about why these types of uncertainty relations might exist generically:
"Intuitively, if the probability mass of a particular state is concentrated around one point in phase-space, then the flow must [be vigorously rebuilding gradients in all directions to counter the dispersive effects of random fluctuations]. This means that the flow is as dispersed as the fluctuations. Conversely, if flow is limited to a small range, random fluctuations would disperse particular states over state-space. In short, the dispersion of states and their flow must complement each other at a nonequilibrium steady-state."
As an example: if you want to stop a droplet of some liquid dissolving in a glass of water, you will want forces to act on the liquid particles from all directions to keep them within a small vicinity. If force is applied solely from one direction without counteracting forces from all of the others, the liquid droplet will still be able to disperse everywhere else except from where the force came from. It would be impossible to keep the liquid particles all in one place if the forces acting on them are all acting in / from a single precise direction. Conversely, the liquid particles could never disperse across a broad range of positions if there were forces acting from every possible direction gathering them up.
The author also expresses this similarly in the more specific context of thermodynamics which also has well known thermodynamic uncertainty relations:
"Intuitively speaking, random fluctuations always increase the entropy through dispersing the ensemble density, while flow decreases entropy by rebuilding probability gradients (i.e., where probability currents flow towards regimes of greater density). In other words, random fluctuations disperse states into regimes of high surprisal (and implicitly thermodynamic potential), while gradient flows due to forces counter the implicit dispersion."
4. Do we need to give up realism, locality or free choice?
As has been said, the wave function does no more than describe long run relative frequencies which manifest as compatible, non-signalling joint probability distributions between the spatially separated measurements of a Bell experiment. The wave function does not describe individual particles whose superpositions physically collapse either. When combined with the fact that the Bell violations are formally rooted in locally non-commuting variables which are necessitated by stochastic systems, it is tempting to think that Bell nonlocality is more or less just a strange statistical artifact that signifies the absence of global joint probability distributions (or conditions of possible experience according to Boole). This may explain why nonlocality seems to co-exist happily with non-signalling even though they prima-facie contradict each other. Therefore, even though there are definitely nonlocal correlations in quantum mechanics, we seem to be able to explain them away locally without a need to refer to spooky nonlocal forces that act between individual particles and violate speed-of-light limits. Given that the non-commutativity of locally related observables can be naturally explained as a consequence of stochasticity, there also does not seem to be a need to appeal to distant causes in the past that are influencing measurement settings a la superdeterminism, i.e. giving up free choice.
It seems that giving up realism may be what is preferable: giving up the notion that particle states have pre-existing values. Under a statistical interpretation, we can retain definite properties of particles at every point in time during any run of an experiment without contradicting ideas of contextuality, incompatibility or the notion of irrealism. This is because the wave function and superposition is about probability distributions describing long run frequencies, not individual particles. The idea that particles do not have definite pre-existing states prior to measurement is then simply replaced by, or rather, shifted toward the notion that non-commuting variables do not have meaningful joint probability distributions for their long run frequencies that are independent of the particular experimental context. Therefore, even though realism is given up in a way compatible with the requirements of Bell's theorem, particles can still retain their definite properties as individuals in a realist way.
5. Underlying causes of indeterminacy?
The main benefits of a statistical or stochastic interpretation is that there is no measurement problem and some realism is restored in the sense that particles have definite properties. It is also arguably the most straightforward way of interpreting the math of quantum mechanics without injecting any additional ontology into the formulas; after all, the Schrodinger equation is just a complex (number) diffusion equation, diffusion equations describing the behavior of stochastic processes. At the same time, this leaves a lot to be desired in terms of explanation: it might be asked why exactly particles behave non-deterministically, perhaps hinting at some underlying cause yet to be discovered. A deterministic explanation would not necessarily be ruled out, in the same way that stochastic models of Brownian motion are in principle explainable in a classical, deterministic way through collisions between a Brownian macro-particle and the micro-particles that constitute the medium it is suspended in. Given how Brownian motion is related to the behavior of Brownian particles suspended in some background medium, we might ask if there is something analogous in quantum mechanics which can provide the type of underlying explanation we might want. Potential hypotheses could relate to quantum foam or random fluctuations in the underlying spacetime and quantum vacuum fields.
Comments (60)
Not quite, If I am understanding you correctly. Its saying that because quantum mechanics under this interpretation is solely about long run statistics of many repeated occurrences, realism is not about the indefiniteness of individual particles, it is about entire probabilitu distributions. These probability distributions are realized by those many repeated occurrences or experimental runs, each involving a particle with definite properties at any given time.
Fine's theorem proves that a Bell violation is just equivalent to a lack of joint probability distribution. It doesn't matter why there is a lack of joint probability distribution, it doesn't matter exactly what forces are acting on the system and why, as long as there is a lack of joint probability distribution in this setting, Bell inequalities will be broken as a formal requirement. It is in that way that it is an artifact of statistics; the violation is very real, just that it is an artifact of the lack of joint probability distribution. This lack of joint distribution is caused in quantum mechanics by non-commuting observables and I believe this non-commuting nature is just a necessary fact of certain types of randomly behaving systems like quantum mechanics seems to describe.
I'm pretty sure physicists call that structure quantum mechanics, because quantum mechanics explicitly predicted the results we do in fact see.
Fun fact, the Schrödinger equation is deterministic! It's a deterministic mathematical equation that determines how the wave function evolves. Quantum physics is still math, like any other physics. It's still structured, it's not just a bunch of physicists around a hookah pipe.
It seems you reject qm for what it is out of hand - perhaps, given its incredible track record for successful predictions, you could give it more of a chance than that.
Sure it does. The op of this thread is trying to come up with an alternative of taking the wave function as ontologically real - which implicitly points to the fact that in many approaches to quantum mechanics, the wave function IS real, it is casual, it evolves deterministically over time via the Schrödinger equation, etc. So yeah, qm can absolutely work that way.
It is proven under Fine's theorem here:
https://scholar.google.co.uk/scholar?cluster=2543155278787880428&hl=en&as_sdt=0,5&as_vis=1
That Bell violations are equivalent to the absence of joint probability distribution for all variables which is equivalent to the presence of incompatible joint probability distributions caused by non-commuting variables.
That is suggesting that Bell violations are caused by non-commuting variables and this is a completely formal result; in other words, it does not matter why the variables do not commute, they will cause Bell violations so long as they fulfil the formal conditions that characterize non-commuting observables. Quantum mechanics fulfils these exact criteria; in having non-commuting variables it will have Bell violations as a necessary mathematical consequence. That is absolutely sufficient for the "mystery", without requiring a physical explanation since the relation between Bell violations and joint probability distributions is completely formal, even if it looks really really bizarre.
The something that is "at work" is the non-commutativity in the spin measurements (it causes the absence of global joint distribution) which has a natural local explanation in that 1) canonical position-momentum commutation is a generic feature certain kinds of random dynamic systems, even classical ones 2) 3d descriptions of rotation inherently have non-commuting properties for formal reasons which you can actually demonstrate for yourself by applying successive rotations in different orders even to your own hand.
In other words, you're saying math exists only to describe the things you accept as real, but I think math is the reality, and the things you like to think of as real are a consequence of the math.
What if every quantum object is just a numerical vector "moving" across a 3 dimensional (or more) array, and everything you know is just a consequence of these numbers interacting?
I personally don't think it's a coincidence that physics behaves in ways that are describable by functions. Galileo said mathematics is the language of the universe. Perhaps he was right?
Good luck in whatever you're doing, I'll speak to you later.
Slightly off-topic : If you will think of Mathematical relationships as A> a form of Information, and B> Information as "the power to enform a mind", plus C> Energy as the power to enform matter (as in E=MC^2), then the notion of a Real universe consisting of mathematical (structural) & informational (meaningful) relationships might begin to make sense. Of course, it's a great leap from Atomism & Materialism.
Some of Tegmark's Mathematical Universe conjectures are preternatural & transcendent, but the notion that reality is fundamentally Mathematical & Informational is compatible with our modern knowledge of Nature via Physics. Below are a few other thoughts on Math (information) as the fundamental element of Reality. To answer your question, the Abstract Math form of Information may-or-may-not-be inert (depending), but the Energy form of Generic Information "gives it causal efficacy". :smile:
The mass-energy-information equivalence principle :
Landauers principle formulated in 1961 states that logical irreversibility implies physical irreversibility and demonstrated that information is physical. Here we formulate a new principle of mass-energy-information equivalence proposing that a bit of information is not just physical, as already demonstrated, but it has a finite and quantifiable mass while it stores information.
https://pubs.aip.org/aip/adv/article/9/9/095206/1076232/The-mass-energy-information-equivalence-principle
Mathematics : Greek máth?ma (??????), meaning "that which is learnt",[11] "what one gets to know",
Note : Knowledge is Information ; hence Math is information.
Is Information Theory Mathematics? :
Yes, Information Theory is a branch of mathematics
https://math.stackexchange.com/questions/1083862/is-information-theory-mathematics
Is information the only thing that exists? :
Physics suggests information is more fundamental than matter, energy, space and time
https://www.newscientist.com/article/mg23431191-500-inside-knowledge-is-information-the-only-thing-that-exists/
What is Information? :
Abstract Information : the 1s & 0s of computer language. Existence = 1, Non-existence = 0
Causal Information : Energy - e.g. the ratio between Hot & Cold. Energy is the causal power of Information. https://en.wikipedia.org/wiki/Information_causality
https://en.wikipedia.org/wiki/Information
Material Information : E=MC^2. Mass is Enformed Energy, and is an essential property of Matter. "the equation says that energy and mass (matter) are interchangeable; they are different forms of the same thing." https://www.pbs.org/wgbh/nova/einstein/ ... 2expl.html
Shannon Information : The abstract ratio of One to Zero. It yields accuracy in computation, but omits any meaning or significance. Quantity without Quality. Like language, its utility is in its ability to mean anything you want to convey.
https://informationphilosopher.com/index.30.en.html
Organic Information : Living organisms are defined and organized by their "Information Molecule", which we call DNA. https://en.wikipedia.org/wiki/DNA
Semantic Information : Meaning in a conscious mind; for example the relationship between Self and Other. It can be expressed mathematically as a numerical ratio, or emotionally as a positive/negative feeling.
https://plato.stanford.edu/entries/info ... -Semantic/
https://bothandblog6.enformationism.info/page16.html
And you don't think those rules are defined in some way that's analogous to mathematics and/or computation?
Well we have universes where the fundamental rules are defined by computation and mathematics, and those universes are called video games. Similarly, cellular automata universes like Conway's game of life.
The things you call "matter" are in theory representable by numerical data (that's exactly how they're represented in video games and in physics simulations), and able to be manipulated by computations of mathematical rules - that's how every "universe" we've manufactured works, which at least gives us a potential analogy for how any universe might work - I'm not saying our universe definitely obviously works like that and you're wrong if you think otherwise (even though you're apparently that confident of your own position for some reason), I'm saying it's a potentially strong analogy to how our universe operates.
Why?
You don't have to accept it as truth my man, I'm not trying to convince you it's the case. You asked some questions, I tried to answer them. Plenty of physicists think like this. It doesn't matter if you like it or not, I'm perfectly fine with you not thinking the universe is mathematical. It makes perfect sense to me.
But in the end it actually is a matter of likes and dislikes. It's not like there's a known objective answer to the true fundamental nature of our universe right now, so if you feel strongly that it can't be mathematical, which you seem to do, it's not because you have scientific proof that it can't be. It's because you don't like it, it's because there's some aspect of it that doesn't sit well with you, that goes against the grain of your intuitions about how the world works. It goes exactly with the grain of my intuitions, so I think it's a compelling idea.
The concept of Turing completeness provides the bedrock, for me, for the idea that we might be in a computational universe - that computation and/or mathematics are strongly analogous to the root nature of every thing and every event in this place. That makes sense to me.
I don't think there is any explicit problem with the idea. I think you may not like the idea for your own philosophical reasons, and I have no illusion that I'm capable of convincing you otherwise. I'm comfortable with that.
https://spotify.link/zMHaM1TQiDb
https://spotify.link/zMHaM1TQiDb
If the "detail" you're looking for is empirical evidence, it's probably not forthcoming. Mathematics is a language for science, not an object to be studied under a microscope. Likewise, Energy is an intangible invisible force that is observed only in its physical effects, not as a ding an sich. Both Math & Energy are now regarded, by scientists & philosophers, as forms of Generic Information. Basically most of the referenced links in my posts are philosophical/theoretical generalizations & opinions, not empirical evidence. So, the bottom line is : do you trust these theoretical scientists to know what they are talking about?
Since the advent of Quantum Theory, there has been a divergence between Theory & Practice. Classical physics, beginning around the 16th & 17th centuries, replaced the traditional philosophical observations & interpretations of Aristotle & such, with repeatable, recorded experimental evidence. But, on the quantum scale of reality, things are not that simple. Heisenberg defined the distinction between classical and quantum as "Uncertainty". For example, statistical observations cannot be just recorded as fundamental facts, they must be interpreted in the light of personal or conventional beliefs about their indirect observations. That's why Quantum Bayesianism*1*2 begins with subjective interpretations (beliefs) and adjusts the percentage of Certainty as more evidence comes in. Some empirical Quantum scientists, such as Feynman, objected to the Copenhagen incursion of philosophy into physics. Yet today, many quantum physicists are mathematical theorists (i.e. philosophers), who do no empirical work at all.
Regarding "how can that work?", I have my own personal theory, but I also post links to sites where professional scientists publish their own philosophical opinions on the "how" question*3*4. The empirical and philosophical research is ongoing on many fronts. For example, the Santa Fe Institute for the Study of Complex Systems is at the cutting edge of Quantum Information knowledge. But it also looks for enforming & causal effects in Biology & Chemistry*5. Little of this ongoing research is textbook stuff at the moment, but the direction is obvious : everything in the world is a form of Information, including Mathematics, Mind, & Matter*6. :smile:
*1. Quantum Philosophy :
In physics and the philosophy of physics, quantum Bayesianism is a collection of related approaches to the interpretation of quantum mechanics, the most prominent of which is QBism (pronounced "cubism"). QBism is an interpretation that takes an agent's actions and experiences as the central concerns of the theory.
https://en.wikipedia.org/wiki/Quantum_Bayesianism
*2. Unraveling QBism :
In QBism, the wave function represents an observer's subjective beliefs about the possible outcomes of a measurement rather than an objective description of reality.
https://medium.com/physics-philosophy-more/qbism-a-technical-discourse-34109e2b3c16
*3. Information causality :
Information causality is a physical principle suggested in 2009. . . .
https://en.wikipedia.org/wiki/Information_causality#cite_note-1
*4. Information converted to energy :
Physicists in Japan have shown experimentally that a particle can be made to do work simply by receiving information, rather than energy
https://physicsworld.com/a/information-converted-to-energy/
*5. New paper answers causation conundrum :
Called downward causation . . . . However, as soon as one spends a little time considering how this causality works, trouble arises.
https://www.santafe.edu/news-center/news/new-paper-answers-causation-conundrum
*6. Information and the Nature of Reality :
Many scientists regard mass and energy as the primary currency of nature. In recent years, however, the concept of information has gained importance.
https://www.cambridge.org/core/books/information-and-the-nature-of-reality/811A28839BB7B63AAB63DC355FBE8C81
Note --- several of the authors of this anthology are associated with Santa Fe Institute
*7. From Matter to Life, Information and Causality :
If information makes a difference in the physical world, which it surely does, then should we not attribute to it causal powers?
https://www.cambridge.org/core/books/from-matter-to-life/4DA89C33D0FF29E749E6B415739F8E5A
Note --- several of the authors of this anthology are associated with Santa Fe Institute
Would you characterize the world model described above as "Materialism", or "Physicalism", or merely "Atheism"? No information, no patterns, no interrelationships, just atoms whirling in the void? The missing element is Meaning, which is significant only to evolved creatures capable of knowing, and knowing that they know, hence possessing a Self Concept, and the concept of Other Minds.
Before Shannon defined it in terms of abstract mathematical values, the original definition of Information was "knowledge in a mind"*1. The "form" part of Information simply refers to an Idea (a mental concept), rather than a material thing. For example, "uncertainty" is a state of mind, and Information Theory is intended to reduce the uncertainty of a communication*2. A world of mindless "stuff" would not know the feeling of uncertainty, only a world of persons can feel & know. Are you a thing, or a person?
A world that "knows nothing of information" is a world without Ideas, a world without Meaning, just Things doing whatever Energy forces them to do. If that is the case, what is the purpose of Philosophy? Does it put food on the table for "things that interact"? Are you saying that the only form of information you are interested in is in the form of "stuff"? What kind of "stuff" do you get from this forum? :smile:
*1. Information Etymology :
late 14c., informacion, "act of informing, communication of news," from Old French informacion, enformacion "advice, instruction," from Latin informationem (nominative informatio) "outline, concept, idea," noun of action from past participle stem of informare "to train, instruct, educate; shape, give form to"
https://www.etymonline.com/word/information
*2. Information theory :
Information theory studies the transmission, processing, extraction, and utilization of information. Abstractly, information can be thought of as the resolution of uncertainty.
https://en.wikipedia.org/wiki/Information_theory
Quoting tim wood
Another term for that "cottage industry" you mentioned is Philosophy. And yes, Philosophers & Scientists do indeed "define information in peculiar ways". One of those ways is to create imaginary "models" of reality, that are not in themselves real, but ideal*3. Another term for a mental model of reality is Theory. Do you know the real world directly, or only by means of models & theories (a la Kant)?*4
Gregory Bateson was a people-watcher, and defined Information in a strange", but human-oriented way *5. Claude Shannon was an engineer, not a philosopher, and he redefined Information in an odd way : as a degree of uncertainty (i.e. entropy). But what is Uncertainty to a bit of stuff? What difference does Entropy make to a rock?
On this forum, do you communicate information to mindless things, or to the minds of unseen persons, for whom that knowledge might make a difference in their understanding of the world? A world with "no pattern" is a world of Random Noise. Do you hear the static, and miss the signal?*6 :cool:
*3. It from Bit :
It from bit symbolises the idea that every item of the physical world has at bottom at a very deep bottom, in most instances an immaterial source and explanation; that what we call reality arises in the last analysis from the posing of yes-no questions and the registering of equipment-evoked responses; in short, that all things physical are information-theoretic in origin and this is a participatory universe.
Note --- John Archibald Wheeler was a quantum physicist, not a philosopher. But this theory is philosophical, not scientific. The metaphysical philosophy of Materialism accepts his physical science, but rejects his metaphysical philosophy.
*4. On Reality :
"Uncertainty" is NOT "I don't know." It is "I can't know." "I am uncertain" does not mean "I could be certain." ____Werner Heisenberg
Real Things vs Appearances :
The world as it is before mediation Kant calls the noumenal world, or, in a memorable phrase, Das Ding an sich, a phrase which literally means The thing in itself, but whose sense would be more accurately caught by translating it as the thing (or world) as it really is(as distinct from how it appears to us).
https://philosophynow.org/issues/31/Kant_and_the_Thing_in_Itself
Note --- Empirical Science aspires to reality, but due to the "mediation" of the imperfect senses, must be content with "appearances" :
Appearance vs. Reality in the Sciences : https://academic.oup.com/book/7392/chapter-abstract/152237212?redirectedFrom=fulltext
*5. A bit of Information :
"What we mean by information - the elementary unit of information - is a difference which makes a difference, and it is able to make a difference because the neural pathways along which it travels and is continuously transformed are themselves provided with energy."
https://www.cs.bham.ac.uk/research/projects/cogaff/misc/information-difference.html
Note --- During the early period of Quantum science, Gregory Bateson was an English anthropologist, social scientist, linguist, visual anthropologist, semiotician, and cyberneticist whose work intersected that of many other fields. His first "difference" is physical, but the second "difference" is metaphysical (i.e. meaning), hence philosophical.
*6. "Call it the Heisenberg Uncertainty Principle of Error : we can be wrong, or we can know it, but we can't do both at the same time"
https://www.goodreads.com/quotes/tag/uncertainty-principle
My two bits worth :
I assume you meant that the universe is lawless, and completely random. Of course, the "Laws of Physics" are human interpretations of how the world works, as experienced by highly-evolved creatures with both senses and reasons. But our sensory experience of those lawful behaviors has occurred only in the last few million years of evolution. And modern scientists have picked-out the law-like Order within a background of Randomness. Do you view the universe as a Big Accident that just happened to haphazardly produce highly-organized creatures who ask question about the origins of Order?
Since the Big Bang, Nature has been coasting along on the angular momentum (vector) from a primordial burst of Energy of unknown etiology. The "angle" Nature takes over Time, seems to be regulated by primordial limits on the path of causal Energy. Which we know in retrospect as The Arrow of Time. Evolution is autonomous only in the details, due to random mutations (rearrangement of structure). Other than those details, the general direction was set in the initial conditions. Which included a trigger and the power to evolve, to change.
The unfolding of evolution is not impelled & guided by internal "laws". Instead, we infer the primordial "Laws" from observation of natural behavior. And some of us attribute such lawful behavior to a preternatural Lawmaker, imagined as a human king. As far as Cosmologists know though, Space & Time did not exist before the Bang. But how could nothingness "bang" without available Energy, or produce angular momentum without some input of direction? :smile:
Sounds remarkably similar to max Tegmarks mathematical universe.
It seems to me to also imply a sort of mathematical causality - if these numerical values change, that causes a different kind of universe, it causes different behaviours in the different universes.
It's talked about to some degree here, https://spotify.link/zMHaM1TQiDb
The kart before the horse. What is mathematics, first? I've been a mathematician for over a half century and can not give a clear definition. There are numerous pages on Wikipedia that revolve around this question. How did math arise in human thought? Through language and observations of what we now consider logical - cause and effect - in the physical world?
Is there math without symbols? Well, yes, if one has the patience to express mathematical ideas through common language. What of the visual aspect of the subject? Well, there have been blind mathematicians who have been quite accomplished. I knew one: Larry Baggett, at the University of Colorado. So one could replace symbols with ordinary language, which seems to imply math is a substrata we contemplate by one or the other.
So, when Tegmark speaks of the mathematical universe - a creation whose structure is somehow mathematics - how can that be? Doesn't structure require a framework of sorts? And one would think physical. Maybe a collection of homeomorphic entities, that share a mathematical description, which in turn provides a uniform structure that somehow reifies.
Heady stuff that I predict will be left by the wayside of time. Or not.
I have the opposite intuition - doesn't a physical structure require a framework of sorts? And one would think mathematical (or computational/algorithmic).
Common language uses symbols, but of a different type from mathematics. The mathematical symbol is principally a visual object, while the symbol of common language is principally an aural object. There is a big difference in meaningful purpose, or usefulness between these two types of symbols. The aural symbol serves to aid us in communication and assists in providing for our immediate needs. It's temporal existence is fleeting though. The visual symbol persists through time and serves as a memory aid. As an extension of memory (external memory) it enhances computational capacity.
We have developed ways to unify the two. Mathematical symbols have aural names, and aural words have letters which allow them to be written in the visual form. I believe that it is this ingenuity, which provided for the combining of these two, which led to the explosion of human development in the latest era of human development, beginning with cave painting, perhaps. The private memory aid of visual symbols became combined with the communicative power of common aural symbols, when the visual symbols were talked about, thereby producing a unification of distinct human memories which increased computational capacity exponentially.
Actually, thinking about it, I don't even really know what it means for the universe to be physical either in a similar way, since all I have is my experiences and my experiences definitely so not provide a direct link to the physical reality beyond my sensory boundaries. In so far as physical models are just predictive instruments that usually involve math substantially, then the boundaries of what it means to say the universe is physical (in the sense of our models) or mathematical blurs. But then, maybe its trivial, because we can impose mathematical language on almost anything in some way.
When we construct a building we lay plans, then follow through physically. So those plans underlie a physical project and provide a "framework", which can be destroyed if we wish without endangering the building. Not so in a mathematical universe. Somehow the building reifies the plans and the two can not be separated. Or whatever.
Are you using Wittgenstein as an authority to justify an evasive non-position on a philosophical question? Does that side-step imply that you have no philosophical worldview, or just that you don't want to expose your subjective personal "template" to objective critical analysis? I too, am wary of being dismissively labeled, but it's a risk I'm willing to take, in the interest of refining my beliefs in the give & take of philosophy. Perhaps you would be willing to deny the labels that don't apply to you?
By "template" you may mean Wittgenstein's "explanatory pictures" or merely "arbitrary belief systems". But humans seem to be born with a crude template to overlay on the outside world*1. That elementary world model is neither true nor false, but merely necessary to begin learning how to live in the real world. Immanuel Kant called the elements of knowledge "Categories of Understanding"*2. They are "pure" in the sense of not yet adulterated with conventional cultural belief systems.
For him that list of semantic compartments was merely a philosophical hypothesis, but modern Neuroscientists also assume that the brain stores experiences in a few pre-set categories. But, even with current technology, it's hard to locate them in the neuronal network, except in the most basic senses : taste, touch, etc. The inborn categories are general & imprecise, but become more specific with experience. Some brain scientists have even postulated a specialized "grandmother" or "Jennifer Anniston" cell {see image}. But the brain-maps are not likely to be single cells, or even that particular. Instead, they are organized into broad "meanings" or "semantic categories"*3.
Since these primitive templates (world maps) are inborn, you don't have to be "urged" to apply them to the non-self world. They automatically divide incoming sensory information into something like Kant's four categories and twelve classes. Over time, these general templates are refined into the specific concepts and comprehensive worldviews that philosophers argue over interminably. :smile:
*1. Category Learning :
We have instead evolved the ability to detect the higher-level structure of experiences, the commonalities across them that allow us to group experiences into meaningful categories and concepts. This process imbues the world with meaning. . . . . Categories represent our knowledge of groupings and patterns that are not explicit in the bottom-up sensory inputs
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3709834/
*2. Immanuel Kant's Categories of Understanding :
Kant ultimately distinguishes twelve pure concepts of the understanding, divided into four classes of three: 1. Quantity (Unity, Plurality, Totality), 2. Quality (Reality, Negation, Limitation), 3. Relation (Inherence and Subsistence (substance and accident), Causality and Dependence (cause and effect), Community (reciprocity)), and 4. Modality (Possibility, Existence, Necessity).
https://www.thephilosophyproject.in/post/immanuel-kant-s-categories-of-understanding
*3. New Map of Meaning in the Brain :
Our understanding, our knowledge about things, is actually somewhat embedded in the perceptual systems,
https://www.quantamagazine.org/new-map-of-meaning-in-the-brain-changes-ideas-about-memory-20220208/
BRAIN CATEGORIES including questionable Jennifer Anniston cells
My point was just to see if you were arguing from a well-thought-out personal worldview, or just parroting a party line (or template). For example, for all practical purposes (e.g. science & technology) I could be placed under the heading of "Materialist". But, for theoretical purposes (e.g. philosophy & ethics) I might fit better into the category of "Idealist". That's because the non-human material aspects of the world have no Ideas (words, concepts) for us to argue about : either it is, or it ain't.
Since my personal worldview is multi-faceted & complementary, I have labeled it a "BothAnd" philosophy. Yet, for Either-Or One-Siders, that broad-mindedness is confusing. Another way to look at it is : my scientific worldview is both Classical (matter/energy/objective) and Quantum (mind/observer/subjective). Since the topic of this thread is a Quantum physics question, my comments will be primarily focused on the mental interpretation. Which I suspected might clash with your views. Hence, the request for clarification. So yes, my intuition has been confirmed. But there is still room for further philosophizing. :smile:
For the purposes of this forum, Ideas are the non-things (non-stuff) that we argue about in threads such as this. And for the most part, Ideas are limited to a tiny clique in the universe, consisting mostly of the upright animals we label as homo sapiens ; implying that other animals are not wise enough to debate about the meaning of ideas. Hence, in the Real world, no questioning humans, no ideas, no philosophy ; just atoms whirling in the void. What makes ideas moot is their immaterial "substance". Material objects are seldom the topic of TPF threads. :smile:
Quoting tim wood
It's also the lack of material evidence for thingness, that limits Ideas to the central focus of philosophical forums, and only peripherally for scientific forums. The latter are supposedly reserved for those who "shut-up and calculate". And feckless philosophers are not welcome to blab on & on about Qualia which cannot be Quantified. :wink:
Quoting tim wood
Gladly! The term "quantum" was introduced into the vocabulary of science to represent the aspects of reality that were assumed, by Classical Physics, to be continuous, but in sub-atomic experiments returned dis-continuous results. The quantum pioneers didn't describe those results in terms of Magic, but of "Nature exposed to our methods of questioning" (Heisenberg). In order to deal with both the continuous and the discrete nature of sub-atomic Nature, the pioneers re-introduced philosophical methods into empirical numerical science. That qualitative method of interpretation*1 had been banished centuries ago as too entangled with Religion & Magic. Quantum physics is unavoidably statistical, returning not absolute either/or answers but relative BothAnd percentages, :cool:
*1. Measurement problem :
In quantum mechanics, the measurement problem is the problem of how, or whether, wave function collapse occurs. The inability to observe such a collapse directly has given rise to different interpretations of quantum mechanics and poses a key set of questions that each interpretation must answer.
https://en.wikipedia.org/wiki/Measurement_problem
Note --- In classical physics, light was assumed to flow like water. But the quantum measurements came back in discrete bits, now called Photons, that seem to be both discrete particles and continuous waves. It's the introduction into physics of the necessity for philosophical (statistical) interpretation, that caused 20th century physics to seem weird, and even magical. :joke:
Quoting tim wood
Feynman "did not know" what quantum duality meant, because he was looking for absolute Either/Or answers, not Einsteinian BothAnd relative approximations. His attitude of "shut-up and calculate" --- while avoiding the philosophical problem --- is what has allowed modern science to produce the 21st century technology, such as atomic bombs, cell phones, and Twitter gossip, that we enjoy today --- but would have seemed magical in the 17th century.
Presumably the "quants" (number-crunchers) who process the data (unambiguous numerical information) of technology, are not distracted by "additional weirdness" (ambiguous philosophical questions). That wordy waste of time is reserved for a few philosophical forums, such as TPF. If you are mainly interested in Material Science, a pertinent question might be, what are you doing posting on an un-scientific forum? Trying to show the weirdos the error of their way? :nerd:
PS___My interest in quantum physics is mostly due to its discovery of the multiple roles of meaningful & causal Information (e.g. Ideas) in the real world*2. Quantum theory is not about Matter, but about Math. And Math is about Mind : knowable relationships, not sensible objects.
*2. Beyond Weird by Philip Ball wins Physics World Book of the Year 2018 :
Rife with science, Beyond Weird also contains a hefty helping of philosophy, as Ball attempts to reconcile quantum reality with seemingly confounding experimental results. Quantum theory may actually be a theory about information, and how we gain it. As Ball writes, a more if this, then that approach to understanding the outcome of an experiment may be what we need to meaningfully understand the quantum world.
https://physicsworld.com/a/beyond-weird-by-philip-ball-wins-physics-world-book-of-the-year-2018/
Is it? I've never seen, tasted, or touched an electron. All I know about those invisible entities is the published interpretations of quantum physicists*1. 17th century physicists had no concept of an electron, but they imagined fundamental particles of matter, that everyone had agreed on since the 5th century BC*2. Besides, its properties depend on how you look at it*3. Is that a "literary" interpretation? Unlike the simple atoms of Classical Physics, quantum-scale particles are subject to various interpretations*4. Is that still "hard" Science, or is it "literary" Philosophy, or both? :smile:
*1. Interpretations of quantum mechanics :
An interpretation of quantum mechanics is an attempt to explain how the mathematical theory of quantum mechanics might correspond to experienced reality.
https://en.wikipedia.org/wiki/Interpretations_of_quantum_mechanics
*2. Dalton's atomic theory :
He proposed that all matter is made of tiny indivisible particles called atoms, which he imagined as "solid, massy, hard, impenetrable, movable particle(s)".
https://www.khanacademy.org/science/chemistry/electronic-structure-of-atoms/history-of-atomic-structure/a/daltons-atomic-theory-version-2
*3. Electron mass is sometimes termed as rest mass because according to the special theory of relativity, mass of the object is said to vary according to the frame of reference.
https://byjus.com/physics/electron-mass/
*4. Subatomic Particles :
There are more than 12 subatomic particles, but the 12 main ones include six quarks (up, charm, top, Down, Strange, Bottom), three electrons (electron, muon, tau), and three neutrinos (e, muon, tau).
https://www.wondriumdaily.com/subatomic-particles-the-quantum-realm/#:~:text=There%20are%20more%20than%2012,e%2C%20muon%2C%20tau).