Quantum mechanics is universally considered to be so weird that, as Niels Bohr quipped, “if you are not shocked by it, you don’t really understand it.” One of the most shocking phenomena predicted by quantum mechanics is quantum entanglement, which Einstein called “spooky action at a distance.” He thought a more complete theory could avoid it, but in 1964 John Bell showed that if the predictions of quantum mechanics are true, then spooky action at a distance must indeed take place, given certain reasonable assumptions. Last week, in her article “Experiment Reaffirms Quantum Weirdness,” Natalie Wolchover reported that physicists are closing the door on an intriguing loophole related to these assumptions. This “freedom of choice” loophole had offered die-hards a possible way to avoid believing in spooky action at a distance.

This month’s Insights puzzle takes on the shocking weirdness of the quantum realm as implied by Bell’s theorem. It uses familiar objects and phenomena to reason about quantum particles in an intuitive way that, in my view, gets rid of the weirdness or at least shoves it out of sight so that the results don’t seem so strange at all. Is a simple physical model of quantum mechanics possible? Perhaps! You be the judge.

But first, let’s review Bell’s theorem and introduce our puzzle:

Two students, A and B, who are polar opposites of each other, are gearing up to do a course on quantum mechanics. Thirty-seven days before the course (Day –37) they take a computer test consisting of 100 true/false questions. Every question that A answers as true, B answers as false, and vice versa — their answers are perfectly anti-correlated. At the start of the course (Day 0), the two take the same test again. Some of their answers are now different from what they were the first time, but they are still perfectly anti-correlated. Thirty-seven days later (Day +37), they take the same test for the third time. Again, some of their answers are different, but they are still perfectly anti-correlated.

You and a friend sit at separate computer terminals and compare the tests. You can bring up just one of A’s tests on your computer screen at any given time, while your friend can bring up just one of B’s. First, the two of you pull up the tests the students took on the same day, comparing A’s Day –37 test with B’s Day –37 test, and so on. Sure enough, they are all perfectly anti-correlated, with no matching answers at all. Next, you compare A’s Day 0 test with B’s Day –37 test. In this case, there are exactly 10 answers that match. Similarly, B’s Day 0 test has 10 answers that match those in A’s Day +37 test. Finally, you compare B’s Day –37 test with A’s Day +37 test. And here comes the surprise …

Question 1: What are the minimum and maximum numbers of matching answers you would expect for these two tests?

Question 2: If you found that there were 36 answers that matched, how would you explain it?

Question 3: Where do all the numbers in the above scenario (–37, 0, +37, 10 and 36) come from? (If you have no idea, read on for a hint.)

OK, what does all this have to do with Bell’s theorem? To quote Wolchover:

… when two particles interact, they can become “entangled,” shedding their individual probabilities and becoming components of a more complicated probability function that describes both particles together. This function might specify that two entangled photons are polarized in perpendicular directions, with some probability that photon A is vertically polarized and photon B is horizontally polarized, and some chance of the opposite. The two photons can travel light-years apart, but they remain linked: Measure photon A to be vertically polarized, and photon B instantaneously becomes horizontally polarized, even though B’s state was unspecified a moment earlier and no signal has had time to travel between them. This is the “spooky action” that Einstein was famously skeptical about in his arguments against the completeness of quantum mechanics in the 1930s and ’40s.

In 1964, the Northern Irish physicist John Bell found a way to put this paradoxical notion to the test. He showed that if particles have definite states even when no one is looking (a concept known as “realism”) and if indeed no signal travels faster than light (“locality”), then there is an upper limit to the amount of correlation that can be observed between the measured states of two particles. But experiments have shown time and again that entangled particles are more correlated than Bell’s upper limit, favoring the radical quantum worldview over local realism.

These experiments map directly to our puzzle. A and B’s same-day tests are the anti-correlated photons, and you and your friend are the experimenters. The days of the tests represent the angles, in degrees, of your respective polarizers. If the polarizers are at the same angle (same-day tests), the photons are 100 percent anti-correlated, just as the students are. Since the situations are isomorphic, we should be able to replicate the photon correlation results with the test correlation results — the situations should give identical numerical answers for all angles (days) under common-sense assumptions. These commonsense assumptions are: Completed tests with definite answers exist (realism), they cannot influence each other while the grading is being done (locality), and the examiners are free to compare any of A’s tests with any of B’s (freedom of choice). For polarizers at different angles, the quantum mechanical prediction, now experimentally well established, is that the correlation between them is given by the formula 1 *–* cos^{2}(θ/2), where θ is the angle between the two polarizers. This innocent-looking correlation function cannot be achieved with the assumptions given above: The discrepancy is clearest if you take the value for a given angle (correlation between A and B’s tests taken a given number of days apart) and use it to calculate the maximum value for twice that angle (correlation between A and B’s tests taken twice the number of days apart) as we verified above. The correlation between the entangled photons is much higher than that possible between the students’ tests. This is an example of how quantum mechanical correlations for entangled particles breach what is known as “Bell’s inequality.”

Question 4: Using the above formula, what is the largest possible difference between the actual correlation for an angle 2θ and the maximum value calculated for 2θ from the given correlation for θ, under the three assumptions described above? At what angle between the polarizers does this largest possible difference take place?

If you have followed the above calculations diligently, you cannot escape the conclusion that the polarization of both photons (represented in the figure by a red or blue color) only takes on a unique value at the instant of, and through the act of, measurement itself. There is absolutely no way to explain the results using real-world objects, is there?

But wait a minute. Let’s consider just one qualitative aspect of the quantum weirdness — the idea that the quantum attributes of an entangled pair of quantum particles are chosen at random by the act of measurement, at the instant of measurement, at potentially widely dispersed points in space. What if you pictured the photons not as solid particles but as being similar to elongated “balloon animal” balloons as shown in the illustration at the top of the page? Imagine that the horizontally photon balloon is a red balloon, and the vertically polarized photon is a blue one. In what follows, try not to focus on the mechanism of how this could be achieved with real balloons, but rather on how balloon-like objects would behave in this kind of set-up. When entangled photons supposedly rush off in opposite directions, imagine they are actually like elongating self-inflating balloons twisting tightly around each other, with each balloon projecting at the speed of light in both directions. Imagine that the balloons are rigged up (entangled) in such a way that they always deflate together and in opposite directions. Then each balloon will be accessible at both ends — if you blindly grasp one (make a measurement), you could come up with either. Imagine that when the measurement is made, it “spears” one of the two twisted balloons at random. This results in instant disentanglement and deflation of both balloons, and the non-speared one, now no longer anchored, snaps back to the opposite end (I have frequently, and painfully, experienced a similar phenomenon with balloons and rubber bands). It’s easy to see why the color of the balloon at one end turns out to be the opposite of the color at the other end. This easy-to-visualize model captures how the selection of attributes could happen only at the instant of measurement, in places widely dispersed.

What about the fact that the two measurements could potentially be carried out light years apart? Wouldn’t there be tremendous lag between the results at either end? Well, when I say the balloons disentangle instantaneously, I mean instantaneously — the above-mentioned snapping back happens faster than the speed of light! The potentially infinite extension of the particles and their superluminal snapping back in this model, though, are not really a problem: These properties are implicit in the mathematics of quantum mechanics anyway. Quantum mechanics specifies that particles can have a finite amplitude to be everywhere in the universe, and wave function collapse (represented here by the superluminal snapping back) is internal to each particle and therefore cannot transmit information. This visualization thus hides away the weird aspects of quantum mechanics and does not break any laws.

I find elastic balloons or bubbles very useful for representing quantum particles. Anyone who has played with soap bubbles in a sink, or air bubbles trapped under a plastic sheet or a carpet, has seen how large bubbles can divide into myriads of “bubblets” that are all over the place, just like particle amplitudes. These bubblets can suddenly and unexpectedly coalesce into the original-size bubble at a completely different location, just like quantum particles. Imagine a two-slit experiment where a bubble splits into two equal-size wave-borne bubblets and goes through both slits, to suddenly coalesce, fully formed, at the instant and place where the measurement is made! It’s fully faithful to the quantum mechanical idea that each particle ultimately interferes only with itself. Maybe quantum particles are like dynamic subdividing, shape-shifting bubbles trapped within a plastic-sheet universe, taking on and revealing their individual attributes only when we probe them and force them to become whole at some location. Perhaps each particle is free to fractionate into millions of dispersed parts in its own private cosmic wormhole, until a measurement forces it to become whole at some particular location, chosen probabilistically.

For now, this idea of visualizing quantum objects by means of bubbles or elastic balloons is just a fun heuristic exercise. Can we use it to build a fully deterministic theory containing real, albeit strange, internally superluminal objects, while using completely traditional probabilities? I’d like to know what readers think. And if any of you possess the deep training and expertise in this field that would be required to create a full-fledged theory, and would like to collaborate, I’d love to hear from you. Happy puzzling!

*Editor’s note: The reader who submits the most interesting, creative or insightful solution (as judged by the columnist) in the comments section will receive a *Quanta Magazine* T-shirt. ( Update: The solution is now available here.) And if you’d like to suggest a favorite puzzle for a future Insights column, submit it as a comment below, clearly marked “NEW PUZZLE SUGGESTION” (it will not appear online, so solutions to the puzzle above should be submitted separately).*

*Note that we may hold comments for the first day or two to allow for independent contributions by readers.*

The balloons are a nice visual tool. One point that they clearly illustrate is that irrespective of any "unrealism", you always need some sort of nonlocality to explain violation of Bell's theorem. Saying that the particles/balloons can have indefinite states is not enough: you always get violation of the principle of strong locality (aka "local causality").

You say that this superluminal influence is not a problem because it "cannot transmit information", but it does transmit information: the state of the measured particle. The crucial point is that we cannot control the transmitted information, so there is no superluminal signalling.

For more info see, for example, https://arxiv.org/abs/1503.06413 or https://arxiv.org/abs/1506.02179

"Can we use it to build a fully deterministic theory containing real, albeit strange, internally superluminal objects, while using completely traditional probabilities? I’d like to know what readers think."

Just as I suspected… First the Sleeping Beauty problem halfers and now the psi-ontologists. I think you should just stop defending the probability theory dunces and deniers. 😉

"The potentially infinite extension of the particles and their superluminal snapping back in this model, though, are not really a problem: These properties are implicit in the mathematics of quantum mechanics anyway."

That circular and profoundly ironic "just follow the maths" excuse is one of the MWIers' favourites, but at least they don't have to try to persuade us that superluminality is "not really a problem"! (The MW misinterpretation requires only [the untestable assumption of] a baroque ontology, not an explicitly unphysical one). In that sense at least it is superior to the psi-ontic misinterpretations which do entail the abandonment of firmly established fundamental physical principles. Implicit in the maths of QM is the generalisation of probability theory needed to accommodate the fact that observation is interaction.

"Quantum mechanics is universally considered to be so weird that, as Niels Bohr quipped, “if you are not shocked by it, you don’t really understand it.” One of the most shocking phenomena predicted by quantum mechanics is quantum entanglement"

Of course quantum mechanics is not universally considered to be weird. And there is nothing shocking about entanglement*. Bohr's remark is of its time and there is no excuse for it now. What's shocking – truly shocking – is that all these years after Max Born and others exposed the fragile conceptual foundations of classical physics to a bit of hard thinking, and John von Neumann and others provided the mathematical foundations of the appropriate theoretical context in which to compare QM and CM, people are still trying to fix the imaginary weirdness they see in QM by putting real weirdness into it.

* https://web.archive.org/web/20151117174141/http://www.mth.kcl.ac.uk/~streater/EPR.html

**

You lost me at hello.

This article was very entertaining, and fun to visualize particles as bubbles!

I like to imagine that what we call present is a single flat sheet/layer…(Negative energy is "future bits" and positive energy is "past bits") and we can only observe the neutral energy "present bits" that make up the layer.

That spooky action at a distance, is an illusion, because relative to us there's a distance, but maybe relative to the bits, they are never "apart" time wise (present/neutral layer).

Just an enthusiast amateur imaginings!

Einstein's contention that hidden variables play a part in Spooky Action has now been decidedly proven to be wrong. In ancient times this spookiness would have been called magic – nowadays its just non local! Come to think of it this quantum idea of probabilities before fact is quite logical just as in a photograph the whole picture is not clearly defined until one focuses specifically.

Some quick responses:

@Mark Alford,

Agreed. Your statement that "The crucial point is that we cannot control the transmitted information, so there is no superluminal signaling" is more accurate than what I wrote. As you say, the superluminal snapback "does transmit information: the state of the measured particle," but my attempt here is to show that it needn't be information as the modern imagination pictures it to be — i.e. translated into digital bits or qubits. It could be just as easily be pictured as a simple physical process among strange objects that nevertheless have some analogy to things we are familiar with.

@phayes,

Just because you can build a successful mathematical model of probability as interaction, does not mean that you have figured out what's actually happening at the physical level. It could just be a front for ignorance. Mathematical models are conceptual after all, as I've pointed out before: the map is not the territory. Electrical engineers very successfully use models involving imaginary quantities, which enable them to conceptually map real world properties such as phase. It would be hubristic to imagine that you understand the details of how electric current is generated by the physical interactions of millions of electrons and atoms, just because you understand imaginary numbers. As far as Von Neumann's collapse model, it is just plain wrong physically, and has generated a lot of speculative nonsense about consciousness being involved in quantum mechanics. See: http://cosmology.com/Consciousness139.html

Blithely accepting that a conceptual model of probability as interaction is the last word about physical reality, risks glossing over and ignoring really interesting phenomena regarding what really happens during a measurement, how and when "measurements" happen naturally in the environment, and the phenomenon of decoherence.

More later…

The idea that I've presented above, that every quantum object by default has its own private wormhole that it normally frolics in, might provide a natural conceptual foundation for the ER=EPR conjecture, which states that entangled particles are connected by a wormhole (or Einstein-Rosen Bridge), proposed by Leonard Susskind and Juan Maldacena in 2013.

See also, the Quanta article by K.C. Cole: The Quantum Fabric of Space-Time

Experts in this subject are encouraged to weigh in…

@Pradeep Mutalik

The ideas that probability is "a front for ignorance" and the "map is not the territory" ["beware the mind projection fallacy"] were the mother and father of my rant!

What von Neumann provided the foundations for isn't "a model of probability as interaction", it's the mathematical model of probability which generalises 'ordinary' probability theory and is capable of accommodating interaction / noncommuting observables* And "von Neumann's collapse model" is silly psi-ontology, yes, but how is that relevant? If we discovered that Kolmogorov had favoured the 'propensity' interpretation of 'classical' probability would we have to stop interpreting it 'Bayesianly'?

I really don't see how wishing people would read a few more words from the probability theory book before they decide it's time to throw out the physical reality book can be interpreted as wishing to have the last word on the latter.

http://blogs.discovermagazine.com/cosmicvariance/files/2011/11/banks-qmblog.pdf

https://arxiv.org/abs/quant-ph/0601158

https://terrytao.wordpress.com/2010/02/10/245a-notes-5-free-probability/

A serious objection on the concept behind the proposed puzzle has just appeared in a blogspot note

http://motls.blogspot.gr/2017/02/entanglement-why-two-schoolkids-always.html#more

You might find this to be of some interest:

http://vixra.org/pdf/1609.0129v1.pdf

It explains quantum correlations as a unavoidable consequence, of attempting to make multiple, independent measurements, on an entity that cannot possibly have multiple, independent attributes, namely, a single bit of information, as defined by the limiting case of Shannon's Capacity Theorem, in Information Theory.

Pradeep liked the article and your playing with "visual models" such as polarisation balloons rather than let the maths lead us by the nose without any thought to what "really happens during a measurement " or indeed an interaction

Firstly can I say that electrical engineers do not use models with "imaginary quantities " they use imaginary numbers to predict for example the phase relationship between current and voltage in ac circuits

Likewise in Quantum Mechanics we use imaginary numbers but ! We are not sure the role that imaginary numbers perform only that for example in Schrodingers equation they are involved with wave functions and so probabilities without reference to a visualisation of what's happening

The current and voltage waves are quite acceptable as visualisation models but waves of probability……no not very satisfactory

So perhaps your models of expanding and twisting "balloons" are a very good first attempt at visualisation , maybe balloons, maybe bubbles , maybe knots of electomagnetic fields…..

1) Min: 0 if the ones that switched differently in (-37)->0 and (0)->(+37) are the same set of questions. Max: 20 if the two sets are disjoint.

2) Only way to explain is that if somehow observing the results of A's test changes the answers of B's test and vice-versa. The answers are entangled states and when we observe A's state then B's state changes depending on the observation.

4) If the angle of the polarizers is pi/6 then the initial correlation is 0.25. At double the angle (pi/3) the correlation increases to 0.75. The excess correlation (that cannot be explained classically) is 0.75 – 2*(0.25) = 0.25. This excess is maximized at this angle and can be computed by maximizing: 1-cos^2(2x) – 2*(1-cos^2(x)) = cos(2x) – cos^2(2x)

3) If the initial correlation is 0.1 then the angle is .6435 radians. At twice the angle the correlation becomes 0.36 which leads us to the number 36 in our final observation. It seems we are offsetting our polarizers by a degree each day which leads to a correlation of 0.1 in 37 days.

Hi. Over and over I read people talk about the transmission of the state of one particle to the other entangled particle.

Distance is irrelevant. So take distance out of the equation. Time is also irrelevant. Take time out also. Neither apply to entangled particle.

I see it like this. Break the process of measurement into a trillion steps. As a measurement is beginning at one particle the exact level of measurement is occurring to the other particle. Forget distance. Imagine they are touching. Time doesn't matter because they are touching. the motion of one particle flipping occurs identically to the other particle.

My opinion…and I'm going to get blasted for thinking outside the box here, is that in some way even though we measure distance between the particles linearly, there is no distance between then. Perhaps our understanding of space-time is not quite right. The analogy of a piece of paper folded in half representing space itself would all the particles to touch yet from our perspective there is a linear distance.

I don't do drugs.

Bob

It would be interesting to know (from a string theorist maybe) if the suggestion of interpreting particles as bubbles bears any resemblance to particles as strings (aka string theory). I have heard of strings, membranes, D-branes but not sure how they relate. So instead of the balloons tying around each other to form entangled particles we just have two strings forming a rope like structure. Does string theory really treat entangled particles as "entangled" strings?

"Just because you can build a successful mathematical model of probability as interaction, does not mean that you have figured out what's actually happening at the physical level. It could just be a front for ignorance. Mathematical models are conceptual after all, as I've pointed out before: the map is not the territory."

I think I am agreeing with phayes when I say that reality/actuality is just that to which descriptions (by reference to likenesses), or, better, explanations (by reference to coincidences of likeness), of phenomena – one's experience(s)/observation(s) – refer. "The physical level" is just as conceptual as a mathematical model. Conception is all anyone ever has. To the extent that one's conceptions are accurate one has apprehended reality/actuality. That is as good as it can get. The map–territory distinction is a red herring, except in the sense that conceptions are maps and what they refer to is territory.

Question 1.

I find that the possible numbers of matching answers are 0, 2, 4, …, 18, 20.

We are told that A's answers on day 0 agree with B's answers on day -37 in exactly 10 cases. Hence, due to the mismatch, this is equivalent to saying A's answers on day 0 disagree with A's answers on day -37 in exactly 10 cases. Labelling all the disjoint subsets in a Venn diagram for the sets in which A answers T leads to my answer above, but it is too messy to write in long form, so I'll give a slightly different approach that is easier to write down.

Each of the 100 question numbers can be sorted into one of 8 buckets, according to A's possible answers on the three days. The buckets are

(T,T,T), (T,T,F), (T,F,T), …., (F,F,F),

where (T,F,T) for example is for those questions that A answers T on day 0, F on day -37, and T on day 37,etc. Because of the perfect mismatch with B, the bucket (T,F,T) also corresponds to those questions that B answers F on day 0, T on day -37, and F on day 37.

Let N(T,T,T) be the number of questions in bucket (T,T,T), etc. As pointed out before, we are told that A's answers on day 0 disagree with those on day -37 in exactly 10 cases. Disagreement means that A answered T to a question on day 0 and F to the same question on day -37, or vice versa. It doesn't matter what she answered on day 37. So,

N(T,F,T) + N(T,F,F) + N(F,T,T) + N(F,T,F) = 10.

The first two terms are the number of questions she answered T to on day 0 and F to on day -17, and the second two are the vice versa.

We are also told A disagreed in her answers (via the perfect mismatch with B) exactly 10 times on days 0 and 37. So, in the same way,

N(T,T,F) + N(T,F,F) + N(F,T,T) + N(F,F,T) = 10.

From the above two equations, it is easy to show that

N(T,F,T) + N(F,F,T) + N(T,T,F) + N(F,T,F) = 20 – 2( N(T,F,F)+N(F,T,T) ),

which can clearly only take the values 0,2,4, …, 18, 20 (since the numbers in each bucket can't be negative). But this is precisely the number of times A disagrees in her answers on days -37 and 37, which is precisely the number of times A agrees with B on these days.

Question 2.

The answers to the tests must actually change each time a given pair is pulled out. Which seems impossible, because A and B gave their answers some time ago, so these answers should be 'really' there. This suggests that the computers used to bring up the answers each time have been tampered with! Possibly by astronomers in Vienna!!

I guess that one is therefore really trying to explain the behaviour of the computers and their users. If there is a quantum model, then one can use any of the interpretations of quantum mechanics to do this.

Question 3:

The answer seems to be given later on.

#(A and B agree)/100 ~ 1 – ( cos((d1-d2)/2) )^2

for days d1 and d2. So the relative frequencies are close to the quantum probabilities of the photon experiment.

Question 4:

For a large number of questions, N (not just N=100), the number of matches will be

N( 1 – (cos theta/2)^2 )

for days 0 and theta, and days 0 and -theta (and any 'days' separated by theta). By the same argument as for question 1, the number of matches for two 'days' separated by 2 theta, can be at most twice the above number. So,

maxcorr(assumptions) = 2 ( 1 – (cos theta/2)^2 ) = 2 (sin theta/2)^2 ,

in comparsion to

maxcorr = 1 – (cos theta)^2 = (sin theta)^2.

The difference is then

difference = (sin theta)^2 – 2 ( sin theta/2 )^2

= 2(sin theta/2)^2 ( 2 (cos theta/2)^2 -1)

= x (1-x).

where

x = 2(sin theta/2)^2 = 1 – cos theta.

This is obviously maximised for x=1/2, i.e., for

cos theta = 1/2, corresponding to

theta = 360n +/- 60 degrees

for any integer n. Taking theta in the range [0,90] gives a value of 60 degrees.

"There is absolutely no way to explain the results using real-world objects, is there?"

If there are three assumptions as stated, realism, locality and freedom of choice, then it is logical that there are ways to explain the results by keeping realism and throwing away one of the other two assumptions.

The balloon model obviously throws away locality. So does the Bohm model of quantum mechanics – https://plato.stanford.edu/entries/qm-bohm/ .

Superdeterminism obviously throws away freedom of choice. So does the Hall model mentioned in Natalie Wolchover's article – https://arxiv.org/abs/1007.5518 .

Standard quantum mechanics obviously throws away realism. The Copenhagen interpretation, for example, only allows that preparation and measuring devices are real ("classical"), with quantum mechanics being about the correlations between such devices. In this sense the correlations are real, when measured, but there is no underlying reality of, say, "photons" that cause these correlations.

Models that throw away locality, like the balloon model and the Bohm model, seem rather like models with instantaneous "collapse" of the wavefunction, but with some determinism added in. This determinism comes at a cost, because it requires a preferred frame of reference for the deterministic instantaneousness to occur, which is against the spirit of relativity theory.

Relativistically, there exist different reference frames which disagree on which of the two measurements is "first". So there is a problem if A pricks the red balloon first in one frame, but B picks the red balloon first in a different frame – their results will agree instead of disagree! To avoid this, there must be a preferred reference frame on which all observers agree. This is not incompatible with relativity as long as all observers agree on the preferred frame (or if one uses wormholes to modify large scale spacetime structure), but it is an extra complication.

Another problem with throwing away locality is that one has the magic property that despite a whole lot of faster than light stuff going on at a quantum level, it never gets magnified up to allow actual communication faster than light. Instead, it somehow must be cancelled out at the level of the measured statistics. This need for a special argument makes such model subject to the criticism of Occam's razor.

In the balloon model, for example, one would have to explain why we are not allowed to set up a single red balloon between the two observers. Then one communicates a bit of information to the other by either measuring or not measuring the balloon. The balloon instantaneously bursts or does not burst. The message is received instantaneously, faster than the speed of light, by seeing whether there is still a balloon or not at the other end. Either the balloon model has to happily assert such communication is possible via suitable physical manipulation, or to explain how it is impossible to set up such communication in practice.

@All,

You might find Tim Maudlin’s “Quantum Non-Locality & Relativity” (3rd edition) to be of some interest. =)

@Bob Lally,

No worry for any blast, you’re not the only one who “think[s] outside the box here”: You may want to read https://www.quantamagazine.org/20161201-quantum-gravitys-time-problem/ and probably all the articles linked by that one.

In my opinion this article is the best summary I’ve read about latest speculations/understandings (pick your choice) on the true nature of space-time and gravity (the only missing explicit data is the mention of the Wheeler–DeWitt equation, but implicitly present as Page and Wootters are mentioned).

The summary of this summary is the quotation in the 4th paragraph: “I think we now understand that space-time really is just a geometrical representation of the entanglement structure of these underlying quantum systems” (Mark Van Raamsdonk).

See, it pretty covers your comment! 😉

Pradeep, thanks for an interesting article and puzzle. In the last sentence of the first paragraph of your "students" model, you say:

"Again, some of their answers are different, but they are still perfectly anti-correlated."

What do you mean by different? ie, do you mean from the initial results on -37, or

different from their results on day 0?

Thanks again, Jim Farned

In our real world we primarily move through space from position (x, y, and z) to a second position (x, y, and z) in time (t). We also occupy space in our real world which can also be considered a field. Einstein said that mass = energy and in the quantum world energy is equivalent to mass. For instance an electron in the quantum world is a mass of energy that is compared to undefined smoke in space with no discernable boundaries. This undefined boundary is a field. If an electron gains energy, it jumps to a higher orbit or if it loses energy it jumps to a lower orbit. Since this jumping occurs it seems that space is simply an (x, y, and z) location in the quantum world and travel is to location (x, y, and z) on quantum time (t). Taking it a step further, the orbit of an electron could be considered to be a quantum time line, infinite in length, although we only see it as a circle or oval, In addition since we have particle / wave duality, the wave in the quantum world must be energy representing the particle in our real world. This means that numbers, quanta or particles (35) must be expressed in terms of energy (11.66) which stretches to infinity on quantum time (t). Let’s suppose we drop two particles into the quantum world which immediately combine their energy representations (add their infinite energy quantum time lines to reach a total). This means we have two entangled particles that must be differentiated on the end of their added infinite quantum time lines. This differentiation is done through opposite polarization. Information can be sent along the quantum time line but that information travels at the speed of light.

@Jim,

I mean different from Day 0.

To summarize,

10 of A's answers on Days —37 and +37 are different from A's answers on Day 0. However, A's answers on all three days are perfectly anti-correlated with B's answers on the corresponding days.

Thanks for the clarification, Pradeep; may I further reveal my ignorance by requesting a bit more of the promised "hint" concerning the numbers 0, 10, 36,+37, -37….? I do not see that this was the subject of a previous question, so I trust I am not asking a question that you have already answered.

@Jim

The hint is that the two students scenario is similar in structure (isomorphic) to the situation in quantum mechanics where two photons are entangled, with the days between tests corresponding to the angle between polarizers in degrees. The formula for the quantum correlation is given. So try substituting the number of days between tests for the angle in the formula…

Nietzsche introduced the idea of perspectivism: in the final analysis, all we really have is a manifold of interlocking perspectives. For example, consider the following toy model. If humans are small finite, represent each possible human perspective by a small non-empty subset of {1,…,n} where n is a large natural number. Then, there are minimal perspectives, but no maximal human perspective. Still, there is an ideal finite perspective which sees everything! If n=infinity, then there is still an ideal infinite perspective which sees everything! (God's eye-view!) If one accepts the standard quantum logic then one has a manifold of perspectives which cannot-by Gleason's Theorem-be embedded into any single perspective! There are now maximal perspectives, but no universal perspective! (Theologically, this requires accepting polytheism!! Alternatively: Even God suffers from cognitive dissonance!!)

Pradeep, pardon my puzzled persistence. Suppose we map the "two students scenario" (TSS) onto an xy-grid, with Day for A on the x-axis and Day for B on the y-axis. Let

a function m(x,y,k) be defined as =1 if the kth (1-100) item of the test results for on

Day x agrees with the result for B on Day y. Summing over k, we arrive at a function

M(x,y) which gives the number of matches when Day x for A is compared with Day y

for B. We know M(x,x) = 0 and that M(0,-37) =10 and that M(37,0)=10. I suppose

you to be asking: Assume that M(37,-37) = 36 and consider possible explanations.

While it is true that 1 – cos^2[(37-(-37))/2] = .36, and that 1 – cos^2[(37-0)/2] is

approx. 1/10, it would never occur to me to propose those relations as explanation

the occurrences of the 36 matchings or of the 10 matchings. I am a mathematician, not a physicist, so there may be some inside knowledge or assumption here that I am missing. I have for some time been interested in the derivation of the Bell inequality,

so I continue to hope for some insight here. Thank you for your patience, and I hope that the above remarks are pertinent and clear.

In the double slit experiment, are the number of electrons/photons going through the two slits and hitting the backboard where they are recorded, exactly the same whether there the slits are monitored or not to see which one the electron/photon went through?

Do the same number show up at the backboard whether the slits are monitored or not?

@Josh For the double slit experiment if we try and observe which slit each electron/photon goes through then the interference pattern disappears.

https://en.wikipedia.org/wiki/Double-slit_experiment#.22Which-way.22_experiments_and_the_principle_of_complementarity

@Josh

As Ashish said, detection causes the interference pattern to be destroyed, but the number of particles detected remains the same. The double slit experiment has even been done with very attenuated electron sources, so that we can say with certainty that only one particle is sent toward the slit, and yet there is interference – the particle interferes with itself!

In my plastic enclosed bubble visualization model, two half-bubbles pass through the slits and interfere with each other. Here is how to imagine the measurement.

Rules:

1. Only full bubbles can be detected.

2. As it moves, a bubble spreads out into tiny bubblets, following a wave distribution.

3. When you probe a bubble, it is like pinching the enclosing plastic, which always reconstitutes the full bubble: Either you can "push" the bubblets away to reconstitute the bubble at a remote spot, or you "suck" the bubblets to reconstitute at the point you are probing, resulting in finding the bubble there.

Once the bubble is found (or not), the reconstituted bubble spreads out again, but does not have any phase-shifted partner to interfere with itself.

@Jim,

You said, "it would never occur to me to propose those relations as explanation…" You are right, and that is precisely the point! Those relations are impossible if we apply assumptions from our macroscopic everyday world – namely, that A's and B's completed tests exist (realism), and that they do not influence each other at the time the grading takes place (locality), with the added assumption that we can compare any test of A's with any of B's (freedom of choice). If A and B had decided to collude at making M(0,-37) equal to 10, and M(37,0) also equal to 10, there is no way they could also make M(37,-37) equal 36.

But in fact, this is exactly what happens in the entangled photon scenario, which quantum mechanics correctly predicts:

If the polarizers at points A and B are both at the same angle (say, vertical) the amount of correlation of the entangled photons is 0.

If you turn the A polarizer 37 degrees counter-clockwise while keeping the one at B vertical, the correlation is 10%.

Similarly, if you keep the one at A vertical, and turn the one at B 37 degrees clockwise, the correlation is 10%.

However, if you turn the one at A 37 degrees counter-clockwise, and the one at B 37 degrees clockwise, the correlation is 36%.

This is much higher than is mathematically possible under the standard assumptions stated above, so one of the assumptions has to be ditched.

Hope that helps. I'll spell it out in a little more detail in the solution column, but this is the essence.

@phayes,

If all you are saying is that quantum correlations can be predicted by Von Neumann’s probability model of non-commuting observables, and that people should know this, I agree with you fully. However, the fact that the predictions of quantum mechanics are correct is not a point of contention — it is a given for our discussion. What we interested in here is how these correlations are maintained across space at superluminal speeds. Surely you do not mean to imply that Von Neumann’s 1930’s model explains later developments such as Bell’s theorem (1964), decoherence (1970-80’s), and more modern suggestions such as Susskind and Maldacena’s ER=EPR (2013). In fact, it does not, as the paper by Nauenberg (a collaborator of Bell’s) that I linked to in my last reply to you shows: Von Neumann’s simple model of measurement does not even correctly model devices with large numbers of degrees of freedom.

So, in short, there are a lot of things we still need to understand, notwithstanding Von Neumann. To list a few:

First, how do the mathematical correlations which are correct and may represent objects in some timeless, spaceless realm, go on to generate or get projected on to the 4-dimensional reality of ordinary space-time and the physical objects therein?

Second, how does probability, which is mathematically an ensemble phenomenon, apply to single particles, and how does a solid single particle interfere with itself? (My visualization suggestion as explained in my answer to Josh: imagine a single bubble fractionating into multiple wave-borne bubblets).

Third, what is special about measurement? (In the bubble model, every measurement results in bubblets being converted to full bubbles, either at the point of measurement or remotely. This very naturally explains “Renninger’s negative result experiment” — the fact that non-detection of a particle can count as a genuine measurement.)

Fourth, as Ethaniel pointed out, how do we reconcile the absolute time of quantum mechanics with the relative time of relativity? Something has to give, and it seems quite likely that the smooth space-time of general relativity simply does not exist at really tiny space-time intervals.

You can pooh-pooh “psi ontology” all you want, but it serves a useful visualization function and shows that quantum objects can be imagined in familiar terms. And who knows, such modeling may give rise to ideas that actually fill gaps in our knowledge which there are plenty of, as we saw.

@Theophanes Raptis,

Thanks for the link. It seems to me that the objection dwells in detail on the creation of a singlet state and the consequent anti-correlation. In my example, the anti-correlation is assumed for all angles as is clearly stated. I will try to post a detailed reply at the site if I have time.

@Michael,

Very nice summary!

As you may have guessed, I don't think there is any cost to throwing away locality – as I said, non-locality is implicit in quantum mechanics: If we accept QM's predictions, we have to accept non-locality. As Ethaniel has mentioned, there is a conflict between the concept of absolute time of quantum mechanics and the relative time of relativity. So you have to choose: I think that it is increasingly likely that relativistic space-time ideas are fine for the macro world but at the level at which quantum phenomena are generated (inside quantum objects and entangled objects), these relativistic restrictions simply do not apply, and absolute time reigns supreme. The "whole lot of superluminal stuff" going on cannot be magnified because it is internal to quantum objects.

In the balloon example, I explained the composite object as two twisted balloons, to keep the image familiar. A more accurate depiction is that the balloons temporarily form a composite single balloon that's partly red and partly blue. This composite object dissolves instantaneously (in absolute time) into two objects when measured, so no communication can be set up in practice.

To me, the versions of QM resulting from throwing away realism or freedom of choice are closed and sterile, do not answer any questions outside their scope and cannot lead to further advances. Deterministic non-local models, on the other hand, just might, some day…

@Ethaniel,

Thanks for the links. I completely agree with Bob's speculation and the quote you provided “I think we now understand that space-time really is just a geometrical representation of the entanglement structure of these underlying quantum systems” (Mark Van Raamsdonk).

QM seems to be paramount, and it seems to come from a realm that is more fundamental than space, time, relativity and even string theory.

@Alex Livingston,

I give you that "conception is all anyone ever has." However there is a big difference between conceptual models of mathematical objects (like Von Neumann's) and conceptual models of physical objects (like Bohm's, or my balloons).

The differences are:

1) Mathematical objects are contingent on, and follow from, a set of assumptions or postulates, and are true in all possible universes. Thus, Von Neumann's probability model of non-commuting observable is true in any real or imagined universe where there are non-commuting observables. It is mathematical. Physical objects, on the other hand, either exist in a particular universe or they do not, and have properties that are based on their fine-grained structure. So the question for this universe is, what generates the non-commuting observables in our universe?

2) In case of physical objects, probabilistic predictions are a result of calculations on ensembles i.e. on statistics. In other words, QM is a statistical theory and therefore the question arises, what kind of ensemble generates these statistics? In short, I have no problem in accepting physical objects with strange properties such as internal superluminality, but I cannot accept "objective probabilities." Some finer ensemble is causing those probabilities, whether we can access that physical level or not.

3) Generally (and this is where the map/territory distinction is relevant) physical objects tend to be more fine grained than usable mathematical models of them, just as physical reality has greater resolution than a map. We have not yet reached the "bottom" of our physical reality: the objects we find continue to have smaller components, which in turn, we can form mathematical models of. If we treat our mathematical models as the final basis of reality, we are ceasing our exploration of the finer grained constituents of the objects we study. Thus we give up the potential to explain behaviors these objects exhibit that do not fit current models or do not integrate with other theories.

@Pradeep Mutalik

Very interesting one, thanks. Although it is very important that we are able to use intuitive analogies to better understand complex notions such as entanglement, it becomes even more important to ensure that the analogy we pick is not misleading – I think that presenting entanglement as a binary notion (up/down, true/false) is actually going to mislead because the system which gives rise to entangled states is not binary (at least most of them aren't).

I'll try to explain but I apologise for not using correct language as I don't have a background in this field. If we use the analogy of a sphere that is spinning on a central axis (just like earth is spinning for example) – if we were to take any point on the surface of that sphere and measure its momentum as a horizontal vector we will find that it is moving either left or right with some observed speed. If we now measure the momentum of the point directly opposite this one on the opposite surface of the sphere (opposite point is the one which is at the end of the line which starts at our original chosen point and goes through the centre of the sphere). Of course this point will be observed to have a horizontal momentum of exactly same value as the original point but with the opposite direction.

This is the important bit – until you pick a point to measure on the surface of this sphere there is no way to know how big the velocity of it will be or in which direction it will be going because this sphere actually has points that range in velocity from zero (at its poles) to certain maximum (at its equator) – so theoretically before you pick one and measure it, all ranges in between are possible (as well as either direction). Once you measure the starting point you then instantly know not only about the opposite point but also you could work out all other points of the system (I.e the speed and direction of the points not just directly across the centre of the sphere but perhaps at some angle to it).

Looking at it this way, it begins to become clear that (I) initial measurement does not define the characteristics of the entangled counterpart, but rather the properties of the system of which they are both a part sets the range of possibilities (specific values then become a matter of one's point of view); and (II) this reasoning is scale free, so in our example the sphere could be tiny or as big as a galaxy and the observed outcomes would be identical.

"Editor’s note: The reader who submits the most interesting, creative or insightful solution (as judged by the columnist) in the comments section will receive a Quanta Magazine T-shirt."

Solution to what question? Also, please consider the advantage of copy-editing this column prior to publication.

Z. Dash: There are three questions above, indicated by "Question 1:", etc. We copy edit everything.

ok, i'm not a physicist but my guess is that the particles (balloons) exist in another dimension that is both spaceless and timeless. when the observation of the particles is made we force the particles out of the other dimension into ours where both time and space exist.