I’m watching simplified lectures on quantum mechanics to try to get the gist of it, and I have a question:
In Quantum Mechanics, the law of superposition states that all possible states need to be added to determine the end state. This test uses a device that splits and recombines photons which have two possible ways to go. They interact like waves to cancel each other one way (destructive interference) and recombine the other way (constructive interference). It shows that even when using one photon at a time, they always go the same way, as if the two possible states are still interacting with each other.
Could it be that the probabilities aren’t totally random?
Note that using normal probabilities, flipping a coin is a 50-50 shot at heads or tails, but if you flip the coin 10 times you’re most likely not going to get 5 of each. In quantum mechanics are you guaranteed 5 of each?
If you do a 2-slit experiment with light the light forms an interference pattern, and you calculate the exact number of photons that create this pattern and shoot them through one at a time, will it always show the pattern?
Or a more realistic version: you have a low-light laser (close to or at one photon at a time) going through a two-slit experiment hitting photon detectors that record the position of the hits. You take very short time exposures and see if they are always getting the same pattern, or at least close enough to defy normal probability?
In normal probability, there’s a chance that only one side or the other lights up and you’d see a test reflecting a perfectly probable off-normal distribution in a very short exposure.
Anyone hear about anything like this?
Answers from physicist friend:
1) I’m not sure how to interpret your first question, but in QM, the wave function is used to describe how the probability varies as a function of space and/or time. As regards individual events, all of the experimental data would say it is perfectly random.
2) For a 50/50 situation, the QM case would behave the same as the classical coin toss case.
3) It doesn’t matter how slowly the photons are sent. You can send one every 100 years and you will still get the interference pattern.
That doesn’t just go for photons, it goes for electrons too, or atoms or anything, e.g. people. The notions of a pure wave or a true particle are only figments of the human imagination. The reality is everything has properties that we associate with both classical waves and particles. Therefore everything will generate an interference pattern.
I’ll stick with the photon as an example here simply because it’s the easiest to experiment with. So here’s what I’m thinking…
Perhaps the probabilities expresses in quantum mechanics aren’t truly random. If you set up a single photon in a dual slit experiment, we can use the probabilities to predict where it goes next, but each photon leaves a trace behind – perhaps a shift in a currently undetectable medium – and the next position hit reacts to these previous traces such that it’s next destination would be predictable if we knew where these traces were.
If random chance plays a role, then you should only rarely get a perfect interference pattern when you shoot through just enough photons to complete it – think of the actual probability of flipping a coin 10 times and getting exactly 5 heads and 5 tails and you see what I mean.
If random chance doesn’t play a role, then every short exposure in my proposed experiment should show almost exactly the same pattern.
My hypothesis is that there is some kind of shiftable non-detectable medium that these photons react to in a wave-like way, but that the medium isn’t necessarily stable over time. In a short-duration test you would get close to the same pattern every time, but if you separated the photons over longer periods then something outside the experiment is bound to affect this medium and it will more closely exhibit true randomness.
Keep in mind that only a small fraction of the universe is in any way detectable, so I think something in that unknown realm is influencing detectable matter in a wave-like way, and that potentially other non-detectable particles exist that could influence it over time. Think of muons shooting through for example – those are really hard to detect and the apparatus to do so is quite bulky, yet they could potentially affect a quantum calculation.
My goal here is to create a more realistic visual model that fits the data – like picturing all particles moving through a dark stream. Note that the stream involves energy waves rather than physical matter in my view, which is why it acts like a continuous stream … electric and magnetic fields are still thought of as continuous phenomena, hence the term “fields”.
The problem with quantum mechanics that Einstein and others have is that:
1. it is more of a mathematical theory than a visual one
2. Einstein was always convinced that the physical part of particles like electrons have only one physical location in a point in time (though in real-time the probability cloud is more useful)
3. It uses a lot of random probability. Though this fits the calculations in macro-time very well, many believe it shouldn’t be the basis of theory and that there’s a deeper story there.
I think a theory of matter moving in dark streams built at first to explaining quantum mechanics could address these points and lead to interesting questions in how to further test and define them.