>I mean, fair's fair, many worlds doesn't give you probabilities. On the other hand, I'm not convinced saying "and then collapse happens" is an explanation for the Born rule anyway, seeing as collapse is just something magic that turns amplitudes into probability in the same way (mathematically) decoherence does.
An alternative explanation with as much physical evidence as the Many Worlds (MW) interpretation (i.e., none) could be that we are, in fact, part of a simulation[0] and the "wave function" properties of quanta aren't "real" (what is "real" in a simulation?), but rather are artifacts of speculative execution[1] on the part of the CPU executing the aforementioned simulation.
The idea there being that all possible branches are followed, but only the [correct|selected|randomly arrived at|etc.] events are incorporated into "reality."
That, of course, raises a number of questions:
1. How is it that we can perceive such speculative branch execution from inside a simulation executing on such a CPU?
2. What mechanism (algorithm? [pseudo]-random number generation? lookup table?) would be used to determine "actual" outcome from executing all possible code branches?
3. Like the "Many Worlds" hypothesis, treating quantum uncertainty as an artifact of "speculative execution" of all possible branches of simulation code isn't testable (at least not as far as I'm aware). As such, how do we use the scientific method to identify the most likely scenario?
I'm not advocating the position that we do, in fact, live in a simulation. Nor am I advocating the MW, Copenhagen or even Pilot-wave interpretations.
Rather, my point is that none of these interpretations are "science" in the sense of having falsifiable hypotheses. Unless and until we have the appropriate concepts/technology to test such hypotheses, all are just speculation/metaphysics.
That said, I also think it's useful to examine and (where possible) investigate such hypotheses, as that might give us a better understanding of the universe(s) we occupy.
I respect the basic idea of falsification, but the fact is we routinely draw conclusions despite their being just one of infinite possibilities, all of which are unfalsifiable. The tool we use to distinguish between them is Occam's razor, and with it we can argue for one conclusion or another.
Quantum physicists don't agree on MW, Copenhagen, or weirder hidden-variable type theories. But very few of them are of the opinion that, say, the universe has obeyed the laws of physics until now by chance. It has exactly as much evidence as every other theory, but isn't favored by Occam's razor.
Someone favoring the Copenhagen interpretation would argue that the rest of the waveform is deadweight. Someone favoring MW would say collapse is. I don't know many hidden variable people, but maybe they think probabilities are just more elegant.
Not only are universes operating on each of these theories distinct, there are absolutely things we could discover that would push the balance! Bell's Theorem genuinely hurts the credibility of hidden variable theories. MW would be absolutely deranged without the notion of decoherence. If someone published a paper tomorrow showing why world measure translates into probabilities, I'd call that a big win for MW. If someone found a clever, non-decoherence reason for collapse, it would be a big win for the Copenhagen interpretation.
Regardless, such speculation is far from metaphysics. Overly theoretical? Perhaps. But Newton wasn't doing metaphysics when he proposed gravity, and he was shown to be basically right the instant a handful of elegant equations predicted the motion of the planets.
An alternative explanation with as much physical evidence as the Many Worlds (MW) interpretation (i.e., none) could be that we are, in fact, part of a simulation[0] and the "wave function" properties of quanta aren't "real" (what is "real" in a simulation?), but rather are artifacts of speculative execution[1] on the part of the CPU executing the aforementioned simulation.
The idea there being that all possible branches are followed, but only the [correct|selected|randomly arrived at|etc.] events are incorporated into "reality."
That, of course, raises a number of questions:
1. How is it that we can perceive such speculative branch execution from inside a simulation executing on such a CPU?
2. What mechanism (algorithm? [pseudo]-random number generation? lookup table?) would be used to determine "actual" outcome from executing all possible code branches?
3. Like the "Many Worlds" hypothesis, treating quantum uncertainty as an artifact of "speculative execution" of all possible branches of simulation code isn't testable (at least not as far as I'm aware). As such, how do we use the scientific method to identify the most likely scenario?
I'm not advocating the position that we do, in fact, live in a simulation. Nor am I advocating the MW, Copenhagen or even Pilot-wave interpretations.
Rather, my point is that none of these interpretations are "science" in the sense of having falsifiable hypotheses. Unless and until we have the appropriate concepts/technology to test such hypotheses, all are just speculation/metaphysics.
That said, I also think it's useful to examine and (where possible) investigate such hypotheses, as that might give us a better understanding of the universe(s) we occupy.
[0] https://en.wikipedia.org/wiki/Simulation_hypothesis
[1] https://en.wikipedia.org/wiki/Speculative_execution
Edit: Clarified prose.