Bước tới nội dung

Thành viên:Nguyentrongphu/lưu trữ/Interpretation of Quantum Mechanics

Bách khoa toàn thư mở Wikipedia

Phu's interpretation of quantum mechanics: a lot of physics students and physicists alike lament that quantum mechanics doesn't make any sense because it doesn't correspond to reality, our every day experience. Our great Feynman once echoed this sentiment when he said, "No one understands quantum mechanics." I'm writing an interpretation that reconciles our understanding of physics with reality without introducing new modifications to the existing theory of quantum mechanics.

Wave function collapse and its interpretations

[sửa | sửa mã nguồn]

Copenhagen interpretation is one of the most popular interpretations of mathematics formalism of quantum mechanics. However, since its inception, there has been many objections to it. There is no universal definition of the Copenhagen interpretation, even its two creators Niels Bohr and Werner Heisenberg do not fully agree on everything. There are other interpretations of quantum mechanics, but each fails to explain as many phenomenon in quantum mechanics as Copenhagen interpretation.

Schrodinger’s cat

[sửa | sửa mã nguồn]

Let’s say a cat is inside a box with a device that detects radioactive decay. It is designed that in a course of 1 hour, there is 50% one of the atoms decays. If there is a decay, the cat would be dead. My interpretation is that the cat is dead as soon as a radioactive decay is detected. The decay could happen in 1 hour or 10 hours or at any time, independent of an observer. The decay happens spontaneously and has a 50% probability of happening in 1 hour. Note: this is not the same as objective-collapse theory. My statement is only an interpretation of quantum mechanics, not a theory with additional modification. Many-worlds interpretation has criticism about its interpretation of the role of probability. Another big draw back to it is that since its inception, no one has been able to come up with a way to test it. Un-testable interpretation is meaningless (much like Russell's teapot). Copenhagen interpretation's deficiency is that it is not clear what constitutes as an observer. Ensemble interpretation works well for the Schrodinger's cat case, but it fails short to explain other phenomenon of quantum mechanics.

Wave function collapse

[sửa | sửa mã nguồn]

Let’s say we have a hydrogen with 1 proton and 1 electron. The electron forms an electron cloud orbiting the single proton. The probability density of finding an electron at a specific location is determined by the absolute value of wave function squared. The electron is said to be in a superposition, which means it could be anywhere around the nucleus. This does not mean that it is everywhere at once. My interpretation is that the electron is like a continuously deforming blob of water (energy) warping itself and flowing around the nucleus. When you want to know its location at time t, you can shoot a photon at it. My proposal idea is that the photon can only detect the electron in this case only if it hits the "critical" energy density (energy density of the electron). Anything lower than this critical energy density (threshold), the photon would pass right through and unable to detect the electron. So why some region has higher probability of finding the electron than others? Because the energy that made up the electron is not warping itself around the nucleus uniformly. The closer you get to the nucleus the higher probability of finding the electron, it is because the blob of energy (that made up the electron) tend to compact its energy more the closer it gets to the nucleus. This is due to electromagnetic force between the electron and proton with distance inversely proportional to force strength. Therefore, the closer you get to the nucleus the more probability the photon would hit that critical energy density to detect the existence of the electron. So basically, wave function collapse sometimes is a spontaneous process (like radioactive decay) and sometimes simply is due to meeting the critical energy density when we’re using whatever apparatus to observe a particle or a phenomenon. It is a real physical entity and not an abstract mathematical construct.

So why the electron doesn't radiate energy when it's moving around the proton? The mass-energy (the energy that made up the electron) is warping around the proton; it's not actually moving. It's like a blob of water with the proton at its center. I believe that while this blob of energy (electron) is constantly deforming and fluctuating randomly around the proton, its center of mass remains still. That's why it doesn't radiate energy while in atom's orbit.

Interpretation of fundamental forces

[sửa | sửa mã nguồn]

The four fundamental forces of nature are gravity, electromagnetic, strong and weak forces. Except for gravity, they are all force fields. They all have infinite range. Weak and strong forces are negligible outside of an atom. On planet scale or larger, electromagnetic cancels each other out. Therefore, gravity is the only force that dominates in large scale (according to general relativity, gravity is not a force but a result of space-time curvature, but for our intended purpose, it acts like a force).

According to electroweak theory, eletromagnetic and weak forces are the same at 246 GeV. At that high energy, photon and W, Z bosons are indistinguishable. Since they are the excitation states of the electromagnetism and weak force respectively, if they become distinguishable, that means the electromagnetism and weak force fields are the same.

Grand Unified Theory

[sửa | sửa mã nguồn]

I hypothesize strong force and weak force are electromagnetic force in disguise. Quantum chromodynamics and electroweak theory have been quite successful over the last few decades with observations that matched its predictions. However, its limitation is a clear sign that it's a incomplete theory of nature. I suspect all three forces (electromagnetism, strong force, and weak force) are manifestations of the same quantum property with different names and effects at different scale. Electromagnetism has electric charge, strong force has flavors, weak force has proton and neutron interaction.

Renormalization

[sửa | sửa mã nguồn]

Renormalization is a cheat tool. Some famous physicists have doubted its legitimacy such as Feynman,

"The shell game that we play is technically called 'renormalization'. But no matter how clever the word, it is still what I would call a dippy process! Having to resort to such hocus-pocus has prevented us from proving that the theory of quantum electrodynamics is mathematically self-consistent. It's surprising that the theory still hasn't been proved self-consistent one way or the other by now; I suspect that renormalization is not mathematically legitimate."

A big underlying issue with renormalization becomes apparent with the "hierarchy problem" in physics.

Theory of energy and matter

[sửa | sửa mã nguồn]

Mass-energy: it is defined to be the energy that made up a particle. Mass is equivalent to energy according to Einstein's famous equation. Matter is basically just energy bounded in space. If the energy density is dense enough, energy is bounded together by gravitational force to create all elementary particles together with their associated quantum properties. I hypothesize that each elementary particle has a specific energy density. Before the energy density threshold, energy is just free floating (such as photons).

Higgs field: photon clearly has energy, so why it has 0 mass? My hypothesis is that there is a minimum mass that can be detected by the Higgs field. In other words, there is a minimum energy density, so that anything lower than this would have 0 mass and goes undisturbed in the Higgs field.

Particles are made of energy and are not point-like like we often depict them. The energy that made up of a given particle is not confined into a point-like object; the more accurate depiction of it would be its energy is being smeared out in space like a continuously deforming blob of water floating in space. The smaller the particle (in terms of mass) the bigger its wavelength and vice versa. In other words, the energy within small particles has ability to move more and smear more. The energy within bigger particles is bound together more tightly due to gravitational force (according to general relativity, gravity is not a force but a result of space-time curvature, but for our intended purpose, it acts like a force), so there would be less room to smear. Hence, its wavelength would be shorter than smaller particles. All matter is created by energy. My proposed matter creation mechanism: particles are created based on energy density (energy unit/m^3; any energy unit works). For example, since matter can only be created in pair with anti-matter, an up quark and anti-up quark would be created at a specific energy density. When energy is compacted in different levels of energy density, it can create all elementary particle pairs (particle and its anti-particle counterpart) known in the universe. For example, when two protons are smashed into each other by our massive particle accelerator, this compacts a big amount of energy into a tight space. The result is that many different particles of matter and antimatter are created from vacuum seemingly, but the energy actually compacts in many different ways (reach many different critical energy densities) to create all those particles.

To test this theory: this is currently only hypothetical. Let's say we know the exact (or really good approximation) energy density of an electron, which would require to know its radius very accurately. We then have a device that can condense energy into a volume in space by photon collision. Since matter and antimatter have to be created in pair, we then use this hypothetical device that can condense energy to the energy density of an electron in the sphere with radius = radius of an electron times two. If we can observe an electron-positron pair being created, it confirms the validity of this theory.

Two-photon physics

[sửa | sửa mã nguồn]

This is kind of evidence to support my matter and antimatter can be created at certain energy densities. When 2 photons collide, most of the time they just pass right through each other. However, at high energy photons, if 2 photons collision is at certain energy densities, it would create a pair of matter-antimatter instead of just passing through each other. The pair of matter-antimatter would likely to annihilate each other and recreate the 2 original photons. This process is currently thought to be a spontaneous process. However, I think it is not spontaneous. The mechanism of this phenomenon is based solely on energy density. Which energy density can create matter-antimatter are currently unknown.

Uncertainty principle

[sửa | sửa mã nguồn]

It states that it’s impossible to know the exact position and momentum at the same time. This principle is an inherent property of quantum objects. My interpretation is that to know the exact momentum of a particle at time t, its full wavelength must be calculated at t. Mass and energy are equivalent. The particle is made up of energy, since energy is being spread out when we calculate its full wavelength (this is the equivalence of saying the particle is being spread out when we calculate its full wavelength), so it makes sense intuitively that it would be impossible to know its exact position when the particle itself is being spread out. And to know its exact position, we need to somehow squeeze all of its energy into a point-like particle, so we then lose information on its wavelength, which leads to losing information in momentum.

Virtual particle

[sửa | sửa mã nguồn]

There is no such thing as virtual particle. Physicists create this concept to make fundamental interactions easier to understand (much like the aether that was later proven to be nonexistence in the early 20th century). Supposedly, virtual particles act as force mediator of the strong force, weak force, and electromagnetism. There is no force mediator. There are just field, quantum fluctuation, and quanta (a unit of quantum field). We simply don't need to imagine an infinity of virtual particles flying around to meditate the fundamental forces of nature. While there is a mathematical way to deal with this kind of infinity, but it's kind of an unnecessary layer of concept that doesn't do anything to actually advance our understanding of the universe. Fundamental forces are meditated by field that is invisible and intangible, but their effects are measurable. While some quantum fluctuations may resemble characteristics of a particle, that doesn't mean we should just invent a new term called "virtual particle" to explain the phenomenon. Those quantum fluctuations should be just called "quantum fluctuation", a term that stays true to their nature.

Vacuum polarization

[sửa | sửa mã nguồn]

The current explanation is that this phenomenon is caused by virtual electron-positron pairs, which uncertainty principle allows to exist. These pairs are real, not virtual. They seemingly pop out of existence with no obvious origin, so they are mistakenly labelled as virtual. It's known that vacuum is not just empty space. My hypothesis is that energy (too small to be detected) permeates through out all space-time. Sometimes, these energy happen to concentrate at a point in space by pure chance. When they hit a certain threshold energy, an electron is born out of nowhere. The positron is created by the same random process. They come together randomly as a electron-positron pair to create the phenomenon known as vacuum polarization. All steps in the process are random. It's possible that only electron is created or vice versa (only positron is created). And if both are created but do not come together, lone electron and positron particles would just exist as individual particle in vacuum that was created from the energy of vacuum.

It consists of a sea of virtual quark-anti quark pairs inside a hadron. My interpretation is that there is no sea quark. There are only sea of gluons and valence quarks within any hadron. A quantum state within a hadron is complex. Quantum fluctuation of this state has an effect similar to that of the quark-anti quark pair. Instead of giving this quantum fluctuation its own name such as virtual quark-anti quark pairs (or sea quark), let's just call it quantum fluctuation with effects similar to a sea of quark-anti quark pairs within a hadron.

Deep inelastic scattering

[sửa | sửa mã nguồn]

Let's use proton for our example. Proton is like a ball with a radius of a proton. At low energy electron, electron is like a smaller ball with a radius of an electron (I predict electron has a radius size; we just don't have to technology to measure it yet). When a smaller ball hits a big ball, it bounces off. However, at high energy electron (very small wavelength), this smaller (ball) blob of energy that made up the electron elongates (this makes sense in context that the faster the electron moves the higher energy it contains). At high enough energy, it becomes like a needle compared to the big ball (proton). It's all relative. To a proton, this high energy electron looks like a needle, but to something much smaller than a proton, an electron looks like a ball. Anyway, this needle electron can probe inside the proton. We then see three distinct patterns of electron scattering that correspond to the three valence quarks inside the proton. At a higher energy electron than needle electron, we see a lot more constituents of the proton, which Feynman called them partons. We later discovered that they're just a sea of gluons.

So why the needle electron cannot detect this sea of gluons? This sea of gluons is moving around constantly. The needle electron doesn't have enough energy to cause the gluon to accelerate. It just waddles through this sea until it hits one of the three valence quarks. However, at higher energy than needle electron, it doesn't just waddle through. It hits the gluon hard and bounces back outside the proton. It causes the gluon to accelerate. The gluon has color charge, so it emits electromagnetic radiation when being accelerated, which we can detect and infer that this incident actually happened.

I hypothesize that neutrino is the smallest elementary particle that the Higgs field can detect. In other words, anything lighter than neutrino would have 0 mass and goes undisturbed (undetected) by the Higgs field. It's not a coincidence that Planck length and uncertainty principle are both in the same order of magnitude (^-35). Planck length is where our understanding of physics breaks down, and quantum gravity's effects can no longer be ignored. Uncertainty principle states that below certain scale, momentum and distance cannot be determined precisely anymore.

I predict that the neutrino has a radius of Planck length. Conceptually, neutrino is like a blob of energy with a radius of P. This blob of energy oscillate (imagine a ball getting bigger and smaller in repeated cycles) from r_P to r_max.

  • r_max - r_P = x1 + x2 + x3
  • x1 = r1 - r_P
  • x2 = r2 - r1
  • x3 = r_max - r2

Where r1, r2, and r_max are unknown constants.

Neutrino oscillation

[sửa | sửa mã nguồn]

It's known that neutrino can oscillate between its 3 flavors.

  • Electron neutrino oscillates from r_P to r1.
  • Muon neutrino oscillates from r1 to r2.
  • Tau neutrino oscillates from r2 to r_max.

Standard Model says that all three are the same elementary particle. However, I would argue that they are 3 different elementary particles. If all three are in fact one particle, we expect the probability wave function of it oscillates into each of the three eigenstates would be the same over time regardless of its initial flavor. The three probability waves don't have to be equal, but they should be the same regardless of initial flavor. The fact, the 3 probability wave functions are different depending on the initial flavor implies that there are actually 3 different elementary particles.

If they are different elementary particles, how come they can oscillate to become each other? This can explain with uncertain principle. At this scale, nothing is certain anymore, so we can't tell whether an electron neutrino has oscillated into a muon neutrino or its initial flavor was muon neutrino. The two incidents are identical under measurement. At this scale, the Higgs field is tricked due to oscillation. It can't distinguish between the 3 different elementary particle, so the Higgs field treats it as one oscillating particle.

The catch is if its initial flavor is electron neutrino, the probability wave function of it oscillates into the other two is quite low at any given time. In converse, if it starts out as muon or tau flavors, the probability wave function of it oscillates into electron neutrino is also quite low, but the probability wave function of it oscillates into each other is roughly the same. It makes sense since muon or tau flavors are much more massive than the electron flavor, and their mass is probably close to each other relatively compared to electron flavor. In conclusion, the probability of successfully pull off the trick is quite low if the particle tries to impersonate something much more massive than it is and vice versa. It's the same in real life. One would have a much better odds of chance to impersonate their own twin (assuming they have one) than impersonating a complete stranger. Nature doesn't like being fooled unless particles are in the uncertainty principle realm (Planck length scale).

Redshift is a continuous process that elongates photon's wavelength. The rate of wavelength elongation is a function that decreases as distance gets smaller. For example, a photon, that is 1 million light years away from Earth, is traveling toward Earth. Its wavelength would be stretches out continuously over time, and the rate of stretch is getting small over time (as its distance from Earth is getting smaller). This is due to the fact that the universe is expanding in all directions with respect to Earth. The further away from Earth the faster the rate of expansion is.

Redshift seems to violate conservation of energy, one of the most important principle of modern physics. I propose an idea that could resolve this problem. When a photon is redshift slightly, its energy loss is so small that it's in the realm of uncertainly principle, so this energy loss would be undetectable no matter how advance technology will be in the future. However, when the redshift is significantly enough, we can detect it. I hypothesize that the photon will split into 2 photon when it's being redshifted enough (let's called this "critical" split point). Every time, a photon hits a "critical" split point, it splits into two. Over its long journey to Earth, 1 photon could have been split into many smaller wavelength photons (less energetic). The sum of all its split photon = the total energy of 1 original photon. Hence, this would protect the conversation of energy from being violated.

Quantum fluctuation

[sửa | sửa mã nguồn]

I hypothesized quantum fluctuation is caused by dark matter and all photons flying around in the vacuum (including cosmic microwave background and photons from outer space). Interaction between dark matter can cause quantum fluctuation. Two high energetic photons can create matter (and antimatter) out of nothing; its effect is quantum fluctuation.

Dark matter and dark energy

[sửa | sửa mã nguồn]

The nature of dark matter and dark energy are currently unknown to science. It doesn't interact with any known matter (and antimatter) directly. Its existence can be inferred through its observed effects, prominently its gravitational effects on photons and galaxy rotation curve. The universe expansion is hypothesized to be caused by dark energy. I hypothesize that dark matter and dark energy are actually the same thing.

I hypothesize dark matter is made up a new, undiscovered, elementary particle permeating this universe. Its energy is equal to cosmological constant. Its energy density equal to cosmological constant divided by the volume of a sphere with a radius equals to Planck length. This new elementary particle has a fifth fundamental force of nature. Much like how gravity curves space-time, it expands space-time in all direction surrounding it.

A bulk of dark matter was created shortly after the Big Bang. Afterward, every time matter-antimatter annihilate each other, a small of dark matter is created. It's undetectable and probably will never be detectable directly because it's in the realm of uncertainty principle. However, its effects can be measured. Matter-antimatter annihilation creates more dark matter, which may explain why the rate of expansion is accelerating.

Cosmological constant problem

[sửa | sửa mã nguồn]

Dark matter and cosmological constant are hypothesized to be the same. The prediction of this value by quantum field theory and the observed value are different by as high as 120 orders of magnitude. It was hauled as "the worst theoretical prediction in the history of physics." This huge discrepancy is known as the cosmological constant problem and currently unsolved in science. I'm here proposing a solution to this problem.

The cosmological constant is a snapshot of energy density of the universe (excluding all matter and antimatter); it's like when you freeze time and measure it. The predicted value by quantum field theory on the other hand is the measurement of the same thing but taken in . Since natural system is chaotic (deterministic nonlinear system in chaos theory), a very small difference in 1 single parameter such as in this case can result in the difference of 120 orders of magnitude. For example, Casimir effect is an effect of quantum fluctuation that is attributed to the energy of vacuum. Its effect can be measurable in only. Since the time it takes to measure anything has to be greater than 0, I hypothesize that the intrinsic energy density of the universe is the cosmological constant, but the measurable value will always be much greater.

Imagine an invisible cube, let's say we want to measure the total energy inside the cube. Let's say if time is frozen, the total energy inside the cube is x. That's a theoretical total energy inside the cube. As time goes on, energy flowing in and out of the cube freely. However, if we want to measure it, we'll have to send in a photon or electron (let's call this a probe particle). The act of measurement can't happen when time is frozen. During , a probe particle could have detected energy inside the cube and energy flowing in from outside the cube. Also during the travelling distance, quantum fluctuation would probably affect the probe particle even more. Adding in all the extra affects result from measuring, the end result is what quantum field theory predicts.

Interpretation on point-wave duality of matter

[sửa | sửa mã nguồn]

Is all matter point-like or wave-like particle? It's neither. Sometimes, it acts as a point-like particle, and sometimes, it acts like a wave. We can measure wave property of microscopic subatomic particles or even molecules, but for macroscopic things, the wavelength would be too small compared to the size of the whole thing to be detected. For example, a 200g baseball travelling at 30 m/s, using de Broglie formula, we have its wavelength equal to 1.1e-34 m. That’s around 1 unit of Planck length where our understanding of physics breaks down. And according to uncertainty principle, around this miniscule realm, nothing is certain anymore (wavelength is related to momentum). This sets a fundamental limit on how small wavelength can be no matter how much technology would advance in the future. Basically, this means that anything bigger than a baseball would render calculating its wavelength impossible. However this is only a theoretical limit, practical limit of wavelength may be much higher. Physical description of wavelength in quantum mechanics: wavelength is simply a measure of the energy smear-ness of something.