Fermion Antisymmetric wavefunction for a (fermionic) 2-particle state in an infinite square well potential. In particle physics, a fermion is a particle that follows Fermi–Dirac statistics. These particles obey the Pauli exclusion principle. Fermions include all quarks and leptons, as well as all composite particles made of an odd number of these, such as all baryons and many atoms and nuclei. Fermions differ from bosons, which obey Bose–Einstein statistics. In addition to the spin characteristic, fermions have another specific property: they possess conserved baryon or lepton quantum numbers. As a consequence of the Pauli exclusion principle, only one fermion can occupy a particular quantum state at any given time. Composite fermions, such as protons and neutrons, are the key building blocks of everyday matter. The name fermion was coined by English theoretical physicist Paul Dirac from the surname of Italian physicist Enrico Fermi.[2] Elementary fermions[edit] Composite fermions[edit] Notes[edit]

Zero-point energy Zero-point energy, also called quantum vacuum zero-point energy, is the lowest possible energy that a quantum mechanical physical system may have; it is the energy of its ground state. All quantum mechanical systems undergo fluctuations even in their ground state and have an associated zero-point energy, a consequence of their wave-like nature. The uncertainty principle requires every physical system to have a zero-point energy greater than the minimum of its classical potential well. This results in motion even at absolute zero. For example, liquid helium does not freeze under atmospheric pressure at any temperature because of its zero-point energy. History[edit] In 1900, Max Planck derived the formula for the energy of a single energy radiator, e.g., a vibrating atomic unit:[5] where is Planck's constant, is the frequency, k is Boltzmann's constant, and T is the absolute temperature. According to this expression, an atomic system at absolute zero retains an energy of ½hν. Varieties[edit] .

Matter wave The de Broglie relations redirect here. In quantum mechanics, the concept of matter waves or de Broglie waves /dəˈbrɔɪ/ reflects the wave–particle duality of matter. The theory was proposed by Louis de Broglie in 1924 in his PhD thesis.[1] The de Broglie relations show that the wavelength is inversely proportional to the momentum of a particle and is also called de Broglie wavelength. Also the frequency of matter waves, as deduced by de Broglie, is directly proportional to the total energy E (sum of its rest energy and the kinetic energy) of a particle.[2] Historical context[edit] At the end of the 19th century, light was thought to consist of waves of electromagnetic fields which propagated according to Maxwell’s equations, while matter was thought to consist of localized particles (See history of wave and particle viewpoints). where is the frequency of the light and h is Planck’s constant. de Broglie relations[edit] Quantum mechanics[edit] where h is Planck's constant. Group velocity[edit]

Quantum decoherence Decoherence can be viewed as the loss of information from a system into the environment (often modeled as a heat bath),[2] since every system is loosely coupled with the energetic state of its surroundings. Viewed in isolation, the system's dynamics are non-unitary (although the combined system plus environment evolves in a unitary fashion).[3] Thus the dynamics of the system alone are irreversible. As with any coupling, entanglements are generated between the system and environment. These have the effect of sharing quantum information with—or transferring it to—the surroundings. Decoherence does not generate actual wave function collapse. It only provides an explanation for the observance of wave function collapse, as the quantum nature of the system "leaks" into the environment. Decoherence represents a challenge for the practical realization of quantum computers, since such machines are expected to rely heavily on the undisturbed evolution of quantum coherences. Mechanisms[edit] . .

Boson In quantum mechanics, a boson (/ˈboʊsɒn/,[1] /ˈboʊzɒn/[2]) is a particle that follows Bose–Einstein statistics. Bosons make up one of the two classes of particles, the other being fermions.[3] The name boson was coined by Paul Dirac[4] to commemorate the contribution of the Indian physicist Satyendra Nath Bose[5][6] in developing, with Einstein, Bose–Einstein statistics—which theorizes the characteristics of elementary particles.[7] Examples of bosons include fundamental particles such as photons, gluons, and W and Z bosons (the four force-carrying gauge bosons of the Standard Model), the recently discovered Higgs boson, and the hypothetical graviton of quantum gravity; composite particles (e.g. mesons and stable nuclei of even mass number such as deuterium (with one proton and one neutron, mass number = 2), helium-4, or lead-208[Note 1]); and some quasiparticles (e.g. Cooper pairs, plasmons, and phonons).[8]:130 Types[edit] Properties[edit] Elementary bosons[edit] Composite bosons[edit]

Uncertainty principle where ħ is the reduced Planck constant. The original heuristic argument that such a limit should exist was given by Heisenberg, after whom it is sometimes named the Heisenberg principle. This ascribes the uncertainty in the measurable quantities to the jolt-like disturbance triggered by the act of observation. Though widely repeated in textbooks, this physical argument is now known to be fundamentally misleading.[4][5] While the act of measurement does lead to uncertainty, the loss of precision is less than that predicted by Heisenberg's argument; the formal mathematical result remains valid, however. Since the uncertainty principle is such a basic result in quantum mechanics, typical experiments in quantum mechanics routinely observe aspects of it. Certain experiments, however, may deliberately test a particular form of the uncertainty principle as part of their main research program. Introduction[edit] Click to see animation. Wave mechanics interpretation[edit] . with yields where

Holographic principle In a larger sense, the theory suggests that the entire universe can be seen as a two-dimensional information structure "painted" on the cosmological horizon[clarification needed], such that the three dimensions we observe are an effective description only at macroscopic scales and at low energies. Cosmological holography has not been made mathematically precise, partly because the particle horizon has a finite area and grows with time.[4][5] The holographic principle was inspired by black hole thermodynamics, which conjectures that the maximal entropy in any region scales with the radius squared, and not cubed as might be expected. In the case of a black hole, the insight was that the informational content of all the objects that have fallen into the hole might be entirely contained in surface fluctuations of the event horizon. Black hole entropy[edit] An object with entropy is microscopically random, like a hot gas. Black hole information paradox[edit] Limit on information density[edit]

Ehrenfest theorem It reads[2] where A is some QM operator and is its expectation value. Ehrenfest's theorem is most apparent in the Heisenberg picture of quantum mechanics, where it is just the expectation value of the Heisenberg equation of motion. It provides mathematical support to the correspondence principle. Derivation in the Schrödinger picture[edit] Suppose some system is presently in a quantum state Φ. where we are integrating over all space. and Note Often (but not always) the operator A is time independent, so that its derivative is zero and we can ignore the last term. Derivation in the Heisenberg Picture[edit] In the Heisenberg picture, the derivation is trivial. we can derive Ehrenfest's theorem simply by projecting the Heisenberg equation onto from the right and from the left, or taking the expectation value, so We can pull the out of the first term since the state vectors are no longer time dependent in the Heisenberg Picture. General example[edit] where x is the position of the particle. , we get where

Second quantization Second quantization is a powerful procedure used in quantum field theory for describing the many-particle systems by quantizing the fields using a basis that describes the number of particles occupying each state in a complete set of single-particle states. This differs from the first quantization, which uses the single-particle states as basis. Introduction[edit] The starting point of this formalism is the notion of indistinguishability of particles that bring us to use determinants of single-particle states as a basis of the Hilbert space of N-particles states[clarification needed]. The occupation number representation[edit] Consider an ordered and complete single-particle basis , where is the set of all states available for the single particle. . The notation means that there are particles in the state . which obeys For fermions can be 0 or 1, while for bosons it can be any non negative number The space spanned by the occupation number basis is denoted the Fock space. Bosons[edit] and creation .

Wave–particle duality Origin of theory[edit] The idea of duality originated in a debate over the nature of light and matter that dates back to the 17th century, when Christiaan Huygens and Isaac Newton proposed competing theories of light: light was thought either to consist of waves (Huygens) or of particles (Newton). Through the work of Max Planck, Albert Einstein, Louis de Broglie, Arthur Compton, Niels Bohr, and many others, current scientific theory holds that all particles also have a wave nature (and vice versa).[2] This phenomenon has been verified not only for elementary particles, but also for compound particles like atoms and even molecules. For macroscopic particles, because of their extremely short wavelengths, wave properties usually cannot be detected.[3] Brief history of wave and particle viewpoints[edit] Thomas Young's sketch of two-slit diffraction of waves, 1803 Particle impacts make visible the interference pattern of waves. A quantum particle is represented by a wave packet.

Planck scale In particle physics and physical cosmology, the Planck scale (named after Max Planck) is an energy scale around 1.22 × 1019 GeV (which corresponds by the mass–energy equivalence to the Planck mass 2.17645 × 10−8 kg) at which quantum effects of gravity become strong. At this scale, present descriptions and theories of sub-atomic particle interactions in terms of quantum field theory break down and become inadequate, due to the impact of the apparent non-renormalizability of gravity within current theories. At the Planck scale, the strength of gravity is expected to become comparable with the other forces, and it is theorized that all the fundamental forces are unified at that scale, but the exact mechanism of this unification remains unknown. The term Planck scale can also refer to a length scale or time scale. The Planck length is related to Planck energy by the uncertainty principle. Theoretical ideas[edit] Experiments probing the Planck scale[edit] See also[edit] References[edit]

Quantum tunnelling Quantum tunnelling or tunneling (see spelling differences) refers to the quantum mechanical phenomenon where a particle tunnels through a barrier that it classically could not surmount. This plays an essential role in several physical phenomena, such as the nuclear fusion that occurs in main sequence stars like the Sun.[1] It has important applications to modern devices such as the tunnel diode,[2] quantum computing, and the scanning tunnelling microscope. The effect was predicted in the early 20th century and its acceptance as a general physical phenomenon came mid-century.[3] Tunnelling is often explained using the Heisenberg uncertainty principle and the wave–particle duality of matter. Pure quantum mechanical concepts are central to the phenomenon, so quantum tunnelling is one of the novel implications of quantum mechanics. History[edit] After attending a seminar by Gamow, Max Born recognised the generality of tunnelling. Introduction to the concept[edit] The tunnelling problem[edit] or

Field operators Wave function However, complex numbers are not necessarily used in all treatments. Louis de Broglie in his later years proposed a real-valued wave function connected to the complex wave function by a proportionality constant and developed the de Broglie–Bohm theory. The unit of measurement for ψ depends on the system. For one particle in three dimensions, its units are [length]−3/2. These unusual units are required so that an integral of |ψ|2 over a region of three-dimensional space is a unitless probability (the probability that the particle is in that region). Historical background[edit] In the 1920s and 1930s, quantum mechanics was developed using calculus and linear algebra. Wave functions and function spaces[edit] Functional analysis is commonly used to formulate the wave function with a necessary mathematical precision; usually they are quadratically integrable functions (at least locally) because it is compatible with the Hilbert space formalism mentioned below. Requirements[edit]

Related: QUANTUM PHYSICS