background preloader

Physics

Facebook Twitter

Marine Gravity from Satellite Altimetry. Data on slight variations of the pull of gravity over the oceans are recorded with satellite altimetry, and are then combined to map the seafloor globally. Get the Marine Gravity Map: Global Data Grids Google Earth Overlays GPlates Web Visualization Reference: Sandwell, D.

T., R. Links to Related Material and Publicity Send us feedback GPlates Web Visualization These cloud-based tools are provided courtesy of Dietmar Müller on the GPlates Web Portal and require a WebGL-enabled browser. Navigating the Visualization left click + drag: (3D View) rotates the camera around the globe (2D, Columbus View) translates the camera around the map surface right click + drag zooms the camera in and out middle wheel scrolling also zooms the camera in and out middle click + drag (3D View) rotates the camera around the point on the surface of the globe Take a Tour of the Seafloor: Marine gravity model of the North Atlantic (10 mGal contours).

Marine Gravity from Space Enables Discovery aboard Ships. Molecular Mysticism. Shape of the Universe. The shape of the universe is the local and global geometry of the universe, in terms of both curvature and topology (though, strictly speaking, it goes beyond both). When physicsist describe the universe as being flat or nearly flat, they're talking geometry: how space and time are warped according to general relativity.

When they talk about whether it open or closed, they're referring to its topology.[1] Although the shape of the universe is still a matter of debate in physical cosmology, based on the recent Wilkinson Microwave Anisotropy Probe (WMAP) measurements "We now know that the universe is flat with only a 0.4% margin of error", according to NASA scientists. [2] Theorists have been trying to construct a formal mathematical model of the shape of the universe. In formal terms, this is a 3-manifold model corresponding to the spatial section (in comoving coordinates) of the 4-dimensional space-time of the universe. Two aspects of shape[edit] Local geometry (spatial curvature)[edit]

Ekpyrotic universe. The ekpyrotic universe, or ekpyrotic scenario, is a cosmological model of the origin and shape of the universe. The name comes from a Stoic term ekpyrosis (Ancient Greek ἐκπύρωσις ekpurōsis) meaning conflagration or in Stoic usage "conversion into fire".[1] The ekpyrotic model of the universe is an alternative to the standard cosmic inflation model for the very early universe; both models accommodate the standard Big Bang Lambda-CDM model of our universe.[2][3] The ekpyrotic model is a precursor to, and part of, some cyclic models. The ekpyrotic model came out of work by Neil Turok and Paul Steinhardt and maintains that the universe did not start in a singularity, but came about from the collision of two branes.

This collision avoids the primordial singularity and superluminal expansion of spacetime while preserving nearly scale-free density fluctuations and other features of the observed universe. See also[edit] Notes and references[edit] Further reading[edit] P. Ghosting Energy: In (masking) radiation electromagnetic waves travel backwards. Sep 23, Physics/General Physics This figure shows back power flow lines at 21 GHz.

Credit: Cesar Monzon. (PhysOrg.com) -- Typically, electromagnetic waves travel away from their sources. For instance, a radar system emits radio waves that travel all the way to a target, such as a car or plane, before being reflected back to the source. Police officers and the military rely on the forward movement of the waves to determine the speed or location of an object. So if electromagnetic radiation started flowing in reverse, back toward its source, it could cause serious confusion. “I have been working in diverse aspects of electromagnetics for a number of years, and I just happened to stumble on some solutions of Maxwell’s equations that exhibited a very unusual behavior,” Monzon told PhysOrg.com.

In Monzon’s theoretical set-up, a number of identical electromagnetic wave sources are aligned in a row. But the fact that ordinary waves are flowing backwards here results in some unusual effects. Quintessence (physics) In physics, quintessence is a hypothetical form of dark energy postulated as an explanation of the observation of an accelerating rate of expansion of the Universe announced in 1998. It has been proposed by some physicists to be a fifth fundamental force. Quintessence differs from the cosmological constant explanation of dark energy in that it is dynamic, that is, it changes over time, unlike the cosmological constant which always stays constant. It is suggested that quintessence can be either attractive or repulsive depending on the ratio of its kinetic and potential energy.

Specifically, it is thought that quintessence became repulsive about ten billion years ago (the universe is approximately 13.8 billion years old).[1] q, is given by the potential energy and a kinetic term: Hence, quintessence is dynamic, and generally has a density and wq parameter that varies with time. Jump up ^ Christopher Wanjek; "Quintessence, accelerating the Universe? " Ostriker JP, Steinhardt P (January 2001). Diffeomorphism. The image of a rectangular grid on a square under a diffeomorphism from the square onto itself. Definition[edit] Given two manifolds M and N, a differentiable map f : M → N is called a diffeomorphism if it is a bijection and its inverse f−1 : N → M is differentiable as well. If these functions are r times continuously differentiable, f is called a Cr-diffeomorphism). Two manifolds M and N are diffeomorphic (symbol usually being ≃) if there is a diffeomorphism f from M to N.

They are Cr diffeomorphic if there is an r times continuously differentiable bijective map between them whose inverse is also r times continuously differentiable. Diffeomorphisms of subsets of manifolds[edit] Given a subset X of a manifold M and a subset Y of a manifold N, a function f : X → Y is said to be smooth if for all p in X there is a neighborhood U ⊂ M of p and a smooth function g : U → N such that the restrictions agree (note that g is an extension of f).

Local description[edit] Remark 1. Remark 2. Remark 3. Let. Non-orientable wormhole. In topology, this sort of connection is referred to as an Alice handle. Theory[edit] "Normal" wormhole connection[edit] Matt Visser has described a way of visualising wormhole geometry: take a "normal" region of space"surgically remove" spherical volumes from two regions ("spacetime surgery")associate the two spherical bleeding edges, so that a line attempting to enter one "missing" spherical volume encounters one bounding surface and then continues outward from the other. For a "conventional" wormhole, the network of points will be seen at the second surface to be inverted, as if one surface was the mirror image of the other—countries will appear back-to-front, as will any text written on the map. "Reversed" wormhole connection[edit] The alternative way of connecting the surfaces makes the "connection map" appear the same at both mouths.

This configuration reverses the "handedness" or "chirality" of any objects passing through. Consequences[edit] Alice universe[edit] Notes[edit] Casimir pressure. Casimir pressure is created by the Casimir force of virtual particles. According to experiments, the Casimir force between two closely spaced neutral parallel plate conductors is directly proportional to their surface area Therefore, dividing the magnitude of Casimir force by the area of each conductor, Casimir pressure can be found.

Because the Casimir force between conductors is attractive, the Casimir pressure in space between the conductors is negative. Because virtual particles are physical representations of the zero point energy of physical vacuum, the Casimir pressure is the difference in the density of the zero point energy of empty space inside and outside of cavity made by conductive plates. Some scientists[who?] See also[edit] Casimir effect References[edit] Cabibbo–Kobayashi–Maskawa matrix. The matrix[edit] A pictorial representation of the six quarks' decay modes, with mass increasing from left to right. In 1963, Nicola Cabibbo introduced the Cabibbo angle (θc) to preserve the universality of the weak interaction.[1] Cabibbo was inspired by previous work by Murray Gell-Mann and Maurice Lévy,[2] on the effectively rotated nonstrange and strange vector and axial weak currents, which he references.[3] In light of current knowledge (quarks were not yet theorized), the Cabibbo angle is related to the relative probability that down and strange quarks decay into up quarks (|Vud|2 and |Vus|2 respectively).

In particle physics parlance, the object that couples to the up quark via charged-current weak interaction is a superposition of down-type quarks, here denoted by d′.[4] Mathematically this is: or using the Cabbibo angle: Using the currently accepted values for |Vud| and |Vus| (see below), the Cabbibo angle can be calculated using . θC = 13.02°. or using the Cabibbo angle: λ = s12. Pontecorvo–Maki–Nakagawa–Sakata matrix. In particle physics, the Pontecorvo–Maki–Nakagawa–Sakata matrix (PMNS matrix), Maki–Nakagawa–Sakata matrix (MNS matrix), lepton mixing matrix, or neutrino mixing matrix, is a unitary matrix[note 1] which contains information on the mismatch of quantum states of leptons when they propagate freely and when they take part in the weak interactions.

It is important in the understanding of neutrino oscillations. This matrix was introduced in 1962 by Ziro Maki, Masami Nakagawa and Shoichi Sakata,[1] to explain the neutrino oscillations predicted by Bruno Pontecorvo.[2][3] The matrix[edit] On the left are the neutrino fields participating in the weak interaction, and on the right is the PMNS matrix along with a vector of the neutrino fields diagonalizing the neutrino mass matrix. The PMNS matrix describes the probability of a neutrino of given flavor α to be found in mass eigenstate i. These probabilities are proportional to |Uαi|2. Based on less current data (28 June 2012) mixing angles are:[7]

Fredkin finite nature hypothesis. In digital physics, the Fredkin Finite Nature Hypothesis states that ultimately all quantities of physics, including space and time, are discrete and finite. All measurable physical quantities arise from some Planck scale substrate for multiverse information processing. Also, the amount of information in any small volume of spacetime will be finite and equal to a small number of possibilities.[1] Conceptions[edit] Stephen Wolfram in A New Kind of Science, Chapter 9, considered the possibility that energy and spacetime might be secondary derivations from an informational substrate underlying the Planck scale.

Fredkin's "Finite Nature" and Wolfram's ideas on the foundations of physics might be relevant to unsolved problems in physics. Fredkin's ideas on inertia[edit] According to Fredkin, "the computational substrate of quantum mechanics must have access to some sort of metric to create inertial motion. See also[edit] References[edit] External links[edit] Scharnhorst effect. The Scharnhorst effect is a hypothetical phenomenon in which light signals travel faster than c between two closely spaced conducting plates. It was predicted by Klaus Scharnhorst of the Humboldt University of Berlin, Germany, and Gabriel Barton of the University of Sussex in Brighton, England. They showed using quantum electrodynamics that the effective refractive index, at low frequencies, in the space between the plates was less than 1 (which by itself does not imply superluminal signaling).

They were not able to show that the wavefront velocity exceeds c (which would imply superluminal signaling) but argued that it is plausible.[1] Explanation[edit] Owing to Heisenberg's uncertainty principle, an empty space which appears to be a true vacuum is actually filled with virtual subatomic particles. These are called vacuum fluctuations. As a photon travels through a vacuum it interacts with these virtual particles, and is absorbed by them to give rise to a virtual electron-positron pair. Technicolor (physics) Technicolor theories are models of physics beyond the standard model that address electroweak gauge symmetry breaking, the mechanism through which W and Z bosons acquire masses. Early technicolor theories were modelled on quantum chromodynamics (QCD), the "color" theory of the strong nuclear force, which inspired their name.

In order to produce quark and lepton masses, technicolor has to be "extended" by additional gauge interactions. Particularly when modelled on QCD, extended technicolor is challenged by experimental constraints on flavor-changing neutral current and precision electroweak measurements. It is not known what is the extended technicolor dynamics. Much technicolor research focuses on exploring strongly interacting gauge theories other than QCD, in order to evade some of these challenges. The mechanism for the breaking of electroweak gauge symmetry in the Standard Model of elementary particle interactions remains unknown.

Forms. . Here, at the scale μ. Is small there. . And. Digital physics. Digital physics is grounded in one or more of the following hypotheses; listed in order of decreasing strength. The universe, or reality: History[edit] The hypothesis that the universe is a digital computer was pioneered by Konrad Zuse in his book Rechnender Raum (translated into English as Calculating Space). The term digital physics was first employed by Edward Fredkin, who later came to prefer the term digital philosophy.[3] Others who have modeled the universe as a giant computer include Stephen Wolfram,[4] Juergen Schmidhuber,[5] and Nobel laureate Gerard 't Hooft.[6] These authors hold that the apparently probabilistic nature of quantum physics is not necessarily incompatible with the notion of computability. Quantum versions of digital physics have recently been proposed by Seth Lloyd,[7] David Deutsch, and Paola Zizzi.[8] Overview[edit] Some try to identify single physical particles with simple bits.

Weizsäcker's ur-alternatives[edit] Wheeler's "it from bit"[edit]

General Relativity

Not Even Wrong: Peter Woit's blog on General Physics. Graham Farmelo has posted a very interesting interview he did with Witten last year, as part of his promotion of his forthcoming book The Universe Speaks in Numbers. One surprising thing I learned from the interview is that Witten learned Calculus when he was 11 (this would have been 1962). He quite liked that, but then lost interest in math for many years, since no one gave him more advanced material to study.

After years of studying non math/physics subjects and doing things like working on the 1972 McGovern campaign, he finally realized physics and math were where his talents lay. He ended up doing a Ph.D. at Princeton with David Gross, starting work with him just months after the huge breakthrough of asymptotic freedom, which put in place the final main piece of the Standard Model. If only back in 1962 someone had told Witten about linear algebra and quantum mechanics, the entire history of the subject could have been quite different. About the landscape: Lorentz group.

Pontryagin duality. Homogeneity (physics) Phase space.