background preloader

Digital physics

Digital physics
Digital physics is grounded in one or more of the following hypotheses; listed in order of decreasing strength. The universe, or reality: History[edit] The hypothesis that the universe is a digital computer was pioneered by Konrad Zuse in his book Rechnender Raum (translated into English as Calculating Space). Related ideas include Carl Friedrich von Weizsäcker's binary theory of ur-alternatives, pancomputationalism, computational universe theory, John Archibald Wheeler's "It from bit", and Max Tegmark's ultimate ensemble. Overview[edit] Digital physics suggests that there exists, at least in principle, a program for a universal computer which computes the evolution of the universe. Some try to identify single physical particles with simple bits. Loop quantum gravity could lend support to digital physics, in that it assumes space-time is quantized. Weizsäcker's ur-alternatives[edit] Pancomputationalism or the computational universe theory[edit] Wheeler's "it from bit"[edit] Related:  PhysicsPhysics

Fredkin finite nature hypothesis In digital physics, the Fredkin Finite Nature Hypothesis states that ultimately all quantities of physics, including space and time, are discrete and finite. All measurable physical quantities arise from some Planck scale substrate for multiverse information processing. Also, the amount of information in any small volume of spacetime will be finite and equal to a small number of possibilities.[1] Conceptions[edit] Stephen Wolfram in A New Kind of Science, Chapter 9, considered the possibility that energy and spacetime might be secondary derivations from an informational substrate underlying the Planck scale. Fredkin's ideas on inertia[edit] According to Fredkin, "the computational substrate of quantum mechanics must have access to some sort of metric to create inertial motion. See also[edit] References[edit] External links[edit]

Turing completeness In computability theory, a system of data-manipulation rules (such as a computer's instruction set, a programming language, or a cellular automaton) is said to be Turing complete or computationally universal if it can be used to simulate any single-taped Turing machine. The concept is named after Alan Turing. A classic example is lambda calculus. Computability theory includes the closely related concept of Turing equivalence. In colloquial usage, the terms "Turing complete" or "Turing equivalent" are used to mean that any real-world general-purpose computer or computer language can approximately simulate any other real-world general-purpose computer or computer language. To show that something is Turing complete, it is enough to show that it can be used to simulate some Turing complete system. Formal definitions[edit] Turing completeness A computational system that can compute every Turing-computable function is called Turing complete (or Turing powerful). Turing equivalence History[edit]

Mathematical universe hypothesis In physics and cosmology, the mathematical universe hypothesis (MUH), also known as the Ultimate Ensemble, is a speculative "theory of everything" (TOE) proposed by the cosmologist Max Tegmark.[1][2] Description[edit] Tegmark's mathematical universe hypothesis (MUH) is: Our external physical reality is a mathematical structure. That is, the physical universe is mathematics in a well-defined sense, and "in those [worlds] complex enough to contain self-aware substructures [they] will subjectively perceive themselves as existing in a physically 'real' world".[3][4] The hypothesis suggests that worlds corresponding to different sets of initial conditions, physical constants, or altogether different equations may be considered equally real. Tegmark claims that the hypothesis has no free parameters and is not observationally ruled out. The hypothesis is related to the anthropic principle and to Tegmark's categorization of four levels of the multiverse.[6] Criticisms and responses[edit]

Non-orientable wormhole In topology, this sort of connection is referred to as an Alice handle. Theory[edit] "Normal" wormhole connection[edit] Matt Visser has described a way of visualising wormhole geometry: take a "normal" region of space"surgically remove" spherical volumes from two regions ("spacetime surgery")associate the two spherical bleeding edges, so that a line attempting to enter one "missing" spherical volume encounters one bounding surface and then continues outward from the other. For a "conventional" wormhole, the network of points will be seen at the second surface to be inverted, as if one surface was the mirror image of the other—countries will appear back-to-front, as will any text written on the map. "Reversed" wormhole connection[edit] The alternative way of connecting the surfaces makes the "connection map" appear the same at both mouths. This configuration reverses the "handedness" or "chirality" of any objects passing through. Consequences[edit] Alice universe[edit] Notes[edit]

Lamb–Oseen vortex In fluid dynamics, the Lamb–Oseen vortex models a line vortex that decays due to viscosity. This vortex is named after Horace Lamb and Carl Wilhelm Oseen.[1] Vector plot of the Lamb-Oseen vortex The mathematical model for the flow velocity in the circumferential –direction in the Lamb–Oseen vortex is: with The radial velocity is equal to zero. An alternative definition is to use the peak tangential velocity of the vortex rather than the total circulation where is the radius at which is attained, and the number α = 1.25643, see Devenport et al.[2] The pressure field simply ensures the vortex rotates in the circumferential direction, providing the centripetal force where ρ is the constant density[3] Jump up ^ Saffman, P. Church–Turing thesis Several independent attempts were made in the first half of the 20th century to formalize the notion of computability: American mathematician Alonzo Church created a method for defining functions called the λ-calculus,British mathematician Alan Turing created a theoretical model for machines, now called Turing machines, that could carry out calculations from inputs,Austrian-American mathematician Kurt Gödel, with Jacques Herbrand, created a formal definition of a class of functions whose values could be calculated by recursion. All three computational processes (recursion, the λ-calculus, and the Turing machine) were shown to be equivalent—all three approaches define the same class of functions.[2][3] This has led mathematicians and computer scientists to believe that the concept of computability is accurately characterized by these three equivalent processes. Formal statement[edit] J. The thesis can be stated as follows: Every effectively calculable function is a computable function.[8]

Holographic principle In a larger sense, the theory suggests that the entire universe can be seen as a two-dimensional information structure "painted" on the cosmological horizon[clarification needed], such that the three dimensions we observe are an effective description only at macroscopic scales and at low energies. Cosmological holography has not been made mathematically precise, partly because the particle horizon has a finite area and grows with time.[4][5] The holographic principle was inspired by black hole thermodynamics, which conjectures that the maximal entropy in any region scales with the radius squared, and not cubed as might be expected. In the case of a black hole, the insight was that the informational content of all the objects that have fallen into the hole might be entirely contained in surface fluctuations of the event horizon. Black hole entropy[edit] An object with entropy is microscopically random, like a hot gas. Black hole information paradox[edit] Limit on information density[edit]

Scharnhorst effect The Scharnhorst effect is a hypothetical phenomenon in which light signals travel faster than c between two closely spaced conducting plates. It was predicted by Klaus Scharnhorst of the Humboldt University of Berlin, Germany, and Gabriel Barton of the University of Sussex in Brighton, England. They showed using quantum electrodynamics that the effective refractive index, at low frequencies, in the space between the plates was less than 1 (which by itself does not imply superluminal signaling). They were not able to show that the wavefront velocity exceeds c (which would imply superluminal signaling) but argued that it is plausible.[1] Explanation[edit] Owing to Heisenberg's uncertainty principle, an empty space which appears to be a true vacuum is actually filled with virtual subatomic particles. The effect, however, is predicted to be minuscule. Causality[edit] References[edit]

Related: