background preloader

Digital physics

Digital physics
Digital physics is grounded in one or more of the following hypotheses; listed in order of decreasing strength. The universe, or reality: History[edit] The hypothesis that the universe is a digital computer was pioneered by Konrad Zuse in his book Rechnender Raum (translated into English as Calculating Space). Related ideas include Carl Friedrich von Weizsäcker's binary theory of ur-alternatives, pancomputationalism, computational universe theory, John Archibald Wheeler's "It from bit", and Max Tegmark's ultimate ensemble. Overview[edit] Digital physics suggests that there exists, at least in principle, a program for a universal computer which computes the evolution of the universe. Some try to identify single physical particles with simple bits. Loop quantum gravity could lend support to digital physics, in that it assumes space-time is quantized. Weizsäcker's ur-alternatives[edit] Pancomputationalism or the computational universe theory[edit] Wheeler's "it from bit"[edit] > Home Non-orientable wormhole In topology, this sort of connection is referred to as an Alice handle. Theory[edit] "Normal" wormhole connection[edit] Matt Visser has described a way of visualising wormhole geometry: take a "normal" region of space"surgically remove" spherical volumes from two regions ("spacetime surgery")associate the two spherical bleeding edges, so that a line attempting to enter one "missing" spherical volume encounters one bounding surface and then continues outward from the other. For a "conventional" wormhole, the network of points will be seen at the second surface to be inverted, as if one surface was the mirror image of the other—countries will appear back-to-front, as will any text written on the map. "Reversed" wormhole connection[edit] The alternative way of connecting the surfaces makes the "connection map" appear the same at both mouths. This configuration reverses the "handedness" or "chirality" of any objects passing through. Consequences[edit] Alice universe[edit] Notes[edit]

Zuse's Thesis - Zuse hypothesis - Algorithmic Theory of Everything - Digital Physics, Rechnender Raum (Computing Space, Computing Cosmos) - Computable Universe - The Universe is a Computer - Theory of Everything Konrad Zuse (1910-1995; pronounce: "Conrud Tsoosay") not only built the first programmable computers (1935-1941) and devised the first higher-level programming language (1945), but also was the first to suggest (in 1967) that the entire universe is being computed on a computer, possibly a cellular automaton (CA). He referred to this as "Rechnender Raum" or Computing Space or Computing Cosmos. Many years later similar ideas were also published / popularized / extended by Edward Fredkin (1980s), Jürgen Schmidhuber (1990s - see overview), and more recently Stephen Wolfram (2002) (see comments and Edwin Clark's review page ). Zuse's first paper on digital physics and CA-based universes was: Zuse is careful: on page 337 he writes that at the moment we do not have full digital models of physics, but that does not prevent him from asking right there: which would be the consequences of a total discretization of all natural laws?

Scharnhorst effect The Scharnhorst effect is a hypothetical phenomenon in which light signals travel faster than c between two closely spaced conducting plates. It was predicted by Klaus Scharnhorst of the Humboldt University of Berlin, Germany, and Gabriel Barton of the University of Sussex in Brighton, England. They showed using quantum electrodynamics that the effective refractive index, at low frequencies, in the space between the plates was less than 1 (which by itself does not imply superluminal signaling). They were not able to show that the wavefront velocity exceeds c (which would imply superluminal signaling) but argued that it is plausible.[1] Explanation[edit] Owing to Heisenberg's uncertainty principle, an empty space which appears to be a true vacuum is actually filled with virtual subatomic particles. The effect, however, is predicted to be minuscule. Causality[edit] References[edit]

Simulated reality Simulated reality is the hypothesis that reality could be simulated—for example by computer simulation—to a degree indistinguishable from "true" reality. It could contain conscious minds which may or may not be fully aware that they are living inside a simulation. This is quite different from the current, technologically achievable concept of virtual reality. Virtual reality is easily distinguished from the experience of actuality; participants are never in doubt about the nature of what they experience. Simulated reality, by contrast, would be hard or impossible to separate from "true" reality. There has been much debate over this topic, ranging from philosophical discourse to practical applications in computing. Types of simulation[edit] Brain-computer interface[edit] Virtual people[edit] In a virtual-people simulation, every inhabitant is a native of the simulated world. Arguments[edit] Simulation argument[edit] 1. 2. 3. Relativity of reality[edit] Computationalism[edit] Dreaming[edit]

Lorentz group The mathematical form of Basic properties[edit] The Lorentz group is a subgroup of the Poincaré group, the group of all isometries of Minkowski spacetime. Mathematically, the Lorentz group may be described as the generalized orthogonal group O(1,3), the matrix Lie group which preserves the quadratic form on R4. The restricted Lorentz group arises in other ways in pure mathematics. Connected components[edit] Each of the four connected components can be categorized by which of these two properties its elements have: Lorentz transformations which preserve the direction of time are called orthochronous. The subgroup of all Lorentz transformations preserving both orientation and the direction of time is called the proper, orthochronous Lorentz group or restricted Lorentz group, and is denoted by SO+(1, 3). The set of the four connected components can be given a group structure as the quotient group O(1,3)/SO+(1,3), which is isomorphic to the Klein four-group. P = diag(1, −1, −1, −1) where

Fredkin finite nature hypothesis In digital physics, the Fredkin Finite Nature Hypothesis states that ultimately all quantities of physics, including space and time, are discrete and finite. All measurable physical quantities arise from some Planck scale substrate for multiverse information processing. Also, the amount of information in any small volume of spacetime will be finite and equal to a small number of possibilities.[1] Conceptions[edit] Stephen Wolfram in A New Kind of Science, Chapter 9, considered the possibility that energy and spacetime might be secondary derivations from an informational substrate underlying the Planck scale. Fredkin's ideas on inertia[edit] According to Fredkin, "the computational substrate of quantum mechanics must have access to some sort of metric to create inertial motion. See also[edit] References[edit] External links[edit]

Homogeneity (physics) The definition of homogeneous strongly depends on the context used. For example, a composite material is made up of different individual materials, known as "constituents" of the material, but may be defined as a homogeneous material when assigned a function. For example, asphalt paves our roads, but is a composite material consisting of asphalt binder and mineral aggregate, and then laid down in layers and compacted. In another context, a material is not homogeneous in so far as it composed of atoms and molecules. A few other instances of context are: Dimensional homogeneity (see below) is the quality of an equation having quantities of same units on both sides; Homogeneity (in space) implies conservation of momentum; and homogeneity in time implies conservation of energy. In the context of composite metals is an alloy. Homogeneity, in another context plays a role in cosmology. Fundamental laws of physics should not (explicitly) depend on position in space. Translational invariance