background preloader


Facebook Twitter

Information revolution. The term information revolution (sometimes called also the "informational revolution") describes current economic, social and technological trends beyond the Industrial Revolution.

Information revolution

Many competing terms have been proposed that focus on different aspects of this societal development. The British polymath crystallographer J. D. Bernal introduced the term "scientific and technical revolution" in his 1939 book The Social Function of Science to describe the new role that science and technology are coming to play within society. He asserted that science is becoming a "productive force", using the Marxist Theory of Productive Forces.[1] After some controversy, the term was taken up by authors and institutions of the then-Soviet Bloc. Daniel Bell (1980) challenged this theory and advocated post-industrial society, which would lead to a service economy rather than socialism.[3] Many other authors presented their views, including Zbigniew Brzezinski (1976) with his "Technetronic Society".[4] The second law of thermodynamics - how energy flows from useful to useless. The Second Law of Thermodynamics Or Energy is Forever, but Not Exactly How Everything Happens Energy makes everything happen, and every time something happens, there is an energy change.

The second law of thermodynamics - how energy flows from useful to useless.

There are two important natural "laws of energy" that describe what happens to the energy involved in every change. We call them "laws" because countless observations and thousands of experiments have shown them to always predict what will happen. Ponder that for a moment - how everything happens. These next few pages will give you an overview of the famous, but often misunderstood, 2nd Law. Beyond the First Law The First Law of Thermodynamics tells us energy is conserved. Remember that there has to be an energy transfer for something to happen; energy changes form or moves from place to place (heat flow, for example). That sounds good doesn't it? Entropy Explained. Addendum A to "Bad Science, Worse Philosophy: the Quackery and Logic-Chopping of David Foster's The Philosophical Scientists" (2000) Introduction The concept of entropy is generally not well understood among laymen.

Entropy Explained

With the help of several physicists, including Wolfgang Gasser and Malcolm Schreiber, I have composed the following article in an attempt to correct a common misunderstanding.[1] Contrary to what many laymen think, there is no Law of Entropy which states that order must always decrease. That is a layman's fiction, although born from a small kernel of reality. ‪Salikoko S. Mufwene - Globalisation et anglais global : Mythes et réalités‬‏ Association ISKO-FRANCE.

List of thought processes. Nature of thought[edit] Thought (or thinking) can be described as all of the following: An activity taking place in a: brain – organ that serves as the center of the nervous system in all vertebrate and most invertebrate animals (only a few invertebrates such as sponges, jellyfish, adult sea squirts and starfish do not have a brain).

List of thought processes

It is the physical structure associated with the mind. mind – abstract entity with the cognitive faculties of consciousness, perception, thinking, judgement, and memory. Having a mind is a characteristic of humans, but which also may apply to other life forms.[1][2] Activities taking place in a mind are called mental processes or cognitive (see automated reasoning, below) – general purpose device that can be programmed to carry out a set of arithmetic or logical operations automatically.

Graphical dictionary and thesaurus. Physical information. In physics, physical information refers generally to the information that is contained in a physical system.

Physical information

Its usage in quantum mechanics (i.e. quantum information) is important, for example in the concept of quantum entanglement to describe effectively direct or causal relationships between apparently distinct or spatially separated particles. Issues of Information Semantics and Granularity in Cross-Media Publ... Alphabet (Wikipedia EN) A true alphabet has letters for the vowels of a language as well as the consonants.

Alphabet (Wikipedia EN)

The first "true alphabet" in this sense is believed to be the Greek alphabet,[1][2] which is a modified form of the Phoenician alphabet. In other types of alphabet either the vowels are not indicated at all, as was the case in the Phoenician alphabet (such systems are known as abjads), or else the vowels are shown by diacritics or modification of consonants, as in the devanagari used in India and Nepal (these systems are known as abugidas or alphasyllabaries).

There are dozens of alphabets in use today, the most popular being the Latin alphabet[3] (which was derived from the Greek). Many languages use modified forms of the Latin alphabet, with additional letters formed using diacritical marks. While most alphabets have letters composed of lines (linear writing), there are also exceptions such as the alphabets used in Braille, fingerspelling, and Morse code. Etymology[edit] Ferdinand de Saussure. Ferdinand de Saussure (/sɔːˈsʊr/ or /soʊˈsʊr/; French: [fɛʁdinɑ̃ də sosyʁ]; 26 November 1857 – 22 February 1913) was a Swiss linguist and semiotician whose ideas laid a foundation for many significant developments both in linguistics and semiology in the 20th century.[2][3] He is widely considered one of the fathers of 20th-century linguistics[4][5][6][7] and one of two major fathers (together with Charles Sanders Peirce) of semiotics/semiology.[8] Language is no longer regarded as peripheral to our grasp of the world we live in, but as central to it.

Words are not mere vocal labels or communicational adjuncts superimposed upon an already given order of things. They are collective products of social interaction, essential instruments through which human beings constitute and articulate their world.