background preloader

Skepticism

Skepticism or scepticism (see American and British English spelling differences) is generally any questioning attitude towards knowledge, facts, or opinions/beliefs stated as facts,[1] or doubt regarding claims that are taken for granted elsewhere.[2] Philosophical skepticism is an overall approach that requires all information to be well supported by evidence.[3] Classical philosophical skepticism derives from the 'Skeptikoi', a school who "asserted nothing".[4] Adherents of Pyrrhonism, for instance, suspend judgment in investigations.[5] Skeptics may even doubt the reliability of their own senses.[6] Religious skepticism, on the other hand, is "doubt concerning basic religious principles (such as immortality, providence, and revelation)".[7] Definition[edit] In ordinary usage, skepticism (US) or scepticism (UK) (Greek: 'σκέπτομαι' skeptomai, to think, to look about, to consider; see also spelling differences) refers to: Philosophical skepticism[edit] Scientific skepticism[edit] Media[edit]

Positivism Positivism is the philosophy of science that information derived from logical and mathematical treatments and reports of sensory experience is the exclusive source of all authoritative knowledge,[1] and that there is valid knowledge (truth) only in this derived knowledge.[2] Verified data received from the senses are known as empirical evidence.[1] Positivism holds that society, like the physical world, operates according to general laws. Introspective and intuitive knowledge is rejected, as is metaphysics and theology. Although the positivist approach has been a recurrent theme in the history of western thought,[3] the modern sense of the approach was developed by the philosopher Auguste Comte in the early 19th century.[4] Comte argued that, much as the physical world operates according to gravity and other absolute laws, so does society.[5] Etymology[edit] Overview[edit] Antecedents[edit] Auguste Comte[edit] Antipositivism[edit] Main article: antipositivism In historiography[edit]

Existentialism Existentialism is a term applied to the work of certain late 19th- and 20th-century philosophers who, despite profound doctrinal differences,[1][2][3] shared the belief that philosophical thinking begins with the human subject—not merely the thinking subject, but the acting, feeling, living human individual.[4] In existentialism, the individual's starting point is characterized by what has been called "the existential attitude", or a sense of disorientation and confusion in the face of an apparently meaningless or absurd world.[5] Many existentialists have also regarded traditional systematic or academic philosophies, in both style and content, as too abstract and remote from concrete human experience.[6][7] Definitional issues and background[edit] There has never been general agreement on the definition of existentialism. The term is often seen as an historical convenience as it was first applied to many philosophers in hindsight, long after they had died. Concepts[edit] The Absurd[edit]

Fallibilism Fallibilism (from medieval Latin fallibilis, "liable to err") is the philosophical principle that human beings could be wrong about their beliefs, expectations, or their understanding of the world, and yet still be justified in holding their incorrect beliefs. In the most commonly used sense of the term, this consists in being open to new evidence that would disprove some previously held position or belief, and in the recognition that "any claim justified today may need to be revised or withdrawn in light of new evidence, new arguments, and new experiences."[1] This position is taken for granted in the natural sciences.[2] In another sense, it refers to the consciousness of "the degree to which our interpretations, valuations, our practices, and traditions are temporally indexed" and subject to (possibly arbitrary) historical flux and change. Some fallibilists argue that absolute certainty about knowledge is impossible. Moral fallibilism[edit] Criticism[edit] See also[edit] References[edit]

Ontology Study of the nature of being, becoming, existence or reality, as well as the basic categories of being and their relations Parmenides was among the first to propose an ontological characterization of the fundamental nature of reality. Etymology[edit] While the etymology is Greek, the oldest extant record of the word itself, the New Latin form ontologia, appeared in 1606 in the work Ogdoas Scholastica by Jacob Lorhard (Lorhardus) and in 1613 in the Lexicon philosophicum by Rudolf Göckel (Goclenius). The first occurrence in English of ontology as recorded by the OED (Oxford English Dictionary, online edition, 2008) came in a work by Gideon Harvey (1636/7–1702): Archelogia philosophica nova; or, New principles of Philosophy. Leibniz is the only one of the great philosophers of the 17th century to have used the term ontology.[6] Overview[edit] Some fundamental questions[edit] Principal questions of ontology include:[citation needed] Concepts[edit] Types[edit] History[edit] Hindu philosophy[edit]

Pragmatism Pragmatism is a philosophical tradition that began in the United States around 1870.[1] Pragmatism is a rejection of the idea that the function of thought is to describe, represent, or mirror reality[citation needed]. Instead, pragmatists consider thought to be a product of the interaction between organism and environment. Thus, the function of thought is as an instrument or tool for prediction, action, and problem solving. Pragmatists contend that most philosophical topics—such as the nature of knowledge, language, concepts, meaning, belief, and science—are all best viewed in terms of their practical uses and successes. A few of the various but interrelated positions often characteristic of philosophers working from a pragmatist approach include: Charles Sanders Peirce (and his pragmatic maxim) deserves much of the credit for pragmatism,[2] along with later twentieth century contributors, William James and John Dewey.[3] Pragmatism enjoyed renewed attention after W. Origins[edit]

Nihilism Nihilism is also a characteristic that has been ascribed to time periods: for example, Jean Baudrillard and others have called postmodernity a nihilistic epoch,[4] and some Christian theologians and figures of religious authority have asserted that postmodernity[5] and many aspects of modernity[3] represent a rejection of theism, and that such rejection of their theistic doctrine entails nihilism. Forms of nihilism[edit] Nihilism has many definitions, and thus can describe philosophical positions that are arguably independent. [edit] Metaphysical nihilism is the philosophical theory that there might be no objects at all—that is, that there is a possible world where there are no objects at all—or at least that there might be no concrete objects at all—so that even if every possible world contains some objects, there is at least one that contains only abstract objects. Epistemological nihilism[edit] Mereological nihilism[edit] This interpretation of existence must be based on resolution.

Utilitarianism Utilitarianism is influential in political philosophy. Bentham and Mill believed that a utilitarian government was achievable through democracy. Mill thought that despotism was also justifiable through utilitarianism as a transitional phase towards more democratic forms of governance. As an advocate of liberalism, Mill stressed the relationship between utilitarianism and individualism.[10] Historical background[edit] The importance of happiness as an end for humans has long been recognized. Although utilitarianism is usually thought to start with Jeremy Bentham, there were earlier writers who presented theories that were strikingly similar. Hume says that all determinations of morality, this circumstance of public utility principally important. In the first three editions of the book, Hutcheson included various mathematical algorithms "...to compute the Morality of any Actions." This pursuit of happiness is given a theological basis:[22] …actions are to be estimated by their tendency.

Golden Rule Book with "Dieu, la Loi, et le Roi" ("God, the law and the king") on one page and the golden rule on the other, by Bernard d'Agesci. One should treat others as one would like others to treat oneself (directive form).[1]One should not treat others in ways that one would not like to be treated (cautionary form, also known as the Silver Rule).[1] This concept describes a "reciprocal", or "two-way", relationship between one's self and others that involves both sides equally, and in a mutual fashion.[3][4] This concept can be explained from the perspective of psychology, philosophy, sociology and religion. Rushworth Kidder notes that the Golden Rule can be found in the early contributions of Confucianism. Antiquity[edit] Ancient Babylon[edit] Ancient China[edit] "Zi Gong asked, saying, "Is there one word that may serve as a rule of practice for all one's life?" Ancient Egypt[edit] Ancient Greece[edit] The Golden Rule in its prohibitive form was a common principle in ancient Greek philosophy.

Empiricism John Locke, a leading philosopher of British empiricism Empiricism is a theory which states that knowledge comes only or primarily from sensory experience.[1] One of several views of epistemology, the study of human knowledge, along with rationalism and skepticism, empiricism emphasizes the role of experience and evidence, especially sensory experience, in the formation of ideas, over the notion of innate ideas or traditions;[2] empiricists may argue however that traditions (or customs) arise due to relations of previous sense experiences.[3] Empiricism, often used by natural scientists, says that "knowledge is based on experience" and that "knowledge is tentative and probabilistic, subject to continued revision and falsification."[4] One of the epistemological tenets is that sensory experience creates knowledge. The scientific method, including experiments and validated measurement tools, guides empirical research. Etymology[edit] History[edit] Background[edit] Early empiricism[edit]

Semantics Montague grammar[edit] In the late 1960s, Richard Montague proposed a system for defining semantic entries in the lexicon in terms of the lambda calculus. In these terms, the syntactic parse of the sentence John ate every bagel would consist of a subject (John) and a predicate (ate every bagel); Montague demonstrated that the meaning of the sentence altogether could be decomposed into the meanings of its parts and in relatively few rules of combination. The logical predicate thus obtained would be elaborated further, e.g. using truth theory models, which ultimately relate meanings to a set of Tarskiian universals, which may lie outside the logic. The notion of such meaning atoms or primitives is basic to the language of thought hypothesis from the 1970s. Despite its elegance, Montague grammar was limited by the context-dependent variability in word sense, and led to several attempts at incorporating context, such as: Dynamic turn in semantics[edit] Prototype theory[edit] Psychology[edit]

Phenomenalism Phenomenalism is the view that physical objects cannot justifiably be said to exist in themselves, but only as perceptual phenomena or sensory stimuli (e.g. redness, hardness, softness, sweetness, etc.) situated in time and in space. In particular, some forms of phenomenalism reduce talk about physical objects in the external world to talk about bundles of sense-data. History[edit] Phenomenalism is a radical form of empiricism. Its roots as an ontological view of the nature of existence can be traced back to George Berkeley and his subjective idealism, which David Hume further elaborated.[1] John Stuart Mill had a theory of perception which is commonly referred to as classical phenomenalism. This differs from Berkeley's idealism in its account of how objects continue to exist when no one is perceiving them (this view is also known as "local realism"). Kant's "epistemological phenomenalism", as it has been called, is therefore quite distinct from Berkeley's earlier ontological version.

Instrumentalism In the philosophy of science, instrumentalism is the view that a scientific theory is a useful instrument in understanding the world. A concept or theory should be evaluated by how effectively it explains and predicts phenomena, as opposed to how accurately it describes objective reality. Instrumentalism avoids the debate between anti-realism and philosophical or scientific realism. It may be better characterized as non-realism. Explanation[edit] Historically, science and scientific theories have advanced as more detailed observations and results about the world have been made. Instrumentalism is particularly popular in the field of economics, where researchers postulate fictional economies and actors. On a logical positivist version of instrumentalism, theories about unobservable phenomena are regarded as having no scientific meaning. An instrumentalist position was put forward by Ernst Mach. Critiques and responses[edit] See also[edit] Notes[edit] References[edit] Kuhn, T.S.

Minimalism In the visual arts and music, minimalism is a style that uses pared-down design elements. §Minimal art, minimalism in visual art[edit] In France between 1947 and 1948,[12] Yves Klein conceived his Monotone Symphony (1949, formally The Monotone-Silence Symphony) that consisted of a single 20-minute sustained chord followed by a 20-minute silence[13][14] – a precedent to both La Monte Young's drone music and John Cage's 4′33″. Although Klein had painted monochromes as early as 1949, and held the first private exhibition of this work in 1950, his first public showing was the publication of the Artist's book Yves: Peintures in November 1954.[15][16] In contrast to the previous decade's more subjective Abstract Expressionists, with the exceptions of Barnett Newman and Ad Reinhardt; minimalists were also influenced by composers John Cage and LaMonte Young, poet William Carlos Williams, and the landscape architect Frederick Law Olmsted. §Minimalist design[edit]

Related: