Integrative Thinking « Roger Martin. In this primer on the problem-solving power of “integrative thinking,” Martin draws on more than 50 management success stories, including the masterminds behind The Four Seasons, Proctor & Gamble and eBay, to demonstrate how, like the opposable thumb, the “opposable mind”-Martin’s term for the human brain’s ability “to hold two conflicting ideas in constructive tension”-is an intellectually advantageous evolutionary leap through which decision-makers can synthesize “new and superior ideas.”
Using this strategy, Martin focuses on what leaders think, rather than what they do. Among anecdotes and examples steering readers to change their thinking about thinking, Martin gives readers specific strategies for understanding their own “personal knowledge system” (by parsing inherent qualities of “stance,” “tools” and “experience”), as well as for taking advantage of the “richest source of new insight into a problem,” the “opposing model.” Power (philosophy) In social science and politics, power is the ability to influence or control the behavior of people.
The term authority is often used for power perceived as legitimate by the social structure. Integrative thinking. Integrative Thinking is a field which was originated by Graham Douglas in 1986. He describes Integrative Thinking as the process of integrating intuition, reason and imagination in a human mind with a view to developing a holistic continuum of strategy, tactics, action, review and evaluation for addressing a problem in any field.
A problem may be defined as the difference between what one has and what one wants. Integrative Thinking may be learned by applying the SOARA (Satisfying, Optimum, Achievable Results Ahead) Process devised by Graham Douglas to any problem. Overconfidence effect. The overconfidence effect is a well-established bias in which someone's subjective confidence in their judgments is reliably greater than their objective accuracy, especially when confidence is relatively high. For example, in some quizzes, people rate their answers as "99% certain" but are wrong 40% of the time.
It has been proposed that a metacognitive trait mediates the accuracy of confidence judgments, but this trait's relationship to variations in cognitive ability and personality remains uncertain. Overconfidence is one example of a miscalibration of subjective probabilities. Demonstration The most common way in which overconfidence has been studied is by asking people how confident they are of specific beliefs they hold or answers they provide. The data show that confidence systematically exceeds accuracy, implying people are more sure that they are correct than they deserve to be. Overprecision is the excessive confidence that one knows the truth. Framing (social sciences) Confirmation bias. Confirmation bias, also called myside bias, is the tendency to search for, interpret, or recall information in a way that confirms one's beliefs or hypotheses.
[Note 1] It is a type of cognitive bias and a systematic error of inductive reasoning. People display this bias when they gather or remember information selectively, or when they interpret it in a biased way. The effect is stronger for emotionally charged issues and for deeply entrenched beliefs. People also tend to interpret ambiguous evidence as supporting their existing position. A series of experiments in the 1960s suggested that people are biased toward confirming their existing beliefs. Confirmation biases contribute to overconfidence in personal beliefs and can maintain or strengthen beliefs in the face of contrary evidence. §Types Confirmation biases are effects in information processing. Sunk costs. In economics and business decision-making, a sunk cost is a retrospective (past) cost that has already been incurred and cannot be recovered.
Sunk costs are sometimes contrasted with prospective costs, which are future costs that may be incurred or changed if an action is taken. Status quo bias. Status quo bias is a cognitive bias; a preference for the current state of affairs.
The current baseline (or status quo) is taken as a reference point, and any change from that baseline is perceived as a loss. Status quo bias should be distinguished from a rational preference for the status quo ante, as when the current state of affairs is objectively superior to the available alternatives, or when imperfect information is a significant problem. A large body of evidence, however, shows that status quo bias frequently affects human decision-making.
Status quo bias interacts with other non-rational cognitive processes such as loss aversion, existence bias, endowment effect, longevity, mere exposure, and regret avoidance. Experimental evidence for the detection of status quo bias is seen through the use of the Reversal test. Decision making. Sample flowchart representing the decision process to add a new article to Wikipedia.
Decision-making can be regarded as the cognitive process resulting in the selection of a belief or a course of action among several alternative possibilities. Every decision-making process produces a final choice that may or may not prompt action.Decision making is one of the central activities of management and is a huge part of any process of implementation. Good decision making is an essential skill to become an effective leader and for a successful career. Decision making is the study of identifying and choosing alternatives based on the values and preferences of the decision maker. Overview Human performance as regards decisions has been the subject of active research from several perspectives: Decision-making can also be regarded as a problem-solving activity terminated by a solution deemed to be satisfactory. Cognitive bias.
A cognitive bias is a pattern of deviation in judgment, whereby inferences about other people and situations may be drawn in an illogical fashion. Individuals create their own “subjective social reality” from their perception of the input. An individual’s construction of social reality, not the objective input, may dictate their behaviour in the social world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, or what is broadly called irrationality. Some cognitive biases are presumably adaptive.
A continually evolving list of cognitive biases has been identified over the last six decades of research on human judgment and decision-making in cognitive science, social psychology, and behavioral economics. Cognitive biases are important to study because “systematic errors” highlight the “psychological processes that underlie perception and judgement” (Tversky & Kahneman,1999, p. 582). Anchoring. Anchoring or focalism is a cognitive bias that describes the common human tendency to rely too heavily on the first piece of information offered (the "anchor") when making decisions.
During decision making, anchoring occurs when individuals use an initial piece of information to make subsequent judgments. Once an anchor is set, other judgments are made by adjusting away from that anchor, and there is a bias toward interpreting other information around the anchor. For example, the initial price offered for a used car sets the standard for the rest of the negotiations, so that prices lower than the initial price seem more reasonable even if they are still higher than what the car is really worth. Focusing effect Organizational culture. Organizational culture is the behavior of humans who are part of an organization and the meanings that the people reach to their actions. Culture includes the organization values, visions, norms, working language, systems, symbols, beliefs, and habits. It is also the pattern of such collective behaviors and assumptions that are taught to new organizational members as a way of perceiving, and even thinking and feeling.
Organizational culture affects the way people and groups interact with each other, with clients, and with stakeholders. Ravasi and Schultz (2006) state that organizational culture is a set of shared mental assumptions that guide interpretation and action in organizations by defining appropriate behavior for various situations. At the same time although a company may have their "own unique culture", in larger organizations, there is a diverse and sometimes conflicting cultures that co-exist due to different characteristics of the management team. Usage Emotional intelligence. Emotional intelligence (EI) can be defined as the ability to monitor one's own and other people's emotions, to discriminate between different emotions and label them appropriately, and to use emotional information to guide thinking and behavior. There are three models of EI.
The ability model, developed by Peter Salovey and John Mayer, focuses on the individual's ability to process emotional information and use it to navigate the social environment. The trait model as developed by Konstantin Vasily Petrides, "encompasses behavioral dispositions and self perceived abilities and is measured through self report"  The final model, the mixed model is a combination of both ability and trait EI, focusing on EI being an array of skills and characteristics that drive leadership performance, as proposed by Daniel Goleman.