background preloader

Nash equilibrium

Nash equilibrium
In game theory, the Nash equilibrium is a solution concept of a non-cooperative game involving two or more players, in which each player is assumed to know the equilibrium strategies of the other players, and no player has anything to gain by changing only their own strategy.[1] If each player has chosen a strategy and no player can benefit by changing strategies while the other players keep theirs unchanged, then the current set of strategy choices and the corresponding payoffs constitutes a Nash equilibrium. The reality of the Nash equilibrium of a game can be tested using experimental economics method. Stated simply, Amy and Will are in Nash equilibrium if Amy is making the best decision she can, taking into account Will's decision while Will's decision remains unchanged, and Will is making the best decision he can, taking into account Amy's decision while Amy's decision remains unchanged. Applications[edit] History[edit] The Nash equilibrium was named after John Forbes Nash, Jr. Let .

Game theory Game theory is the study of strategic decision making. Specifically, it is "the study of mathematical models of conflict and cooperation between intelligent rational decision-makers."[1] An alternative term suggested "as a more descriptive name for the discipline" is interactive decision theory.[2] Game theory is mainly used in economics, political science, and psychology, as well as logic, computer science, and biology. The subject first addressed zero-sum games, such that one person's gains exactly equal net losses of the other participant or participants. Modern game theory began with the idea regarding the existence of mixed-strategy equilibria in two-person zero-sum games and its proof by John von Neumann. This theory was developed extensively in the 1950s by many scholars. Representation of games[edit] Most cooperative games are presented in the characteristic function form, while the extensive and the normal forms are used to define noncooperative games. Extensive form[edit] [edit]

Science Magazine: Sign In Theoretical work suggests that structural properties of naturally occurring networks are important in shaping behavior and dynamics. However, the relationships between structure and behavior are difficult to establish through empirical studies, because the networks in such studies are typically fixed. We studied More Theoretical work suggests that structural properties of naturally occurring networks are important in shaping behavior and dynamics. Behavioral dynamics and influence in networked coloring and consensus Author Affiliations Edited by Brian Skyrms, University of California, Irvine, CA, and approved July 16, 2010 (received for review February 3, 2010) A correction has been published Abstract We report on human-subject experiments on the problems of coloring (a social differentiation task) and consensus (a social agreement task) in a networked setting. Social organizations often need to perform coordination tasks in a networked and decentralized fashion. We describe human-subject experiments in which decentralized coordination is modeled as the problems of coloring and consensus on a parametrized family of networks. The (behavioral) coloring problem (1) requires each player in a network to choose a color from a fixed set that differs from the choice of all of their network neighbors, while consensus requires selecting a color that agrees with all network neighbors. Related Work Decentralized coordination is a problem of long standing interest. Experimental Methodology Fig. 1. Fig. 2. Fig. 3.

PageRank Algorithm used by Google Search to rank web pages PageRank (PR) is an algorithm used by Google Search to rank web pages in their search engine results. It is named after both the term "web page" and co-founder Larry Page. PageRank is a way of measuring the importance of website pages. PageRank works by counting the number and quality of links to a page to determine a rough estimate of how important the website is. Currently, PageRank is not the only algorithm used by Google to order search results, but it is the first algorithm that was used by the company, and it is the best known.[2][3] As of September 24, 2019, all patents associated with PageRank have expired.[4] Description[edit] A PageRank results from a mathematical algorithm based on the webgraph, created by all World Wide Web pages as nodes and hyperlinks as edges, taking into consideration authority hubs such as cnn.com or mayoclinic.org. History[edit] Algorithm[edit] A probability is expressed as a numeric value between 0 and 1.

Behavioral experiments on biased voting in networks Author Affiliations Edited by Ronald L. Graham, University of California at San Diego, La Jolla, CA, and approved December 5, 2008 (received for review August 19, 2008) Abstract Many distributed collective decision-making processes must balance diverse individual preferences with a desire for collective unity. We report here on an extensive session of behavioral experiments on biased voting in networks of individuals. Keywords: The tension between the expression of individual preferences and the desire for collective unity appears in decision-making and voting processes in politics, business, and many other arenas. The 2008 Democratic National Primary race offers a recent, if approximate, example of this phenomenon. Although there is now a significant literature on the diffusion of opinion in social networks (2–4), the topic is typically studied in the absence of any incentives toward collective unity. Fig. 1. Screenshot of the user interface for a typical experiment. Experimental Design

Exclusive: How Google's Algorithm Rules the Web | Wired Magazine Want to know how Google is about to change your life? Stop by the Ouagadougou conference room on a Thursday morning. It is here, at the Mountain View, California, headquarters of the world’s most powerful Internet company, that a room filled with three dozen engineers, product managers, and executives figure out how to make their search engine even smarter. This year, Google will introduce 550 or so improvements to its fabled algorithm, and each will be determined at a gathering just like this one. The decisions made at the weekly Search Quality Launch Meeting will wind up affecting the results you get when you use Google’s search engine to look for anything — “Samsung SF-755p printer,” “Ed Hardy MySpace layouts,” or maybe even “capital Burkina Faso,” which just happens to share its name with this conference room. You might think that after a solid decade of search-market dominance, Google could relax. Still, the biggest threat to Google can be found 850 miles to the north: Bing.

The Small-World Phenomenon: An Algorithmic Perspective 1 Jon Kleinberg 2 Abstract: Long a matter of folklore, the ``small-world phenomenon'' -- the principle that we are all linked by short chains of acquaintances -- was inaugurated as an area of experimental study in the social sciences through the pioneering work of Stanley Milgram in the 1960's. This work was among the first to make the phenomenon quantitative, allowing people to speak of the ``six degrees of separation'' between any two people in the United States. Since then, a number of network models have been proposed as frameworks in which to study the problem analytically. One of the most refined of these models was formulated in recent work of Watts and Strogatz; their framework provided compelling evidence that the small-world phenomenon is pervasive in a range of networks arising in nature and technology, and a fundamental ingredient in the evolution of the World Wide Web. The Small-World Phenomenon. Modeling the Phenomenon. The Present Work. Let us return to Milgram's experiment. and

Routing Routing is the process of selecting best paths in a network. In the past, the term routing was also used to mean forwarding network traffic among networks. However this latter function is much better described as simply forwarding. Routing is performed for many kinds of networks, including the telephone network (circuit switching), electronic data networks (such as the Internet), and transportation networks. In case of overlapping/equal routes, the following elements are considered in order to decide which routes get installed into the routing table (sorted by priority): Prefix-Length: where longer subnet masks are preferred (independent of whether it is within a routing protocol or over different routing protocol)Metric: where a lower metric/cost is preferred (only valid within one and the same routing protocol)Administrative distance: where a route learned from a more reliable routing protocol is preferred (only valid between different routing protocols) Delivery semantics[edit]

Generalized second-price auction The generalized second-price auction (GSP) is a non-truthful auction mechanism for multiple items. Each bidder places a bid. The highest bidder gets the first slot, the second-highest, the second slot and so on, but the highest bidder pays the price bid by the second-highest bidder, the second-highest pays the price bid by the third-highest, and so on. First conceived as a natural extension of the Vickrey auction, it in fact does conserve some of the good properties of the Vickrey auction. It is used mainly in the context of keyword auctions, where sponsored search slots are sold on an auction basis. The first analyses of GSP are in the economics literature by Edelman, Ostrovsky, and Schwarz[1] and by Varian.[2] It is employed by Google's AdWords technology. Formal model[edit] Consider there are bidders and slots. . We can think of additional virtual slots with click-through-rate zero, so, for . submits a bid ). and charge each bidder a price (this will be 0 if they didn't win a slot). is . . .

Power law An example power-law graph, being used to demonstrate ranking of popularity. To the right is the long tail, and to the left are the few that dominate (also known as the 80–20 rule). In statistics, a power law is a functional relationship between two quantities, where a relative change in one quantity results in a proportional relative change in the other quantity, independent of the initial size of those quantities: one quantity varies as a power of another. For instance, considering the area of a square in terms of the length of its side, if the length is doubled, the area is multiplied by a factor of four.[1] Empirical examples of power laws[edit] Properties of power laws[edit] Scale invariance[edit] One attribute of power laws is their scale invariance. , scaling the argument by a constant factor causes only a proportionate scaling of the function itself. That is, scaling by a constant simply multiplies the original power-law relation by the constant . and A power-law only if Universality[edit]

Clustering coefficient In graph theory, a clustering coefficient is a measure of the degree to which nodes in a graph tend to cluster together. Evidence suggests that in most real-world networks, and in particular social networks, nodes tend to create tightly knit groups characterised by a relatively high density of ties; this likelihood tends to be greater than the average probability of a tie randomly established between two nodes (Holland and Leinhardt, 1971;[1] Watts and Strogatz, 1998[2]). Two versions of this measure exist: the global and the local. Global clustering coefficient[edit] The global clustering coefficient is based on triplets of nodes. Watts and Strogatz defined the clustering coefficient as follows, "Suppose that a vertex has neighbours; then at most edges can exist between them (this occurs when every neighbour of is connected to every other neighbor of ). denote the fraction of these allowable edges that actually exist. as the average of over all Transitivity Ratio[edit] A graph and a set of edges

Small-world experiment The "six degrees of separation" model The small-world experiment comprised several experiments conducted by Stanley Milgram and other researchers examining the average path length for social networks of people in the United States. The research was groundbreaking in that it suggested that human society is a small-world-type network characterized by short path-lengths. The experiments are often associated with the phrase "six degrees of separation", although Milgram did not use this term himself. Historical context of the small-world problem[edit] Mathematician Manfred Kochen and political scientist Ithiel de Sola Pool wrote a mathematical manuscript, "Contacts and Influences", while working at the University of Paris in the early 1950s, during a time when Milgram visited and collaborated in their research. Milgram's experiment was conceived in an era when a number of independent threads were converging on the idea that the world is becoming increasingly interconnected. Results[edit]

Related: