The Internet, unlike traditional computing environments, provides a wealth of arenas for strategic interaction. There is competition for the use of pipelines and routers, for the use of server and processor time, and in the increasingly popular online auctions for advertisements, goods, and services.
EECS Professor Christos Papadimitriou (Photo by Peg Skorpinski) Not surprisingly, computer scientists have been adopting concepts from game theory, the study of strategic behavior, to analyze computer networks, online markets, and the Internet as a whole. At the same time, computer scientists are leaving their mark on economics, introducing new ways of understanding the effects of strategic behavior and efficient algorithms for finding market equilibria. Christos Papadimitriou has been in the vanguard of these efforts, taking key ideas from his areas of expertise—algorithms and complexity theory—and using them to yield a powerful new perspective on game theory.
In 1998, Papadimitriou—together with Professor Elias Koutsoupias of the University of Athens—combined ideas from economics and computer science to define "the price of anarchy," a measure of the loss in efficiency when a system is operated through the competitive interaction of participants rather than socially optimized by an all-knowing central controller. For example, the system might be a network in which users are choosing routes that minimize their own packets' delays but slowing the network as a whole. In this case, the price of anarchy would measure how the total network delay incurred by such "selfish routing" compares to that of the optimal routing, which minimizes overall delay.
The price of anarchy was inspired by the standard measures of efficiency for algorithms that run in restricted environments—"online algorithms," which have limited access to information, and "approximation algorithms," which assume limited computational resources. The price of anarchy measures the worst-case cost of decentralization, which has never been systematically quantified by economists. "Economists never embarked on a systematic investigation of efficiency loss, only because the social optimum is not an option," Papadimitriou says. On the other hand, "computer scientists often work with engineered systems where the social optimum is the status quo."
For a simple network model of two nodes and parallel pathways of equal link speeds, Papadimitriou and Koutsoupias showed that the price of anarchy is 3/2, i.e., the cost of selfish routing is a network that is 3/2 times slower than an optimally managed one. For a network in which the delays along each pipeline are proportional to the flow, the price of anarchy is 4/3, as Cornell researchers showed in 2000. Such results suggest that for network routing, "an all-powerful coordinator might make things a little more efficient but not enough to be worth the trouble," Papadimitriou says. "A factor of 4/3 is absorbed by the galloping technology in a couple of months." Researchers in electrical engineering and operations research have since looked at the price of anarchy for decentralized market resource allocation and decentralized power markets.
Recently, Papadimitriou addressed a fundamental question about Nash equilibria, a central concept of game theory. A Nash equilibrium is a "steady-state" configuration in which no agent has the incentive to switch unilaterally to a different strategy. Such equilibria always exist in finite games, and they have long been assumed to be the natural outcome in a competitive environment. Yet, at the same time, there is no known efficient algorithm for computing Nash equilibria. This raises a philosophical question: Is it meaningful to know that a market equilibrium exists if the participants cannot compute it? Papadimitriou thinks not. "If your laptop can't find it, then neither can the market," he says.
Papadimitriou has long been obsessed with understanding the complexity of computing Nash equilibria. Nearly two decades ago, he defined a seemingly esoteric complexity class with Nash equilibria in mind. In the summer of 2005, his approach finally succeeded. Together with his student, Konstantinos Daskalakis, and Paul Goldberg of the University of Liverpool, Papadimitriou showed that finding a Nash equilibrium is, in some sense, hard.
Typically, showing something is hard to compute means showing it is NP-complete. Computing Nash equilibria is unlikely to fall into this category, however, because equilibria always exist, while problems known to be NP-complete sometimes have no solution. To capture the complexity of Nash, Papadimitriou had to characterize a new class of search problems that are essentially similar to the Nash problem. "I had to ask the question: 'What is the most basic principle on which a solution to these problems is based?'" Papadimitriou says.
The shared characterization turned out to be the following: For each such problem, the candidate solutions can be viewed as nodes of a graph linked via a directed path with a given starting point and with an actual solution as the endpoint. Papadimitriou called the class of problems of this form PPAD. A growing body of evidence has since indicated that complete problems for PPAD, which include finding fixed points for functions satisfying the hypotheses of Brouwer's famous 1909 "fixed point theorem," are likely to be intractable.
Papadimitriou, Goldberg, and Daskalakis were able to show that computing Nash equilibria for games with three or more players is PPAD-complete. Subsequently, another group of researchers, which included a former student of Papadimitriou, showed that, even for two-player games, finding Nash equilibria is PPAD-complete.
Now that finding Nash equilibria is known to be hard, Papadimitriou is trying to resolve the philosophical dilemma in a different way: In real games, perhaps the players don't arrive at a Nash equilibrium; they only get close to one. With this in mind, Papadimitriou is exploring the complexity of finding a situation in which each player, by altering his own strategy, is able to improve his lot only by a very small amount—an approximate Nash equilibrium. "It's a realistic equilibrium concept," he says. "One would hope that it is not plagued with the same intractability problems, but in the complexity business, you never know."