Perception-based Decision Analysis (PDA)

(Professor Lotfi A. Zadeh)
(ARO) DAAH 04-961-0341, BISC Program of UC Berkeley, (NASA) NAC2-117, (NASA) NCC2-1006, (ONR) N00014-99-C-0298, (ONR) N00014-96-1-0556, and (ONR) FDN0014991035

Born at the beginning of the second half of the last century, decision analysis, or DA for short, was the brainchild of von Neumann, Morgenstern, Wald, and other great intellects of that period. Decision analysis was, and remains, driven by a quest for a theory that is prescriptive, rigorous, quantitative, and precise. The question is: can this aim be achieved? A contention that is advanced in the following is that the aim is unachievable by a prolongation of existing theories. What is needed in decision analysis is a significant shift in direction--a shift from computing with numbers to computing with words and from manipulation of measurements to manipulation of perceptions.

Decisions are based on information. More often than not, the decision-relevant information is a mixture of measurements and perceptions--perceptions exemplified by: it is very unlikely that there will be a significant decrease in the price of oil in the near future. The problem with perceptions is that they are intrinsically imprecise, reflecting the bounded ability of the human mind to resolve detail and store information. More specifically, perceptions are f-granular in the sense that (1) the boundaries of perceived classes are unsharp; and (2) the values of perceived attributes are granular, with a granule being a clump of values drawn together by indistinguishability, similarity, proximity, or functionality. For example, a perception of likelihood may be described as "very unlikely," and a perception of gain as "not very high."

Existing decision theories have a fundamental limitation--they lack the capability to operate on perception-based information. To add this capability to an existing theory, T, three stages of generalization are required: (1) f-generalization, which adds to T the capability to operate on fuzzy sets, leading to a generalized theory denoted as T+; (2) f.g-generalization, which leads to a theory denoted as T++, and adds to T+ the capability to compute with f-granular variables, e.g., a random variable that takes the values small, medium, and large with respective probabilities low, high, and low; and (3) nl-generalization, which leads to a theory denoted as Tp, and adds to T++ the capability to operate on perception-based information expressed in a natural language.

A concept that plays a key role in perception-based decision analysis is that of precisiated natural language, PNL. Basically, PNL is a subset of a natural language, NL, which consists of propositions that can be precisiated through translation into a generalized constraint language. A generalized constraint is expressed as X isr R, where X is the constrained variable; R is the constraining relation; and r is a discrete-valued indexing variable whose value defines the way in which R constrains X. The principal types of constrains are: possibilistic (r = blank); veristic (r = v); probabilistic (r = p); random set (r = rs); fuzzy graph (r = fg); usuality (r = u); and Pawlak set (r = ps).

More general constraints may be generated by combination, modification, and qualification. The collection of all such constraints is the Generalized Constraint Language, GCL. By construction, GCL is maximally expressive. As a consequence, PNL is the largest subset of NL which is precisiable through translation into GCL. In general, X, R, and r are implicit rather than explicit. Thus, if p is a proposition in a natural language, then its translation into GCL involves explicitation of X, R and r in its translate, X isr R.

In PDA, precisiated natural language is employed to define the goals, constraints, relations, and decision-relevant information. An important sublanguage of PNL is the language of fuzzy if-then rules, FRL. In this language, a perception of a function, Y = f(X), is described by a collection of fuzzy if-then rules, e.g., if X is small then Y is small; if X is medium then Y is large; if X is large then Y is small. Such a collection is referred to as the fuzzy graph of f, f*.

For example, if X is a random variable, then a perception of its probability distribution may be described as: if X is small then its probability is low; if X is medium then its probability is very low; and if X is large then its probability is high. More generally, employment of PNL in decision analysis adds an important capability--a capability to operate on perception-based information. This capability has the effect of substantially enhancing the ability of DA to deal with real-world problems. More fundamentally, the high expressive power of PNL opens the door to a redefinition of such basic concepts as optimality and causality. In particular, what is called into question is the validity of conventional approaches in which optimization is equated to a maximization of a scalar objective function. It is argued that to achieve a close rapport with reality, the use of PNL in optimization is a matter of necessity rather than choice.

Perception-based decision analysis represents a significant change in direction in the evolution of decision analysis. As we move farther into the age of machine intelligence and automation of reasoning, the need for a shift from computing with numbers to computing with words, and from manipulation of measurements to manipulation of perceptions, will cease to be a matter of debate. A case in point is the following example--referred to as the Robert example. These are three progressively difficult versions of the example. The simplest version is:

Suppose that I need to call Robert by phone at home. What I know is that usually Robert returns from work at about 6:00 p.m. My questions are: (1) What is the probability that Robert is home at 6:30 p.m.? (2) What is the earliest time at which the probability that Robert is home is high? and (3) At 6:30 p.m., should I call Robert person-to-person or station-to-station, assuming that the costs are, respectively, 1 and 2.

* Professor in the Graduate School and Director, Berkeley Initiative in Soft Computing (BISC), Computer Science Division and the Electronics Research Laboratory, Department of EECS, Univeristy of California, Berkeley, CA 94720-1776; Telephone: 510-642-4959; Fax: 510-642-1712; E-Mail: Research supported in part by ONR Contract N00014-99-C-0298, NASA Contract NCC2-1006, NASA Grant NAC2-117, ONR Grant N00014-96-1-0556, ONR Grant FDN0014991035, ARO Grant DAAH 04-961-0341, and the BISC Program of UC Berkeley.

More information ( or

Send mail to the author : (

Edit this abstract