Policy Capturing and Multiple Regression: A Married Couple?
Judgment analysis mainly uses linear models, particularly multiple regression, to describe how participants utilize available cues to arrive at their judgments. Although neo-Brunswikians have mostly restricted themselves to using multiple regression as a tool for describing judgments, this analysis is in principle open to testing other candidate models. Among these other models are fast and frugal heuristics, such as Take The Best (Gigerenzer & Goldstein, 1996, Psychological Review, pp. 650-669; Gigerenzer, Todd, and the ABC Research Group, in press, Simple heuristics that make us smart, Oxford University Press). Take The Best is designed for pair comparison tasks: If the most valid cue discriminates between two objects, the heuristic will choose the object the cue favors; if the most valid cue does not discriminate, the next best cue is checked, and so on.
I recently investigated the following question: How powerful is standard policy capturing (i.e., when based on multiple regression) in discriminating between several strategies that generated choices in pair comparison tasks? Different strategies frequently lead to the identical prediction so that an individual's decision cannot be assigned unambiguously to one strategy. To illustrate this problem, the overlap between the predictions of Take The Best and those of multiple regression has been determined for the environment of German cities used by Gigerenzer and Goldstein (1996). It turns out that in 96% of the comparisons, where both Take The Best and multiple regression made a prediction, their predictions were identical. Such a large overlap has implications for policy capturing. Suppose that a participant consistently uses Take The Best to make inferences for 100 paired comparisons randomly drawn from this city environment and that another participant consistently uses multiple regression for the same set. Would policy capturing detect any difference between these two participants? The answer is no. The multiple correlations and the weights in the regression equation that describe the policies of these two participants do not differ (much) from each other. One solution to this problem of separability, that is, the difficulty of discriminating between strategies, is to select the alternatives presented to participants in a way that forces the strategies to make different predictions. However, selecting alternatives to minimize the overlap of the strategies' predictions often makes the item set unrepresentative and the results difficult to generalize. There seems to be a dilemma here: Either the item set is representative, with generalizable results but barely distinguishable strategies, or the item set is selective, with distinguishable strategies but possibly limited generalizability.
The idea of a fast and frugal lens model seems to be worthy closer inspection and this is a direction I will take in further research.
Thus, some of my other research interests and activities that are listed below also deal with these simple heuristics (to avoid a wrong impression: in most cases I´m the co-author):
- Ecological rationality of fast and frugal heuristics: How does their performance depend on the structure of the information in the environment? (with Laura Martignon)
- Effect of time pressure on judgment and decision making (with Jörg Rieskamp)
- Intransitivities and rationality (with Martin Lages)
- Fast and frugal heuristics for quantitative estimation (with Ralph Hertwig)
- Hindsight bias as a by-product of an adaptive process (with Ralph Hertwig)
- Bayesian Inferences and representation of information, including studies with experts, e.g., with physicians, or AIDS counselors (with Gerd Gigerenzer)
- Risk taking behavior of children in traffic (with Angelika Weber)