Assessment |
Biopsychology |
Comparative |
Cognitive |
Developmental |
Language |
Individual differences |
Personality |
Philosophy |
Social |
Methods |
Statistics |
Clinical |
Educational |
Industrial |
Professional items |
World psychology |
Cognitive Psychology: Attention · Decision making · Learning · Judgement · Memory · Motivation · Perception · Reasoning · Thinking - Cognitive processes Cognition - Outline Index
Heuristic (/hjuːˈrɪs.tɪk/) is an adjective for methods that help in decision making and problem solving, in turn leading to learning and discovery. These methods in most cases employ experimentation and trial-and-error techniques. A heuristic method is particularly used to rapidly come to a solution that is reasonably close to the best possible answer, or 'optimal solution'. Heuristics are "rules of thumb", educated guesses, intuitive judgments or simply common sense. Heuristics (hyu-ˈris-tiks) as a noun is another name for heuristic methods.
In more precise terms, heuristics stand for strategies using readily accessible, though loosely applicable, information to control problem solving in human beings and machines.[1] Forensic engineering is an important tool in tracing defects in products and processes.
Example[]
Perhaps the most fundamental heuristic is "trial and error", which can be used in everything from matching bolts to bicycles to finding the values of variables in algebra problems.
Here are a few other commonly used heuristics, from Polya's 1945 book, How to Solve It:[2]
- If you are having difficulty understanding a problem, try drawing a picture.
- If you can't find a solution, try assuming that you have a solution and seeing what you can derive from that ("working backward").
- If the problem is abstract, try examining a concrete example.
- Try solving a more general problem first (the "inventor's paradox": the more ambitious plan may have more chances of success).
Psychology[]
In psychology, heuristics are simple, efficient rules, hard-coded by evolutionary processes or learned, which have been proposed to explain how people make decisions, come to judgments, and solve problems, typically when facing complex problems or incomplete information. These rules work well under most circumstances, but in certain cases lead to systematic errors or cognitive biases. Heuristic theories in psychology motivate both proponents and opponents of the theory to pursue research related to it.
For example, people may tend to perceive more expensive beers as tasting better than inexpensive ones (providing the two beers are of similar initial quality or lack of quality and of similar style). This finding holds true even when prices and brands are switched; putting the high price on the normally relatively inexpensive brand is enough to lead subjects to perceive it as tasting better than the beer that is normally more expensive. One might call this "price implies quality" bias. (Cf. Veblen good.)
Although much of the work of discovering heuristics in human decision-makers has been done by Amos Tversky and Daniel Kahneman[3], the concept was originally introduced by Nobel laureate Herbert Simon. Gerd Gigerenzer focuses on how heuristics can be used to make judgments that are in principle accurate, rather than producing cognitive biases – heuristics that are "fast and frugal".[4]
A number of classic heuristic experiments have recently (2003) come into dispute because some of the experiments conducted by Amos Tversky and Daniel Kahneman were designed in a way considered non-optimal for humans to judge the relevance of stochastic variables involved in the problems. Without full knowledge of the relevance of probabilities with regard to distinct events and those variables' influence on events, humans may not apply them properly. Consequently, although humans do not regularly understand the idea that certain medical tests generate false positives, they can and do understand that there are alternative explanations for a positive result, such as cysts. Although simple Bayesian calculations alone may not be enough for individuals to overcome these lapses in judgment, it has been proven, through experimentation, that when individuals are aware of the causal network describing the problem in question, the predictions regarding such events improve.[5]. Experiments by Joshua Tenenbaum and Tevye Krynski have disputed both the mammogram problem and the cab problem, originally formulated by Kahneman and Tversky, by adjusting the way in which subjects are made to understand the nature of the parameters involved in the predictions. Furthermore, Tenenbaum and Griffiths have shown that for "everyday decisions," individuals make reasonable predictions that are within known frequentist distributions, such as when individuals are asked what they expect to be the length of term of a congressman given that the congressman has been in office already for 10 years.[6] These sorts of commonplace predictions from individuals fall well within the distributions that measure the occurrence of such events.
Theorized psychological heuristics[]
Well known[]
- Anchoring and adjustment
- Availability heuristic
- Naïve diversification
- Recognition heuristic
- Representativeness heuristic
Less well known[]
|
|
|
Human-computer interaction[]
In human-computer interaction, heuristic evaluation is a usability-testing technique devised by expert usability consultants. In heuristic evaluation, the user interface is reviewed by experts and its compliance to usability heuristics (broadly stated characteristics of a good user interface) is assessed, and any violating aspects are recorded.
The Pitfalls of Heuristics[]
Heuristic algorithms are often employed because they may be seen to "work" without having been mathematically proven to meet a given set of requirements.
Great care must be given when employing a heuristic algorithm. One common pitfall in implementing a heuristic method to meet a requirement comes when the engineer or designer fails to realize that the current data set doesn't necessarily represent future system states.
While the existing data can be pored over and an algorithm can be devised to successfully handle the current data, it is imperative to ensure that the heuristic method employed is capable of handling future data sets. This means that the engineer or designer must fully understand the rules that generate the data and develop the algorithm to meet those requirements and not just address the current data sets.
A simple example of how heuristics can fail is to answer the question "What is the next number in this sequence: 1, 2, 4?". One heuristic algorithm might say that the next number is 8 because the numbers are doubling — leading to a sequence like 1, 2, 4, 8, 16, 32... Another, equally valid, heuristic would say that the next number is 7 because each number is being raised by one higher interval than the one before — leading to a series that looks like 1, 2, 4, 7, 11, 16.
Statistical analyses should be conducted when employing heuristics to estimate the probability of incorrect outcomes.
Philosophy[]
In philosophy, especially in Continental European philosophy, the adjective "heuristic" (or the designation "heuristic device") is used when an entity X exists to enable understanding of, or knowledge concerning, some other entity Y. A good example is a model which, as it is never identical with what it models, is a heuristic device to enable understanding of what it models. Stories, metaphors, etc., can also be termed heuristic in that sense. A classic example is the notion of utopia as described in Plato's best-known work, The Republic. This means that the "ideal city" as depicted in the The Republic is not given as something to be pursued, or to present an orientation-point for development; rather, it shows how things would have to be connected, and how one thing would lead to another (often with highly problematic results), if one would opt for certain principles and carry them through rigorously.
"Heuristic" is also often commonly used as a noun to describe a rule-of-thumb, procedure, or method.[7] Philosophers of science have emphasized the importance of heuristics in creative thought and constructing scientific theories.[8] (See the logic of discovery, and philosophers such as Imre Lakatos,[9] Lindley Darden, and others.)
See also[]
- Algorithm
- Behavioral economics – an economic subfield with heuristics as one of its main arguments
- Heuristic argument
- Heuristics in judgment and decision making
- Heuristic modeling
- List of cognitive biases
- Thin-slicing judgement
Notes[]
- ↑ Pearl, Judea (1983). Heuristics: Intelligent Search Strategies for Computer Problem Solving. New York, Addison-Wesley, p. vii.
- ↑ Polya, George (1945) How To Solve It: A New Aspect of Mathematical Method, Princeton, NJ: Princeton University Press. ISBN 0-691-02356-5 ISBN 0-691-08097-6
- ↑ Daniel Kahneman, Amos Tversky and Paul Slovic, eds. (1982) Judgment under Uncertainty: Heuristics & Biases. Cambridge, UK, Cambridge University Press ISBN 0-521-28414-7
- ↑ Gerd Gigerenzer, Peter M. Todd, and the ABC Research Group (1999). Simple Heuristics That Make Us Smart. Oxford, UK, Oxford University Press. ISBN 0-19-514381-7
- ↑ Krynski, T. R. and Tenenbaum, J. B. (2007). The role of causality in judgment under uncertainty. Journal of Experimental Psychology. General, 136(3):430-450.
- ↑ Griffiths, Thomas, L., Tenenbaum, and Joshua, B. (2006). Optimal predictions in everyday cognition. Psychological Science, 17(9):767-773.
- ↑ K. M. Jaszczolt (2006). "Defaults in Semantics and Pragmatics", The Stanford Encyclopedia of Philosophy, ISSN 1095-5054
- ↑ Roman Frigg and Stephan Hartmann (2006). "Models in Science", The Stanford Encyclopedia of Philosophy, ISSN 1095-5054
- ↑ Olga Kiss (2006). "Heuristic, Methodology or Logic of Discovery? Lakatos on Patterns of Thinking", Perspectives on Science, vol. 14, no. 3, pp. 302-317, ISSN 1063-6145
References[]
- How To Solve It: Modern Heuristics, Zbigniew Michalewicz and David B. Fogel, Springer Verlag, 2000. ISBN 3-540-66061-5
- Template:Russell Norvig 2003
Further reading[]
- Birnbaum, M. H. (2008). Evaluation of the priority heuristic as a descriptive model of risky decision making: Comment on Brandstätter, Gigerenzer, and Hertwig (2006). Psychological Review, 115, 253-260. Full text
- Birnbaum, M. H. (2008). Postscript: Rejoinder to Brandstätter et al. (2008). Psychological Review, 115, 260-262. Full text
- Brandstätter, E., Gigerenzer, G., & Hertwig, R. (2008). Risky choice with heuristics: Reply to Birnbaum (2008), Johnson, Schulte-Mecklenbeck & Willemsen (2008) and Rieger and Wang. Psychological Review, 115, 281-289. Full text (with Postscript from 289-290)
- Dougherty, M.R., Franco-Watkins, A., & Thomas, R.P. (2008). The psychological plausibility of the theory of probabilistic mental models and the fast and frugal heuristics. Psychological Review, 115, 199 - 211. Full text
- Dougherty, M.R., Thomas, R., & Franco-Watkins, A.M. (2008). Postscript: Vague heuristics revisited. Psychological Review, 115, 211-213. Full text
- Gigerenzer, G., Hoffrage, U., & Goldstein, D.G. (2008). Fast and frugal heuristics are plausible models of cognition: Reply to Dougherty, Franco - Watkins, and Thomas (2008). Psychological Review, 115, 230-237. Full text
- Gigerenzer, G., Hoffrage, U., & Goldstein, D.G. (2008). Postscript: Fast and frugal heuristics. Psychological Review, 115, 238-239. Full text
- Johnson, E.J., Schulte-Mecklenbeck, M., & Willemsen, M. (2008). Process Models deserve Process Data: Comment on Brandstätter, Gigerenzer, & Hertwig (2006). Psychological Review, 115, 263-272. Full text
External links[]
- The Heuristic Wiki
- Heuristics and artificial intelligence in finance and investment – The use of heuristics and AI techniques in finance and investment.
- “Discovering Assumptions” by Paul Niquette — Highly recommended
This page uses Creative Commons Licensed content from Wikipedia (view authors). |