Difference between revisions of "Timeline of cognitive biases"

From Timelines
Jump to: navigation, search
(Full timeline)
 
(642 intermediate revisions by 3 users not shown)
Line 1: Line 1:
This is a '''timeline of FIXME'''.
+
This is a '''timeline of {{w|cognitive bias}}es''', attempting to describe several events related to the development of new concepts, as well as some illustrative events describing research in the field.
  
 
== Sample questions ==
 
== Sample questions ==
  
The following are some interesting questions that can be answered by reading this timeline:  
+
The following are some interesting questions that can be answered by reading this timeline:
 +
 
 +
* What are the different types of cognitive bias described by the timeline?
 +
** Sort the full timeline by "Bias type".
 +
** You will mostly see three categories: Social bias, memory bias, and belief, decision-making and behavioral bias.
 +
* What are some notable cases in history involving a cognitive bias?
 +
** Sort the full timeline by "Event type" and look for the group of rows with value "Notable case".
 +
* What are some events describing the development of a concept within the field of cognitive biases?
 +
** Sort the full timeline by "Event type" and look for the group of rows with value "Concept development".
 +
**You will read mentions such as {{w|stereotype}}, {{w|Semmelweis effect}}, and {{w|Bandwagon effect}}, etc.
 +
* What are some ilustrative pieces of research related to the field?
 +
** Sort the full timeline by "Event type" and look for the group of rows with value "Research".
 +
* What are some books illustrating the literature on the field of cognitive biases?
 +
** Sort the full timeline by "Event type" and look for the group of rows with value "Literature".
 +
** You will read a number of notable authors, such as {{w|Daniel Kahneman}}, and {{w|Irving Fisher}}, among others.
  
 
==Big picture==
 
==Big picture==
Line 9: Line 23:
 
{| class="wikitable"
 
{| class="wikitable"
 
! Time period !! Development summary !! More details
 
! Time period !! Development summary !! More details
 +
|-
 +
| 1972 backward || Pre concept development era || Multiple concepts later included within the category of cognitive biases are developed throughout time, starting from ancient Greek philosophers.
 +
|-
 +
| 1972 onward || Modern period || The notion of cognitive bias is introduced by Amos Tversky and Daniel Kahneman, who in the following years would further elaborate on several different types of cognitive biases and related concepts.
 +
|-
 +
| 21st century || Present time || As of 2020, there are approximately 188 recognized cognitive biases.<ref>{{cite web |title=Every Single Cognitive Bias in One Infographic |url=https://www.visualcapitalist.com/every-single-cognitive-bias/ |website=visualcapitalist.com |access-date=5 December 2020}}</ref>
 
|-
 
|-
 
|}
 
|}
 +
 +
== Visual and numerical data ==
 +
 +
=== Mentions on Google Scholar ===
 +
 +
The following table summarizes per-year mentions on Google Scholar as of May 17, 2021.
 +
 +
{| class="sortable wikitable"
 +
! Year
 +
! Overconfidence Bias
 +
! Self Serving Bias
 +
! Herd Mentality
 +
! Loss Aversion
 +
! Framing Cognitive Bias
 +
! Narrative Fallacy
 +
! Anchoring Bias
 +
! Confirmation Bias
 +
! Hindsight Bias
 +
! Representativeness Heuristic
 +
|-
 +
| 1980 || 89 || 3,060 || 102 || 1,830 || 134 || 390 || 221 || 2,150 || 420 || 136
 +
|-
 +
| 1985 || 144 || 3,570 || 137 || 2,500 || 311 || 557 || 320 || 2,560 || 583 || 226
 +
|-
 +
| 1990 || 234 || 6,410 || 268 || 3,810 || 779 || 958 || 584 || 4,780 || 1,010 || 414
 +
|-
 +
| 1995 || 428 || 10,600 || 502 || 5,040 || 1,610 || 1,560 || 1,100 || 7,070 || 1,660 || 539
 +
|-
 +
| 2000 || 824 || 18,500 || 745 || 8,590 || 3,010 || 2,550 || 1,960 || 12,400 || 2,970 || 832
 +
|-
 +
| 2002 || 1,090 || 20,700 || 1,020 || 11,200 || 3,850 || 2,390 || 2,560 || 12,400 || 3,430 || 898
 +
|-
 +
| 2004 || 1,700 || 24,200 || 1,160 || 14,000 || 5,120 || 3,300 || 3,370 || 16,200 || 4,200 || 1,130
 +
|-
 +
| 2006 || 2,050 || 27,300 || 1,220 || 16,900 || 6,470 || 3,570 || 4,090 || 20,500 || 4,660 || 1,500
 +
|-
 +
| 2008|| 2,650 || 32,300 || 1,520 || 20,700 || 8,220 || 4,690 || 5,040 || 25,600 || 5,500 || 1,580
 +
|-
 +
| 2010 || 3,350 || 36,700 || 1,810 || 25,500 || 10,700 || 5,320 || 6,220 || 31,300 || 6,280 || 2,270
 +
|-
 +
| 2012|| 4,500 || 40,100 || 2,140 || 29,200 || 13,900 || 6,180 || 7,910 || 38,500 || 7,310 || 2,820 
 +
|-
 +
| 2014 || 5,300 || 42,400 || 2,260 || 31,800 || 17,800 || 8,890 || 9,230 || 43,800 || 8,070 || 3,440 
 +
|-
 +
| 2016 || 6,020 || 42,600 || 2,390 || 31,600 || 19,900 || 9,160 || 10,600 || 45,100 || 8,790 || 3,700 
 +
|-
 +
| 2017 || 6,760 || 41,600 || 2,210 || 31,000 || 21,900 || 9,570 || 11,300 || 40,300 || 9,010 || 4,090 
 +
|-
 +
| 2018 || 7,500 || 39,700 || 2,360 || 31,200 || 23,200 || 10,300 || 12,500 || 42,200 || 9,650 || 4,300 
 +
|-
 +
| 2019 || 8,290 || 33,800 || 2,330 || 29,700 || 24,000 || 10,200 || 13,200 || 35,400 || 7,990 || 4,490 
 +
|-
 +
| 2020 || 9,110 || 30,100 || 2,670 || 28,000 || 25,500 || 10,200 || 15,200 || 32,500 || 9,300 || 4,590   
 +
|-
 +
|}
 +
 +
[[File:Cognitive biases.png|thumb|center|700px]]
 +
 +
=== Google Trends ===
 +
 +
The chart below shows Google Trends data for cognitive biases (topic) from January 2004 to January 2021, when the screenshot was taken.<ref>{{cite web |title=Cognitive biases |url=https://trends.google.com/trends/explore?date=all&q=Cognitive%20biases |website=trends.google.com |access-date=15 January 2021}}</ref>
 +
 +
[[File:Cognitive biases gtrends.jpeg|thumb|center|700px]]
 +
 +
=== Google Ngram Viewer ===
 +
 +
The chart shows Google Ngram Viewer data for "cognitive bias", from 1972 (when the concept was created) to 2019.<ref>{{cite web |title=Google Books Ngram Viewer |url=https://books.google.com/ngrams/graph?content=cognitive+bias&year_start=1972&year_end=2019&corpus=26&smoothing=3 |website=books.google.com |access-date=28 January 2021 |language=en}}</ref>
 +
 +
[[File:Cognitive bias ngram.png|thumb|center|700px]]
 +
 +
=== Wikipedia Views ===
 +
 +
The chart below shows pageviews of the English Wikipedia article {{w|cognitive bias}}, from July 2015 to December 2020.<ref>{{cite web |title=Cognitive biases |url=https://wikipediaviews.org/displayviewsformultiplemonths.php?page=Cognitive+biases&allmonths=allmonths-api&language=en&drilldown=all |website=wikipediaviews.org |access-date=19 January 2021}}</ref>
 +
 +
[[File:Cognitive biases wv.jpeg|thumb|center|450px]]
  
 
==Full timeline==
 
==Full timeline==
  
 
{| class="sortable wikitable"
 
{| class="sortable wikitable"
! Year !! Event type !! Details
+
! Year !! Bias type !! Event type !! Details !! Concept definition (when applicable)
 +
|-
 +
| c.180 CE || Social bias || Field development || Many philosophers and social theorists observe and consider the phenomenon of belief in a just world, going back to at least as early as the [[w:Pyrrhonism|Pyrrhonist]] philosopher {{w|Sextus Empiricus}}, writing ''circa'' 180 CE, who argues against this belief.<ref>Sextus Empiricus, "Outlines of Pyrrhonism", Book 1, Chapter 13, Section 32</ref> || "The {{w|just-world hypothesis}} is the belief that people get what they deserve since life is fair."<ref>{{cite web |title=Just-World Hypothesis |url=https://www.alleydog.com/glossary/definition.php?term=Just-World+Hypothesis |website=alleydog.com |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 1747 || || Field development || Scottish doctor {{w|James Lind}} conducts the first systematic [[w:Controlled experiment|clinical trial]].<ref>Carlisle, Rodney (2004). ''Scientific American Inventions and Discoveries'', John Wiley & Songs, Inc., New Jersey. p. 393.</ref> || "Clinical trials are research studies performed in people that are aimed at evaluating a medical, surgical, or behavioral intervention."<ref>{{cite web |title=What Are Clinical Trials and Studies? |url=https://www.nia.nih.gov/health/what-are-clinical-trials-and-studies |website=National Institute on Aging |access-date=28 January 2021 |language=en}}</ref>
 +
|-
 +
| 1753 || || Field development || {{w|Anthropomorphism}} is first attested, originally in reference to the {{w|heresy}} of applying a human form to the [[w:Christianity|Christian]] [[w:God the Father|God]].<ref>{{citation |date=1753 |title=Chambers's Cyclopædia, Supplement }}</ref><ref name=oed>''Oxford English Dictionary'', 1st ed. "anthropomorphism, ''n.''" Oxford University Press (Oxford), 1885.</ref> || Anthropomorphism is "the interpretation of nonhuman things or events in terms of human characteristics".<ref>{{cite web |title=Anthropomorphism |url=https://www.britannica.com/topic/anthropomorphism |website=britannica.com |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 1776–1799 || || Field development || The {{w|declinism}} belief is traced back to {{w|Edward Gibbon}}'s work ''{{w|The History of the Decline and Fall of the Roman Empire}}'',<ref name="Salon1">{{cite web|last1=Miller|first1=Laura|title=Culture is dead — again|url=https://www.salon.com/2015/06/14/culture_is_dead_%E2%80%94_again_its_the_end_of_civilization_as_we_know_it_and_maybe_we_feel_fine/|website=Salon|accessdate=17 April 2018|date=2015-06-14}}</ref> where {{w|Edward Gibbon}} argues that Rome collapsed due to the gradual loss of {{w|civic virtue}} among its citizens.<ref>J.G.A. Pocock, "Between Machiavelli and Hume: Gibbon as Civic Humanist and Philosophical Historian," ''Daedalus'' 105:3 (1976), 153–169; and in '''[[#Further reading|Further reading]]:''' Pocock, ''EEG'', 303–304; ''FDF'', 304–306.</ref> || Declinism is "the tendency to believe that the worst is to come".<ref>{{cite web |title=Why we feel the past is better compare to what the future holds |url=https://thedecisionlab.com/biases/declinism/ |website=thedecisionlab.com |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 1796 || || Literature || French scholar {{w|Pierre-Simon Laplace}} describes in ''A Philosophical Essay on Probabilities'' the ways in which men calculate their probability of having sons: "I have seen men, ardently desirous of having a son, who could learn only with anxiety of the births of boys in the month when they expected to become fathers. Imagining that the ratio of these births to those of girls ought to be the same at the end of each month, they judged that the boys already born would render more probable the births next of girls." The expectant fathers feared that if more sons were born in the surrounding community, then they themselves would be more likely to have a daughter. This essay by Laplace is regarded as one of the earliest descriptions of the fallacy.<ref name="BarronLeider2010">{{cite journal|last1=Barron|first1=Greg|last2=Leider|first2=Stephen|title=The role of experience in the Gambler's Fallacy|journal=Journal of Behavioral Decision Making|url=http://www-personal.umich.edu/~leider/Papers/Gamblers_Fallacy.pdf|date=13 October 2009}}</ref> || "The Gambler's Fallacy is the misconception that something that has not happened for a long time has become 'overdue', such a coin coming up heads after a series of tails."<ref>{{cite web |title=The Gambler's Fallacy - Explained |url=https://www.thecalculatorsite.com/articles/finance/the-gamblers-fallacy.php |website=thecalculatorsite.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 1747 || || "Lind conducted the first systematic [[w:Controlled experiment|clinical trial]] in 1747."<ref>Carlisle, Rodney (2004). ''Scientific American Inventions and Discoveries'', John Wiley & Songs, Inc., New Jersey. p. 393. {{isbn|0-471-24410-4}}.</ref>
+
| 1847 || || Concept development || Hungarian physician {{w|Ignaz Semmelweis}} discovers that hand washing and disinfecting at hospitals dramatically reduces infection and death in paients. His hand-washing suggestions are at the beginning rejected by his contemporaries, often for non-medical reasons. This would give birth to the concept of Semmelweis effect, which is a metaphor for the {{w|reflex}}-like tendency to reject new evidence or new knowledge because it contradicts established norms, beliefs, or {{w|paradigm}}s.<ref>{{cite journal|last1=Mortell|first1=Manfred|last2=Balkhy|first2=Hanan H.|last3=Tannous|first3=Elias B.|last4=Jong|first4=Mei Thiee|title=Physician ‘defiance’ towards hand hygiene compliance: Is there a theory–practice–ethics gap?|journal=Journal of the Saudi Heart Association|date=July 2013|volume=25|issue=3|pages=203–208|doi=10.1016/j.jsha.2013.04.003|pmc=3809478|pmid=24174860}}</ref> || Semmelweis effect "refers to the tendency to automatically reject new information or knowledge because it contradicts current thinking or beliefs."<ref>{{cite web |title=Semmelweis Reflex (Semmelweis Effect) |url=https://www.alleydog.com/glossary/definition.php?term=Semmelweis+Reflex+%28Semmelweis+Effect%29 |website=alleydog.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 1753 || || {{w|Anthropomorphism}} is first attested, originally in reference to the {{w|heresy}} of applying a human form to the [[w:Christianity|Christian]] [[w:God the Father|God]].<ref>{{citation |date=1753 |title=Chambers's Cyclopædia, Supplement }}</ref>}}<ref name=oed>''Oxford English Dictionary'', 1st ed. "anthropomorphism, ''n.''" Oxford University Press (Oxford), 1885.</ref>
+
| 1848 || Social (conformity bias) || Concept development || The phrase "jump on the bandwagon" first appears in American politics when enterteiner {{w|Dan Rice}} uses his bandwagon and its music to gain attention for his political campaign appearances. As his campaign becomes more successful, other politicians would strive for a seat on the bandwagon, hoping to be associated with his success. This preludes the emergence of the term {{w|bandwagon effect}}, which is later coined in the early 20th century.<ref>{{cite web |url=http://www.wordwizard.com/phpbb3/viewtopic.php?f=7&t=6642 |title=Bandwagon Effect |accessdate=2007-03-09}}</ref> || {{w|Bandwagon effect}} "is a psychological phenomenon whereby people do something primarily because other people are doing it, regardless of their own beliefs, which they may ignore or override."<ref>{{cite web |title=The Bandwagon Effect |url=https://www.psychologytoday.com/us/blog/stronger-the-broken-places/201708/the-bandwagon-effect |website=psychologytoday.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 1848 || || {{w|Bandwagon effect}} "The phrase "jump on the bandwagon" first appeared in American politics in 1848 when [[Dan Rice]], a famous and popular circus clown of the time, used his bandwagon and its music to gain attention for his political campaign appearances. As his campaign became more successful, other politicians strove for a seat on the bandwagon, hoping to be associated with his success. Later, during the time of [[William Jennings Bryan]]'s 1900 presidential campaign, bandwagons had become standard in campaigns,<ref>{{cite web |url=http://www.wordwizard.com/phpbb3/viewtopic.php?f=7&t=6642 |title=Bandwagon Effect |accessdate=2007-03-09}}</ref> and the phrase "jump on the bandwagon" was used as a derogatory term, implying that people were associating themselves with success without considering that with which they associated themselves."
+
| 1850 || || Concept development || The first reference to “stereotype” appears as a noun that means “image perpetuated without change.<ref name="Stereotypes Defined">{{cite web |title=Stereotypes Defined |url=https://stereotypeliberia.wordpress.com/about/stereeotypes-defined/ |website=stereotypeliberia.wordpress.com |accessdate=10 April 2020}}</ref> || Stereotype refers to "a widely held but fixed and oversimplified image or idea of a particular type of person or thing"<ref>Oxford Languages</ref>
 
|-
 
|-
| 1882 || || "The ''[[w:wiktionary:specious|specious]] present'' is the time duration wherein a state of {{w|consciousness]] is experienced as being in the {{w|present}}.<ref name=james>{{cite book | vauthors = James W | date = 1893 | url = https://archive.org/details/bub_gb_JLcAAAAAMAAJ | title = The principles of psychology | location = New York | publisher = H. Holt and Company. | page = [https://archive.org/details/bub_gb_JLcAAAAAMAAJ/page/n624 609] }}</ref> The term was first introduced by the philosopher E. R. Clay in 1882 (E. Robert Kelly),<ref name=kelly/><ref name=andersen>{{cite journal | last1 = Andersen | first1 = Holly | last2 = Grush | first2 = Rick | name-list-format = vanc | title = A brief history of time-consciousness: historical precursors to James and Husserl | journal = Journal of the History of Philosophy | date = 2009 | volume = 47 | issue = 2 | pages = 277–307 | url = http://mind.ucsd.edu/papers/bhtc/Andersen&Grush.pdf | accessdate = 2008-02-02 | doi = 10.1353/hph.0.0118 | url-status = dead | archiveurl = https://web.archive.org/web/20080216100320/http://mind.ucsd.edu/papers/bhtc/Andersen%26Grush.pdf | archivedate = 2008-02-16 | citeseerx = 10.1.1.126.3276 }}</ref>
+
| 1860 || || Concept development || Both [[w:Weber–Fechner law|Weber's law and Fechner's law]] are published by [[w:Gustav Fechner|Gustav Theodor Fechner]] in the work ''Elemente der Psychophysik'' (''Elements of Psychophysics''). This publication is the first work ever in this field, and where Fechner coins the term {{w|psychophysics}} to describe the interdisciplinary study of how humans perceive physical magnitudes.<ref name="Fechner1">{{cite book
 +
|last=Fechner
 +
|first=Gustav Theodor
 +
|editor-last1=Howes
 +
|editor-first1=D H
 +
|editor-last2=Boring
 +
|editor-first2=E G
 +
|translator-last=Adler
 +
|translator-first=H E
 +
|title=Elements of psychophysics
 +
|trans-title=Elemente der Psychophysik
 +
|volume=volume 1
 +
|location=United States of America
 +
|publisher=Holt, Rinehart and Winston
 +
|year=1966
 +
|orig-year=First published .1860
 +
}}</ref> || {{w|Weber–Fechner law}} "states that the change in a stimulus that will be just noticeable is a constant ratio of the original stimulus."<ref>{{cite web |title=Weber's law |url=https://www.britannica.com/science/Webers-law |website=britannica.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 1906 || || "The first known use of {{w|bandwagon effect}} was in 1906"<ref>{{cite web |title=bandwagon effect |url=https://www.merriam-webster.com/dictionary/bandwagon%20effect |website=merriam-webster.com |accessdate=7 April 2020}}</ref>  
+
| 1866 || Belief, decision-making and behavioral ({{w|apophenia}}) || Concept development|| The German word ''pareidolie'' is used in German articles by [[w:Karl Ludwig Kahlbaum|Dr. Karl Ludwig Kahlbaum]] in his paper ''On Delusion of the Senses''.<ref>[https://books.google.com/books?id=IM06AQAAMAAJ&pg=PA238&dq=%22pareidolia%22&hl=en&sa=X&ved=0ahUKEwjPysqt0ejUAhWe14MKHbdkCdIQ6AEIXzAJ#v=onepage&q=%22pareidolia%22&f=false ] Sibbald, M.D. "Report on the Progress of Psychological Medicine; German Psychological Literature", ''The Journal of Mental Science'', Volume 13.  1867. p. 238</ref> || {{w|Pareidolia}} is "the tendency to perceive a specific, often meaningful image in a random or ambiguous visual pattern."<ref>{{cite web |last1= |first1= |title=pareidolia |url=https://www.merriam-webster.com/dictionary/pareidolia |website=merriam-webster.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 1920 || || "First coined back in 1920, the halo effect describes how our impression of a person forms a halo around our conception of their character." "The term was coined by psychologist Edwin Thorndike in 1920."<ref>{{cite web |title=This Cognitive Bias Explains Why Pretty People Make 12% More Money Than Everybody Else |url=https://www.businessinsider.com.au/halo-effect-money-beauty-bias-2014-11 |website=businessinsider.com |accessdate=6 April 2020}}</ref><ref>{{cite web |title=What Is the Halo Effect? |url=https://www.psychologytoday.com/us/basics/the-halo-effect |website=psychologytoday.com |accessdate=6 April 2020}}</ref>  
+
| 1874 || Memory bias || Field development || The first documented instance of {{w|cryptomnesia}} occurs with the medium {{w|Stainton Moses}}.<ref>Brian Righi. (2008). ''Chapter 4: Talking Boards and Ghostly Goo''. In ''Ghosts, Apparitions and Poltergeists''. Llewellyn Publications."An early example of this occurred in 1874 with he medium William Stanton Moses, who communicated with the spirits of two brothers who had recently died in India. Upon investigation, it was discovered that one week prior to the séance, their obituary had appeared in the newspaper. This was of some importance because Moses's communications with the two spirits contained nothing that wasn't already printed in the newspaper. When the spirits were pressed for further information, they were unable to provide any. Researchers concluded that Moses had seen the obituary, forgotten it, and then resurfaced the memory during the séance."</ref><ref>{{w|Robert Todd Carroll}}. (2014). [http://skepdic.com/cryptomn.html "Cryptomnesia"]. ''{{w|The Skeptic's Dictionary}}''. Retrieved 2014-07-12.</ref> || {{w|Cryptomnesia}} is "an implicit memory phenomenon in which people mistakenly believe that a current thought or idea is a product of their own creation when, in fact, they have encountered it previously and then forgotten it".<ref>{{cite web |title=cryptomnesia |url=https://dictionary.apa.org/cryptomnesia |website=dictionary.apa.org |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 1927 || || Russian psychologist {{w|Bluma Zeigarnik}} publishes in the journal ''[[Psychological Research|Psychologische Forschung]]'' a report on a series of experiments uncovering the processes underlying the phenomenon later called {{w|Zeigarnik effect}}.<ref>Zeigarnik 1927: "Das Behalten erledigter und unerledigter Handlungen". ''[[Psychologische Forschung]]'' 9, 1-85.</ref>
+
| 1876 || Memory bias || Field development || German experimental psychologist {{w|Gustav Fechner}} conducts the earliest known research on the {{w|mere-exposure effect}}.<ref>{{cite web |title=Mere Exposure Effect |url=https://www.wiwi.europa-uni.de/de/lehrstuhl/fine/mikro/bilder_und_pdf-dateien/WS0910/VLBehEconomics/Ausarbeitungen/MereExposure.pdf |website=wiwi.europa-uni.de |accessdate=10 April 2020}}</ref> || {{w|Mere-exposure effect}} "means that people prefer things that they are most familiar with".<ref>{{cite web |title=6 Conversion Principles You Can Learn From The Mere-Exposure Effect |url=https://marketingland.com/6-conversion-principles-can-learn-mere-exposure-effect-140430 |website=marketingland.com |accessdate=7 May 2020}}</ref> It is "the tendency to express undue liking for things merely because of familiarity with them."<ref name="dsaaaa"/>
 
|-
 
|-
| 1930 || || The ''[[w:wiktionary:specious|specious]] present'' is further developed by {{w|William James}}.<ref name=andersen /> "James defined the specious present to be "the prototype of all conceived times... the short duration of which we are immediately and incessantly sensible". In "Scientific Thought" (1930), [[C. D. Broad]] further elaborated on the concept of the specious present and considered that the specious present may be considered as the temporal equivalent of a sensory datum.<ref name=andersen /> ||
+
| 1882 || || Concept development || The term ''specious present'' is first introduced by the philosopher E. R. Clay.<ref name="kelly">Anonymous (E. Robert Kelly, 1882) [https://archive.org/details/alternativeastu00claygoog/page/n5/mode/2up ''The Alternative: A Study in Psychology'']. London: Macmillan and Co. p. 168.</ref><ref name=andersen>{{cite journal | last1 = Andersen | first1 = Holly | last2 = Grush | first2 = Rick | name-list-format = vanc | title = A brief history of time-consciousness: historical precursors to James and Husserl | journal = Journal of the History of Philosophy | date = 2009 | volume = 47 | issue = 2 | pages = 277–307| doi = 10.1353/hph.0.0118 |url = https://web.archive.org/web/20080216100320/http://mind.ucsd.edu/papers/bhtc/Andersen%26Grush.pdf}}</ref> || {{w|Specious present}} "is the time duration wherein a state of {{w|consciousness}} is experienced as being in the {{w|present}}".<ref name=james>{{cite book | vauthors = James W | date = 1893 | url = https://archive.org/details/bub_gb_JLcAAAAAMAAJ | title = The principles of psychology | location = New York | publisher = H. Holt and Company. | page = [https://archive.org/details/bub_gb_JLcAAAAAMAAJ/page/n624 609] }}</ref>
 
|-
 
|-
| 1946 || || " In 1946, Berkson first illustrated the presence of a false correlation due to this last reason, which is known as Berkson's paradox and is one of the most famous paradox in probability and statistics."<ref>{{cite journal |last1=Batsidis |first1=Apostolos |last2=Tzavelas |first2=George |last3=Alexopoulos |first3=Panagiotis |title=Berkson's paradox and weighted distributions: An application to Alzheimer's disease |doi=10.1002/bimj.201900046 |url=https://onlinelibrary.wiley.com/doi/abs/10.1002/bimj.201900046}}</ref>  
+
| 1885 || Memory bias || Concept development || The phenomenon of {{w|spacing effect}} is first identified by {{w|Hermann Ebbinghaus}}, and his detailed study of it is published in his book ''Über das Gedächtnis. Untersuchungen zur experimentellen Psychologie'' (''Memory: A Contribution to Experimental Psychology''). || "The {{w|spacing effect}} describes the robust finding that long-term learning is promoted when learning events are spaced out in time, rather than presented in immediate succession".<ref>{{cite journal |last1=Vlach |first1=Haley A. |last2=Sandhofer |first2=Catherine M. |title=Distributing Learning Over Time: The Spacing Effect in Children’s Acquisition and Generalization of Science Concepts |doi=10.1111/j.1467-8624.2012.01781.x |pmid=22616822 |url=https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3399982/ |pmc=3399982}}</ref>
 
|-
 
|-
| 1960 || || English psychhologist Peter Wason first describes the {{w|confirmation bias}}.<ref>{{cite web |title=The Curious Case of Confirmation Bias |url=https://www.psychologytoday.com/us/blog/seeing-what-others-dont/201905/the-curious-case-confirmation-bias |website=psychologytoday.com |accessdate=7 April 2020}}</ref><ref>{{cite book |last1=Acks |first1=Alex |title=The Bubble of Confirmation Bias |url=https://books.google.com.ar/books?id=hPWCDwAAQBAJ&pg=PA9&dq=confirmation+bias%22+was+coined+by+English+psychologist+Peter+Wason&hl=en&sa=X&ved=0ahUKEwiMnaen1dboAhVAIrkGHX4TAwEQ6AEIMTAB#v=onepage&q=confirmation%20bias%22%20was%20coined%20by%20English%20psychologist%20Peter%20Wason&f=false}}</ref><ref>{{cite book |last1=Myers |first1=David G. |title=Psychology |url=https://books.google.com.ar/books?id=OqZZAAAAYAAJ&q=confirmation+bias%22+was+coined+by+English+psychologist+Peter+Wason&dq=confirmation+bias%22+was+coined+by+English+psychologist+Peter+Wason&hl=en&sa=X&ved=0ahUKEwiMnaen1dboAhVAIrkGHX4TAwEQ6AEISzAE}}</ref>
+
| 1890 || Memory bias || Concept development || The {{w|tip of the tongue}} phenomenon is first described as a psychological phenomenon in the text ''{{w|The Principles of Psychology}}'' by {{w|William James}}.<ref name="James">James, W. (1890). ''Principles of Psychology''. Retrieved from http://psychclassics.yorku.ca/James/Principles/</ref> || {{w|Tip of the tongue}} describes "a state in which one cannot quite recall a familiar word but can recall words of similar form and meaning".<ref>{{cite journal |last1=Brown |first1=Roger |last2=McNeill |first2=David |title=The “tip of the tongue” phenomenon |doi=10.1016/S0022-5371(66)80040-3 |url=https://www.sciencedirect.com/science/article/abs/pii/S0022537166800403}}</ref>
 
|-
 
|-
| 1960 || || "The classic example of subjects' {{w|congruence bias}} was discovered by {{w|Peter Cathcart Wason}}"
+
| 1893 || Memory bias || Concept development || {{w|Childhood amnesia}} is first formally reported by psychologist Caroline Miles in her article ''A study of individual psychology''  by the ''American Journal of Psychology''.<ref name=WhereOhWhere>{{cite journal|last=Bauer|first=P|title=Oh where, oh where have those early memories gone? A developmental perspective on childhood amnesia|journal=Psychological Science Agenda|volume=18|year=2004|url=http://www.apa.org/science/about/psa/2004/12/bauer.aspx|issue=12 }}</ref> || {{w|Childhood amnesia}} "refers to the fact that most people cannot remember events that occurred before the age of 3 or 4".<ref>{{cite web |title=Childhood Amnesia |url=https://www.sciencedirect.com/topics/medicine-and-dentistry/childhood-amnesia |website=sciencedirect.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 1961 || || {{w|Ambiguity effect}} is first described by {{w|Daniel Ellsberg}}.<ref>{{cite book|last1=Borcherding|first1=Katrin|last2=Laričev|first2=Oleg Ivanovič|last3=Messick|first3=David M.|title=Contemporary Issues in Decision Making|url=https://books.google.com/books?id=W3l9AAAAMAAJ|year=1990|publisher=North-Holland|isbn=978-0-444-88618-7|page=50}}</ref>
+
| 1906 || Social (conformity bias) || Concept development || The first known use of {{w|bandwagon effect}} occurs in this year.<ref>{{cite web |title=bandwagon effect |url=https://www.merriam-webster.com/dictionary/bandwagon%20effect |website=merriam-webster.com |accessdate=7 April 2020}}</ref> || "Bandwagon effect is when an idea or belief is being followed because everyone seems to be doing so."<ref>{{cite web |title=Bandwagon Effect - Biases & Heuristics |url=https://thedecisionlab.com/biases/bandwagon-effect/ |website=The Decision Lab |access-date=26 January 2021 |language=en-CA}}</ref>
 
|-
 
|-
| 1967 || || "Chapman (1967) described a bias in the judgment of the frequency with which two events co-occur. This demonstration showed that the [[co-occurrence]] of paired stimuli resulted in participants overestimating the frequency of the pairings."<ref>{{cite journal|last=Chapman|first=L.J|title=Illusory correlation in observational report|journal=Journal of Verbal Learning|year=1967|volume=6|pages=151–155|doi=10.1016/s0022-5371(67)80066-5}}</ref>
+
| 1906 || Social bias || Field development || American sociologist [[w:William Graham Sumner|William Sumner]] posits that humans are a species that join together in groups by their very nature. However, he also maintains that humans have an innate tendency to favor their own group over others, proclaiming how "each group nourishes its own pride and vanity, boasts itself superior, exists in its own divinities, and looks with contempt on outsiders".<ref>Sumner, William Graham. (1906). ''Folkways: A Study of the Social Importance of Usages, Manners, Customs, Mores, and Morals''. Boston, MA: Ginn.</ref> || {{w|In-group favoritism}} is "the tendency to favor members of one's own group over those in other groups".<ref>{{cite journal |last1=Everett |first1=Jim A. C. |last2=Faber |first2=Nadira S. |last3=Crockett |first3=Molly |title=Preferences and beliefs in ingroup favoritism |doi=10.3389/fnbeh.2015.00015 |pmid=25762906 |url=https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4327620/ |pmc=4327620}}</ref>
 
|-
 
|-
| 1969 || || Researchers confirm the {{w|Ben Franklin effect}}.<ref>{{cite web |title=To Become Super-Likable, Practice “The Ben Franklin Effect” |url=https://medium.com/swlh/practice-the-ben-franklin-effect-to-become-super-likable-23f98bf1ecdb |website=medium.com |accessdate=13 March 2020}}</ref>
+
| 1909 || Memory bias || Concept development || The first documented empirical studies on the {{w|testing effect}} are published by Edwina E. Abbott.<ref>{{cite journal|last1=Abbott|first1=Edwina|date=1909|title=On the analysis of the factors of recall in the learning process|url=https://insights.ovid.com/psychological-monographs-general-applied/pmga/1909/11/010/analysis-factor-recall-learning-process/5/00006828|journal=Psychological Monographs: General and Applied|volume=11|issue=1|pages=159–177|via=Ovid|doi=10.1037/h0093018}}</ref><ref>{{Cite book|last1=Larsen|first1=Douglas P.|last2=Butler|first2=Andrew C.|date=2013|editor-last=Walsh, K.|title=Test-enhanced learning|url=https://books.google.com/?id=KW2rAAAAQBAJ&pg=PA443&dq=Test-enhanced+learning#v=onepage&q=Test-enhanced%20learning&f=false|journal=In Oxford Textbook of Medical Education|volume=|issue=|pages=443–452}}</ref> || "{{w|Testing effect}} is the finding that long-term memory is often increased when some of the learning period is devoted to retrieving the to-be-remembered information."<ref>{{cite book |last1=Goldstein |first1=E. Bruce |title=Cognitive Psychology: Connecting Mind, Research and Everyday Experience |publisher=Cengage Learning |isbn=978-1-133-00912-2 |url=https://books.google.com.ar/books?id=9TUIAAAAQBAJ&pg=PA231&redir_esc=y |language=en}}</ref>
 
|-
 
|-
| 1974 || || " One of the common heuristics used when making judgements is the anchoring and adjustment heuristic, first described in 1974 (Tversky and Kahneman, 1974). In this heuristic, when people estimate an unknown quantity (say, the length of the average American commute) they begin with an ‘anchor’ of information they do know (say, their own commute) and adjust until an acceptable value is reached. This anchor could be based on information given to a person (such as the advertised price of new car before bargaining) or it could be drawn from personal experience (the price a friend paid for a new car)."<ref name="One of the common">{{cite journal |last1=Ralph |first1=Kelcie |last2=Delbosc |first2=Alexa |title=I’m multimodal, aren’t you? How ego-centric anchoring biases experts’ perceptions of travel patterns |doi=10.1016/j.tra.2017.04.027 |url=One of the common heuristics used when making judgements is the anchoring and adjustment heuristic, first described in 1974 (Tversky and Kahneman, 1974). In this heuristic, when people estimate an unknown quantity (say, the length of the average American commute) they begin with an ‘anchor’ of information they do know (say, their own commute) and adjust until an acceptable value is reached. This anchor could be based on information given to a person (such as the advertised price of new car before bargaining) or it could be drawn from personal experience (the price a friend paid for a new car).}}</ref>  
+
| 1913 || || Concept development || The term "{{w|Monte Carlo fallacy}}" (also known as {{w|Gambler's fallacy}}) originates from the best known [[w:Gambler's fallacy#Monte Carlo Casino|example]] of the phenomenon, which occurs in the {{w|Monte Carlo Casino}}.<ref name= "monte_carlo">{{Cite web|url=http://www.bbc.com/future/story/20150127-why-we-gamble-like-monkeys|title=Why we gamble like monkeys|work=BBC.com|date=2015-01-02}}</ref> || Gambler's fallacy "occurs when an individual erroneously believes that a certain random event is less likely or more likely, given a previous event or a series of events."<ref>{{cite web |title=Gambler's Fallacy |url=https://www.investopedia.com/terms/g/gamblersfallacy.asp |website=investopedia.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 1975 || || "In 1975, psychologist [[Stanley Smith Stevens]] proposed that the strength of a stimulus (e.g., the brightness of a light, the severity of a crime) is encoded neurally in a way that is independent of [[stimulus modality|modality]]. Kahneman and Frederick built on this idea, arguing that the target attribute and heuristic attribute could be unrelated."<ref name="revisited"/>
+
| 1914 || Memory bias || Concept development || The first research on the {{w|cross-race effect}} is published.<ref>{{cite journal | last1 = Feingold | first1 = CA | year = 1914 | title = The influence of environment on identification of persons and things | url = https://scholarlycommons.law.northwestern.edu/jclc/vol5/iss1/6| journal = Journal of Criminal Law and Police Science | volume = 5 | issue = 1| pages = 39–51 | doi=10.2307/1133283| jstor = 1133283 }}</ref> || {{w|Cross-race effect}} is "the tendency for eyewitnesses to be better at recognizing members of their own race/ethnicity than members of other races."<ref>{{cite journal |last1=Laub |first1=Cindy E. |last2=Meissner |first2=Christian A. |last3=Susa |first3=Kyle J. |title=The Cross-Race Effect: Resistant to Instructions |doi=10.1155/2013/745836 |url=https://www.hindawi.com/journals/jcrim/2013/745836/}}</ref>
 
|-
 
|-
| 1979 || || "In 1979, professor of psychology and author Charles G. Lord sought answers[1] as to whether we might overcome the {{w|Bacon principle}}, or whether humans are always held hostage to their initial beliefs even in the face of compelling and contradictory evidence."
+
| 1920 || Social bias || Concept development || The {{w|halo effect}} is named by psychologist {{w|Edward Thorndike}}<ref>{{Cite book
 +
|title=The Advanced Dictionary of Marketing, Scott G. Dacko, 2008: Marketing
 +
|date=2008-06-18  |publisher=Oxford University Press  |isbn=9780199286003  |location=Oxford  |pages=248}}</ref> in reference to a person being perceived as having a [[w:Halo (religious iconography)|halo]]. He gives the phenomenon its name in his article ''A Constant Error in Psychological Ratings''.<ref name=":2">{{harvnb | Thorndike | 1920}}</ref> In "Constant Error", Thorndike sets out to replicate the study in hopes of pinning down the bias that he thought was present in these ratings. Subsequent researchers would study it in relation to {{w|attractiveness}} and its bearing on the judicial and educational systems.<ref name="BBdang">{{Cite journal|last=Sigall|first=Harold|last2=Ostrove|first2=Nancy|date=1975-03-01|title=Beautiful but Dangerous: Effects of Offender Attractiveness and Nature of the Crime on Juridic Judgment|url=https://www.researchgate.net/publication/232451231|journal=Journal of Personality and Social Psychology|volume=31|issue=3|pages=410–414|doi=10.1037/h0076472}}</ref> Thorndike originally coins the term referring only to people; however, its use would be greatly expanded especially in the area of brand marketing.<ref name=":2" /> || {{w|Halo effect}} refers to an "error in reasoning in which an impression formed from a single trait or characteristic is allowed to influence multiple judgments or ratings of unrelated factors."<ref>{{cite web |title=Halo effect |url=https://www.britannica.com/science/halo-effect |website=britannica.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 1995 || || "Implicit bias was first described in a 1995 publication by Tony Greenwald and Mahzarin Banaji"<ref>{{cite web |title=PROJECT IMPLICIT LECTURES AND WORKSHOPS |url=https://www.projectimplicit.net/lectures.html |website=projectimplicit.net |accessdate=12 March 2020}}</ref>
+
| 1922 || || Concept development || The term “stereotype” is first used in the modern psychological sense by American journalist Walter Lippmann in his work ''Public Opinion''.<ref name="Stereotypes Defined"/> || "Stereotype is most frequently now employed to refer to an often unfair and untrue belief that many people have about all people or things with a particular characteristic."<ref>{{cite web |title=Definition of STEREOTYPE |url=https://www.merriam-webster.com/dictionary/stereotype |website=www.merriam-webster.com |access-date=28 January 2021 |language=en}}</ref>
 
|-
 
|-
| 1996 || || {{w|Daniel Kahneman}} and {{w|Amos Tversky}} argue that cognitive biases have efficient practical implications for areas including clinical judgment, entrepreneurship, finance, and management.<ref>{{cite journal|author1=Kahneman, D. |author2=Tversky, A. |last-author-amp=yes |title=On the reality of cognitive illusions|journal=Psychological Review|year=1996|volume=103|issue=3|pages=582–591|doi=10.1037/0033-295X.103.3.582|pmid=8759048|url=http://psy.ucsd.edu/%7Emckenzie/KahnemanTversky1996PsychRev.pdf|citeseerx=10.1.1.174.5117  }}</ref><ref name="S.X. Zhang and J. Cueto 2015">{{cite journal |author1=S.X. Zhang |author2=J. Cueto |title=The Study of Bias in Entrepreneurship |journal= Entrepreneurship Theory and Practice |volume=41 |issue=3 |pages=419–454 |doi= 10.1111/etap.12212  |year=2015 }}</ref>
+
| 1927 || Memory bias || Research || [[w:Lithuanians|Lithuanian]]-[[w:Soviet Union|Soviet]] {{w|psychologist}} {{w|Bluma Zeigarnik}} at the {{w|University of Berlin}} first describes the phenomenon that would be later known as {{w|Zeigarnik effect}}.<ref>{{cite web |title=Bluma Wulfovna Zeigarnik |url=https://www.thescienceofpsychotherapy.com/bluma-wulfovna-zeigarnik/ |website=The Science of Psychotherapy |access-date=16 March 2021 |language=en-AU |date=31 March 2014}}</ref><ref>Zeigarnik 1927: "Das Behalten erledigter und unerledigter Handlungen". ''{{w|Psychologische Forschung}}'' 9, 1-85.</ref><ref>Zeigarnik 1927: "Das Behalten erledigter und unerledigter Handlungen". ''{{w|Psychologische Forschung}}'' 9, 1-85.</ref> || {{w|Zeigarnik effect}} is the "tendency to remember interrupted or incomplete tasks or events more easily than tasks that have been completed."<ref>{{cite web |title=Zeigarnik Effect |url=https://www.goodtherapy.org/blog/psychpedia/zeigarnik-effect |website=goodtherapy.org |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 2002 || || "In a 2002 revision of the theory, Kahneman and {{w|Shane Frederick}} proposed {{w|attribute substitution}} as a process underlying these and other effects."<ref name="revisited">{{cite book |last= Kahneman |first=Daniel |first2=Shane |last2=Frederick  |title=Heuristics and Biases: The Psychology of Intuitive Judgment |editor=Thomas Gilovich |editor2=Dale Griffin |editor3=Daniel Kahneman |publisher =Cambridge University Press |location=Cambridge |year=2002 |pages=49–81 |chapter=Representativeness Revisited: Attribute Substitution in Intuitive Judgment |isbn=978-0-521-79679-8 |oclc=47364085}}</ref>
+
| 1928 || Belief, decision-making and behavioral || Literature || American economist {{w|Irving Fisher}} publishes ''The {{w|Money Illusion}}'', which develops the concept of the same name.<ref>{{Citation | title = The Money Illusion | last = Fisher | first = Irving | publisher = Adelphi Company | year = 1928 |location=New York }}</ref> || "Money illusion posits that people have a tendency to view their wealth and income in nominal dollar terms, rather than recognize its real value, adjusted for inflation."<ref>{{cite web |last1=Liberto |first1=Daniel |title=Money Illusion Definition |url=https://www.investopedia.com/terms/m/money_illusion.asp |website=Investopedia |access-date=26 January 2021 |language=en}}</ref>
 
|-
 
|-
| 2002 || || {{w|Bystander effect}}. Research indicates that priming a social context may inhibit helping behavior. Imagining being around one other person or being around a group of people can affect a person's willingness to help.<ref>{{cite journal | last1 = Garcia | first1 = S.M. | last2 = Weaver | first2 = K. | last3 = Darley | first3 = J.M. | last4 = Moskowitz | first4 = G.B. | year = 2002 | title = Crowded minds: the implicit bystander effect | url = | journal = Journal of Personality and Social Psychology | volume = 83 | issue = 4| pages = 843–853 | doi=10.1037/0022-3514.83.4.843| pmid = 12374439 }}</ref>  
+
| 1930 || || Concept development || English epistemologist {{w|C. D. Broad}} further elaborates on the concept of the {{w|specious present}} and states that it may be considered as the temporal equivalent of a sensory datum.<ref name=andersen /> || "The specious present is a term applied to that short duration of time the human mind appears to be able to experience, a period which exists between past and future and which is longer than the singular moment of the actual present."<ref>{{cite web |title=The Specious Present: Andrew Beck, David Claerbout, Colin McCahon, Keith Tyson - Announcements - Art & Education |url=https://www.artandeducation.net/announcements/106498/the-specious-present-andrew-beck-david-claerbout-colin-mccahon-keith-tyson |website=www.artandeducation.net |access-date=27 January 2021}}</ref>
 
|-
 
|-
| 2004 || || "One of the most common anchors is personal experience, which is the basis of ego-centric decision-making. Estimating the behaviors, attitudes and thoughts of other people is complex and effortful; anchoring and adjustment makes this process simpler by substituting one’s own perspective and adjusting until a reasonable estimate has been achieved (Epley et al., 2004). "<ref name="One of the common"/>
+
| 1932 || Memory bias || Field development || Some of the earliest evidence for the {{w|Fading Affect Bias}} dates back to a study by Cason, who conducts a study using a retrospective procedure where participants recall and rate past events and emotion when prompted finds that recalled emotional intensity for positive events is generally stronger than that of negative events.<ref>{{Cite journal|last=Fleming|first=G. W. T. H.|date=January 1933|title=The Learning and Retention of Pleasant and Unpleasant Activities. (Arch. of Psychol., No. 134, 1932.) Cason, H.|journal=Journal of Mental Science|volume=79|issue=324|pages=187–188|doi=10.1192/bjp.79.324.187-c|issn=0368-315X}}</ref> || The {{w|Fading Affect Bias}} "indicates that the emotional response prompted by positive memories often tends to be stronger than the emotional response prompted by negative memories."<ref>{{cite journal |last1=Skowronski |first1=John J. |last2=Walker |first2=W. Richard |last3=Henderson |first3=Dawn X. |last4=Bond |first4=Gary D. |title=Chapter Three - The Fading Affect Bias: Its History, Its Implications, and Its Future |doi=10.1016/B978-0-12-800052-6.00003-2 |url=https://www.sciencedirect.com/science/article/pii/B9780128000526000032}}</ref>
 
|-
 
|-
| 2006 || || "Overcoming Bias began in November ’06 as a group blog on the general theme of how to move our beliefs closer to reality, in the face of our natural biases such as overconfidence and wishful thinking, and our bias to believe we have corrected for such biases, when we have done no such thing."<ref>{{cite web |title=Overcoming Bias |url=http://www.overcomingbias.com/about |website=overcomingbias.com |accessdate=13 March 2020}}</ref>
+
| 1933 || Memory bias || Concept development || The {{w|Von Restorff effect}} theory is coined by German psychiatrist and pediatrician {{w|Hedwig von Restorff}}, who, in her study, finds that when participants are presented with a list of categorically similar items with one distinctive, isolated item on the list, memory for the item is improved.<ref name="vonRestorff1933">{{cite journal|last1=von Restorff|first1=Hedwig|title=Über die Wirkung von Bereichsbildungen im Spurenfeld|journal=Psychologische Forschung [Psychological Research]|date=1933|volume=18|issue=1|pages=299–342|doi=10.1007/BF02409636|trans-title=The effects of field formation in the trace field|url=http://www.utsa.edu/mind/von_restorff_translation.htm|language=de}}</ref> || "It predicts that when multiple similar objects are present, the one that differs from the rest is most likely to be remembered."<ref>{{cite web |title=The Von Restorff effect |url=https://lawsofux.com/von-restorff-effect |website=lawsofux.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 2011 || || "Cognitive Bias: The Google Effect. Also known as “digital amnesia”, the aptly named Google Effect describes our tendency to forget information that can be easily accessed online. First described in 2011 by Betsy Sparrow (Columbia University) and her colleagues, their paper described the results of several memory experiments involving technology."<ref name="thecustomer.net">{{cite web |title=Marketers Need To Be Aware Of Cognitive Bias |url=https://thecustomer.net/marketers-need-to-be-aware-of-cognitive-bias/?cn-reloaded=1 |website=thecustomer.net |accessdate=12 March 2020}}</ref>  
+
| 1942 || || Concept development || The {{w|Einstellung effect}} is first described by Dr. Abraham Luchins.<ref>{{cite web |title=The Einstellung Effect - Thinking Differently |url=https://exploringyourmind.com/the-einstellung-effect-thinking-differently/ |website=Exploring your mind |access-date=18 April 2021 |language=en |date=27 January 2020}}</ref> || "The Einstellung Effect is a type of mindset that causes humans to repeat the use of "tried and true" strategies for problem solving, even when a simpler solution strategy exists."<ref>{{cite web |title=Einstellung Effect definition {{!}} Psychology Glossary {{!}} alleydog.com |url=https://www.alleydog.com/glossary/definition.php?term=Einstellung+Effect |website=www.alleydog.com |access-date=17 May 2021}}</ref>
 +
|-
 +
| 1945 || Belief, decision-making and behavioral (anchoring bias) || Concept development || {{w|Karl Duncker}} defines {{w|functional fixedness}} as being a "mental block against using an object in a new way that is required to solve a problem".<ref name=Duncker1945>Duncker, K. (1945). "On problem solving". ''{{w|Psychological Monographs}}'', 58:5 (Whole No. 270).</ref> || {{w|Functional fixedness}} "is the inability to realize that something known to have a particular use may also be used to perform other functions."<ref>{{cite web |title=Functional fixedness |url=https://www.britannica.com/science/functional-fixedness |website=britannica.com |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 1946 || Belief, decision-making and behavioral (logical fallacy) || Concept development || American statistician {{w|Joseph Berkson}} illustrates what would be later known as {{w|Berkson's paradox}}, one of the most famous paradoxes in probability and statistics.<ref>{{cite journal |last1=Batsidis |first1=Apostolos |last2=Tzavelas |first2=George |last3=Alexopoulos |first3=Panagiotis |title=Berkson's paradox and weighted distributions: An application to Alzheimer's disease |url=https://onlinelibrary.wiley.com/doi/abs/10.1002/bimj.201900046}}</ref>  Berkson's bias or fallacy, is a type of selection bias. || {{w|Berkson's paradox}} "is a type of selection bias{{snd}}a mathematical result found in the fields of conditional probability and statistics in which two variables can be negatively correlated even though they have the appearance of being positively correlated within the population."<ref>{{cite web |title=Berkson's Paradox (Berkson's Bias) |url=https://www.alleydog.com/glossary/definition.php?term=Berkson%27s+Paradox+%28Berkson%27s+Bias%29 |website=alleydog.com |accessdate=14 August 2020}}</ref>
 +
|-
 +
| 1947 || Belief, decision-making and behavioral ({{w|extension neglect}}) || Concept development || {{w|Joseph Stalin}} is credited by some for having introduced the concept of {{w|compassion fade}} with his statement “the death of one man is a tragedy, the death of millions is a statistic”.<ref name=":4">Johnson, J. (2011). The arithmetic of compassion: rethinking the politics of photography. ''British Journal of Political Science, 41''(3), 621-643. doi: 10.1017/S0007123410000487.</ref> However, this introduction is considered to be misattributed by others.<ref>{{cite web |title=Joseph Stalin - Wikiquote |url=https://en.wikiquote.org/wiki/Joseph_Stalin#Misattributed |website=en.wikiquote.org |access-date=17 May 2021 |language=en}}</ref> || Compassion fade "refers to the decrease in the compassion one shows for the people in trouble as the number of the victims increase."<ref>{{cite web |title=Compassion fade |url=http://econowmics.com/compassion-fade/ |website=econowmics.com |access-date=15 January 2021}}</ref>
 +
|-
 +
| 1952 || Social (conformity bias) || Concept development || [[w:William H. Whyte|William H. Whyte Jr.]] derives the term ''{{w|groupthink}}'' from {{w|George Orwell}}'s ''{{w|Nineteen Eighty-Four}}'' and popularizes it in [[w:Fortune (magazine)|''Fortune'']] magazine:
 +
 
 +
{{quote|text=Groupthink being a coinage – and, admittedly, a loaded one – a working definition is in order. We are not talking about mere instinctive conformity – it is, after all, a perennial failing of mankind. What we are talking about is a ''rationalized'' conformity – an open, articulate philosophy which holds that group values are not only expedient but right and good as well.<ref>
 +
{{cite news |first=W. H., Jr. |last=Whyte  |author-link=William H. Whyte |title=Groupthink |journal=[[w:Fortune (magazine)|Fortune]] |date=March 1952|pages = 114–117, 142, 146}}
 +
</ref><ref>{{cite web |last1=Safire |first1=William |title=THE WAY WE LIVE NOW: 8-8-04: ON LANGUAGE; Groupthink (Published 2004) |url=https://query.nytimes.com/gst/fullpage.html?res=9C01E2DD173CF93BA3575BC0A9629C8B63 |website=The New York Times |access-date=14 March 2021 |date=8 August 2004}}</ref>
 +
}} 
 +
|| "Groupthink is a psychological phenomenon in which people strive for consensus within a group."<ref>{{cite web |title=The Psychology Behind Why We Strive for Consensus |url=https://www.verywellmind.com/what-is-groupthink-2795213 |website=Verywell Mind |language=en}}</ref>
 +
|-
 +
| 1954 || Social bias || Concept development || The {{w|social comparison theory}} is initially proposed by {{w|social psychologist}} {{w|Leon Festinger}}. It centers on the belief that there is a drive within individuals to gain accurate self-evaluations.<ref name="Festinger1954">{{cite journal | author = Festinger L | year = 1954 | title = A theory of social comparison processes | url = | journal = Human Relations | volume = 7 | issue = 2| pages = 117–140 | doi=10.1177/001872675400700202}}</ref> || The {{w|social comparison theory}} refers to "the idea that individuals determine their own social and personal worth based on how they stack up against others".<ref>{{cite web |title=Social Comparison Theory |url=https://www.psychologytoday.com/intl/basics/social-comparison-theory |website=psychologytoday.com |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 1956 || || Concept development || The term "{{w|Barnum effect}}" is coined by psychologist {{w|Paul Meehl}} in his essay ''Wanted – A Good Cookbook'', because he relates the vague personality descriptions used in certain "pseudo-successful" psychological tests to those given by showman {{w|P. T. Barnum}}.<ref name=Meehl1956>{{cite journal|last1=Meehl |first1=Paul E. |title=Wanted – A Good Cookbook |journal=American Psychologist |date=1956 |volume=11 |issue=6 |pages=263–272 |doi=10.1037/h0044164 |df= }}</ref><ref name="Dutton1988">{{cite journal|last1=Dutton|first1=D. L.|title=The cold reading technique|journal=Experientia|date=1988|volume=44|issue=4|pages=326–332|doi=10.1007/BF01961271|url=http://denisdutton.com/cold_reading.htm|language=en|pmid=3360083}}</ref> || {{w|Barnum effect}} is "the phenomenon that occurs when individuals believe that personality descriptions apply specifically to them (more so than to other people), despite the fact that the description is actually filled with information that applies to everyone."<ref>{{cite web |title=Barnum Effect |url=https://www.britannica.com/science/Barnum-Effect |website=britannica.com |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 1957 || || Concept development || British naval historian {{w|C. Northcote Parkinson}} describes what is later called {{w|Parkinson's law of triviality}}, which argues that members of an organization give disproportionate weight to trivial issues.<ref name="parkinson">{{cite book |first=C. Northcote |last=Parkinson |title = Parkinson's Law, or the Pursuit of Progress |publisher=John Murray |isbn=0140091076|year=1958}}</ref> || {{w|Parkinson's law of triviality}} (also known as the bike-shed effect) "explains that people will give more energy and focus to trivial or unimportant items than to more important and complex ones."<ref>{{cite web |title=How to Handle Bikeshedding: Parkinson’s Law of Triviality |url=https://projectbliss.net/bikeshedding-parkinsons-law-of-triviality/ |website=projectbliss.net |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 1960 || Belief, decision-making and behavioral || Concept development || English psychologist {{w|Peter Cathcart Wason}} first describes the {{w|confirmation bias}}.<ref>{{cite web |title=The Curious Case of Confirmation Bias |url=https://www.psychologytoday.com/us/blog/seeing-what-others-dont/201905/the-curious-case-confirmation-bias |website=psychologytoday.com |accessdate=7 April 2020}}</ref><ref>{{cite book |last1=Acks |first1=Alex |title=The Bubble of Confirmation Bias |url=https://books.google.com.ar/books?id=hPWCDwAAQBAJ&pg=PA9&dq=confirmation+bias%22+was+coined+by+English+psychologist+Peter+Wason&hl=en&sa=X&ved=0ahUKEwiMnaen1dboAhVAIrkGHX4TAwEQ6AEIMTAB#v=onepage&q=confirmation%20bias%22%20was%20coined%20by%20English%20psychologist%20Peter%20Wason&f=false}}</ref><ref>{{cite book |last1=Myers |first1=David G. |title=Psychology |url=https://books.google.com.ar/books?id=OqZZAAAAYAAJ&q=confirmation+bias%22+was+coined+by+English+psychologist+Peter+Wason&dq=confirmation+bias%22+was+coined+by+English+psychologist+Peter+Wason&hl=en&sa=X&ved=0ahUKEwiMnaen1dboAhVAIrkGHX4TAwEQ6AEISzAE}}</ref> || "{{w|Confirmation bias}} is the tendency of people to favor information that confirms their existing beliefs or hypotheses."<ref>{{cite web |title=Confirmation Bias |url=https://www.simplypsychology.org/confirmation-bias.html |website=simplypsychology.org |accessdate=14 August 2020}}</ref>
 +
|-
 +
| 1960 || Belief, decision-making and behavioral ({{w|confirmation bias}}) || Concept development || {{w|Peter Cathcart Wason}} discovers the classic example of subjects' {{w|congruence bias}}.<ref>{{cite web |title=The Curious Case of Confirmation Bias |url=https://www.psychologytoday.com/gb/blog/seeing-what-others-dont/201905/the-curious-case-confirmation-bias#:~:text=Confirmation%20bias%20was%20first%20described,their%20triple%20fit%20the%20rule. |website=psychologytoday.com |accessdate=14 August 2020}}</ref> || {{w|Congruence bias}} is "the tendency to test hypotheses exclusively through direct testing, instead of considering possible alternatives."<ref>{{cite web |title=Cognitive Bias in Decision Making |url=https://associationanalytics.com/2015/11/30/cognitive-bias-in-decision-making/ |website=associationanalytics.com |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 1961 || Social bias || Research || The {{w|Milgram experiment}} is conducted. This classic experiment establishes the existence of {{w|authority bias}}.<ref>{{cite book |author=Ellis RM |title=Middle Way Philosophy: Omnibus Edition |year=2015 |publisher=[[w:Lulu (company)|Lulu Press]] | url=https://books.google.com/books?id=xG9rCgAAQBAJ&dq=Ellis+RM+Middle+Way+Philosophy%3A+Omnibus+Edition&q=authority#v=onepage&q=milgram&f=false|isbn=9781326351892 }}</ref> || "{{w|Authority bias}} is the human tendency to attribute greater authority and knowledge to persons of authority (fame, power, position, etc.) than they may actually possess."<ref>{{cite web |title=Authority Bias |url=https://www.alleydog.com/glossary/definition.php?term=Authority+Bias |website=alleydog.com |accessdate=14 August 2020}}</ref>
 +
|-
 +
| 1961 || {{w|Ambiguity effect}} || Concept development || The {{w|ambiguity effect}} is first described by American economist {{w|Daniel Ellsberg}}.<ref>{{cite book|last1=Borcherding|first1=Katrin|last2=Laričev|first2=Oleg Ivanovič|last3=Messick|first3=David M.|title=Contemporary Issues in Decision Making|url=https://books.google.com/books?id=W3l9AAAAMAAJ|year=1990|publisher=North-Holland|isbn=978-0-444-88618-7|page=50}}</ref> || "{{w|Ambiguity Effect}} occurs when people prefer options with known probabilities over those with unknown probabilities."<ref>{{cite web |title=Why we prefer options that are known to us |url=https://thedecisionlab.com/biases/ambiguity-effect/ |website=thedecisionlab.com |accessdate=14 August 2020}}</ref>
 +
|-
 +
| 1964 || Memory bias || Concept development || The original work on the {{w|telescoping effect}} is usually attributed to an article by Neter and Waksberg in the ''{{w|Journal of the American Statistical Association}}''.<ref name=Rubin>{{cite journal |last1=Rubin |first1=David C. |last2=Baddeley |first2=Alan D. |date=1989 |title=Telescoping is not time compression: A model |journal=Memory & Cognition |doi=10.3758/BF03202626 |pmid=2811662 |volume=17 |issue=6|pages=653–661}}</ref> The term telescoping comes from the idea that time seems to shrink toward the present in the way that the distance to objects seems to shrink when they are viewed through a telescope.<ref name=Rubin/> || "The telescoping effect refers to inaccurate perceptions regarding time, where people see recent events as more remote than they are (backward telescoping), and remote events as more recent (forward telescoping).<ref>{{cite web |title=Telescoping effect - Biases & Heuristics |url=https://thedecisionlab.com/biases/telescoping-effect/ |website=The Decision Lab |access-date=26 January 2021 |language=en-CA}}</ref>
 +
|-
 +
| 1964 || Belief, decision-making and behavioral (anchoring bias) || Concept development || The first recorded statement of the concept of {{w|Law of the instrument}} is {{w|Abraham Kaplan}}'s: "I call it ''the law of the instrument,'' and it may be formulated as follows: Give a small boy a hammer, and he will find that everything he encounters needs pounding."<ref>
 +
{{cite book
 +
| title = The Conduct of Inquiry: Methodology for Behavioral Science
 +
| author = Abraham Kaplan
 +
| publisher = San Francisco: Chandler Publishing Co
 +
| year = 1964
 +
| page = 28
 +
| url = https://books.google.com/books?id=OYe6fsXSP3IC&pg=PA28| isbn = 9781412836296
 +
}}</ref> || "The law of the instrument principle states that when we acquire a specific tool/skill, we tend to be to see opportunities to use that tool/skill everywhere."<ref>{{cite web |title=Law of the instrument - Biases & Heuristics |url=https://thedecisionlab.com/biases/law-of-the-instrument/ |website=The Decision Lab |access-date=27 January 2021 |language=en-CA}}</ref>
 +
|-
 +
| 1966 || Social (egocentric bias) || Research || Walster hypothesizes that it can be frightening to believe that a misfortune could happen to anyone at random, and attributing responsibility to the person(s) involved helps to manage this emotional reaction.<ref>{{cite journal |last1=Walster |first1=Elaine |title=Assignment of responsibility for an accident. |journal=Journal of Personality and Social Psychology |date=1966 |volume=3 |issue=1 |pages=73–79 |doi=10.1037/h0022733}}</ref> || "The {{w|defensive attribution hypothesis}} is a social psychology term that describes an attributional approach taken by some people - a set of beliefs that an individual uses to protect or "shield" themselves against fears of being the victim or cause of a major mishap."<ref>{{cite web |title=Defensive Attribution Hypothesis definition {{!}} Psychology Glossary {{!}} alleydog.com |url=https://www.alleydog.com/glossary/definition.php?term=Defensive+Attribution+Hypothesis |website=www.alleydog.com |access-date=29 January 2021}}</ref>
 +
|-
 +
| 1967 || Belief, decision-making and behavioral || Notable case || {{w|Risk compensation}}. Sweden experiences a drop in crashes and fatalities, following [[w:Dagen H|the change from driving on the left to driving on the right]]. This is linked to the increased apparent risk. The number of motor insurance claims goes down by 40%, returning to normal over the next six weeks.<ref>{{cite book|title=Risk and Freedom: Record of Road Safety Regulation|first=John |last=Adams|publisher=Brefi Press|year=1985|isbn=9780948537059}}</ref><ref>{{cite news|quote=On the day of the change, only 150 minor accidents were reported. Traffic accidents over the next few months went down. ... By 1969, however, accidents were back at normal levels|title=Dagen H: The day Sweden switched sides of the road|work=Washington Post|url=https://www.washingtonpost.com/blogs/blogpost/post/dagen-h-the-day-sweden-switched-sides-of-the-road-photo/2012/02/17/gIQAOwFVKR_blog.html|first=Elizabeth|last=Flock|date=2012-02-17}}</ref> Fatality levels would take two years to return to normal.<ref>"On September 4 there were 125 reported traffic accidents as opposed to 130-196 from the previous Mondays. No traffic fatalities were linked to the switch. In fact, fatalities dropped for two years, possibly because drivers were more vigilant after the switch." Sweden finally began driving on the right side of the road in 1967 ''The Examiner'' Sept 2, 2009</ref> || "{{w|Risk compensation}} postulates that humans have a built-in level of acceptable risk-taking and that our behaviour adjusts to this level in a homeostatic manner".<ref>{{cite journal |last1=Mok |first1=D |last2=Gore |first2=G |last3=Hagel |first3=B |last4=Mok |first4=E |last5=Magdalinos |first5=H |last6=Pless |first6=B |title=Risk compensation in children’s activities: A pilot study |doi=10.1093/pch/9.5.327 |pmid=19657519 |url=https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2721187/ |pmc=2721187}}</ref>
 +
|-
 +
| 1967 || Belief, decision-making and behavioral ({{w|apophenia}}) || Concept development || {{w|Illusory correlation}} is originally coined by Chapman and Chapman to describe people's tendencies to overestimate relationships between two groups when distinctive and unusual information is presented.<ref name="Chapman1967">{{cite journal|last1=Chapman|first1=L|title=Illusory correlation in observational report|journal=Journal of Verbal Learning and Verbal Behavior|volume=6|issue=1|year=1967|pages=151–155|doi=10.1016/S0022-5371(67)80066-5}}</ref>"<ref>{{cite journal|last=Chapman|first=L.J|title=Illusory correlation in observational report|journal=Journal of Verbal Learning|year=1967|volume=6|pages=151–155|doi=10.1016/s0022-5371(67)80066-5}}</ref> || An {{w|illusory correlation}} occurs when a person perceives a relationship between two variables that are not in fact correlated.<ref>{{cite web |title=Illusory Correlation |url=http://psychology.iresearchnet.com/social-psychology/decision-making/illusory-correlation/ |website=psychology.iresearchnet.com |accessdate=17 July 2020}}</ref>
 +
|-
 +
| 1967 || Social (attribution bias) || Research || American social psychologist {{w|Edward E. Jones}} and Victor Harris conduct a classic experiment<ref name="JonesHarris67">{{cite journal|last=Jones|first=E. E.|last2=Harris|first2=V. A.|year=1967|title=The attribution of attitudes|journal=Journal of Experimental Social Psychology|volume=3|issue=1|pages=1–24|doi=10.1016/0022-1031(67)90034-0}}</ref> that would later give rise to the phrase {{w|Fundamental attribution error}}, coined by {{w|Lee Ross}}.<ref>{{cite book|title=Advances in experimental social psychology|last=Ross|first=L.|publisher=Academic Press|year=1977|isbn=978-0-12-015210-0|editor-last=Berkowitz|editor-first=L.|volume=10|location=New York|pages=173–220|chapter=The intuitive psychologist and his shortcomings: Distortions in the attribution process}}</ref> || {{w|Fundamental attribution error}} "is the tendency for people to over-emphasize dispositional, or personality-based explanations for behaviors observed in others while under-emphasizing situational explanations".<ref>{{cite web |title=Fundamental Attribution Error |url=https://www.simplypsychology.org/fundamental-attribution.html |website=simplypsychology.org |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 1968 || Belief, decision-making and behavioral (anchoring bias) || Concept development ||  American psychologist  {{w|Ward Edwards}} discusses the concept of {{w|conservatism (belief revision)}} bias.<ref name="edwards1">Edwards, Ward. "Conservatism in Human Information Processing (excerpted)". In Daniel Kahneman, Paul Slovic and Amos Tversky. (1982). ''Judgment under uncertainty: Heuristics and biases''. New York: Cambridge University Press. Original work published 1968.</ref> || "[[w:Conservatism (belief revision)|Conservatism bias]] is a mental process in which people maintain their past views or predictions at the cost of recognizing new information."<ref>{{cite web |title=Conservatism Bias |url=https://dwassetmgmt.com/conservatism-bias/|website=dwassetmgmt.com |accessdate=8 May 2020}}</ref>
 +
|-
 +
| 1968 || Social || Concept development || German-born American psychologist [[w:Robert Rosenthal (psychologist)|Robert Rosenthal]] and Lenore Jacobsen first describe what would be called {{w|Pygmalion Effect}} (also called the Galatea effect).<ref>{{cite web |title=Statistics How To |url=https://www.statisticshowto.com/pygmalion-effect-rosenthal/ |website=statisticshowto.com |accessdate=7 April 2020}}</ref> || {{w|Pygmalion Effect}} "refers to the phenomenon of people improving their performance when others have high expectations of them."<ref>{{cite web |title=Pygmalion Effect |url=https://www.alleydog.com/glossary/definition.php?term=Pygmalion+Effect |website=alleydog.com |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 1969 || Social ({{w|cognitive dissonance}}) || Concept development || Researchers confirm the {{w|Ben Franklin effect}}.<ref>{{cite web |title=To Become Super-Likable, Practice “The Ben Franklin Effect” |url=https://medium.com/swlh/practice-the-ben-franklin-effect-to-become-super-likable-23f98bf1ecdb |website=medium.com |accessdate=13 March 2020}}</ref> || The {{w|Ben Franklin effect}} refers to "an altruistic reaction that makes a person more likely to do a favor for someone that they have already completed a favor for; more likely than they are to return a favor to someone who has completed a favor for them."<ref>{{cite web |title=Ben Franklin Effect |url=https://www.alleydog.com/glossary/definition.php?term=Ben+Franklin+Effect |website=alleydog.com |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 1969 || Memory bias || Research || Crowder and Morton argue that the suffix effect is a reflection of the contribution of the auditory sensory memory or echoic memory to recall in the nonsuffix control condition.<ref>{{cite web |title=The suffix effect: How many positions are involved? |url=https://link.springer.com/content/pdf/10.3758/BF03197612.pdf |website=link.springer.com |accessdate=5 May 2020}}</ref> || "The suffix effect is the selective impairment in recall of the final items of a spoken list when the list is followed by a nominally irrelevant speech item, or suffix."<ref>{{cite web |title=Two-component theory of the suffix effect: Contrary evidence |url=https://link.springer.com/article/10.3758/BF03193586#:~:text=The%20suffix%20effect%20is%20the,irrelevant%20speech%20item%2C%20or%20suffix.&text=The%20entire%20suffix%20effect%20may,phenomenon%20arising%20from%20perceptual%20grouping. |website=link.springer.com |accessdate=16 July 2020}}</ref>
 +
|-
 +
| 1971 || Social bias || Concept development || The concept of {{w|actor–observer asymmetry}} (also actor–observer bias) is introduced by Jones and Nisbett. It explains the errors that one makes when forming attributions about the behavior of others.<ref>{{cite journal |last1=Malle |first1=BF |title=The actor-observer asymmetry in attribution: a (surprising) meta-analysis. |doi=10.1037/0033-2909.132.6.895 |pmid=17073526 |url=https://www.ncbi.nlm.nih.gov/pubmed/17073526}}</ref> || The {{w|actor–observer asymmetry}} "states that people tend to explain their own behavior with situation causes and other people's behavior with person causes".<ref>{{cite web |title=The actor-observer asymmetry in attribution: A (surprising) meta-analysis. |url=https://psycnet.apa.org/record/2006-20202-004 |website=psycnet.apa.org |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 1972 || || Concept development || The concept of {{w|cognitive bias}} is introduced in this year through the work of researchers {{w|Amos Tversky}} and {{w|Daniel Kahneman}}.<ref>{{cite web |title=Cognitive Bias: How Your Mind Plays Tricks on You and How to Overcome That at Work |url=https://zapier.com/blog/cognitive-bias/ |website=zapier.com |access-date=15 January 2021}}</ref> || Cognitive bias refers to "people's systematic but purportedly flawed patterns of responses to judgment and decision problems."<ref>{{cite web |title=Cognitive Bias |url=https://www.sciencedirect.com/topics/neuroscience/cognitive-bias |website=sciencedirect.com |access-date=16 January 2021}}</ref>
 +
|-
 +
| 1973 || Memory bias || Concept development || American academic {{w|Baruch Fischhoff}} attends a seminar where {{w|Paul E. Meehl}} states an observation that clinicians often overestimate their ability to have foreseen the outcome of a particular case, as they claim to have known it all along.<ref name="Fischhoff 2007">{{cite journal | last1 = Fischhoff | first1 = B | year = 2007 | title = An early history of hindsight research | url = | journal = Social Cognition | volume = 25 | issue = | pages = 10–13 | doi = 10.1521/soco.2007.25.1.10}}</ref> || "{{w|Hindsight bias}}, the tendency, upon learning an outcome of an event—such as an experiment, a sporting event, a military decision, or a political election—to overestimate one's ability to have foreseen the outcome."<ref>{{cite web |title=Hindsight bias |url=https://www.britannica.com/topic/hindsight-bias |website=Encyclopedia Britannica |access-date=27 January 2021 |language=en}}</ref>
 +
|-
 +
| 1973 || Belief, decision-making and behavioral ({{w|egocentric bias}}) || Concept development|| The {{w|illusion of validity}} bias is first described by {{w|Amos Tversky}} and {{w|Daniel Kahneman}} in their paper.<ref>{{cite web |title=Why are we overconfident in our predictions? |url=https://thedecisionlab.com/biases/illusion-of-validity/ |website=thedecisionlab.com |accessdate=10 April 2020}}</ref> || The {{w|illusion of validity}} occurs when an individual overestimates their ability to predict an outcome when analyzing a set of data - especially when the data appears to have a consistent pattern or appears to 'tell a story".<ref>{{cite web |title=Illusion Of Validity |url=https://www.alleydog.com/glossary/definition.php?term=Illusion+Of+Validity |website=alleydog.com |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 1973 || Memory bias || Concept development || The {{w|next-in-line effect}} is first studied experimentally by Malcolm Brenner. In his experiment the participants were each in turn reading a word aloud from an {{w|index card}}, and after 25 words were asked to [[w:Recall (memory)|recall]] as many of all the read words as possible. The results of the experiment show that words read aloud within approximately nine seconds before the subject's own turn are recalled worse than other words.<ref>{{Cite journal|last=Brenner|first=Malcolm|title=The next-in-line effect|journal=Journal of Verbal Learning and Verbal Behavior|language=en|volume=12|issue=3|pages=320–323|doi=10.1016/s0022-5371(73)80076-3|year=1973|url=https://deepblue.lib.umich.edu/bitstream/2027.42/33869/1/0000130.pdf}}</ref> || "Next-in-line effect. people not remembering what other people said because they were too busy rehearsing their own part."<ref>{{cite web |title=Memory Flashcards |url=https://quizlet.com/5788833/memory-flash-cards/ |website=Quizlet |access-date=27 January 2021 |language=en-gb}}</ref>
 +
|-
 +
| 1974 || Memory bias || Research || {{w|Elizabeth Loftus}} and John Palmer conduct a study to investigate the effects of language on the development of {{w|false memory}}.<ref name="Loftus1">{{cite journal |doi=10.1016/s0022-5371(74)80011-3 |title=Reconstruction of automobile destruction: An example of the interaction between language and memory |journal=Journal of Verbal Learning and Verbal Behavior |volume=13 |issue=5 |pages=585–589 |year=1974 |last1=Loftus |first1=Elizabeth F. |last2=Palmer |first2=John C. }}</ref> || "False memory refers to cases in which people remember events differently from the way they happened or, in the most dramatic case, remember events that never happened at all."<ref>{{cite web |title=False memory |url=http://www.scholarpedia.org/article/False_memory#:~:text=False%20memory%20refers%20to%20cases,memory%20in%20question%20is%20wrong. |website=scholarpedia.org |accessdate=14 August 2020}}</ref>
 +
|-
 +
| 1974 || Belief, decision-making and behavioral || Concept development || Anchoring is first described by Tversky and Kahneman.<ref name="One of the common">{{cite journal |last1=Ralph |first1=Kelcie |last2=Delbosc |first2=Alexa |title=I’m multimodal, aren’t you? How ego-centric anchoring biases experts’ perceptions of travel patterns |doi=10.1016/j.tra.2017.04.027 |url=https://www.sciencedirect.com/science/article/pii/S0965856417301751}}</ref> || "Anchoring bias occurs when people rely too much on pre-existing information or the first information they find when making decisions."<ref>{{cite web |title=Anchoring Bias - Definition, Overview and Examples |url=https://corporatefinanceinstitute.com/resources/knowledge/trading-investing/anchoring-bias/ |website=Corporate Finance Institute |access-date=27 January 2021}}</ref>
 +
|-
 +
| 1975 || Social ({{w|attribution bias}}) || Research || Miller and Ross conduct a study that is one of the earliest to assess not only {{w|self-serving bias}} but also the attributions for successes and failures within this theory.<ref>{{cite journal|last=Larson|first=James|author2=Rutger U |author3=Douglass Coll |title=Evidence for a self-serving bias in the attribution of causality|journal=Journal of Personality|volume=45|issue=3|pages=430–441|doi=10.1111/j.1467-6494.1977.tb00162.x |year=1977}}</ref> || {{w|Self-serving bias}} is the common habit of a person taking credit for positive events or outcomes, but blaming outside factors for negative events."<ref>{{cite web |title=What Is a Self-Serving Bias and What Are Some Examples of It? |url=https://www.healthline.com/health/self-serving-bias |website=healthline.com |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 1976 || Belief, decision-making and behavioral (logical fallacy) || Concept development || {{w|Escalation of commitment}} is first described by Barry M. Staw in his paper ''Knee deep in the big muddy: A study of escalating commitment to a chosen course of action''.<ref name=Staw1976>{{cite journal|last1=Staw|first1=Barry M.|title=Knee-deep in the big muddy: a study of escalating commitment to a chosen course of action|journal=Organizational Behavior and Human Performance|date=1976|volume=16|issue=1|pages=27–44|doi=10.1016/0030-5073(76)90005-2}}</ref> || {{w|Escalation of commitment}} "refers to the irrational behavior of investing additional resources in a failing project."<ref>{{cite web |title=Escalation of Commitment: Definition, Causes & Examples |url=https://bizfluent.com/13720599/escalation-of-commitment-definition-causes-examples |website=bizfluent.com |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 1976 || Social ({{w|attribution bias}}) || Research || Prior to Pettigrew's formalization of the {{w|ultimate attribution error}}, Birt Duncan finds that [[w:White people|White]] participants view [[w:Black people|Black]] individuals as more violent than White individuals in an "ambiguous shove" situation, where a Black or White person accidentally shoves a White person.<ref name="Duncan 1976 75–93">{{cite journal|last=Duncan|first= B. L.|title= Differential social perception and attribution if intergroup violence: Testing the lower limits of stereotyping of Blacks|journal= {{w|Journal of Personality and Social Psychology}}|year= 1976|volume= 34|issue= 4|pages= 75–93|doi= 10.1037/0022-3514.34.4.590|url= https://semanticscholar.org/paper/be311d0db3ad5857f7ff9587cb65cf1c590baa5c}}</ref> || "The tendency for persons from one group (the ingroup) to determine that any bad acts by members of an outgroup—for example, a racial or ethnic minority group—are caused by internal attributes or traits rather than by outside circumstances or situations, while viewing their positive behaviors as merely exceptions to the rule or the result of luck."<ref>{{cite web |title=APA Dictionary of Psychology |url=https://dictionary.apa.org/ultimate-attribution-error |website=dictionary.apa.org |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 1977 || Memory bias || Research || {{w|Misattribution of memory}}. Early research done by Brown and Kulik finds that flashbulb memories are similar to photographs because they can be described in accurate, vivid detail. In this study, participants describe their circumstances about the moment they learned of the assassination of President John F. Kennedy as well as other similar traumatic events. Participants are able to describe what they were doing, things around them, and other details.<ref>{{Cite journal|last=Brown, R., Kulik J.|date=1977|title=Flashbulb memories|url=|journal=Cognition|volume=5|pages=73–99|doi=10.1016/0010-0277(77)90018-X}}</ref> || {{w|Misattribution of memory}} occurs "when a memory is distorted because of the source, context, or our imagination."<ref>{{cite web |title=Misattribution Effect |url=https://sites.google.com/site/falsememory02/current-research/misattribution |website=sites.google.com |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 1977 || Social (egocentric bias) || Concept development || A study conducted by {{w|Lee Ross}} and colleagues provides early evidence for a {{w|cognitive bias}} called the [[w:False-consensus effect|false consensus effect]], which is the tendency for people to overestimate the extent to which others share the same views.<ref>{{Cite journal|title = The "false consensus effect": An egocentric bias in social perception and attribution processes|journal = Journal of Experimental Social Psychology|pages = 279–301|volume = 13|issue = 3|doi = 10.1016/0022-1031(77)90049-x|first = Lee|last = Ross|first2 = David|last2 = Greene|first3 = Pamela|last3 = House|year = 1977}}</ref> || The {{w|false-consensus effect}} "refers to the tendency to overestimate consensus for one′s attitudes and behaviors."<ref>{{cite journal |last1=Alicke |first1=Mark |last2=Largo |first2=Edward |title=The Role of Self in the False Consensus Effect |doi=10.1006/jesp.1995.1002 |url=https://www.sciencedirect.com/science/article/abs/pii/S0022103185710025}}</ref><ref>{{cite web |title=False Consensus Effect |url=http://psychology.iresearchnet.com/social-psychology/social-cognition/false-consensus-effect/ |website=psychology.iresearchnet.com |access-date=14 January 2021}}</ref> It is "the tendency to assume that one’s own opinions, beliefs, attributes, or behaviors are more widely shared than is actually the case."<ref>{{cite web |title=APA Dictionary of Psychology |url=https://dictionary.apa.org/false-consensus-effect |website=dictionary.apa.org |access-date=29 January 2021 |language=en}}</ref> 
 +
|-
 +
| 1977 || Belief, decision-making and behavioral (truthiness) || Concept development|| The {{w|illusory truth effect}} is first identified in a study at {{w|Villanova University}} and {{w|Temple University}}.<ref name="Hasher1977">{{cite journal|last1=Hasher |first1=Lynn |last2=Goldstein |first2=David |last3=Toppino |first3=Thomas |title=Frequency and the conference of referential validity |journal=Journal of Verbal Learning and Verbal Behavior |date=1977 |volume=16 |issue=1 |pages=107–112 |doi=10.1016/S0022-5371(77)80012-1 | |url=https://web.archive.org/web/20160515062305/http://www.psych.utoronto.ca/users/hasher/PDF/Frequency%20and%20the%20conference%20Hasher%20et%20al%201977.pdf}}</ref><ref name="PLOS ONE">{{cite journal|title=People with Easier to Pronounce Names Promote Truthiness of Claims|journal=PLOS ONE|volume=9|issue=2|pages=e88671|date=September 6, 2014 |doi=10.1371/journal.pone.0088671|pmid=24586368|pmc=3935838|last1=Newman|first1=Eryn J.|last2=Sanson|first2=Mevagh|last3=Miller|first3=Emily K.|last4=Quigley-Mcbride|first4=Adele|last5=Foster|first5=Jeffrey L.|last6=Bernstein|first6=Daniel M.|last7=Garry|first7=Maryanne}}</ref> || The {{w|illusory truth effect}} "occurs when repeating a statement increases the belief that it’s true even when the statement is actually false."<ref>{{cite web |title=Illusory Truth, Lies, and Political Propaganda: Part 1 |url=https://www.psychologytoday.com/us/blog/psych-unseen/202001/illusory-truth-lies-and-political-propaganda-part-1 |website=psychologytoday.com |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 1977 || Memory bias || Research || T. B. Rogers and colleagues publish the first research on the {{w|self-reference effect}}.<ref>{{cite web |title=Self-Reference Effect |url=http://psychology.iresearchnet.com/social-psychology/self/self-reference-effect/ |website=psychology.iresearchnet.com |access-date=12 January 2021}}</ref><ref>{{cite journal |last1=Bentley |first1=Sarah V. |last2=Greenaway |first2=Katharine H. |last3=Haslam |first3=S. Alexander |title=An online paradigm for exploring the self-reference effect |doi=10.1371/journal.pone.0176611 |url=https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0176611}}</ref> || "The self-reference effect refers to people’s tendency to better remember information when that information has been linked to the self than when it has not been linked to the self."<ref>{{cite web |title=Self-Reference Effect - IResearchNet |url=http://psychology.iresearchnet.com/social-psychology/self/self-reference-effect/ |website=Psychology |access-date=10 May 2021 |date=12 January 2016}}</ref>
 +
|-
 +
| 1978 || Memory bias || Research || Loftus, Miller, and Burns conduct the original {{w|misinformation effect}} study.<ref>{{cite journal |last1=Zaragoza |first1=Maria S. |last2=Belli |first2=Robert F. |last3=Payment |first3=Kristie E. |title=Misinformation Effectsand the Suggestibility of Eyewitness Memory}}</ref> || The {{w|misinformation effect}} "happens when a person's memory becomes less accurate due to information that happens after the event."<ref>{{cite web |title=What Is Misinformation Effect? |url=https://www.growthramp.io/articles/misinformation-effect |website=growthramp.io |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 1979 || Social ({{w|attribution bias}}) || Research || Thomas Nagel identifies four kinds of {{w|moral luck}} in his essay.<ref>{{cite journal |last1=Rudy Hiller |first1=Fernando |title=How to (dis)solve Nagel's paradox about moral luck and responsibility |doi=10.1590/0100-6045.2016.V39N1.FRH |url=http://www.scielo.br/scielo.php?script=sci_arttext&pid=S0100-60452016000100005}}</ref> || "{{w|Moral luck}} occurs when the features of action which generate a particular moral assessment lie significantly beyond the control of the agent who is so assessed."<ref>{{cite web |title=Moral Luck |url=https://philpapers.org/browse/moral-luck |website=philpapers.org |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 1979 || Social bias || Concept development || The {{w|ultimate attribution error}} is first established by Thomas F. Pettigrew in his publication ''The Ultimate Attribution Error: Extending Allport's Cognitive Analysis of Prejudice''.<ref name="Pettigrew (T.F)">{{cite journal|last=Pettigrew|first=T. F.|title=The ultimate attribution error: Extending Allport's cognitive analysis of prejudice|journal={{w|Personality and Social Psychology Bulletin}}|year=1979|volume=5|issue=4|pages=461–476|doi=10.1177/014616727900500407}}</ref> ||  "{{w|Ultimate attribution error}} refers to the tendency of individuals to make less internal attributions of negative behaviors committed by ingroup members compared to outgroup members."<ref>{{cite journal |last1=Fraser Pettigrew |first1=Thomas |title=The Ultimate Attribution Error: Extending Allport's Cognitive Analysis of Prejudice |doi=10.1177/014616727900500407 |url=https://www.researchgate.net/publication/248047301_The_Ultimate_Attribution_Error_Extending_Allport's_Cognitive_Analysis_of_Prejudice}}</ref>
 +
|-
 +
| 1979 || Social bias || Concept development || {{w|David Kahneman}} and {{w|Amos Tversky}} originally coin the term {{w|loss aversion}} in a landmark paper on subjective probability.<ref>{{cite web |title=Loss aversion |url=https://www.behavioraleconomics.com/resources/mini-encyclopedia-of-be/loss-aversion/ |website=behavioraleconomics.com |accessdate=14 August 2020}}</ref> || "{{w|Loss aversion}} is a cognitive bias that suggests that for individuals the pain of losing is psychologically twice as powerful as the pleasure of gaining."<ref>{{cite web |title=Why is the pain of losing felt twice as powerfully compared to equivalent gains? |url=https://thedecisionlab.com/biases/loss-aversion/ |website=thedecisionlab.com |accessdate=14 August 2020}}</ref> 
 +
|-
 +
| 1979 || Belief, decision-making and behavioral || Concept development || The {{w|planning fallacy}} is first proposed by {{w|Daniel Kahneman}} and {{w|Amos Tversky}}.<ref name="PezzoLitman2006">{{cite journal|last1=Pezzo|first1=Mark V.|last2=Litman|first2=Jordan A.|last3=Pezzo|first3=Stephanie P.|title=On the distinction between yuppies and hippies: Individual differences in prediction biases for planning future tasks |journal=Personality and Individual Differences|volume=41|issue=7|year=2006|pages=1359–1371|issn=0191-8869|doi=10.1016/j.paid.2006.03.029}}</ref><ref>{{cite journal|last1=Kahneman|first1=Daniel|last2=Tversky|first2=Amos|date=1977|title=Intuitive prediction: Biases and corrective procedures|url=http://www.dtic.mil/dtic/tr/fulltext/u2/a047747.pdf}} Decision Research Technical Report PTR-1042-77-6. In {{cite book|title=Judgment Under Uncertainty: Heuristics and Biases|journal=Science|volume=185|issue=4157|last1=Kahneman|first1=Daniel|last2=Tversky|first2=Amos|year=1982|isbn=978-0511809477|editor-last1=Kahneman|editor-first1=Daniel|pages=414–421|chapter=Intuitive prediction: Biases and corrective procedures|doi=10.1017/CBO9780511809477.031|pmid=17835457|editor-last2=Slovic|editor-first2=Paul|editor-last3=Tversky|editor-first3=Amos}}</ref> || "The {{w|planning fallacy}} refers to a prediction phenomenon, all too familiar to many, wherein people underestimate the time it will take to complete a future task, despite knowledge that previous tasks have generally taken longer than planned"<ref>{{cite journal |last1=Buehler |first1=Roger |last2=Griffin |first2=Dale |last3=Peetz |first3=Johanna |title=Chapter One - The Planning Fallacy: Cognitive, Motivational, and Social Origins |doi=10.1016/S0065-2601(10)43001-4 |url=https://www.sciencedirect.com/science/article/pii/S0065260110430014}}</ref>
 +
|-
 +
| 1980 || Memory bias || Concept development || The term "egocentric bias" is first coined by {{w|Anthony Greenwald}}, a psychologist at {{w|Ohio State University}}.<ref name=":1">{{Cite news|url=https://www.nytimes.com/1984/06/12/science/a-bias-puts-self-at-center-of-everything.html|title=A bias puts self at center of everything|last=Goleman|first=Daniel|date=1984-06-12|newspaper=The New York Times|access-date=2016-12-09}}</ref> || "The {{w|egocentric bias}} is a cognitive bias that causes people to rely too heavily on their own point of view when they examine events in their life or when they try to see things from other people’s perspective."<ref>{{cite web |title=The Egocentric Bias: Why It’s Hard to See Things from a Different Perspective |url=https://effectiviology.com/egocentric-bias/ |website=effectiviology.com |accessdate=16 July 2020}}</ref>
 +
|-
 +
| 1980 || Social bias || Concept development || Ruth Hamill, Richard E. Nisbett, and Timothy DeCamp Wilson become the first to study the first type of {{w|group attribution error}} in detail in their paper ''Insensitivity to Sample Bias: Generalizing From Atypical Cases.''<ref name=":04">{{cite journal|last1=Hamill|first1=Ruth|last2=Wilson|first2=Timothy D.|last3=Nisbett|first3=Richard E.|date=1980|title=Insensitivity to sample bias: Generalizing from atypical cases|journal=Journal of Personality and Social Psychology|volume=39|issue=4|pages=578–589|doi=10.1037/0022-3514.39.4.578|url=https://web.archive.org/web/20160511145714/https://deepblue.lib.umich.edu/bitstream/handle/2027.42/92179/InsensitivityToSampleBias.pdf|archivedate=2016-05-11}}</ref> || {{w|Group attribution error}} is "the tendency for perceivers to assume that a specific group member’s personal characteristics and preferences, including beliefs, attitudes, and decisions, are similar to those of the group to which he or she belongs."<ref>{{cite web |title=group attribution error |url=https://dictionary.apa.org/group-attribution-error |website=dictionary.apa.org |accessdate=14 August 2020}}</ref>
 +
|-
 +
| 1980 || Belief, decision-making and behavioral ({{w|truthiness}}) || Concept development || The term ''{{w|subjective validation}}'' first appears in the book ''{{w|The Psychology of the Psychic}}'' by {{w|David F. Marks}} and Richard Kammann.<ref>{{cite book|last1=Frazier|first1=Kendrick|title=Science Confronts the Paranormal|date=1986|publisher=Prometheus Books|isbn=|page=101}}</ref> || {{w|Subjective validation}} "causes an individual to consider a statement or another piece of information correct if it has any significance or personal meaning (validating their previous opinion) to them."<ref>{{cite web |title=Subjective Validation |url=https://www.alleydog.com/glossary/definition.php?term=Subjective+Validation |website=alleydog.com |accessdate=14 August 2020}}</ref>
 +
|-
 +
| 1980 || Belief, decision-making and behavioral || Concept development || The phenomenon of {{w|optimism bias}} is initially described by Weinstein, who finds that the majority of college students believe that their chances of developing a drinking problem or getting divorced are lower than their peers'.<ref>{{cite web |title=Understanding the Optimism Bias |url=https://www.verywellmind.com/what-is-the-optimism-bias-2795031 |website=verywellmind.com |access-date=15 January 2021}}</ref> || "Optimism Bias refers to the tendency for individuals to underestimate their probability of experiencing adverse effects despite the obvious."<ref>{{cite web |title=Optimism Bias - Biases & Heuristics |url=https://thedecisionlab.com/biases/optimism-bias/ |website=The Decision Lab |access-date=28 January 2021 |language=en-CA}}</ref>
 +
|-
 +
| 1981 || Social bias || Research || Tversky and Kahneman conduct a demonstration of the [[w:Framing effect (psychology)|framing effect]].<ref name="Framing">{{cite web |title=Framing Effect - an overview {{!}} ScienceDirect Topics |url=https://www.sciencedirect.com/topics/psychology/framing-effect |website=www.sciencedirect.com |access-date=29 January 2021}}</ref> || "The Framing effect is the principle that our choices are influenced by the way they are framed through different wordings, settings, and situations."<ref>{{cite web |title=Why do our decisions depend on how options are presented to us? |url=https://thedecisionlab.com/biases/framing-effect/ |website=thedecisionlab.com |access-date=16 January 2021}}</ref> 
 +
|-
 +
| 1981 || Belief, decision-making and behavioral ({{w|prospect theory}}) || Concept development || The {{w|pseudocertainty effect}} is illustrated by {{w|Daniel Kahneman}}.<ref>{{cite journal |last1=Tversky |first1=A |last2=Kahneman |first2=D |title=The framing of decisions and the psychology of choice |journal=Science |date=30 January 1981 |volume=211 |issue=4481 |pages=453–458 |doi=10.1126/SCIENCE.7455683}}</ref> || "{{w|Pseudocertainty effect}} refers to people's tendency to make risk-averse choices if the expected outcome is positive, but make risk-seeking choices to avoid negative outcomes."<ref>{{cite web |title=Pseudocertainty effect |url=https://www.wiwi.europa-uni.de/de/lehrstuhl/fine/mikro/bilder_und_pdf-dateien/WS0910/VLBehEconomics/Ausarbeitungen/PseudocertaintyEeffect.doc#:~:text=Pseudocertainty%20effect%20refers%20to%20people's,choices%20to%20avoid%20negative%20outcomes.&text=Expounding%20on%20theories%20of%20decision,effects%20of%20certainty%20and%20pseudocertainty. |website=wiwi.europa-uni.de |accessdate=14 August 2020}}</ref>
 +
|-
 +
| 1982 || Social (egocentric bias) || Research || {{w|Trait ascription bias}}. In a study involving fifty-six undergraduate psychology students from the University of Bielefeld, Kammer et al.  demonstrate  that subjects rate their own variability on each of 20 trait terms to be considerably higher than their peers'.<ref name=kammer>{{cite journal |last=Kammer |first=D. |year=1982 |title=Differences in trait ascriptions to self and friend: Unconfounding intensity from variability |journal=Psychological Reports |volume=51 |issue=1 |pages=99–102 |doi=10.2466/pr0.1982.51.1.99 }}</ref> || "{{w|Trait ascription bias}} is the belief that other people's behavior and reactions are generally predictable while you yourself are more unpredictable."<ref>{{cite web |title=Trait Ascription Bias |url=https://www.alleydog.com/glossary/definition.php?term=Trait+Ascription+Bias |website=alleydog.com |accessdate=14 August 2020}}</ref>
 +
|-
 +
| 1982 || Belief, decision-making and behavioral ({{w|framing effect}}) || Research || The {{w|decoy effect}} is first demonstrated by Joel Huber and others at {{w|Duke University}}. The effect explains how when a customer is hesitating between two options, presenting them with a third “asymmetrically dominated” option that acts as a decoy will strongly influence which decision they make.<ref name="tactics.convertize.com">{{cite web |title=Decoy Effect definition |url=https://tactics.convertize.com/definitions/decoy-effect |website=tactics.convertize.com |access-date=14 January 2021}}</ref> || "The {{w|decoy effect}} is defined as the phenomenon whereby consumers change their preference between two options when presented with a third option."<ref>{{cite web |last1=Mortimer |first1=Gary |title=The decoy effect: how you are influenced to choose without really knowing it |url=https://theconversation.com/the-decoy-effect-how-you-are-influenced-to-choose-without-really-knowing-it-111259#:~:text=The%20decoy%20effect%20is%20defined,or%20%E2%80%9Casymmetric%20dominance%20effect%E2%80%9D. |website=The Conversation |access-date=29 January 2021 |language=en}}</ref>
 +
|-
 +
| 1983 || Social ({{w|egocentric bias}}) || Concept development || Sociologist W. Phillips Davison first articulates the {{w|third-person effect}} hypothesis.<ref>{{cite journal |title=Third-Person Effect |journal=Encyclopedia of Survey Research Methods |date=2008 |doi=10.4135/9781412963947.n582}}</ref><ref>{{cite web |last1=Conners |first1=Joan L. |title=Understanding the Third-Person Effect |url=http://cscc.scu.edu/trends/v24/v24_2.pdf}}</ref> || {{w|Third-person effect}} refers to "the commonly held belief that other people are more affected, due to personal prejudices, by mass media than you yourself are. This view, largely due to a personal conceit, is caused by the self-concept of being more astute and aware than others, or of being less vulnerable to persuasion than others."<ref>{{cite web |title=Third-Person Effect |url=https://www.alleydog.com/glossary/definition.php?term=Third-Person+Effect |website=alleydog.com |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 1983 || Social (conformity bias) || Research || Jones reports the presence of {{w|courtesy bias}} in Asian cultures.<ref name="dssa">{{cite book |last1=Hakim |first1=Catherine |title=Models of the Family in Modern Societies: Ideals and Realities: Ideals and Realities |url=https://books.google.com.ar/books?id=e-pGDwAAQBAJ&dq=%22Courtesy+bias+is+a%22&source=gbs_navlinks_s}}</ref> || "{{w|Courtesy bias}} is the tendency that some individuals have of not fully stating their unhappiness with a service or product because of a desire not to offend the person or organization that they are responding to."<ref>{{cite web |title=Courtesy Bias |url=https://www.alleydog.com/glossary/definition.php?term=Courtesy+Bias#:~:text=Courtesy%20bias%20is%20the%20tendency,that%20they%20are%20responding%20to. |website=alleydog.com |accessdate=14 August 2020}}</ref>
 +
|-
 +
| 1985 || Belief, decision-making and behavioral (prospect theory) || Concept development || The {{w|disposition effect}} anomaly is identified and named by Hersh Shefrin and Meir Statman, who note that "people dislike incurring losses much more than they enjoy making gains, and people are willing to gamble in the domain of losses." Consequently, "investors will hold onto stocks that have lost value...and will be eager to sell stocks that have risen in value." The researchers coin the term "disposition effect" to describe this tendency of holding on to losing stocks too long and to sell off well-performing stocks too readily.<ref name="Behavioural Finance">{{cite web|title=Disposition Effect|website=Behavioural Finance|accessdate=11 January 2017|url=https://web.archive.org/web/20170324030730/http://disposition-effect.behaviouralfinance.net/}}</ref> || "The {{w|disposition effect}} refers to investors’ reluctance to sell assets that have lost value and greater likelihood of selling assets that have made gains."<ref>{{cite web |title=Disposition effect |url=https://www.behavioraleconomics.com/resources/mini-encyclopedia-of-be/disposition-effect/ |website=behavioraleconomics.com |accessdate=16 July 2020}}</ref>
 +
|-
 +
| 1985 || Belief, decision-making and behavioral (logical fallacy) || Concept development || The {{w|hot-hand fallacy}} is first described in a paper by {{w|Amos Tversky}}, {{w|Thomas Gilovich}}, and Robert Vallone.<ref>{{cite web |last1=US |first1=Joshua Miller,Adam Sanjurjo,The Conversation |title=Momentum Isn&rsquo;t Magic&mdash;Vindicating the Hot Hand with the Mathematics of Streaks |url=https://www.scientificamerican.com/article/momentum-isnt-magic-vindicating-the-hot-hand-with-the-mathematics-of-streaks/ |website=Scientific American |access-date=16 June 2021 |language=en}}</ref> || "The {{w|hot-hand fallacy}} effect refers to the tendency for people to expect streaks in sports performance to continue."<ref>{{cite web |title=Hot Hand Effect |url=http://psychology.iresearchnet.com/social-psychology/decision-making/hot-hand-effect/ |website=psychology.iresearchnet.com |accessdate=16 July 2020}}</ref>
 +
|-
 +
| 1986 || Memory bias || Research || McDaniel and Einstein describe the ''bizarreness effect'' as the finding that people have superior memory for bizarre sentences relative to common ones.<ref>{{cite web |last1=Geraci |first1=Lisa |last2=McDaniel |first2=Mark A. |last3=Miller |first3=Tyler M. |last4=Hughes |first4=Matthew L. |title=The bizarreness effect: evidence for the critical influence of retrieval processes |url=https://link.springer.com/article/10.3758/s13421-013-0335-4 |website=Memory & Cognition |pages=1228–1237 |language=en |doi=10.3758/s13421-013-0335-4 |date=2013-11-01}}</ref> However, the researchers argue that bizarreness intrinsically does not enhance memory in their paper.<ref>{{cite journal |last1=Iaccino |first1=J. F. |last2=Sowa |first2=S. J. |date=February 1989 |title=Bizarre imagery in paired-associate learning: an effective mnemonic aid with mixed context, delayed testing, and self-paced conditions |volume=68 |issue=1 |pages=307–16 |pmid=2928063 |doi=10.2466/pms.1989.68.1.307 |journal=Percept mot Skills}}</ref><ref>{{cite web |title=The imagery bizarreness effect as a function of sentence complexity and presentation time |url=https://link.springer.com/content/pdf/10.3758/BF03334758.pdf |website=link.springer.com |access-date=18 June 2021}}</ref> || "The {{w|bizarreness effect}} holds that items associated with bizarre sentences or phrases are more readily recalled than those associated with common sentences or phrases."<ref>{{cite web |title=Bizarreness effect |url=https://www.britannica.com/topic/bizarreness-effect |website=britannica.com |accessdate=16 July 2020}}</ref>
 +
|-
 +
| 1988 || Social || Concept development || The {{w|Reactive devaluation}} bias is proposed by {{w|Lee Ross}} and Constance Stillinger.<ref name=RossStillinger1988>Lee Ross, Constance A. Stillinger, "Psychological barriers to conflict resolution", Stanford Center on Conflict and Negotiation, Stanford University, 1988, [https://books.google.com/books?id=R2QrAQAAIAAJ&focus=searchwithinvolume&q=reactive p. 4]</ref> || "Reactive Devaluation is tendency to value the proposal of someone we recognized as an antagonist as being less interesting than if it was made by someone else."<ref>{{cite web |title=Why we often tend to devalue proposals made by people who we consider to be adversaries |url=https://thedecisionlab.com/biases/reactive-devaluation/ |website=thedecisionlab.com |accessdate=22 September 2020}}</ref>
 +
|-
 +
| 1988 || Belief, decision-making and behavioral ({{w|prospect theory}}) || Research || {{w|Samuelson}} and {{w|Zeckhauser}} demonstrate {{w|status quo bias}} using a questionnaire in which subjects faced a series of decision problems, which were alternately framed to be with and without a pre-existing status quo position. Subjects tended to remain with the status quo when such a position was offered to them.<ref name=Samuelson>{{cite journal | last1 = Samuelson | first1 = W. | last2 = Zeckhauser | first2 = R. | year = 1988 | title = Status quo bias in decision making | url = | journal = Journal of Risk and Uncertainty | volume = 1 | issue = | pages = 7–59 | doi=10.1007/bf00055564}}</ref> || "Status quo bias refers to the phenomenon of preferring that one's environment and situation remain as they already are."<ref>{{cite web |title=Status Quo Bias: What It Means and How It Affects Your Behavior |url=https://www.thoughtco.com/status-quo-bias-4172981 |website=thoughtco.com |accessdate=22 September 2020}}</ref>
 +
|-
 +
| 1989 || Belief, decision-making and behavioral || Concept development|| The term "{{w|curse of knowledge}}" is coined in a ''{{w|Journal of Political Economy}}'' article by economists {{w|Colin Camerer}}, {{w|George Loewenstein}}, and Martin Weber. || The curse of knowledge causes people to fail to account for the fact that others don't know the same things that they do.<ref>{{cite web |title=The Curse of Knowledge: What It Is and How to Account for It |url=https://effectiviology.com/curse-of-knowledge/ |website=effectiviology.com |accessdate=6 May 2020}}</ref>
 +
|-
 +
| 1990 || Belief, decision-making and behavioral ({{w|prospect theory}}) || Research || Kahneman, Knetsch and Thaler publish a paper containing the first experimental test of the {{w|Endowment Effect}}.<ref name="Atladóttir">{{cite journal |last1=Atladóttir |first1=Kristín |title=The Endowment Effect and other biases in creative goods transactions |url=https://skemman.is/bitstream/1946/8659/1/20.The_Endowment_Effect_Kristin.pdf |issn=1670-8288}}</ref> || It refers to an emotional bias that causes individuals to value an owned object higher, often irrationally, than its market value.
 +
|-
 +
| 1990 || Belief, decision-making and behavioral ({{w|confirmation bias}}) || Concept development || The phenomenon known as “satisfaction of search” is first described, in which a radiologist fails to detect a second abnormality, apparently because of prematurely ceasing to search the images after detecting a “satisfying” finding.<ref>{{cite journal |last1=Bruno |first1=Michael A. |title=256 Shades of gray: uncertainty and diagnostic error in radiology |doi=10.1515/dx-2017-0006 |url=https://www.degruyter.com/view/journals/dx/4/3/article-p149.xml?language=en}}</ref> || "Satisfaction of search describes a situation in which the detection of one radiographic abnormality interferes with that of others."<ref>{{cite journal |last1=Ashman |first1=C. J. |last2=Yu |first2=J. S. |last3=Wolfman |first3=D. |title=Satisfaction of search in osteoradiology |journal=AJR. American journal of roentgenology |date=August 2000 |volume=175 |issue=2 |pages=541–544 |doi=10.2214/ajr.175.2.1750541 |url=https://pubmed.ncbi.nlm.nih.gov/10915712/ |access-date=27 January 2021 |issn=0361-803X}}</ref>
 +
|-
 +
| 1990 || || Literature || Jean-Paul Caverni, Jean-Marc Fabre and Michel Gonzalez publish ''Cognitive Biases''.<ref>{{cite web |title=Cognitive biases |url=https://catalog.library.vanderbilt.edu/discovery/fulldisplay/alma991024853679703276/01VAN_INST:vanui |website=catalog.library.vanderbilt.edu |access-date=25 July 2021 |language=en}}</ref> ||
 +
|-
 +
| 1991 || Social (egocentric bias) || Concept development || The term {{w|illusory superiority}} is first used by the researchers Van Yperen and Buunk.<ref>{{cite web |title=Self-Enhancement and Superiority Biases in Social Comparison |url=https://www.researchgate.net/publication/247505886_Self-Enhancement_and_Superiority_Biases_in_Social_Comparison |website=researchgate.net |accessdate=14 August 2020}}</ref> || {{w|Illusory superiority}} "indicates an individual who has a belief that they are somehow inherently superior to others".<ref>{{cite web |title=Illusory Superiority |url=https://www.alleydog.com/glossary/definition.php?term=Illusory+Superiority |website=alleydog.com |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 1991 || Social (conformity bias) || Research || Marín and Marín report {{w|courtesy bias}} to be common in Hispanic cultures.<ref name="dssa"/> || The "{{w|Courtesy Bias}} is the reluctance of an individual to give negative feedback for fear of offending."<ref>{{cite web |title=The Courtesy Bias |url=https://smallbusinessforum.co/the-courtesy-bias-f0a016d82b09 |website=smallbusinessforum.co |accessdate=14 August 2020}}</ref>
 +
|-
 +
| 1994 || Belief, decision-making and behavioral || Concept development || The {{w|Women are wonderful effect}} term is coined by researchers {{w|Alice Eagly}} and {{w|Antonio Mladinic}} in a paper, where they question the widely-held view that there was prejudice against women.<ref>{{cite web |title=“Women Are Wonderful” Effect |url=https://www.scribd.com/document/274926319/Women-Are-Wonderful-Effect |website=scribd.com |accessdate=10 April 2020}}</ref> || "The {{w|women are wonderful effect}} is a phenomenon found in psychological research in which people associate more positive attributes with women as compared to men."<ref>{{cite web |title=“women are wonderful” effect |url=https://crazyfacts.com/the-women-are-wonderful-effect-is-a-phenomenon-found-in-psychological-research/ |website=crazyfacts.com |accessdate=18 July 2020}}</ref>
 +
|-
 +
| 1994 || Belief, decision-making and behavioral (logical fallacy) || Research || Research by Fox, Rogers, and Tversky provides evidence of the {{w|subadditivity effect}} in expert judgment, after having investigated 32 professional options traders.<ref name="Support theo">{{cite journal |last1=Tversky |first1=Amos |last2=Koehler |first2=Derek J. |title=Support theory: A nonextensional representation of subjective probability. |journal=Psychological Review |date=October 1994 |volume=101 |issue=4 |pages=547–567 |doi=10.1037/0033-295X.101.4.547}}</ref> || The {{w|subadditivity effect}} is "the tendency to judge probability of the whole to be less than the probabilities of the parts".<ref>{{cite web |title=Today's term from psychology is Subadditivity Effect. |url=https://steemit.com/life/@jevh/today-s-term-from-psychology-is-subadditivity-effect |website=steemit.com |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 1995 || || Concept development || The {{w|implicit bias}} is first described in a publication by Tony Greenwald and {{w|Mahzarin Banaji}}.<ref>{{cite web |title=PROJECT IMPLICIT LECTURES AND WORKSHOPS |url=https://www.projectimplicit.net/lectures.html |website=projectimplicit.net |accessdate=12 March 2020}}</ref> || "Research on {{w|implicit bias}} suggests that people can act on the basis of prejudice and stereotypes without intending to do so."<ref>{{cite web |title=Implicit Bias |url=https://plato.stanford.edu/entries/implicit-bias/ |website=plato.stanford.edu |accessdate=8 May 2020}}</ref>
 +
|-
 +
| 1996 || || Research || {{w|Daniel Kahneman}} and {{w|Amos Tversky}} argue that cognitive biases have efficient practical implications for areas including clinical judgment, entrepreneurship, finance, and management.<ref>{{cite journal|author1=Kahneman, D. |author2=Tversky, A.  |last-author-amp=yes |title=On the reality of cognitive illusions|journal=Psychological Review|year=1996|volume=103|issue=3|pages=582–591|doi=10.1037/0033-295X.103.3.582|pmid=8759048|url=http://psy.ucsd.edu/%7Emckenzie/KahnemanTversky1996PsychRev.pdf}}</ref><ref name="S.X. Zhang and J. Cueto 2015">{{cite journal |author1=S.X. Zhang |author2=J. Cueto |title=The Study of Bias in Entrepreneurship |journal= Entrepreneurship Theory and Practice |volume=41 |issue=3 |pages=419–454 |doi= 10.1111/etap.12212  |year=2015 }}</ref> ||
 +
|-
 +
| 1998 || Belief, decision-making and behavioral || Research || Gilbert et al. report on the presence of {{w|impact bias}} in registered voters.<ref>{{cite journal |last1=Medway |first1=Dominic |last2=Foos |first2=Adrienne |last3=Goatman |first3=Anna |title=Impact bias in student evaluations of higher education |journal=Studies in Higher Education |doi=10.1080/03075079.2015.1071345 |url=https://www.tandfonline.com/doi/full/10.1080/03075079.2015.1071345 |accessdate=7 May 2020}}</ref> || "{{w|Impact bias}} refers to a human tendency to overestimate emotional responses to events and experiences."<ref>{{cite journal |last1=Medway |first1=Dominic |last2=Foos |first2=Adrienne |last3=Goatman |first3=Anna |title=Impact bias in student evaluations of higher education |journal=Studies in Higher Education |doi=10.1080/03075079.2015.1071345 |url=https://www.tandfonline.com/doi/full/10.1080/03075079.2015.1071345 |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 1998 || || Concept development || The {{w|implicit-association test}} is introduced in the scientific literature by {{w|Anthony Greenwald}}, Debbie McGhee, and Jordan Schwartz.<ref name = "Greenwald 1998">{{Citation | title = Measuring Individual Differences in Implicit Cognition: The Implicit Association Test | year = 1998 | journal = Journal of Personality and Social Psychology | pages = 1464–1480 | volume = 74 | issue = 6 | last1 = Greenwald| first1 =  Anthony G. | last2 =  McGhee | first2 =  Debbie E. | last3 =  Schwartz | first3 =  Jordan L.K. | doi=10.1037/0022-3514.74.6.1464 | pmid=9654756}}</ref> It is a research method able to provide a range of new possibilities for those looking to conduct research exploring attitudes and beliefs.<ref>{{cite web |title=The Implicit Association Test (IAT) - iMotions |url=https://imotions.com/blog/implicit-association-test/ |website=Imotions Publish |access-date=17 May 2021 |language=en |date=15 December 2020}}</ref> || "The {{w|implicit-association test}} is a flexible task designed to tap automatic associations between concepts (e.g., math and arts) and attributes (e.g., good or bad, male or female, self or other)."<ref>{{cite web |title=Implicit Association Test |url=https://www.projectimplicit.net/nosek/iat/#:~:text=The%20Implicit%20Association%20Test%20is,female%2C%20self%20or%20other). |website=www.projectimplicit.net |access-date=17 May 2021}}</ref>
 +
|-
 +
| 1998 || Belief, decision-making and behavioral ({{w|extension neglect}}) || Concept development || Hsee discovers a less-is-better effect in three contexts: "(1) a person giving a $45 scarf (from scarves ranging from $5-$50) as a gift was perceived to be more generous than one giving a $55 coat (from coats ranging from $50-$500); (2) an overfilled ice cream serving with 7 oz of ice cream was valued more than an underfilled serving with 8 oz of ice cream; (3) a dinnerware set with 24 intact pieces was judged more favourably than one with 31 intact pieces (including the same 24) plus a few broken ones."<ref name="hsee">{{cite journal|last=Hsee|first=Christopher K.|title=Less Is Better: When Low-value Options Are Valued More Highly than High-value Options|journal=Journal of Behavioral Decision Making|year=1998|volume=11|issue=2|pages=107–121|doi=10.1002/(SICI)1099-0771(199806)11:2<107::AID-BDM292>3.0.CO;2-Y |url=http://faculty.chicagobooth.edu/christopher.hsee/vita/papers/LessIsBetter.pdf}}</ref> || "The {{w|less-is-better effect}} is the tendency to prefer the smaller or the lesser alternative when choosing individually, but not when evaluating together."<ref>{{cite web |title=Why we prefer the smaller or the lesser alternative |url=https://thedecisionlab.com/biases/less-is-better-effect/ |website=thedecisionlab.com |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 1999 || Belief, decision-making and behavioral || Concept development || The psychological phenomenon of illusory superiority known as {{w|Dunning–Kruger effect}} is identified as a form of cognitive bias in Kruger and Dunning's 1999 study, ''Unskilled and Unaware of It: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments''.<ref name="Kruger">{{cite journal |last=Kruger |first=Justin |last2=Dunning |first2=David |date=1999 |title=Unskilled and Unaware of It: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments |journal={{w|Journal of Personality and Social Psychology}} |volume=77 |issue=6 |pages=1121–1134|doi=10.1037/0022-3514.77.6.1121 |pmid=10626367}}</ref> || "The Dunning-Kruger effect is a cognitive bias in which people wrongly overestimate their knowledge or ability in a specific area."<ref>{{cite web |title=Dunning-Kruger Effect |url=https://www.psychologytoday.com/intl/basics/dunning-kruger-effect |website=psychologytoday.com |accessdate=14 August 2020}}</ref>
 +
|-
 +
| 1999 || Memory bias || Concept development || The term "{{w|spotlight effect}}" is coined by {{w|Thomas Gilovich}} and Kenneth Savitsky.<ref name=":0">{{Cite journal |pmid = 10707330|year = 2000|last1 = Gilovich|first1 = T.|title = The spotlight effect in social judgment: An egocentric bias in estimates of the salience of one's own actions and appearance|journal = Journal of Personality and Social Psychology|volume = 78|issue = 2|pages = 211–222|last2 = Medvec|first2 = V. H.|last3 = Savitsky|first3 = K.|doi = 10.1037//0022-3514.78.2.211|url=https://web.archive.org/web/20131030215508/http://www.psych.cornell.edu/sites/default/files/Gilo.Medvec.Sav_.pdf}}</ref>  The phenomenon first appears in the world of psychology in the journal ''{{w|Current Directions in Psychological Science}}''. || "The {{w|spotlight effect}} refers to the tendency to think that more people notice something about you than they do."<ref>{{cite web |title=The Spotlight Effect |url=https://www.psychologytoday.com/us/blog/the-big-questions/201111/the-spotlight-effect |website=psychologytoday.com |accessdate=14 August 2020}}</ref>
 +
|-
 +
| 1999 || Social ({{w|egocentric bias}}) || Concept development || Kruger and Gilovich publish study titled ''Naive cynicism in everyday theories of responsibility assessment: On biased assumptions of bias'', which formally introduces the concept of {{w|naïve cynicism}}.<ref name="Kruger 1999">{{cite journal|last1=Kruger|first1=Justin|last2=Gilovich|first2=Thomas|title='Naive cynicism' in everyday theories of responsibility assessment: On biased assumptions of bias.|journal=Journal of Personality and Social Psychology|date=1999|volume=76|issue=5|pages=743–753|doi=10.1037/0022-3514.76.5.743}}</ref> || {{w|Naïve cynicism}} is "the tendency of laypeople to expect other people’s judgments will have a motivational basis and therefore will be biased in the direction of their self-interest."<ref>{{cite web |title=Naive Cynicism |url=http://psychology.iresearchnet.com/social-psychology/decision-making/naive-cynicism/ |website=psychology.iresearchnet.com |accessdate=16 July 2020}}</ref>
 +
|-
 +
| 2002 || Belief, decision-making and behavioral || Concept development || {{w|Daniel Kahneman}} and {{w|Shane Frederick}} propose the process of {{w|attribute substitution}}.<ref name="revisited">{{cite book |last= Kahneman |first=Daniel |first2=Shane |last2=Frederick  |title=Heuristics and Biases: The Psychology of Intuitive Judgment |editor=Thomas Gilovich |editor2=Dale Griffin |editor3=Daniel Kahneman |publisher =Cambridge University Press |location=Cambridge |year=2002 |pages=49–81 |chapter=Representativeness Revisited: Attribute Substitution in Intuitive Judgment |isbn=978-0-521-79679-8}}</ref> || "{{w|Attribute substitution}} occurs when an individual has to make a judgment (of a target attribute) that is computationally complex, and instead substitutes a more easily calculated heuristic attribute."<ref>{{cite web |title=Attribute substitution- a quick guide |url=https://biasandbelief.wordpress.com/2009/06/01/attribute-substitution/ |website=biasandbelief.wordpress.com |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 2001 ||  Belief, decision-making and behavioral ([[w:framing effect (psychology)|framing effect]])  || Research || Druckman shows that economic policies receive higher support when framed in terms of the employment rates rather than unemployment rates.<ref name="Mediumsss">{{cite web |last1=Gearon |first1=Michael |title=Cognitive Biases — Framing effect |url=https://michaelgearon.medium.com/cognitive-biases-framing-effect-e37d45012dc5 |website=Medium |access-date=6 March 2021 |language=en |date=12 February 2019}}</ref> || "The {{w|Framing Effect}} is a cognitive bias that explains how we react differently to things depending on how they are presented to us."<ref>{{cite web |title=Definition |url=https://tactics.convertize.com/definitions/framing-effect |website=tactics.convertize.com |access-date=6 March 2021 |language=en}}</ref>
 +
|-
 +
| 2002 || Social ({{w|egocentric bias}}) || Concept development || Pronin et al. introduce the concept of "{{w|bias blind spot}}".<ref name=dfds>{{cite journal |last1=Pronin |first1=Emily |last2=Lin |first2=Daniel Y. |last3=Ross |first3=Lee |title=The Bias Blind Spot: Perceptions of Bias in Self Versus Others |doi=10.1177/0146167202286008 |url=https://www.researchgate.net/publication/241096502_The_Bias_Blind_Spot_Perceptions_of_Bias_in_Self_Versus_Others#:~:text=(2002)%20call%20the%20%22bias,biases%20in%20their%20own%20thinking.}}</ref> || Bias blind spot "refers to the tendency for people to be able to identify distortionary biases in others, while being ignorant of and susceptible to precisely these biases in their own thinking."<ref name=dfds/>
 +
|-
 +
| 2002 || || Research || {{w|Bystander effect}}. Research indicates that priming a social context may inhibit helping behavior. Imagining being around one other person or being around a group of people can affect a person's willingness to help.<ref>{{cite journal | last1 = Garcia | first1 = S.M. | last2 = Weaver | first2 = K. | last3 = Darley | first3 = J.M. | last4 = Moskowitz | first4 = G.B. | year = 2002 | title = Crowded minds: the implicit bystander effect | url = | journal = Journal of Personality and Social Psychology | volume = 83 | issue = 4| pages = 843–853 | doi=10.1037/0022-3514.83.4.843| pmid = 12374439 }}</ref> || "The bystander effect occurs when the presence of others discourages an individual from intervening in an emergency situation."<ref>{{cite web |title=Bystander Effect |url=https://www.psychologytoday.com/intl/basics/bystander-effect |website=psychologytoday.com |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 2002 || Belief, decision-making and behavioral ({{w|prospect theory}}) || Recognition || {{w|Daniel Kahneman}} is awarded the {{w|Nobel Memorial Prize in Economic Sciences}} for his work on {{w|prospect theory}}. He is the first non-economist by profession to win the prize.<ref>{{cite web |title=Kahneman receives Nobel Prize at ceremony |url=https://www.princeton.edu/news/2002/12/10/kahneman-receives-nobel-prize-ceremony |website=Princeton University |access-date=16 June 2021 |language=en}}</ref><ref>{{cite web |title=Psychologist wins Nobel Prize |url=https://www.apa.org/monitor/dec02/nobel.html |website=www.apa.org |access-date=16 June 2021}}</ref> || "{{w|Prospect theory}} assumes that losses and gains are valued differently, and thus individuals make decisions based on perceived gains instead of perceived losses."<ref>{{cite web |last1=Chen |first1=Full Bio Follow Linkedin Follow Twitter James |last2=Investing |first2=Is the Former Director of |last3=trader |first3=trading content at Investopedia He is an expert |last4=Adviser |first4=Investment |last5=Chen |first5=global market strategist Learn about our editorial policies James |title=Prospect Theory |url=https://www.investopedia.com/terms/p/prospecttheory.asp |website=Investopedia |access-date=16 June 2021 |language=en}}</ref>
 +
|-
 +
| 2003 || Belief, decision-making and behavioral || Concept development || The term ''{{w|projection bias}}'' is first introduced in the paper ''Projection Bias in Predicting Future Utility'' by Loewenstein, O'Donoghue and Rabin.<ref name=Frederick2011>{{cite book|last1=Frederick|first1=Shane|last2=Loewenstein|first2=George|last3=O'Donoghue|first3=Ted|editor1-last=Camerer|editor1-first=Colin F.|editor2-last=Loewenstein|editor2-first=George|editor3-last=Rabin|editor3-first=Matthew|title=Advances in Behavioral Economics|date=2011|publisher=Princeton University Press|isbn=978-1400829118|pages=187–188|chapter-url=https://books.google.com/books?id=sA4jJOjwCW4C&pg=PA187|language=en|chapter=Time Discounting and Time Preference: A Critical Review|ref=harv}}</ref> || {{w|Projection bias}} "refers to people’s assumption that their tastes or preferences will remain the same over time."<ref>{{cite web |title=Projection bias |url=https://www.behavioraleconomics.com/resources/mini-encyclopedia-of-be/projection-bias/ |website=behavioraleconomics.com |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 2003 || || Concept development || Lovallo and Kahneman propose an expanded definition of {{w|planning fallacy}} as the tendency to underestimate the time, costs, and risks of future actions and at the same time overestimate the benefits of the same actions. According to this definition, the planning fallacy results in not only time overruns, but also {{w|cost overruns}} and {{w|benefit shortfall}}s.<ref>{{cite journal |last1=Lovallo |first1=Dan |first2=Daniel |last2=Kahneman  |date=July 2003 |title=Delusions of Success: How Optimism Undermines Executives' Decisions |journal=Harvard Business Review |volume=81 |issue=7 |pages=56–63|pmid=12858711 |url=https://hbr.org/2003/07/delusions-of-success-how-optimism-undermines-executives-decisions}}</ref> || "{{w|Planning fallacy}} refers to a prediction phenomenon, all too familiar to many, wherein people underestimate the time it will take to complete a future task, despite knowledge that previous tasks have generally taken longer than planned."<ref>{{cite journal |last1=Buehler |first1=Roger |last2=Griffin |first2=Dale |last3=Peetz |first3=Johanna |title=The Planning Fallacy |journal=Advances in Experimental Social Psychology |date=2010 |volume=43 |pages=1–62 |doi=10.1016/S0065-2601(10)43001-4}}</ref>
 +
|-
 +
| 2003 || Belief, decision-making and behavioral ([[w:framing effect (psychology)|framing effect]]) || Research || Johnson and Goldstein report on the [[w:framing effect (psychology)|framing effect]] playing a key role in the rate of organ donation.<ref name="Framing"/> || "The term {{w|framing effect}} refers to a phenomenon whereby the choices people make are systematically altered by the language used in the formulation of options."<ref>{{cite journal |last1=Kim |first1=S. |last2=Goldstein |first2=D. |last3=Hasher |first3=L. |last4=Zacks |first4=R. T. |title=Framing Effects in Younger and Older Adults |journal=The Journals of Gerontology Series B: Psychological Sciences and Social Sciences |date=1 July 2005 |volume=60 |issue=4 |pages=P215–P218 |doi=10.1093/geronb/60.4.P215}}</ref>       
 +
|-
 +
| 2004 || Social bias || Literature || American journalist {{w|James Surowiecki}} publishes ''{{w|The Wisdom of Crowds}}'', which explores herd mentality and draws the conclusion that the decisions made by groups are often better and more accurate than those made by any individual member.<ref name=sdf/> || "Herd mentality (also known as mob mentality) describes a behavior in which people act the same way or adopt similar behaviors as the people around them{{snd}}often ignoring their own feelings in the process."<ref name=sdf>{{cite web |title=4 examples of herd mentality (and how to take advantage of it) |url=https://www.iwillteachyoutoberich.com/blog/herd-mentality/#:~:text=Herd%20mentality%20(also%20known%20as,what%20the%20herd%20is%20doing. |website=iwillteachyoutoberich.com |access-date=27 January 2021}}</ref>
 +
|-
 +
| 2004 || || Literature || Rüdiger Pohl and Rüdiger F. Pohl publish ''Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory'', which provides an overview of research in the area.<ref>{{cite book |last1=Pohl |first1=Rüdiger |last2=Pohl |first2=Rüdiger F. |title=Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory |date=2004 |publisher=Psychology Press |isbn=978-1-84169-351-4 |url=https://books.google.com.ar/books/about/Cognitive_Illusions.html?id=k5gTes7yyWEC&source=kp_book_description&redir_esc=y |language=en}}</ref>
 +
|-
 +
| 2004 || Belief, decision-making and behavioral ([[w:Framing effect (psychology)|framing effect]]) || Concept development || The concept of the {{w|distinction bias}} is advanced by Christopher K. Hsee and Jiao Zhang of the {{w|University of Chicago}} as an explanation for differences in evaluations of options between joint evaluation mode and separate evaluation mode.<ref>{{cite journal |last1=Hsee |first1=Christopher K. |last2=Zhang |first2=Jiao |title=General Evaluability Theory |doi=10.1177/1745691610374586 |url=https://journals.sagepub.com/doi/10.1177/1745691610374586}}</ref> || {{w|Distinction bias}} is "the tendency to view two options as more dissimilar when evaluating them simultaneously than when evaluating them separately." This bias is similar to the {{w|less-is-better effect}}, which is "the tendency to prefer a smaller set to a larger set judged separately, but not jointly."<ref name="dsaaaa">{{cite web |title=List of cognitive biases |url=https://uxinlux.github.io/cognitive-biases/ |website=uxinlux.github.io |access-date=25 July 2021 |language=en}}</ref>
 +
|-
 +
| 2005 || || Research || Haigh and List report on the [[w:framing effect (psychology)|framing effect]] playing a key role in stock market forecasting.<ref name="Framing"/> || "The framing effect is a type of cognitive bias that causes people to react to something in different ways depending on how the information is presented to them."<ref>{{cite web |last1=Marfice |first1=Christina |title=How to Use the Framing Effect to Sell More Products |url=https://www.plytix.com/blog/framing-effect |website=www.plytix.com |access-date=6 March 2021 |language=en-us}}</ref>
 +
|-
 +
| 2006 || || Organization || Overcoming Bias launches as a group blog on the "general theme of how to move our beliefs closer to reality, in the face of our natural biases such as overconfidence and wishful thinking, and our bias to believe we have corrected for such biases, when we have done no such thing."<ref>{{cite web |title=Overcoming Bias |url=http://www.overcomingbias.com/about |website=overcomingbias.com |accessdate=13 March 2020}}</ref> ||
 +
|-
 +
| 2006 || Belief, decision-making and behavioral || Concept development || The {{w|Ostrich effect}} is coined by Galai & Sade.<ref>{{cite journal |title=The "Ostrich Effect" and the Relationship between the Liquidity and the Yields of Financial Assets |journal=The Journal of Business |doi=10.2139/ssrn.431180}}</ref> || "The {{w|ostrich effect}} bias is a tendency to ignore dangerous or negative information by ignoring it or burying one's head in the sand."<ref>{{cite web |title=Ostrich Effect |url=https://www.thinkingcollaborative.com/stj/ostrich-effect/ |website=thinkingcollaborative.com |accessdate=8 May 2020}}</ref>
 +
|-
 +
| 2007 || Belief, decision-making and behavioral || Concept development || The term ''{{w|recency illusion}}'' is coined by {{w|Stanford University}} linguist {{w|Arnold Zwicky}}.<ref name="sssa">{{cite journal |last1=Rickford |first1=John R. |last2=Wasow |first2=Thomas |last3=Zwicky |first3=Arnold |date=2007 |title=Intensive and quotative ''all'': something new, something old |journal=American Speech |doi=10.1215/00031283-2007-001 |volume=82 |issue=1 |pages=3–31|doi-access=free }}</ref> || The {{w|recency illusion}} is the belief or impression that a word or language usage is of recent origin when it is long-established."<ref name="sssa"/>
 +
|-
 +
| 2007 || Social (conformity bias) || Concept development || The concept of an “availability cascade” is defined by professors Timur Kuran and Cass Sunstein.<ref name="sddf">{{cite web |title=Climate Change 3: The Grand Narrative Availability Cascade is Making Us Stupid |url=https://www.americanexperiment.org/2016/11/the-grand-narrative-availability-cascade-is-making-us-stupid/ |website=americanexperiment.org |access-date=14 January 2021}}</ref> || Availability cascade refers to the "self-reinforcing process of collective belief formation by which an expressed perception triggers a chain reaction that gives the perception of increasing plausibility through its rising availability in public discourse."<ref name="sddf"/>
 +
|-
 +
| 2008 || Belief, decision-making and behavioral || Literature || Israeli-American author {{w|Dan Ariely}} publishes ''{{w|Predictably Irrational: The Hidden Forces That Shape Our Decisions}}'', which explores cognitive biases within the genre of {{w|behavioral economics}}.<ref>{{cite web |title=APA PsycNet |url=https://psycnet.apa.org/record/2008-04432-000 |website=psycnet.apa.org |access-date=28 July 2021 |language=en}}</ref>
 +
|-
 +
| 2008 || Social bias ({{w|association fallacy}}) || Concept development || The term {{w|cheerleader effect}} is coined by the character {{w|Barney Stinson}} in ''{{w|Not a Father's Day}}'', an episode of the television series ''{{w|How I Met Your Mother}}''. Barney points out to his friends a group of women that initially seem attractive, but who all seem to be very ugly when examined individually.<ref>{{cite web|url=https://www.theatlantic.com/business/archive/2013/11/cheerleader-effect-why-people-are-more-beautiful-in-groups/281119/|title=Cheerleader Effect: Why People Are More Beautiful in Groups|work={{w|The Atlantic}}|last=Hamblin|first=James|date=November 4, 2013|accessdate=December 5, 2015}}</ref> || "The {{w|cheerleader effect}} refers to the increase in attractiveness that an individual face experiences when seen in a group of other faces."<ref>{{cite journal |last1=Carragher |first1=Daniel J. |last2=Thomas |first2=Nicole A. |last3=Gwinn |first3=O. Scott |last4=Nicholls |first4=Mike E. R. |title=Limited evidence of hierarchical encoding in the cheerleader effect |url=https://www.nature.com/articles/s41598-019-45789-6}}</ref>
 +
|-
 +
| 2009 || Belief, decision-making and behavioral ({{w|framing effect}}) || Concept development || The concept of {{w|denomination effect}} is proposed by Priya Raghubir, professor at the {{w|New York University Stern School of Business}}, and Joydeep Srivastava, professor at [[w:University of Maryland, College Park|University of Maryland]], in their paper.<ref name="NPR">{{cite news|title=Why We Spend Coins Faster Than Bills|url=https://www.npr.org/templates/story/story.php?storyId=104063298|accessdate=7 April 2020|publisher=NPR|date=May 12, 2009}}</ref> || {{w|Denomination effect}} relates "to currency, whereby people are less likely to spend larger bills than their equivalent value in smaller bills."<ref>{{cite web |title=Denomination effect |url=http://nlpnotes.com/denomination-effect/ |website=nlpnotes.com |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 2010 || Belief, decision-making and behavioral ({{w|confirmation bias}}) || Concept development ||  The {{w|backfire effect}} is first coined by American political scientist {{w|Brendan Nyhan}} and Jason Reifler.<ref>{{Cite web|url=http://www.dartmouth.edu/~nyhan/nyhan-reifler.pdf|title=Pdf.}}</ref> || "The backfire effect is a cognitive bias that causes people who encounter evidence that challenges their beliefs to reject that evidence, and to strengthen their support of their original stance."<ref>{{cite web |title=The Backfire Effect: Why Facts Don’t Always Change Minds – Effectiviology |url=https://effectiviology.com/backfire-effect-facts-dont-change-minds/ |website=effectiviology.com |access-date=27 January 2021}}</ref>
 +
|-
 +
| 2010 || Belief, decision-making and behavioral || Research || The ''Handbook of Social Psychology'' recognizes {{w|naïve realism}} as one of "four hard-won insights about [[w:Perception|human perception]], [[w:Thought|thinking]], {{w|motivation}} and {{w|behavior}} that... represent important, indeed foundational, contributions of {{w|social psychology}}."<ref>{{cite journal |last1=Ross |first1=Lee |last2=Lepper |first2=Mark |last3=Ward |first3=Andrew |title=History of Social Psychology: Insights, Challenges, and Contributions to Theory and Application |journal=Handbook of Social Psychology |date=30 June 2010 |pages=socpsy001001 |doi=10.1002/9780470561119.socpsy001001}}</ref> || "{{w|Naïve realism}} describes people’s tendency to believe that they perceive the social world “as it is”—as objective reality—rather than as a subjective construction and interpretation of reality."<ref>{{cite web |title=Naive Realism |url=http://psychology.iresearchnet.com/social-psychology/decision-making/naive-realism/ |website=psychology.iresearchnet.com |accessdate=17 July 2020}}</ref>
 +
|-
 +
| 2010 || Belief, decision-making and behavioral || Research || In a study looking at computer use and musculoskeletal symptoms, Chang et al investigate information bias in the self-reporting of personal computer use. Over a period of 3 weeks, young adults report the duration of computer use each day, as well as musculoskeletal symptoms. Usage-monitor software installed onto participant’s computers provides the reference measure. Results show that the relationships between daily self-reported and software-recorded computer-use duration varied greatly across subject with [[w:Spearman's rank correlation coefficient|Spearman's correlations]] ranging from -0.22 to 0.8. Self-reports generally overestimated computer use when software-recorded durations were less than 3.6 hr, and underestimated when above 3.6 hr.<ref>{{cite journal |last1=Chang |first1=Che-hsu Joe |last2=Menéndez |first2=Cammie Chaumont |last3=Robertson |first3=Michelle M. |last4=Amick |first4=Benjamin C. |last5=Johnson |first5=Peter W. |last6=del Pino |first6=Rosa J. |last7=Dennerlein |first7=Jack T. |title=Daily self-reports resulted in information bias when assessing exposure duration to computer use |journal=American Journal of Industrial Medicine |date=November 2010 |volume=53 |issue=11 |pages=1142–1149 |doi=10.1002/ajim.20878}}</ref><ref>{{cite web |title=Information bias |url=https://catalogofbias.org/biases/information-bias/ |website=Catalog of Bias |access-date=25 July 2021 |language=en |date=13 November 2019}}</ref> || "[[w:Information bias (psychology)|Information bias]] is any systematic difference from the truth that arises in the collection, recall, recording and handling of information in a study, including how missing data is dealt with."<ref>{{cite web |title=Information Bias |url=https://catalogofbias.org/biases/information-bias/#:~:text=Information%20bias%20is%20any%20systematic,recall%20bias%20and%20reporting%20bias. |website=catalogofbias.org |accessdate=22 September 2020}}</ref>
 +
|-
 +
| 2010 || || Literature || Sebastian Serfas publishes ''Cognitive Biases in the Capital Investment Context: Theoretical Considerations and Empirical Experiments on Violations of Normative Rationality'', which shows how cognitive biases systematically affect and distort capital investment-related decision making and business judgements.<ref>{{cite book |last1=Serfas |first1=Sebastian |title=Cognitive Biases in the Capital Investment Context: Theoretical Considerations and Empirical Experiments on Violations of Normative Rationality |date=6 December 2010 |publisher=Springer Science & Business Media |isbn=978-3-8349-6485-4 |url=https://books.google.com.ar/books/about/Cognitive_Biases_in_the_Capital_Investme.html?id=i7OJWje1JgQC&source=kp_book_description&redir_esc=y |language=en}}</ref>
 +
|-
 +
| 2011 || Belief, decision-making and behavioral || Concept development || The {{w|IKEA effect}} is identified and named by {{w|Michael I. Norton}} of {{w|Harvard Business School}}, Daniel Mochon of {{w|Yale}}, and {{w|Dan Ariely}} of {{w|Duke University}}, who publish the results of three studies in this year.<ref>{{cite web |title=Cognitive Biases — The IKEA Effect |url=https://medium.com/@michaelgearon/cognitive-biases-the-ikea-effect-d994ea6a28ad |website=medium.com |accessdate=14 August 2020}}</ref> || "The [IKEA effect] is the cognitive phenomena where customers get more excited and place a higher value in the products they have partially created, modified or personalized."<ref>{{cite web |title=What is the Ikea Effect? |url=https://www.bloomreach.com/en/blog/2019/08/ikea-effect.html |website=bloomreach.com |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 2011 || || Literature || {{w|Daniel Kahneman}} publishes ''{{w|Thinking, Fast and Slow}}'', which covers cognitive biases, in addition to his work in other fields.<ref>{{cite web |title=Thinking, Fast and Slow |url=https://www.goodreads.com/book/show/11468377-thinking-fast-and-slow |website=www.goodreads.com |access-date=16 June 2021}}</ref> ||
 +
|-
 +
| 2011 || Memory bias || Concept development || The {{w|Google effect}}, also known as “digital amnesia”, is first described by Betsy Sparrow from {{w|Columbia University}} and her colleagues. Their paper describes the results of several memory experiments involving technology.<ref name="thecustomer.net">{{cite web |title=Marketers Need To Be Aware Of Cognitive Bias |url=https://thecustomer.net/marketers-need-to-be-aware-of-cognitive-bias/?cn-reloaded=1 |website=thecustomer.net |accessdate=12 March 2020}}</ref><ref name="Columbia">{{cite web|title=Study Finds That Memory Works Differently in the Age of Google |publisher={{w|Columbia University}}|date=July 14, 2011|url=https://web.archive.org/web/20110717092619/http://news.columbia.edu/research/2490}}</ref> || The {{w|Google effect}} "represents people’s tendency to forget information that they can find online, particularly by using search engines such as {{w|Google}}."<ref>{{cite web |title=The Google Effect and Digital Amnesia: How We Use Machines to Remember |url=https://effectiviology.com/the-google-effect-and-digital-amnesia/#:~:text=Summary%20and%20conclusions-,The%20Google%20effect%20is%20a%20psychological%20phenomenon%20that%20represents%20people's,search%20engines%20such%20as%20Google. |website=effectiviology.com |accessdate=16 July 2020}}</ref>
 +
|-
 +
| 2011 || Belief, decision-making and behavioral || Notable case || The {{w|look-elsewhere effect}}, more generally known in statistics as the {{w|problem of multiple comparisons}}, gains some media attention in the context of the search for the {{w|Higgs boson}} at the {{w|Large Hadron Collider}}.<ref>{{cite web|url=http://blogs.telegraph.co.uk/news/tomchiversscience/100123873/an-unconfirmed-sighting-of-the-elusive-higgs-boson/|title=An unconfirmed sighting of the elusive Higgs boson|author=Tom Chivers|date=2011-12-13|publisher=Daily Telegraph}}</ref> || The {{w|look-elsewhere effect}} "occurs when a statistically significant observation is found but, actually, arose by chance and due to the size of the parameter space and sample observed."<ref>{{cite web |title=When a statistically significant observation should be overlooked. |url=https://thedecisionlab.com/biases/look-elsewhere-effect/ |website=thedecisionlab.com |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 2011 || || Literature || American neuroscientist {{w|Dean Buonomano}} publishes ''Brain Bugs: How the Brain's Flaws Shape Our Lives'', which attempts to explain the brain’s inherent flaws.<ref>{{cite book |last1=Buonomano |first1=Dean |title=Brain Bugs: How the Brain's Flaws Shape Our Lives |date=11 July 2011 |publisher=W. W. Norton & Company |isbn=978-0-393-08195-4 |url=https://books.google.com.ar/books/about/Brain_Bugs_How_the_Brain_s_Flaws_Shape_O.html?id=eAKIcDmhBuEC&source=kp_book_description&redir_esc=y |language=en}}</ref> ||
 +
|-
 +
| 2013 (February 12) || || Literature || American psychologist {{w|Mahzarin Banaji}} publishes ''Blindspot: Hidden Biases of Good People'', which explains the science that shapes our likes and dislikes and our judgments about people’s character, abilities and potential. The book uses the {{w|implicit-association test}}, an assessment that measures attitudes and beliefs that people may be unwilling or unable to report.<ref>{{cite book |last1=Banaji |first1=Mahzarin R. |title=Blindspot: Hidden Biases of Good People |date=18 April 2014 |publisher=Penguin Books Limited |isbn=978-81-8475-930-3 |url=https://books.google.com.ar/books/about/Blindspot.html?id=r0A7_joFYewC&source=kp_book_description&redir_esc=y |language=en}}</ref> ||
 +
|-
 +
| 2013 || Belief, decision-making and behavioral || Concept development || The term “{{w|end-of-history illusion}}” originates in a journal article by psychologists Jordi Quoidbach, [[w:Daniel Gilbert (psychologist)|Daniel Gilbert]], and {{w|Timothy Wilson}} detailing their research on the phenomenon and leveraging the phrase coined by [[w:The End of History and the Last Man|Francis Fukuyama's 1992 book of the same name]].<ref name="Quoidbach2013">{{cite journal |last1= Quoidbach |first1= Jordi |last2= Gilbert |first2= Daniel T.|last3= Wilson |first3= Timothy D. |date= 2013-01-04 |title= The End of History Illusion |journal= [[w:Science (journal)|Science]] |volume= 339 |issue= 6115 |pages= 96–98 |doi= 10.1126/science.1229294 |pmid= 23288539|quote= Young people, middle-aged people, and older people all believed they had changed a lot in the past but would change relatively little in the future.|url= https://web.archive.org/web/20130113214951/http://www.wjh.harvard.edu/~dtg/Quoidbach%20et%20al%202013.pdf |archivedate= 2013-01-13}}</ref> || The {{w|end-of-history illusion}} occurs "when people tend to underestimate how much they will change in the future.”<ref>{{cite web |title=Why You Won’t Be the Person You Expect to Be |url=https://www.nytimes.com/2013/01/04/science/study-in-science-shows-end-of-history-illusion.html |website=nytimes.com |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 2013 || || Literature || Swiss writer {{w|Rolf Dobelli}} publishes ''{{w|The Art of Thinking Clearly}}'', which describes the most common thinking errors, ranging from cognitive biases to envy and social distortions.<ref>{{cite web |title=The Art of Thinking Clearly |url=http://xqdoc.imedao.com/166eb7278f3556e3fe9dc3ef.pdf |website=xqdoc.imedao.com |access-date=28 July 2021}}</ref>
 +
|-
 +
| 2016 || || Literature || Adrian Nantchev publishes ''50 Cognitive Biases for an Unfair Advantage in Entrepreneurship''.<ref>{{cite book |last1=Nantchev |first1=Adrian |title=50 Cognitive Biases for an Unfair Advantage in Entrepreneurship |publisher=CreateSpace Independent Publishing Platform |isbn=978-1-5376-0327-8 |url=https://books.google.com.ar/books/about/50_Cognitive_Biases_for_an_Unfair_Advant.html?id=yY4pvgAACAAJ&source=kp_book_description&redir_esc=y |language=en}}</ref>
 +
|-
 +
| 2019 || || Literature || Henry Priest publishes ''Biases and Heuristics: The Complete Collection of Cognitive Biases and Heuristics That Impair Decisions in Banking, Finance and Everything Else''.<ref>{{cite book |last1=Priest |first1=Henry |title=BIASES and HEURISTICS: The Complete Collection of Cognitive Biases and Heuristics That Impair Decisions in Banking, Finance and Everything Else |publisher=Amazon Digital Services LLC - KDP Print US |isbn=978-1-0784-3231-3 |url=https://books.google.com.ar/books/about/BIASES_and_HEURISTICS.html?id=z4RWxwEACAAJ&source=kp_book_description&redir_esc=y |language=en}}</ref>
 
|-
 
|-
 
|}
 
|}
Line 71: Line 416:
 
===How the timeline was built===
 
===How the timeline was built===
  
The initial version of the timeline was written by [[User:FIXME|FIXME]].
+
The initial version of the timeline was written by [[User:Sebastian]].
  
 
{{funding info}} is available.
 
{{funding info}} is available.
Line 83: Line 428:
 
===What the timeline is still missing===
 
===What the timeline is still missing===
  
* {{w|Cognitive bias}}
+
* Issa: This is probably going to take a whole bunch of work, but eventually it would be nice if the rows containing specific studies that were conducted could mention whether the study has been replicated or not.
* {{w|List of cognitive biases}}
 
* [https://onlinelibrary.wiley.com/doi/full/10.1002/9781119125563.evpsych241]
 
  
 
===Timeline update strategy===
 
===Timeline update strategy===
  
 
==See also==
 
==See also==
 +
 +
* [[Timeline of the rationality community]]
  
 
==External links==
 
==External links==

Latest revision as of 20:41, 30 July 2021

This is a timeline of cognitive biases, attempting to describe several events related to the development of new concepts, as well as some illustrative events describing research in the field.

Sample questions

The following are some interesting questions that can be answered by reading this timeline:

  • What are the different types of cognitive bias described by the timeline?
    • Sort the full timeline by "Bias type".
    • You will mostly see three categories: Social bias, memory bias, and belief, decision-making and behavioral bias.
  • What are some notable cases in history involving a cognitive bias?
    • Sort the full timeline by "Event type" and look for the group of rows with value "Notable case".
  • What are some events describing the development of a concept within the field of cognitive biases?
  • What are some ilustrative pieces of research related to the field?
    • Sort the full timeline by "Event type" and look for the group of rows with value "Research".
  • What are some books illustrating the literature on the field of cognitive biases?
    • Sort the full timeline by "Event type" and look for the group of rows with value "Literature".
    • You will read a number of notable authors, such as Daniel Kahneman, and Irving Fisher, among others.

Big picture

Time period Development summary More details
1972 backward Pre concept development era Multiple concepts later included within the category of cognitive biases are developed throughout time, starting from ancient Greek philosophers.
1972 onward Modern period The notion of cognitive bias is introduced by Amos Tversky and Daniel Kahneman, who in the following years would further elaborate on several different types of cognitive biases and related concepts.
21st century Present time As of 2020, there are approximately 188 recognized cognitive biases.[1]

Visual and numerical data

Mentions on Google Scholar

The following table summarizes per-year mentions on Google Scholar as of May 17, 2021.

Year Overconfidence Bias Self Serving Bias Herd Mentality Loss Aversion Framing Cognitive Bias Narrative Fallacy Anchoring Bias Confirmation Bias Hindsight Bias Representativeness Heuristic
1980 89 3,060 102 1,830 134 390 221 2,150 420 136
1985 144 3,570 137 2,500 311 557 320 2,560 583 226
1990 234 6,410 268 3,810 779 958 584 4,780 1,010 414
1995 428 10,600 502 5,040 1,610 1,560 1,100 7,070 1,660 539
2000 824 18,500 745 8,590 3,010 2,550 1,960 12,400 2,970 832
2002 1,090 20,700 1,020 11,200 3,850 2,390 2,560 12,400 3,430 898
2004 1,700 24,200 1,160 14,000 5,120 3,300 3,370 16,200 4,200 1,130
2006 2,050 27,300 1,220 16,900 6,470 3,570 4,090 20,500 4,660 1,500
2008 2,650 32,300 1,520 20,700 8,220 4,690 5,040 25,600 5,500 1,580
2010 3,350 36,700 1,810 25,500 10,700 5,320 6,220 31,300 6,280 2,270
2012 4,500 40,100 2,140 29,200 13,900 6,180 7,910 38,500 7,310 2,820
2014 5,300 42,400 2,260 31,800 17,800 8,890 9,230 43,800 8,070 3,440
2016 6,020 42,600 2,390 31,600 19,900 9,160 10,600 45,100 8,790 3,700
2017 6,760 41,600 2,210 31,000 21,900 9,570 11,300 40,300 9,010 4,090
2018 7,500 39,700 2,360 31,200 23,200 10,300 12,500 42,200 9,650 4,300
2019 8,290 33,800 2,330 29,700 24,000 10,200 13,200 35,400 7,990 4,490
2020 9,110 30,100 2,670 28,000 25,500 10,200 15,200 32,500 9,300 4,590
Cognitive biases.png

Google Trends

The chart below shows Google Trends data for cognitive biases (topic) from January 2004 to January 2021, when the screenshot was taken.[2]

Cognitive biases gtrends.jpeg

Google Ngram Viewer

The chart shows Google Ngram Viewer data for "cognitive bias", from 1972 (when the concept was created) to 2019.[3]

Cognitive bias ngram.png

Wikipedia Views

The chart below shows pageviews of the English Wikipedia article cognitive bias, from July 2015 to December 2020.[4]

Cognitive biases wv.jpeg

Full timeline

Year Bias type Event type Details Concept definition (when applicable)
c.180 CE Social bias Field development Many philosophers and social theorists observe and consider the phenomenon of belief in a just world, going back to at least as early as the Pyrrhonist philosopher Sextus Empiricus, writing circa 180 CE, who argues against this belief.[5] "The just-world hypothesis is the belief that people get what they deserve since life is fair."[6]
1747 Field development Scottish doctor James Lind conducts the first systematic clinical trial.[7] "Clinical trials are research studies performed in people that are aimed at evaluating a medical, surgical, or behavioral intervention."[8]
1753 Field development Anthropomorphism is first attested, originally in reference to the heresy of applying a human form to the Christian God.[9][10] Anthropomorphism is "the interpretation of nonhuman things or events in terms of human characteristics".[11]
1776–1799 Field development The declinism belief is traced back to Edward Gibbon's work The History of the Decline and Fall of the Roman Empire,[12] where Edward Gibbon argues that Rome collapsed due to the gradual loss of civic virtue among its citizens.[13] Declinism is "the tendency to believe that the worst is to come".[14]
1796 Literature French scholar Pierre-Simon Laplace describes in A Philosophical Essay on Probabilities the ways in which men calculate their probability of having sons: "I have seen men, ardently desirous of having a son, who could learn only with anxiety of the births of boys in the month when they expected to become fathers. Imagining that the ratio of these births to those of girls ought to be the same at the end of each month, they judged that the boys already born would render more probable the births next of girls." The expectant fathers feared that if more sons were born in the surrounding community, then they themselves would be more likely to have a daughter. This essay by Laplace is regarded as one of the earliest descriptions of the fallacy.[15] "The Gambler's Fallacy is the misconception that something that has not happened for a long time has become 'overdue', such a coin coming up heads after a series of tails."[16]
1847 Concept development Hungarian physician Ignaz Semmelweis discovers that hand washing and disinfecting at hospitals dramatically reduces infection and death in paients. His hand-washing suggestions are at the beginning rejected by his contemporaries, often for non-medical reasons. This would give birth to the concept of Semmelweis effect, which is a metaphor for the reflex-like tendency to reject new evidence or new knowledge because it contradicts established norms, beliefs, or paradigms.[17] Semmelweis effect "refers to the tendency to automatically reject new information or knowledge because it contradicts current thinking or beliefs."[18]
1848 Social (conformity bias) Concept development The phrase "jump on the bandwagon" first appears in American politics when enterteiner Dan Rice uses his bandwagon and its music to gain attention for his political campaign appearances. As his campaign becomes more successful, other politicians would strive for a seat on the bandwagon, hoping to be associated with his success. This preludes the emergence of the term bandwagon effect, which is later coined in the early 20th century.[19] Bandwagon effect "is a psychological phenomenon whereby people do something primarily because other people are doing it, regardless of their own beliefs, which they may ignore or override."[20]
1850 Concept development The first reference to “stereotype” appears as a noun that means “image perpetuated without change.”[21] Stereotype refers to "a widely held but fixed and oversimplified image or idea of a particular type of person or thing"[22]
1860 Concept development Both Weber's law and Fechner's law are published by Gustav Theodor Fechner in the work Elemente der Psychophysik (Elements of Psychophysics). This publication is the first work ever in this field, and where Fechner coins the term psychophysics to describe the interdisciplinary study of how humans perceive physical magnitudes.[23] Weber–Fechner law "states that the change in a stimulus that will be just noticeable is a constant ratio of the original stimulus."[24]
1866 Belief, decision-making and behavioral (apophenia) Concept development The German word pareidolie is used in German articles by Dr. Karl Ludwig Kahlbaum in his paper On Delusion of the Senses.[25] Pareidolia is "the tendency to perceive a specific, often meaningful image in a random or ambiguous visual pattern."[26]
1874 Memory bias Field development The first documented instance of cryptomnesia occurs with the medium Stainton Moses.[27][28] Cryptomnesia is "an implicit memory phenomenon in which people mistakenly believe that a current thought or idea is a product of their own creation when, in fact, they have encountered it previously and then forgotten it".[29]
1876 Memory bias Field development German experimental psychologist Gustav Fechner conducts the earliest known research on the mere-exposure effect.[30] Mere-exposure effect "means that people prefer things that they are most familiar with".[31] It is "the tendency to express undue liking for things merely because of familiarity with them."[32]
1882 Concept development The term specious present is first introduced by the philosopher E. R. Clay.[33][34] Specious present "is the time duration wherein a state of consciousness is experienced as being in the present".[35]
1885 Memory bias Concept development The phenomenon of spacing effect is first identified by Hermann Ebbinghaus, and his detailed study of it is published in his book Über das Gedächtnis. Untersuchungen zur experimentellen Psychologie (Memory: A Contribution to Experimental Psychology). "The spacing effect describes the robust finding that long-term learning is promoted when learning events are spaced out in time, rather than presented in immediate succession".[36]
1890 Memory bias Concept development The tip of the tongue phenomenon is first described as a psychological phenomenon in the text The Principles of Psychology by William James.[37] Tip of the tongue describes "a state in which one cannot quite recall a familiar word but can recall words of similar form and meaning".[38]
1893 Memory bias Concept development Childhood amnesia is first formally reported by psychologist Caroline Miles in her article A study of individual psychology by the American Journal of Psychology.[39] Childhood amnesia "refers to the fact that most people cannot remember events that occurred before the age of 3 or 4".[40]
1906 Social (conformity bias) Concept development The first known use of bandwagon effect occurs in this year.[41] "Bandwagon effect is when an idea or belief is being followed because everyone seems to be doing so."[42]
1906 Social bias Field development American sociologist William Sumner posits that humans are a species that join together in groups by their very nature. However, he also maintains that humans have an innate tendency to favor their own group over others, proclaiming how "each group nourishes its own pride and vanity, boasts itself superior, exists in its own divinities, and looks with contempt on outsiders".[43] In-group favoritism is "the tendency to favor members of one's own group over those in other groups".[44]
1909 Memory bias Concept development The first documented empirical studies on the testing effect are published by Edwina E. Abbott.[45][46] "Testing effect is the finding that long-term memory is often increased when some of the learning period is devoted to retrieving the to-be-remembered information."[47]
1913 Concept development The term "Monte Carlo fallacy" (also known as Gambler's fallacy) originates from the best known example of the phenomenon, which occurs in the Monte Carlo Casino.[48] Gambler's fallacy "occurs when an individual erroneously believes that a certain random event is less likely or more likely, given a previous event or a series of events."[49]
1914 Memory bias Concept development The first research on the cross-race effect is published.[50] Cross-race effect is "the tendency for eyewitnesses to be better at recognizing members of their own race/ethnicity than members of other races."[51]
1920 Social bias Concept development The halo effect is named by psychologist Edward Thorndike[52] in reference to a person being perceived as having a halo. He gives the phenomenon its name in his article A Constant Error in Psychological Ratings.[53] In "Constant Error", Thorndike sets out to replicate the study in hopes of pinning down the bias that he thought was present in these ratings. Subsequent researchers would study it in relation to attractiveness and its bearing on the judicial and educational systems.[54] Thorndike originally coins the term referring only to people; however, its use would be greatly expanded especially in the area of brand marketing.[53] Halo effect refers to an "error in reasoning in which an impression formed from a single trait or characteristic is allowed to influence multiple judgments or ratings of unrelated factors."[55]
1922 Concept development The term “stereotype” is first used in the modern psychological sense by American journalist Walter Lippmann in his work Public Opinion.[21] "Stereotype is most frequently now employed to refer to an often unfair and untrue belief that many people have about all people or things with a particular characteristic."[56]
1927 Memory bias Research Lithuanian-Soviet psychologist Bluma Zeigarnik at the University of Berlin first describes the phenomenon that would be later known as Zeigarnik effect.[57][58][59] Zeigarnik effect is the "tendency to remember interrupted or incomplete tasks or events more easily than tasks that have been completed."[60]
1928 Belief, decision-making and behavioral Literature American economist Irving Fisher publishes The Money Illusion, which develops the concept of the same name.[61] "Money illusion posits that people have a tendency to view their wealth and income in nominal dollar terms, rather than recognize its real value, adjusted for inflation."[62]
1930 Concept development English epistemologist C. D. Broad further elaborates on the concept of the specious present and states that it may be considered as the temporal equivalent of a sensory datum.[34] "The specious present is a term applied to that short duration of time the human mind appears to be able to experience, a period which exists between past and future and which is longer than the singular moment of the actual present."[63]
1932 Memory bias Field development Some of the earliest evidence for the Fading Affect Bias dates back to a study by Cason, who conducts a study using a retrospective procedure where participants recall and rate past events and emotion when prompted finds that recalled emotional intensity for positive events is generally stronger than that of negative events.[64] The Fading Affect Bias "indicates that the emotional response prompted by positive memories often tends to be stronger than the emotional response prompted by negative memories."[65]
1933 Memory bias Concept development The Von Restorff effect theory is coined by German psychiatrist and pediatrician Hedwig von Restorff, who, in her study, finds that when participants are presented with a list of categorically similar items with one distinctive, isolated item on the list, memory for the item is improved.[66] "It predicts that when multiple similar objects are present, the one that differs from the rest is most likely to be remembered."[67]
1942 Concept development The Einstellung effect is first described by Dr. Abraham Luchins.[68] "The Einstellung Effect is a type of mindset that causes humans to repeat the use of "tried and true" strategies for problem solving, even when a simpler solution strategy exists."[69]
1945 Belief, decision-making and behavioral (anchoring bias) Concept development Karl Duncker defines functional fixedness as being a "mental block against using an object in a new way that is required to solve a problem".[70] Functional fixedness "is the inability to realize that something known to have a particular use may also be used to perform other functions."[71]
1946 Belief, decision-making and behavioral (logical fallacy) Concept development American statistician Joseph Berkson illustrates what would be later known as Berkson's paradox, one of the most famous paradoxes in probability and statistics.[72] Berkson's bias or fallacy, is a type of selection bias. Berkson's paradox "is a type of selection bias – a mathematical result found in the fields of conditional probability and statistics in which two variables can be negatively correlated even though they have the appearance of being positively correlated within the population."[73]
1947 Belief, decision-making and behavioral (extension neglect) Concept development Joseph Stalin is credited by some for having introduced the concept of compassion fade with his statement “the death of one man is a tragedy, the death of millions is a statistic”.[74] However, this introduction is considered to be misattributed by others.[75] Compassion fade "refers to the decrease in the compassion one shows for the people in trouble as the number of the victims increase."[76]
1952 Social (conformity bias) Concept development William H. Whyte Jr. derives the term groupthink from George Orwell's Nineteen Eighty-Four and popularizes it in Fortune magazine:
Groupthink being a coinage – and, admittedly, a loaded one – a working definition is in order. We are not talking about mere instinctive conformity – it is, after all, a perennial failing of mankind. What we are talking about is a rationalized conformity – an open, articulate philosophy which holds that group values are not only expedient but right and good as well.[77][78]
"Groupthink is a psychological phenomenon in which people strive for consensus within a group."[79]
1954 Social bias Concept development The social comparison theory is initially proposed by social psychologist Leon Festinger. It centers on the belief that there is a drive within individuals to gain accurate self-evaluations.[80] The social comparison theory refers to "the idea that individuals determine their own social and personal worth based on how they stack up against others".[81]
1956 Concept development The term "Barnum effect" is coined by psychologist Paul Meehl in his essay Wanted – A Good Cookbook, because he relates the vague personality descriptions used in certain "pseudo-successful" psychological tests to those given by showman P. T. Barnum.[82][83] Barnum effect is "the phenomenon that occurs when individuals believe that personality descriptions apply specifically to them (more so than to other people), despite the fact that the description is actually filled with information that applies to everyone."[84]
1957 Concept development British naval historian C. Northcote Parkinson describes what is later called Parkinson's law of triviality, which argues that members of an organization give disproportionate weight to trivial issues.[85] Parkinson's law of triviality (also known as the bike-shed effect) "explains that people will give more energy and focus to trivial or unimportant items than to more important and complex ones."[86]
1960 Belief, decision-making and behavioral Concept development English psychologist Peter Cathcart Wason first describes the confirmation bias.[87][88][89] "Confirmation bias is the tendency of people to favor information that confirms their existing beliefs or hypotheses."[90]
1960 Belief, decision-making and behavioral (confirmation bias) Concept development Peter Cathcart Wason discovers the classic example of subjects' congruence bias.[91] Congruence bias is "the tendency to test hypotheses exclusively through direct testing, instead of considering possible alternatives."[92]
1961 Social bias Research The Milgram experiment is conducted. This classic experiment establishes the existence of authority bias.[93] "Authority bias is the human tendency to attribute greater authority and knowledge to persons of authority (fame, power, position, etc.) than they may actually possess."[94]
1961 Ambiguity effect Concept development The ambiguity effect is first described by American economist Daniel Ellsberg.[95] "Ambiguity Effect occurs when people prefer options with known probabilities over those with unknown probabilities."[96]
1964 Memory bias Concept development The original work on the telescoping effect is usually attributed to an article by Neter and Waksberg in the Journal of the American Statistical Association.[97] The term telescoping comes from the idea that time seems to shrink toward the present in the way that the distance to objects seems to shrink when they are viewed through a telescope.[97] "The telescoping effect refers to inaccurate perceptions regarding time, where people see recent events as more remote than they are (backward telescoping), and remote events as more recent (forward telescoping).[98]
1964 Belief, decision-making and behavioral (anchoring bias) Concept development The first recorded statement of the concept of Law of the instrument is Abraham Kaplan's: "I call it the law of the instrument, and it may be formulated as follows: Give a small boy a hammer, and he will find that everything he encounters needs pounding."[99] "The law of the instrument principle states that when we acquire a specific tool/skill, we tend to be to see opportunities to use that tool/skill everywhere."[100]
1966 Social (egocentric bias) Research Walster hypothesizes that it can be frightening to believe that a misfortune could happen to anyone at random, and attributing responsibility to the person(s) involved helps to manage this emotional reaction.[101] "The defensive attribution hypothesis is a social psychology term that describes an attributional approach taken by some people - a set of beliefs that an individual uses to protect or "shield" themselves against fears of being the victim or cause of a major mishap."[102]
1967 Belief, decision-making and behavioral Notable case Risk compensation. Sweden experiences a drop in crashes and fatalities, following the change from driving on the left to driving on the right. This is linked to the increased apparent risk. The number of motor insurance claims goes down by 40%, returning to normal over the next six weeks.[103][104] Fatality levels would take two years to return to normal.[105] "Risk compensation postulates that humans have a built-in level of acceptable risk-taking and that our behaviour adjusts to this level in a homeostatic manner".[106]
1967 Belief, decision-making and behavioral (apophenia) Concept development Illusory correlation is originally coined by Chapman and Chapman to describe people's tendencies to overestimate relationships between two groups when distinctive and unusual information is presented.[107]"[108] An illusory correlation occurs when a person perceives a relationship between two variables that are not in fact correlated.[109]
1967 Social (attribution bias) Research American social psychologist Edward E. Jones and Victor Harris conduct a classic experiment[110] that would later give rise to the phrase Fundamental attribution error, coined by Lee Ross.[111] Fundamental attribution error "is the tendency for people to over-emphasize dispositional, or personality-based explanations for behaviors observed in others while under-emphasizing situational explanations".[112]
1968 Belief, decision-making and behavioral (anchoring bias) Concept development American psychologist Ward Edwards discusses the concept of conservatism (belief revision) bias.[113] "Conservatism bias is a mental process in which people maintain their past views or predictions at the cost of recognizing new information."[114]
1968 Social Concept development German-born American psychologist Robert Rosenthal and Lenore Jacobsen first describe what would be called Pygmalion Effect (also called the Galatea effect).[115] Pygmalion Effect "refers to the phenomenon of people improving their performance when others have high expectations of them."[116]
1969 Social (cognitive dissonance) Concept development Researchers confirm the Ben Franklin effect.[117] The Ben Franklin effect refers to "an altruistic reaction that makes a person more likely to do a favor for someone that they have already completed a favor for; more likely than they are to return a favor to someone who has completed a favor for them."[118]
1969 Memory bias Research Crowder and Morton argue that the suffix effect is a reflection of the contribution of the auditory sensory memory or echoic memory to recall in the nonsuffix control condition.[119] "The suffix effect is the selective impairment in recall of the final items of a spoken list when the list is followed by a nominally irrelevant speech item, or suffix."[120]
1971 Social bias Concept development The concept of actor–observer asymmetry (also actor–observer bias) is introduced by Jones and Nisbett. It explains the errors that one makes when forming attributions about the behavior of others.[121] The actor–observer asymmetry "states that people tend to explain their own behavior with situation causes and other people's behavior with person causes".[122]
1972 Concept development The concept of cognitive bias is introduced in this year through the work of researchers Amos Tversky and Daniel Kahneman.[123] Cognitive bias refers to "people's systematic but purportedly flawed patterns of responses to judgment and decision problems."[124]
1973 Memory bias Concept development American academic Baruch Fischhoff attends a seminar where Paul E. Meehl states an observation that clinicians often overestimate their ability to have foreseen the outcome of a particular case, as they claim to have known it all along.[125] "Hindsight bias, the tendency, upon learning an outcome of an event—such as an experiment, a sporting event, a military decision, or a political election—to overestimate one's ability to have foreseen the outcome."[126]
1973 Belief, decision-making and behavioral (egocentric bias) Concept development The illusion of validity bias is first described by Amos Tversky and Daniel Kahneman in their paper.[127] The illusion of validity occurs when an individual overestimates their ability to predict an outcome when analyzing a set of data - especially when the data appears to have a consistent pattern or appears to 'tell a story".[128]
1973 Memory bias Concept development The next-in-line effect is first studied experimentally by Malcolm Brenner. In his experiment the participants were each in turn reading a word aloud from an index card, and after 25 words were asked to recall as many of all the read words as possible. The results of the experiment show that words read aloud within approximately nine seconds before the subject's own turn are recalled worse than other words.[129] "Next-in-line effect. people not remembering what other people said because they were too busy rehearsing their own part."[130]
1974 Memory bias Research Elizabeth Loftus and John Palmer conduct a study to investigate the effects of language on the development of false memory.[131] "False memory refers to cases in which people remember events differently from the way they happened or, in the most dramatic case, remember events that never happened at all."[132]
1974 Belief, decision-making and behavioral Concept development Anchoring is first described by Tversky and Kahneman.[133] "Anchoring bias occurs when people rely too much on pre-existing information or the first information they find when making decisions."[134]
1975 Social (attribution bias) Research Miller and Ross conduct a study that is one of the earliest to assess not only self-serving bias but also the attributions for successes and failures within this theory.[135] Self-serving bias is the common habit of a person taking credit for positive events or outcomes, but blaming outside factors for negative events."[136]
1976 Belief, decision-making and behavioral (logical fallacy) Concept development Escalation of commitment is first described by Barry M. Staw in his paper Knee deep in the big muddy: A study of escalating commitment to a chosen course of action.[137] Escalation of commitment "refers to the irrational behavior of investing additional resources in a failing project."[138]
1976 Social (attribution bias) Research Prior to Pettigrew's formalization of the ultimate attribution error, Birt Duncan finds that White participants view Black individuals as more violent than White individuals in an "ambiguous shove" situation, where a Black or White person accidentally shoves a White person.[139] "The tendency for persons from one group (the ingroup) to determine that any bad acts by members of an outgroup—for example, a racial or ethnic minority group—are caused by internal attributes or traits rather than by outside circumstances or situations, while viewing their positive behaviors as merely exceptions to the rule or the result of luck."[140]
1977 Memory bias Research Misattribution of memory. Early research done by Brown and Kulik finds that flashbulb memories are similar to photographs because they can be described in accurate, vivid detail. In this study, participants describe their circumstances about the moment they learned of the assassination of President John F. Kennedy as well as other similar traumatic events. Participants are able to describe what they were doing, things around them, and other details.[141] Misattribution of memory occurs "when a memory is distorted because of the source, context, or our imagination."[142]
1977 Social (egocentric bias) Concept development A study conducted by Lee Ross and colleagues provides early evidence for a cognitive bias called the false consensus effect, which is the tendency for people to overestimate the extent to which others share the same views.[143] The false-consensus effect "refers to the tendency to overestimate consensus for one′s attitudes and behaviors."[144][145] It is "the tendency to assume that one’s own opinions, beliefs, attributes, or behaviors are more widely shared than is actually the case."[146]
1977 Belief, decision-making and behavioral (truthiness) Concept development The illusory truth effect is first identified in a study at Villanova University and Temple University.[147][148] The illusory truth effect "occurs when repeating a statement increases the belief that it’s true even when the statement is actually false."[149]
1977 Memory bias Research T. B. Rogers and colleagues publish the first research on the self-reference effect.[150][151] "The self-reference effect refers to people’s tendency to better remember information when that information has been linked to the self than when it has not been linked to the self."[152]
1978 Memory bias Research Loftus, Miller, and Burns conduct the original misinformation effect study.[153] The misinformation effect "happens when a person's memory becomes less accurate due to information that happens after the event."[154]
1979 Social (attribution bias) Research Thomas Nagel identifies four kinds of moral luck in his essay.[155] "Moral luck occurs when the features of action which generate a particular moral assessment lie significantly beyond the control of the agent who is so assessed."[156]
1979 Social bias Concept development The ultimate attribution error is first established by Thomas F. Pettigrew in his publication The Ultimate Attribution Error: Extending Allport's Cognitive Analysis of Prejudice.[157] "Ultimate attribution error refers to the tendency of individuals to make less internal attributions of negative behaviors committed by ingroup members compared to outgroup members."[158]
1979 Social bias Concept development David Kahneman and Amos Tversky originally coin the term loss aversion in a landmark paper on subjective probability.[159] "Loss aversion is a cognitive bias that suggests that for individuals the pain of losing is psychologically twice as powerful as the pleasure of gaining."[160]
1979 Belief, decision-making and behavioral Concept development The planning fallacy is first proposed by Daniel Kahneman and Amos Tversky.[161][162] "The planning fallacy refers to a prediction phenomenon, all too familiar to many, wherein people underestimate the time it will take to complete a future task, despite knowledge that previous tasks have generally taken longer than planned"[163]
1980 Memory bias Concept development The term "egocentric bias" is first coined by Anthony Greenwald, a psychologist at Ohio State University.[164] "The egocentric bias is a cognitive bias that causes people to rely too heavily on their own point of view when they examine events in their life or when they try to see things from other people’s perspective."[165]
1980 Social bias Concept development Ruth Hamill, Richard E. Nisbett, and Timothy DeCamp Wilson become the first to study the first type of group attribution error in detail in their paper Insensitivity to Sample Bias: Generalizing From Atypical Cases.[166] Group attribution error is "the tendency for perceivers to assume that a specific group member’s personal characteristics and preferences, including beliefs, attitudes, and decisions, are similar to those of the group to which he or she belongs."[167]
1980 Belief, decision-making and behavioral (truthiness) Concept development The term subjective validation first appears in the book The Psychology of the Psychic by David F. Marks and Richard Kammann.[168] Subjective validation "causes an individual to consider a statement or another piece of information correct if it has any significance or personal meaning (validating their previous opinion) to them."[169]
1980 Belief, decision-making and behavioral Concept development The phenomenon of optimism bias is initially described by Weinstein, who finds that the majority of college students believe that their chances of developing a drinking problem or getting divorced are lower than their peers'.[170] "Optimism Bias refers to the tendency for individuals to underestimate their probability of experiencing adverse effects despite the obvious."[171]
1981 Social bias Research Tversky and Kahneman conduct a demonstration of the framing effect.[172] "The Framing effect is the principle that our choices are influenced by the way they are framed through different wordings, settings, and situations."[173]
1981 Belief, decision-making and behavioral (prospect theory) Concept development The pseudocertainty effect is illustrated by Daniel Kahneman.[174] "Pseudocertainty effect refers to people's tendency to make risk-averse choices if the expected outcome is positive, but make risk-seeking choices to avoid negative outcomes."[175]
1982 Social (egocentric bias) Research Trait ascription bias. In a study involving fifty-six undergraduate psychology students from the University of Bielefeld, Kammer et al. demonstrate that subjects rate their own variability on each of 20 trait terms to be considerably higher than their peers'.[176] "Trait ascription bias is the belief that other people's behavior and reactions are generally predictable while you yourself are more unpredictable."[177]
1982 Belief, decision-making and behavioral (framing effect) Research The decoy effect is first demonstrated by Joel Huber and others at Duke University. The effect explains how when a customer is hesitating between two options, presenting them with a third “asymmetrically dominated” option that acts as a decoy will strongly influence which decision they make.[178] "The decoy effect is defined as the phenomenon whereby consumers change their preference between two options when presented with a third option."[179]
1983 Social (egocentric bias) Concept development Sociologist W. Phillips Davison first articulates the third-person effect hypothesis.[180][181] Third-person effect refers to "the commonly held belief that other people are more affected, due to personal prejudices, by mass media than you yourself are. This view, largely due to a personal conceit, is caused by the self-concept of being more astute and aware than others, or of being less vulnerable to persuasion than others."[182]
1983 Social (conformity bias) Research Jones reports the presence of courtesy bias in Asian cultures.[183] "Courtesy bias is the tendency that some individuals have of not fully stating their unhappiness with a service or product because of a desire not to offend the person or organization that they are responding to."[184]
1985 Belief, decision-making and behavioral (prospect theory) Concept development The disposition effect anomaly is identified and named by Hersh Shefrin and Meir Statman, who note that "people dislike incurring losses much more than they enjoy making gains, and people are willing to gamble in the domain of losses." Consequently, "investors will hold onto stocks that have lost value...and will be eager to sell stocks that have risen in value." The researchers coin the term "disposition effect" to describe this tendency of holding on to losing stocks too long and to sell off well-performing stocks too readily.[185] "The disposition effect refers to investors’ reluctance to sell assets that have lost value and greater likelihood of selling assets that have made gains."[186]
1985 Belief, decision-making and behavioral (logical fallacy) Concept development The hot-hand fallacy is first described in a paper by Amos Tversky, Thomas Gilovich, and Robert Vallone.[187] "The hot-hand fallacy effect refers to the tendency for people to expect streaks in sports performance to continue."[188]
1986 Memory bias Research McDaniel and Einstein describe the bizarreness effect as the finding that people have superior memory for bizarre sentences relative to common ones.[189] However, the researchers argue that bizarreness intrinsically does not enhance memory in their paper.[190][191] "The bizarreness effect holds that items associated with bizarre sentences or phrases are more readily recalled than those associated with common sentences or phrases."[192]
1988 Social Concept development The Reactive devaluation bias is proposed by Lee Ross and Constance Stillinger.[193] "Reactive Devaluation is tendency to value the proposal of someone we recognized as an antagonist as being less interesting than if it was made by someone else."[194]
1988 Belief, decision-making and behavioral (prospect theory) Research Samuelson and Zeckhauser demonstrate status quo bias using a questionnaire in which subjects faced a series of decision problems, which were alternately framed to be with and without a pre-existing status quo position. Subjects tended to remain with the status quo when such a position was offered to them.[195] "Status quo bias refers to the phenomenon of preferring that one's environment and situation remain as they already are."[196]
1989 Belief, decision-making and behavioral Concept development The term "curse of knowledge" is coined in a Journal of Political Economy article by economists Colin Camerer, George Loewenstein, and Martin Weber. The curse of knowledge causes people to fail to account for the fact that others don't know the same things that they do.[197]
1990 Belief, decision-making and behavioral (prospect theory) Research Kahneman, Knetsch and Thaler publish a paper containing the first experimental test of the Endowment Effect.[198] It refers to an emotional bias that causes individuals to value an owned object higher, often irrationally, than its market value.
1990 Belief, decision-making and behavioral (confirmation bias) Concept development The phenomenon known as “satisfaction of search” is first described, in which a radiologist fails to detect a second abnormality, apparently because of prematurely ceasing to search the images after detecting a “satisfying” finding.[199] "Satisfaction of search describes a situation in which the detection of one radiographic abnormality interferes with that of others."[200]
1990 Literature Jean-Paul Caverni, Jean-Marc Fabre and Michel Gonzalez publish Cognitive Biases.[201]
1991 Social (egocentric bias) Concept development The term illusory superiority is first used by the researchers Van Yperen and Buunk.[202] Illusory superiority "indicates an individual who has a belief that they are somehow inherently superior to others".[203]
1991 Social (conformity bias) Research Marín and Marín report courtesy bias to be common in Hispanic cultures.[183] The "Courtesy Bias is the reluctance of an individual to give negative feedback for fear of offending."[204]
1994 Belief, decision-making and behavioral Concept development The Women are wonderful effect term is coined by researchers Alice Eagly and Antonio Mladinic in a paper, where they question the widely-held view that there was prejudice against women.[205] "The women are wonderful effect is a phenomenon found in psychological research in which people associate more positive attributes with women as compared to men."[206]
1994 Belief, decision-making and behavioral (logical fallacy) Research Research by Fox, Rogers, and Tversky provides evidence of the subadditivity effect in expert judgment, after having investigated 32 professional options traders.[207] The subadditivity effect is "the tendency to judge probability of the whole to be less than the probabilities of the parts".[208]
1995 Concept development The implicit bias is first described in a publication by Tony Greenwald and Mahzarin Banaji.[209] "Research on implicit bias suggests that people can act on the basis of prejudice and stereotypes without intending to do so."[210]
1996 Research Daniel Kahneman and Amos Tversky argue that cognitive biases have efficient practical implications for areas including clinical judgment, entrepreneurship, finance, and management.[211][212]
1998 Belief, decision-making and behavioral Research Gilbert et al. report on the presence of impact bias in registered voters.[213] "Impact bias refers to a human tendency to overestimate emotional responses to events and experiences."[214]
1998 Concept development The implicit-association test is introduced in the scientific literature by Anthony Greenwald, Debbie McGhee, and Jordan Schwartz.[215] It is a research method able to provide a range of new possibilities for those looking to conduct research exploring attitudes and beliefs.[216] "The implicit-association test is a flexible task designed to tap automatic associations between concepts (e.g., math and arts) and attributes (e.g., good or bad, male or female, self or other)."[217]
1998 Belief, decision-making and behavioral (extension neglect) Concept development Hsee discovers a less-is-better effect in three contexts: "(1) a person giving a $45 scarf (from scarves ranging from $5-$50) as a gift was perceived to be more generous than one giving a $55 coat (from coats ranging from $50-$500); (2) an overfilled ice cream serving with 7 oz of ice cream was valued more than an underfilled serving with 8 oz of ice cream; (3) a dinnerware set with 24 intact pieces was judged more favourably than one with 31 intact pieces (including the same 24) plus a few broken ones."[218] "The less-is-better effect is the tendency to prefer the smaller or the lesser alternative when choosing individually, but not when evaluating together."[219]
1999 Belief, decision-making and behavioral Concept development The psychological phenomenon of illusory superiority known as Dunning–Kruger effect is identified as a form of cognitive bias in Kruger and Dunning's 1999 study, Unskilled and Unaware of It: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments.[220] "The Dunning-Kruger effect is a cognitive bias in which people wrongly overestimate their knowledge or ability in a specific area."[221]
1999 Memory bias Concept development The term "spotlight effect" is coined by Thomas Gilovich and Kenneth Savitsky.[222] The phenomenon first appears in the world of psychology in the journal Current Directions in Psychological Science. "The spotlight effect refers to the tendency to think that more people notice something about you than they do."[223]
1999 Social (egocentric bias) Concept development Kruger and Gilovich publish study titled Naive cynicism in everyday theories of responsibility assessment: On biased assumptions of bias, which formally introduces the concept of naïve cynicism.[224] Naïve cynicism is "the tendency of laypeople to expect other people’s judgments will have a motivational basis and therefore will be biased in the direction of their self-interest."[225]
2002 Belief, decision-making and behavioral Concept development Daniel Kahneman and Shane Frederick propose the process of attribute substitution.[226] "Attribute substitution occurs when an individual has to make a judgment (of a target attribute) that is computationally complex, and instead substitutes a more easily calculated heuristic attribute."[227]
2001 Belief, decision-making and behavioral (framing effect) Research Druckman shows that economic policies receive higher support when framed in terms of the employment rates rather than unemployment rates.[228] "The Framing Effect is a cognitive bias that explains how we react differently to things depending on how they are presented to us."[229]
2002 Social (egocentric bias) Concept development Pronin et al. introduce the concept of "bias blind spot".[230] Bias blind spot "refers to the tendency for people to be able to identify distortionary biases in others, while being ignorant of and susceptible to precisely these biases in their own thinking."[230]
2002 Research Bystander effect. Research indicates that priming a social context may inhibit helping behavior. Imagining being around one other person or being around a group of people can affect a person's willingness to help.[231] "The bystander effect occurs when the presence of others discourages an individual from intervening in an emergency situation."[232]
2002 Belief, decision-making and behavioral (prospect theory) Recognition Daniel Kahneman is awarded the Nobel Memorial Prize in Economic Sciences for his work on prospect theory. He is the first non-economist by profession to win the prize.[233][234] "Prospect theory assumes that losses and gains are valued differently, and thus individuals make decisions based on perceived gains instead of perceived losses."[235]
2003 Belief, decision-making and behavioral Concept development The term projection bias is first introduced in the paper Projection Bias in Predicting Future Utility by Loewenstein, O'Donoghue and Rabin.[236] Projection bias "refers to people’s assumption that their tastes or preferences will remain the same over time."[237]
2003 Concept development Lovallo and Kahneman propose an expanded definition of planning fallacy as the tendency to underestimate the time, costs, and risks of future actions and at the same time overestimate the benefits of the same actions. According to this definition, the planning fallacy results in not only time overruns, but also cost overruns and benefit shortfalls.[238] "Planning fallacy refers to a prediction phenomenon, all too familiar to many, wherein people underestimate the time it will take to complete a future task, despite knowledge that previous tasks have generally taken longer than planned."[239]
2003 Belief, decision-making and behavioral (framing effect) Research Johnson and Goldstein report on the framing effect playing a key role in the rate of organ donation.[172] "The term framing effect refers to a phenomenon whereby the choices people make are systematically altered by the language used in the formulation of options."[240]
2004 Social bias Literature American journalist James Surowiecki publishes The Wisdom of Crowds, which explores herd mentality and draws the conclusion that the decisions made by groups are often better and more accurate than those made by any individual member.[241] "Herd mentality (also known as mob mentality) describes a behavior in which people act the same way or adopt similar behaviors as the people around them – often ignoring their own feelings in the process."[241]
2004 Literature Rüdiger Pohl and Rüdiger F. Pohl publish Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory, which provides an overview of research in the area.[242]
2004 Belief, decision-making and behavioral (framing effect) Concept development The concept of the distinction bias is advanced by Christopher K. Hsee and Jiao Zhang of the University of Chicago as an explanation for differences in evaluations of options between joint evaluation mode and separate evaluation mode.[243] Distinction bias is "the tendency to view two options as more dissimilar when evaluating them simultaneously than when evaluating them separately." This bias is similar to the less-is-better effect, which is "the tendency to prefer a smaller set to a larger set judged separately, but not jointly."[32]
2005 Research Haigh and List report on the framing effect playing a key role in stock market forecasting.[172] "The framing effect is a type of cognitive bias that causes people to react to something in different ways depending on how the information is presented to them."[244]
2006 Organization Overcoming Bias launches as a group blog on the "general theme of how to move our beliefs closer to reality, in the face of our natural biases such as overconfidence and wishful thinking, and our bias to believe we have corrected for such biases, when we have done no such thing."[245]
2006 Belief, decision-making and behavioral Concept development The Ostrich effect is coined by Galai & Sade.[246] "The ostrich effect bias is a tendency to ignore dangerous or negative information by ignoring it or burying one's head in the sand."[247]
2007 Belief, decision-making and behavioral Concept development The term recency illusion is coined by Stanford University linguist Arnold Zwicky.[248] The recency illusion is the belief or impression that a word or language usage is of recent origin when it is long-established."[248]
2007 Social (conformity bias) Concept development The concept of an “availability cascade” is defined by professors Timur Kuran and Cass Sunstein.[249] Availability cascade refers to the "self-reinforcing process of collective belief formation by which an expressed perception triggers a chain reaction that gives the perception of increasing plausibility through its rising availability in public discourse."[249]
2008 Belief, decision-making and behavioral Literature Israeli-American author Dan Ariely publishes Predictably Irrational: The Hidden Forces That Shape Our Decisions, which explores cognitive biases within the genre of behavioral economics.[250]
2008 Social bias (association fallacy) Concept development The term cheerleader effect is coined by the character Barney Stinson in Not a Father's Day, an episode of the television series How I Met Your Mother. Barney points out to his friends a group of women that initially seem attractive, but who all seem to be very ugly when examined individually.[251] "The cheerleader effect refers to the increase in attractiveness that an individual face experiences when seen in a group of other faces."[252]
2009 Belief, decision-making and behavioral (framing effect) Concept development The concept of denomination effect is proposed by Priya Raghubir, professor at the New York University Stern School of Business, and Joydeep Srivastava, professor at University of Maryland, in their paper.[253] Denomination effect relates "to currency, whereby people are less likely to spend larger bills than their equivalent value in smaller bills."[254]
2010 Belief, decision-making and behavioral (confirmation bias) Concept development The backfire effect is first coined by American political scientist Brendan Nyhan and Jason Reifler.[255] "The backfire effect is a cognitive bias that causes people who encounter evidence that challenges their beliefs to reject that evidence, and to strengthen their support of their original stance."[256]
2010 Belief, decision-making and behavioral Research The Handbook of Social Psychology recognizes naïve realism as one of "four hard-won insights about human perception, thinking, motivation and behavior that... represent important, indeed foundational, contributions of social psychology."[257] "Naïve realism describes people’s tendency to believe that they perceive the social world “as it is”—as objective reality—rather than as a subjective construction and interpretation of reality."[258]
2010 Belief, decision-making and behavioral Research In a study looking at computer use and musculoskeletal symptoms, Chang et al investigate information bias in the self-reporting of personal computer use. Over a period of 3 weeks, young adults report the duration of computer use each day, as well as musculoskeletal symptoms. Usage-monitor software installed onto participant’s computers provides the reference measure. Results show that the relationships between daily self-reported and software-recorded computer-use duration varied greatly across subject with Spearman's correlations ranging from -0.22 to 0.8. Self-reports generally overestimated computer use when software-recorded durations were less than 3.6 hr, and underestimated when above 3.6 hr.[259][260] "Information bias is any systematic difference from the truth that arises in the collection, recall, recording and handling of information in a study, including how missing data is dealt with."[261]
2010 Literature Sebastian Serfas publishes Cognitive Biases in the Capital Investment Context: Theoretical Considerations and Empirical Experiments on Violations of Normative Rationality, which shows how cognitive biases systematically affect and distort capital investment-related decision making and business judgements.[262]
2011 Belief, decision-making and behavioral Concept development The IKEA effect is identified and named by Michael I. Norton of Harvard Business School, Daniel Mochon of Yale, and Dan Ariely of Duke University, who publish the results of three studies in this year.[263] "The [IKEA effect] is the cognitive phenomena where customers get more excited and place a higher value in the products they have partially created, modified or personalized."[264]
2011 Literature Daniel Kahneman publishes Thinking, Fast and Slow, which covers cognitive biases, in addition to his work in other fields.[265]
2011 Memory bias Concept development The Google effect, also known as “digital amnesia”, is first described by Betsy Sparrow from Columbia University and her colleagues. Their paper describes the results of several memory experiments involving technology.[266][267] The Google effect "represents people’s tendency to forget information that they can find online, particularly by using search engines such as Google."[268]
2011 Belief, decision-making and behavioral Notable case The look-elsewhere effect, more generally known in statistics as the problem of multiple comparisons, gains some media attention in the context of the search for the Higgs boson at the Large Hadron Collider.[269] The look-elsewhere effect "occurs when a statistically significant observation is found but, actually, arose by chance and due to the size of the parameter space and sample observed."[270]
2011 Literature American neuroscientist Dean Buonomano publishes Brain Bugs: How the Brain's Flaws Shape Our Lives, which attempts to explain the brain’s inherent flaws.[271]
2013 (February 12) Literature American psychologist Mahzarin Banaji publishes Blindspot: Hidden Biases of Good People, which explains the science that shapes our likes and dislikes and our judgments about people’s character, abilities and potential. The book uses the implicit-association test, an assessment that measures attitudes and beliefs that people may be unwilling or unable to report.[272]
2013 Belief, decision-making and behavioral Concept development The term “end-of-history illusion” originates in a journal article by psychologists Jordi Quoidbach, Daniel Gilbert, and Timothy Wilson detailing their research on the phenomenon and leveraging the phrase coined by Francis Fukuyama's 1992 book of the same name.[273] The end-of-history illusion occurs "when people tend to underestimate how much they will change in the future.”[274]
2013 Literature Swiss writer Rolf Dobelli publishes The Art of Thinking Clearly, which describes the most common thinking errors, ranging from cognitive biases to envy and social distortions.[275]
2016 Literature Adrian Nantchev publishes 50 Cognitive Biases for an Unfair Advantage in Entrepreneurship.[276]
2019 Literature Henry Priest publishes Biases and Heuristics: The Complete Collection of Cognitive Biases and Heuristics That Impair Decisions in Banking, Finance and Everything Else.[277]

Meta information on the timeline

How the timeline was built

The initial version of the timeline was written by User:Sebastian.

Funding information for this timeline is available.

Feedback and comments

Feedback for the timeline can be provided at the following places:

  • FIXME

What the timeline is still missing

  • Issa: This is probably going to take a whole bunch of work, but eventually it would be nice if the rows containing specific studies that were conducted could mention whether the study has been replicated or not.

Timeline update strategy

See also

External links

References

  1. "Every Single Cognitive Bias in One Infographic". visualcapitalist.com. Retrieved 5 December 2020. 
  2. "Cognitive biases". trends.google.com. Retrieved 15 January 2021. 
  3. "Google Books Ngram Viewer". books.google.com. Retrieved 28 January 2021. 
  4. "Cognitive biases". wikipediaviews.org. Retrieved 19 January 2021. 
  5. Sextus Empiricus, "Outlines of Pyrrhonism", Book 1, Chapter 13, Section 32
  6. "Just-World Hypothesis". alleydog.com. Retrieved 7 May 2020. 
  7. Carlisle, Rodney (2004). Scientific American Inventions and Discoveries, John Wiley & Songs, Inc., New Jersey. p. 393.
  8. "What Are Clinical Trials and Studies?". National Institute on Aging. Retrieved 28 January 2021. 
  9. Chambers's Cyclopædia, Supplement, 1753 
  10. Oxford English Dictionary, 1st ed. "anthropomorphism, n." Oxford University Press (Oxford), 1885.
  11. "Anthropomorphism". britannica.com. Retrieved 7 May 2020. 
  12. Miller, Laura (2015-06-14). "Culture is dead — again". Salon. Retrieved 17 April 2018. 
  13. J.G.A. Pocock, "Between Machiavelli and Hume: Gibbon as Civic Humanist and Philosophical Historian," Daedalus 105:3 (1976), 153–169; and in Further reading: Pocock, EEG, 303–304; FDF, 304–306.
  14. "Why we feel the past is better compare to what the future holds". thedecisionlab.com. Retrieved 7 May 2020. 
  15. Barron, Greg; Leider, Stephen (13 October 2009). "The role of experience in the Gambler's Fallacy" (PDF). Journal of Behavioral Decision Making. 
  16. "The Gambler's Fallacy - Explained". thecalculatorsite.com. Retrieved 7 May 2020. 
  17. Mortell, Manfred; Balkhy, Hanan H.; Tannous, Elias B.; Jong, Mei Thiee (July 2013). "Physician 'defiance' towards hand hygiene compliance: Is there a theory–practice–ethics gap?". Journal of the Saudi Heart Association. 25 (3): 203–208. PMC 3809478Freely accessible. PMID 24174860. doi:10.1016/j.jsha.2013.04.003. 
  18. "Semmelweis Reflex (Semmelweis Effect)". alleydog.com. Retrieved 7 May 2020. 
  19. "Bandwagon Effect". Retrieved 2007-03-09. 
  20. "The Bandwagon Effect". psychologytoday.com. Retrieved 7 May 2020. 
  21. 21.0 21.1 "Stereotypes Defined". stereotypeliberia.wordpress.com. Retrieved 10 April 2020. 
  22. Oxford Languages
  23. Fechner, Gustav Theodor (1966) [First published .1860]. Howes, D H; Boring, E G, eds. Elements of psychophysics [Elemente der Psychophysik]. volume 1. Translated by Adler, H E. United States of America: Holt, Rinehart and Winston. 
  24. "Weber's law". britannica.com. Retrieved 7 May 2020. 
  25. [1] Sibbald, M.D. "Report on the Progress of Psychological Medicine; German Psychological Literature", The Journal of Mental Science, Volume 13. 1867. p. 238
  26. "pareidolia". merriam-webster.com. Retrieved 7 May 2020. 
  27. Brian Righi. (2008). Chapter 4: Talking Boards and Ghostly Goo. In Ghosts, Apparitions and Poltergeists. Llewellyn Publications."An early example of this occurred in 1874 with he medium William Stanton Moses, who communicated with the spirits of two brothers who had recently died in India. Upon investigation, it was discovered that one week prior to the séance, their obituary had appeared in the newspaper. This was of some importance because Moses's communications with the two spirits contained nothing that wasn't already printed in the newspaper. When the spirits were pressed for further information, they were unable to provide any. Researchers concluded that Moses had seen the obituary, forgotten it, and then resurfaced the memory during the séance."
  28. Robert Todd Carroll. (2014). "Cryptomnesia". The Skeptic's Dictionary. Retrieved 2014-07-12.
  29. "cryptomnesia". dictionary.apa.org. Retrieved 7 May 2020. 
  30. "Mere Exposure Effect" (PDF). wiwi.europa-uni.de. Retrieved 10 April 2020. 
  31. "6 Conversion Principles You Can Learn From The Mere-Exposure Effect". marketingland.com. Retrieved 7 May 2020. 
  32. 32.0 32.1 "List of cognitive biases". uxinlux.github.io. Retrieved 25 July 2021. 
  33. Anonymous (E. Robert Kelly, 1882) The Alternative: A Study in Psychology. London: Macmillan and Co. p. 168.
  34. 34.0 34.1 Andersen H, Grush R (2009). "A brief history of time-consciousness: historical precursors to James and Husserl" (PDF). Journal of the History of Philosophy. 47 (2): 277–307. doi:10.1353/hph.0.0118. 
  35. James W (1893). The principles of psychology. New York: H. Holt and Company. p. 609. 
  36. Vlach, Haley A.; Sandhofer, Catherine M. "Distributing Learning Over Time: The Spacing Effect in Children's Acquisition and Generalization of Science Concepts". PMC 3399982Freely accessible. PMID 22616822. doi:10.1111/j.1467-8624.2012.01781.x. 
  37. James, W. (1890). Principles of Psychology. Retrieved from http://psychclassics.yorku.ca/James/Principles/
  38. Brown, Roger; McNeill, David. "The "tip of the tongue" phenomenon". doi:10.1016/S0022-5371(66)80040-3. 
  39. Bauer, P (2004). "Oh where, oh where have those early memories gone? A developmental perspective on childhood amnesia". Psychological Science Agenda. 18 (12). 
  40. "Childhood Amnesia". sciencedirect.com. Retrieved 7 May 2020. 
  41. "bandwagon effect". merriam-webster.com. Retrieved 7 April 2020. 
  42. "Bandwagon Effect - Biases & Heuristics". The Decision Lab. Retrieved 26 January 2021. 
  43. Sumner, William Graham. (1906). Folkways: A Study of the Social Importance of Usages, Manners, Customs, Mores, and Morals. Boston, MA: Ginn.
  44. Everett, Jim A. C.; Faber, Nadira S.; Crockett, Molly. "Preferences and beliefs in ingroup favoritism". PMC 4327620Freely accessible. PMID 25762906. doi:10.3389/fnbeh.2015.00015. 
  45. Abbott, Edwina (1909). "On the analysis of the factors of recall in the learning process". Psychological Monographs: General and Applied. 11 (1): 159–177. doi:10.1037/h0093018 – via Ovid. 
  46. Larsen, Douglas P.; Butler, Andrew C. (2013). Walsh, K., ed. Test-enhanced learning. In Oxford Textbook of Medical Education. pp. 443–452. 
  47. Goldstein, E. Bruce. Cognitive Psychology: Connecting Mind, Research and Everyday Experience. Cengage Learning. ISBN 978-1-133-00912-2. 
  48. "Why we gamble like monkeys". BBC.com. 2015-01-02. 
  49. "Gambler's Fallacy". investopedia.com. Retrieved 7 May 2020. 
  50. Feingold, CA (1914). "The influence of environment on identification of persons and things". Journal of Criminal Law and Police Science. 5 (1): 39–51. JSTOR 1133283. doi:10.2307/1133283. 
  51. Laub, Cindy E.; Meissner, Christian A.; Susa, Kyle J. "The Cross-Race Effect: Resistant to Instructions". doi:10.1155/2013/745836. 
  52. The Advanced Dictionary of Marketing, Scott G. Dacko, 2008: Marketing. Oxford: Oxford University Press. 2008-06-18. p. 248. ISBN 9780199286003. 
  53. 53.0 53.1 Thorndike 1920
  54. Sigall, Harold; Ostrove, Nancy (1975-03-01). "Beautiful but Dangerous: Effects of Offender Attractiveness and Nature of the Crime on Juridic Judgment". Journal of Personality and Social Psychology. 31 (3): 410–414. doi:10.1037/h0076472. 
  55. "Halo effect". britannica.com. Retrieved 7 May 2020. 
  56. "Definition of STEREOTYPE". www.merriam-webster.com. Retrieved 28 January 2021. 
  57. "Bluma Wulfovna Zeigarnik". The Science of Psychotherapy. 31 March 2014. Retrieved 16 March 2021. 
  58. Zeigarnik 1927: "Das Behalten erledigter und unerledigter Handlungen". Psychologische Forschung 9, 1-85.
  59. Zeigarnik 1927: "Das Behalten erledigter und unerledigter Handlungen". Psychologische Forschung 9, 1-85.
  60. "Zeigarnik Effect". goodtherapy.org. Retrieved 7 May 2020. 
  61. Fisher, Irving (1928), The Money Illusion, New York: Adelphi Company 
  62. Liberto, Daniel. "Money Illusion Definition". Investopedia. Retrieved 26 January 2021. 
  63. "The Specious Present: Andrew Beck, David Claerbout, Colin McCahon, Keith Tyson - Announcements - Art & Education". www.artandeducation.net. Retrieved 27 January 2021. 
  64. Fleming, G. W. T. H. (January 1933). "The Learning and Retention of Pleasant and Unpleasant Activities. (Arch. of Psychol., No. 134, 1932.) Cason, H.". Journal of Mental Science. 79 (324): 187–188. ISSN 0368-315X. doi:10.1192/bjp.79.324.187-c. 
  65. Skowronski, John J.; Walker, W. Richard; Henderson, Dawn X.; Bond, Gary D. "Chapter Three - The Fading Affect Bias: Its History, Its Implications, and Its Future". doi:10.1016/B978-0-12-800052-6.00003-2. 
  66. von Restorff, Hedwig (1933). "Über die Wirkung von Bereichsbildungen im Spurenfeld" [The effects of field formation in the trace field]. Psychologische Forschung [Psychological Research] (in Deutsch). 18 (1): 299–342. doi:10.1007/BF02409636. 
  67. "The Von Restorff effect". lawsofux.com. Retrieved 7 May 2020. 
  68. "The Einstellung Effect - Thinking Differently". Exploring your mind. 27 January 2020. Retrieved 18 April 2021. 
  69. "Einstellung Effect definition | Psychology Glossary | alleydog.com". www.alleydog.com. Retrieved 17 May 2021. 
  70. Duncker, K. (1945). "On problem solving". Psychological Monographs, 58:5 (Whole No. 270).
  71. "Functional fixedness". britannica.com. Retrieved 7 May 2020. 
  72. Batsidis, Apostolos; Tzavelas, George; Alexopoulos, Panagiotis. "Berkson's paradox and weighted distributions: An application to Alzheimer's disease". 
  73. "Berkson's Paradox (Berkson's Bias)". alleydog.com. Retrieved 14 August 2020. 
  74. Johnson, J. (2011). The arithmetic of compassion: rethinking the politics of photography. British Journal of Political Science, 41(3), 621-643. doi: 10.1017/S0007123410000487.
  75. "Joseph Stalin - Wikiquote". en.wikiquote.org. Retrieved 17 May 2021. 
  76. "Compassion fade". econowmics.com. Retrieved 15 January 2021. 
  77. Whyte, W. H., Jr. (March 1952). "Groupthink". Fortune. pp. 114–117, 142, 146. 
  78. Safire, William (8 August 2004). "THE WAY WE LIVE NOW: 8-8-04: ON LANGUAGE; Groupthink (Published 2004)". The New York Times. Retrieved 14 March 2021. 
  79. "The Psychology Behind Why We Strive for Consensus". Verywell Mind. 
  80. Festinger L (1954). "A theory of social comparison processes". Human Relations. 7 (2): 117–140. doi:10.1177/001872675400700202. 
  81. "Social Comparison Theory". psychologytoday.com. Retrieved 7 May 2020. 
  82. Meehl, Paul E. (1956). "Wanted – A Good Cookbook". American Psychologist. 11 (6): 263–272. doi:10.1037/h0044164. 
  83. Dutton, D. L. (1988). "The cold reading technique". Experientia. 44 (4): 326–332. PMID 3360083. doi:10.1007/BF01961271. 
  84. "Barnum Effect". britannica.com. Retrieved 7 May 2020. 
  85. Parkinson, C. Northcote (1958). Parkinson's Law, or the Pursuit of Progress. John Murray. ISBN 0140091076. 
  86. "How to Handle Bikeshedding: Parkinson's Law of Triviality". projectbliss.net. Retrieved 7 May 2020. 
  87. "The Curious Case of Confirmation Bias". psychologytoday.com. Retrieved 7 April 2020. 
  88. Acks, Alex. The Bubble of Confirmation Bias. 
  89. Myers, David G. Psychology. 
  90. "Confirmation Bias". simplypsychology.org. Retrieved 14 August 2020. 
  91. "The Curious Case of Confirmation Bias". psychologytoday.com. Retrieved 14 August 2020. 
  92. "Cognitive Bias in Decision Making". associationanalytics.com. Retrieved 7 May 2020. 
  93. Ellis RM (2015). Middle Way Philosophy: Omnibus Edition. Lulu Press. ISBN 9781326351892. 
  94. "Authority Bias". alleydog.com. Retrieved 14 August 2020. 
  95. Borcherding, Katrin; Laričev, Oleg Ivanovič; Messick, David M. (1990). Contemporary Issues in Decision Making. North-Holland. p. 50. ISBN 978-0-444-88618-7. 
  96. "Why we prefer options that are known to us". thedecisionlab.com. Retrieved 14 August 2020. 
  97. 97.0 97.1 Rubin, David C.; Baddeley, Alan D. (1989). "Telescoping is not time compression: A model". Memory & Cognition. 17 (6): 653–661. PMID 2811662. doi:10.3758/BF03202626. 
  98. "Telescoping effect - Biases & Heuristics". The Decision Lab. Retrieved 26 January 2021. 
  99. Abraham Kaplan (1964). The Conduct of Inquiry: Methodology for Behavioral Science. San Francisco: Chandler Publishing Co. p. 28. ISBN 9781412836296. 
  100. "Law of the instrument - Biases & Heuristics". The Decision Lab. Retrieved 27 January 2021. 
  101. Walster, Elaine (1966). "Assignment of responsibility for an accident.". Journal of Personality and Social Psychology. 3 (1): 73–79. doi:10.1037/h0022733. 
  102. "Defensive Attribution Hypothesis definition | Psychology Glossary | alleydog.com". www.alleydog.com. Retrieved 29 January 2021. 
  103. Adams, John (1985). Risk and Freedom: Record of Road Safety Regulation. Brefi Press. ISBN 9780948537059. 
  104. Flock, Elizabeth (2012-02-17). "Dagen H: The day Sweden switched sides of the road". Washington Post. On the day of the change, only 150 minor accidents were reported. Traffic accidents over the next few months went down. ... By 1969, however, accidents were back at normal levels 
  105. "On September 4 there were 125 reported traffic accidents as opposed to 130-196 from the previous Mondays. No traffic fatalities were linked to the switch. In fact, fatalities dropped for two years, possibly because drivers were more vigilant after the switch." Sweden finally began driving on the right side of the road in 1967 The Examiner Sept 2, 2009
  106. Mok, D; Gore, G; Hagel, B; Mok, E; Magdalinos, H; Pless, B. "Risk compensation in children's activities: A pilot study". PMC 2721187Freely accessible. PMID 19657519. doi:10.1093/pch/9.5.327. 
  107. Chapman, L (1967). "Illusory correlation in observational report". Journal of Verbal Learning and Verbal Behavior. 6 (1): 151–155. doi:10.1016/S0022-5371(67)80066-5. 
  108. Chapman, L.J (1967). "Illusory correlation in observational report". Journal of Verbal Learning. 6: 151–155. doi:10.1016/s0022-5371(67)80066-5. 
  109. "Illusory Correlation". psychology.iresearchnet.com. Retrieved 17 July 2020. 
  110. Jones, E. E.; Harris, V. A. (1967). "The attribution of attitudes". Journal of Experimental Social Psychology. 3 (1): 1–24. doi:10.1016/0022-1031(67)90034-0. 
  111. Ross, L. (1977). "The intuitive psychologist and his shortcomings: Distortions in the attribution process". In Berkowitz, L. Advances in experimental social psychology. 10. New York: Academic Press. pp. 173–220. ISBN 978-0-12-015210-0. 
  112. "Fundamental Attribution Error". simplypsychology.org. Retrieved 7 May 2020. 
  113. Edwards, Ward. "Conservatism in Human Information Processing (excerpted)". In Daniel Kahneman, Paul Slovic and Amos Tversky. (1982). Judgment under uncertainty: Heuristics and biases. New York: Cambridge University Press. Original work published 1968.
  114. "Conservatism Bias". dwassetmgmt.com. Retrieved 8 May 2020. 
  115. "Statistics How To". statisticshowto.com. Retrieved 7 April 2020. 
  116. "Pygmalion Effect". alleydog.com. Retrieved 7 May 2020. 
  117. "To Become Super-Likable, Practice "The Ben Franklin Effect"". medium.com. Retrieved 13 March 2020. 
  118. "Ben Franklin Effect". alleydog.com. Retrieved 7 May 2020. 
  119. "The suffix effect: How many positions are involved?" (PDF). link.springer.com. Retrieved 5 May 2020. 
  120. "Two-component theory of the suffix effect: Contrary evidence". link.springer.com. Retrieved 16 July 2020. 
  121. Malle, BF. "The actor-observer asymmetry in attribution: a (surprising) meta-analysis.". PMID 17073526. doi:10.1037/0033-2909.132.6.895. 
  122. "The actor-observer asymmetry in attribution: A (surprising) meta-analysis.". psycnet.apa.org. Retrieved 7 May 2020. 
  123. "Cognitive Bias: How Your Mind Plays Tricks on You and How to Overcome That at Work". zapier.com. Retrieved 15 January 2021. 
  124. "Cognitive Bias". sciencedirect.com. Retrieved 16 January 2021. 
  125. Fischhoff, B (2007). "An early history of hindsight research". Social Cognition. 25: 10–13. doi:10.1521/soco.2007.25.1.10. 
  126. "Hindsight bias". Encyclopedia Britannica. Retrieved 27 January 2021. 
  127. "Why are we overconfident in our predictions?". thedecisionlab.com. Retrieved 10 April 2020. 
  128. "Illusion Of Validity". alleydog.com. Retrieved 7 May 2020. 
  129. Brenner, Malcolm (1973). "The next-in-line effect" (PDF). Journal of Verbal Learning and Verbal Behavior. 12 (3): 320–323. doi:10.1016/s0022-5371(73)80076-3. 
  130. "Memory Flashcards". Quizlet. Retrieved 27 January 2021. 
  131. Loftus, Elizabeth F.; Palmer, John C. (1974). "Reconstruction of automobile destruction: An example of the interaction between language and memory". Journal of Verbal Learning and Verbal Behavior. 13 (5): 585–589. doi:10.1016/s0022-5371(74)80011-3. 
  132. "False memory". scholarpedia.org. Retrieved 14 August 2020. 
  133. Ralph, Kelcie; Delbosc, Alexa. "I'm multimodal, aren't you? How ego-centric anchoring biases experts' perceptions of travel patterns". doi:10.1016/j.tra.2017.04.027. 
  134. "Anchoring Bias - Definition, Overview and Examples". Corporate Finance Institute. Retrieved 27 January 2021. 
  135. Larson, James; Rutger U; Douglass Coll (1977). "Evidence for a self-serving bias in the attribution of causality". Journal of Personality. 45 (3): 430–441. doi:10.1111/j.1467-6494.1977.tb00162.x. 
  136. "What Is a Self-Serving Bias and What Are Some Examples of It?". healthline.com. Retrieved 7 May 2020. 
  137. Staw, Barry M. (1976). "Knee-deep in the big muddy: a study of escalating commitment to a chosen course of action". Organizational Behavior and Human Performance. 16 (1): 27–44. doi:10.1016/0030-5073(76)90005-2. 
  138. "Escalation of Commitment: Definition, Causes & Examples". bizfluent.com. Retrieved 7 May 2020. 
  139. Duncan, B. L. (1976). "Differential social perception and attribution if intergroup violence: Testing the lower limits of stereotyping of Blacks". Journal of Personality and Social Psychology. 34 (4): 75–93. doi:10.1037/0022-3514.34.4.590. 
  140. "APA Dictionary of Psychology". dictionary.apa.org. Retrieved 7 May 2020. 
  141. Brown, R., Kulik J. (1977). "Flashbulb memories". Cognition. 5: 73–99. doi:10.1016/0010-0277(77)90018-X. 
  142. "Misattribution Effect". sites.google.com. Retrieved 7 May 2020. 
  143. Ross, Lee; Greene, David; House, Pamela (1977). "The "false consensus effect": An egocentric bias in social perception and attribution processes". Journal of Experimental Social Psychology. 13 (3): 279–301. doi:10.1016/0022-1031(77)90049-x. 
  144. Alicke, Mark; Largo, Edward. "The Role of Self in the False Consensus Effect". doi:10.1006/jesp.1995.1002. 
  145. "False Consensus Effect". psychology.iresearchnet.com. Retrieved 14 January 2021. 
  146. "APA Dictionary of Psychology". dictionary.apa.org. Retrieved 29 January 2021. 
  147. Hasher, Lynn; Goldstein, David; Toppino, Thomas (1977). "Frequency and the conference of referential validity" (PDF). Journal of Verbal Learning and Verbal Behavior. 16 (1): 107–112. doi:10.1016/S0022-5371(77)80012-1. 
  148. Newman, Eryn J.; Sanson, Mevagh; Miller, Emily K.; Quigley-Mcbride, Adele; Foster, Jeffrey L.; Bernstein, Daniel M.; Garry, Maryanne (September 6, 2014). "People with Easier to Pronounce Names Promote Truthiness of Claims". PLOS ONE. 9 (2): e88671. PMC 3935838Freely accessible. PMID 24586368. doi:10.1371/journal.pone.0088671. 
  149. "Illusory Truth, Lies, and Political Propaganda: Part 1". psychologytoday.com. Retrieved 7 May 2020. 
  150. "Self-Reference Effect". psychology.iresearchnet.com. Retrieved 12 January 2021. 
  151. Bentley, Sarah V.; Greenaway, Katharine H.; Haslam, S. Alexander. "An online paradigm for exploring the self-reference effect". doi:10.1371/journal.pone.0176611. 
  152. "Self-Reference Effect - IResearchNet". Psychology. 12 January 2016. Retrieved 10 May 2021. 
  153. Zaragoza, Maria S.; Belli, Robert F.; Payment, Kristie E. "Misinformation Effectsand the Suggestibility of Eyewitness Memory". 
  154. "What Is Misinformation Effect?". growthramp.io. Retrieved 7 May 2020. 
  155. Rudy Hiller, Fernando. "How to (dis)solve Nagel's paradox about moral luck and responsibility". doi:10.1590/0100-6045.2016.V39N1.FRH. 
  156. "Moral Luck". philpapers.org. Retrieved 7 May 2020. 
  157. Pettigrew, T. F. (1979). "The ultimate attribution error: Extending Allport's cognitive analysis of prejudice". Personality and Social Psychology Bulletin. 5 (4): 461–476. doi:10.1177/014616727900500407. 
  158. Fraser Pettigrew, Thomas. "The Ultimate Attribution Error: Extending Allport's Cognitive Analysis of Prejudice". doi:10.1177/014616727900500407. 
  159. "Loss aversion". behavioraleconomics.com. Retrieved 14 August 2020. 
  160. "Why is the pain of losing felt twice as powerfully compared to equivalent gains?". thedecisionlab.com. Retrieved 14 August 2020. 
  161. Pezzo, Mark V.; Litman, Jordan A.; Pezzo, Stephanie P. (2006). "On the distinction between yuppies and hippies: Individual differences in prediction biases for planning future tasks". Personality and Individual Differences. 41 (7): 1359–1371. ISSN 0191-8869. doi:10.1016/j.paid.2006.03.029. 
  162. Kahneman, Daniel; Tversky, Amos (1977). "Intuitive prediction: Biases and corrective procedures" (PDF).  Decision Research Technical Report PTR-1042-77-6. In Kahneman, Daniel; Tversky, Amos (1982). "Intuitive prediction: Biases and corrective procedures". In Kahneman, Daniel; Slovic, Paul; Tversky, Amos. Judgment Under Uncertainty: Heuristics and Biases. Science. 185. pp. 414–421. ISBN 978-0511809477. PMID 17835457. doi:10.1017/CBO9780511809477.031. 
  163. Buehler, Roger; Griffin, Dale; Peetz, Johanna. "Chapter One - The Planning Fallacy: Cognitive, Motivational, and Social Origins". doi:10.1016/S0065-2601(10)43001-4. 
  164. Goleman, Daniel (1984-06-12). "A bias puts self at center of everything". The New York Times. Retrieved 2016-12-09. 
  165. "The Egocentric Bias: Why It's Hard to See Things from a Different Perspective". effectiviology.com. Retrieved 16 July 2020. 
  166. Hamill, Ruth; Wilson, Timothy D.; Nisbett, Richard E. (1980). "Insensitivity to sample bias: Generalizing from atypical cases" (PDF). Journal of Personality and Social Psychology. 39 (4): 578–589. doi:10.1037/0022-3514.39.4.578. 
  167. "group attribution error". dictionary.apa.org. Retrieved 14 August 2020. 
  168. Frazier, Kendrick (1986). Science Confronts the Paranormal. Prometheus Books. p. 101. 
  169. "Subjective Validation". alleydog.com. Retrieved 14 August 2020. 
  170. "Understanding the Optimism Bias". verywellmind.com. Retrieved 15 January 2021. 
  171. "Optimism Bias - Biases & Heuristics". The Decision Lab. Retrieved 28 January 2021. 
  172. 172.0 172.1 172.2 "Framing Effect - an overview | ScienceDirect Topics". www.sciencedirect.com. Retrieved 29 January 2021. 
  173. "Why do our decisions depend on how options are presented to us?". thedecisionlab.com. Retrieved 16 January 2021. 
  174. Tversky, A; Kahneman, D (30 January 1981). "The framing of decisions and the psychology of choice". Science. 211 (4481): 453–458. doi:10.1126/SCIENCE.7455683. 
  175. "Pseudocertainty effect". wiwi.europa-uni.de. Retrieved 14 August 2020. 
  176. Kammer, D. (1982). "Differences in trait ascriptions to self and friend: Unconfounding intensity from variability". Psychological Reports. 51 (1): 99–102. doi:10.2466/pr0.1982.51.1.99. 
  177. "Trait Ascription Bias". alleydog.com. Retrieved 14 August 2020. 
  178. "Decoy Effect definition". tactics.convertize.com. Retrieved 14 January 2021. 
  179. Mortimer, Gary. "The decoy effect: how you are influenced to choose without really knowing it". The Conversation. Retrieved 29 January 2021. 
  180. "Third-Person Effect". Encyclopedia of Survey Research Methods. 2008. doi:10.4135/9781412963947.n582. 
  181. Conners, Joan L. "Understanding the Third-Person Effect" (PDF). 
  182. "Third-Person Effect". alleydog.com. Retrieved 7 May 2020. 
  183. 183.0 183.1 Hakim, Catherine. Models of the Family in Modern Societies: Ideals and Realities: Ideals and Realities. 
  184. "Courtesy Bias". alleydog.com. Retrieved 14 August 2020. 
  185. "Disposition Effect". Behavioural Finance. Retrieved 11 January 2017. 
  186. "Disposition effect". behavioraleconomics.com. Retrieved 16 July 2020. 
  187. US, Joshua Miller,Adam Sanjurjo,The Conversation. "Momentum Isn’t Magic—Vindicating the Hot Hand with the Mathematics of Streaks". Scientific American. Retrieved 16 June 2021. 
  188. "Hot Hand Effect". psychology.iresearchnet.com. Retrieved 16 July 2020. 
  189. Geraci, Lisa; McDaniel, Mark A.; Miller, Tyler M.; Hughes, Matthew L. (2013-11-01). "The bizarreness effect: evidence for the critical influence of retrieval processes". Memory & Cognition. pp. 1228–1237. doi:10.3758/s13421-013-0335-4. 
  190. Iaccino, J. F.; Sowa, S. J. (February 1989). "Bizarre imagery in paired-associate learning: an effective mnemonic aid with mixed context, delayed testing, and self-paced conditions". Percept mot Skills. 68 (1): 307–16. PMID 2928063. doi:10.2466/pms.1989.68.1.307. 
  191. "The imagery bizarreness effect as a function of sentence complexity and presentation time" (PDF). link.springer.com. Retrieved 18 June 2021. 
  192. "Bizarreness effect". britannica.com. Retrieved 16 July 2020. 
  193. Lee Ross, Constance A. Stillinger, "Psychological barriers to conflict resolution", Stanford Center on Conflict and Negotiation, Stanford University, 1988, p. 4
  194. "Why we often tend to devalue proposals made by people who we consider to be adversaries". thedecisionlab.com. Retrieved 22 September 2020. 
  195. Samuelson, W.; Zeckhauser, R. (1988). "Status quo bias in decision making". Journal of Risk and Uncertainty. 1: 7–59. doi:10.1007/bf00055564. 
  196. "Status Quo Bias: What It Means and How It Affects Your Behavior". thoughtco.com. Retrieved 22 September 2020. 
  197. "The Curse of Knowledge: What It Is and How to Account for It". effectiviology.com. Retrieved 6 May 2020. 
  198. Atladóttir, Kristín. "The Endowment Effect and other biases in creative goods transactions" (PDF). ISSN 1670-8288. 
  199. Bruno, Michael A. "256 Shades of gray: uncertainty and diagnostic error in radiology". doi:10.1515/dx-2017-0006. 
  200. Ashman, C. J.; Yu, J. S.; Wolfman, D. (August 2000). "Satisfaction of search in osteoradiology". AJR. American journal of roentgenology. 175 (2): 541–544. ISSN 0361-803X. doi:10.2214/ajr.175.2.1750541. Retrieved 27 January 2021. 
  201. "Cognitive biases". catalog.library.vanderbilt.edu. Retrieved 25 July 2021. 
  202. "Self-Enhancement and Superiority Biases in Social Comparison". researchgate.net. Retrieved 14 August 2020. 
  203. "Illusory Superiority". alleydog.com. Retrieved 7 May 2020. 
  204. "The Courtesy Bias". smallbusinessforum.co. Retrieved 14 August 2020. 
  205. ""Women Are Wonderful" Effect". scribd.com. Retrieved 10 April 2020. 
  206. ""women are wonderful" effect". crazyfacts.com. Retrieved 18 July 2020. 
  207. Tversky, Amos; Koehler, Derek J. (October 1994). "Support theory: A nonextensional representation of subjective probability.". Psychological Review. 101 (4): 547–567. doi:10.1037/0033-295X.101.4.547. 
  208. "Today's term from psychology is Subadditivity Effect.". steemit.com. Retrieved 7 May 2020. 
  209. "PROJECT IMPLICIT LECTURES AND WORKSHOPS". projectimplicit.net. Retrieved 12 March 2020. 
  210. "Implicit Bias". plato.stanford.edu. Retrieved 8 May 2020. 
  211. Kahneman, D. & Tversky, A. (1996). "On the reality of cognitive illusions" (PDF). Psychological Review. 103 (3): 582–591. PMID 8759048. doi:10.1037/0033-295X.103.3.582. 
  212. S.X. Zhang; J. Cueto (2015). "The Study of Bias in Entrepreneurship". Entrepreneurship Theory and Practice. 41 (3): 419–454. doi:10.1111/etap.12212. 
  213. Medway, Dominic; Foos, Adrienne; Goatman, Anna. "Impact bias in student evaluations of higher education". Studies in Higher Education. doi:10.1080/03075079.2015.1071345. Retrieved 7 May 2020. 
  214. Medway, Dominic; Foos, Adrienne; Goatman, Anna. "Impact bias in student evaluations of higher education". Studies in Higher Education. doi:10.1080/03075079.2015.1071345. Retrieved 7 May 2020. 
  215. Greenwald, Anthony G.; McGhee, Debbie E.; Schwartz, Jordan L.K. (1998), "Measuring Individual Differences in Implicit Cognition: The Implicit Association Test", Journal of Personality and Social Psychology, 74 (6): 1464–1480, PMID 9654756, doi:10.1037/0022-3514.74.6.1464 
  216. "The Implicit Association Test (IAT) - iMotions". Imotions Publish. 15 December 2020. Retrieved 17 May 2021. 
  217. "Implicit Association Test". www.projectimplicit.net. Retrieved 17 May 2021. 
  218. Hsee, Christopher K. (1998). "Less Is Better: When Low-value Options Are Valued More Highly than High-value Options" (PDF). Journal of Behavioral Decision Making. 11 (2): 107–121. doi:10.1002/(SICI)1099-0771(199806)11:2<107::AID-BDM292>3.0.CO;2-Y. 
  219. "Why we prefer the smaller or the lesser alternative". thedecisionlab.com. Retrieved 7 May 2020. 
  220. Kruger, Justin; Dunning, David (1999). "Unskilled and Unaware of It: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments". Journal of Personality and Social Psychology. 77 (6): 1121–1134. PMID 10626367. doi:10.1037/0022-3514.77.6.1121. 
  221. "Dunning-Kruger Effect". psychologytoday.com. Retrieved 14 August 2020. 
  222. Gilovich, T.; Medvec, V. H.; Savitsky, K. (2000). "The spotlight effect in social judgment: An egocentric bias in estimates of the salience of one's own actions and appearance" (PDF). Journal of Personality and Social Psychology. 78 (2): 211–222. PMID 10707330. doi:10.1037//0022-3514.78.2.211. 
  223. "The Spotlight Effect". psychologytoday.com. Retrieved 14 August 2020. 
  224. Kruger, Justin; Gilovich, Thomas (1999). "'Naive cynicism' in everyday theories of responsibility assessment: On biased assumptions of bias.". Journal of Personality and Social Psychology. 76 (5): 743–753. doi:10.1037/0022-3514.76.5.743. 
  225. "Naive Cynicism". psychology.iresearchnet.com. Retrieved 16 July 2020. 
  226. Kahneman, Daniel; Frederick, Shane (2002). "Representativeness Revisited: Attribute Substitution in Intuitive Judgment". In Thomas Gilovich; Dale Griffin; Daniel Kahneman. Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge: Cambridge University Press. pp. 49–81. ISBN 978-0-521-79679-8. 
  227. "Attribute substitution- a quick guide". biasandbelief.wordpress.com. Retrieved 7 May 2020. 
  228. Gearon, Michael (12 February 2019). "Cognitive Biases — Framing effect". Medium. Retrieved 6 March 2021. 
  229. "Definition". tactics.convertize.com. Retrieved 6 March 2021. 
  230. 230.0 230.1 Pronin, Emily; Lin, Daniel Y.; Ross, Lee. "The Bias Blind Spot: Perceptions of Bias in Self Versus Others". doi:10.1177/0146167202286008. 
  231. Garcia, S.M.; Weaver, K.; Darley, J.M.; Moskowitz, G.B. (2002). "Crowded minds: the implicit bystander effect". Journal of Personality and Social Psychology. 83 (4): 843–853. PMID 12374439. doi:10.1037/0022-3514.83.4.843. 
  232. "Bystander Effect". psychologytoday.com. Retrieved 7 May 2020. 
  233. "Kahneman receives Nobel Prize at ceremony". Princeton University. Retrieved 16 June 2021. 
  234. "Psychologist wins Nobel Prize". www.apa.org. Retrieved 16 June 2021. 
  235. Chen, Full Bio Follow Linkedin Follow Twitter James; Investing, Is the Former Director of; trader, trading content at Investopedia He is an expert; Adviser, Investment; Chen, global market strategist Learn about our editorial policies James. "Prospect Theory". Investopedia. Retrieved 16 June 2021. 
  236. Frederick, Shane; Loewenstein, George; O'Donoghue, Ted (2011). "Time Discounting and Time Preference: A Critical Review". In Camerer, Colin F.; Loewenstein, George; Rabin, Matthew. Advances in Behavioral Economics. Princeton University Press. pp. 187–188. ISBN 978-1400829118. 
  237. "Projection bias". behavioraleconomics.com. Retrieved 7 May 2020. 
  238. Lovallo, Dan; Kahneman, Daniel (July 2003). "Delusions of Success: How Optimism Undermines Executives' Decisions". Harvard Business Review. 81 (7): 56–63. PMID 12858711. 
  239. Buehler, Roger; Griffin, Dale; Peetz, Johanna (2010). "The Planning Fallacy". Advances in Experimental Social Psychology. 43: 1–62. doi:10.1016/S0065-2601(10)43001-4. 
  240. Kim, S.; Goldstein, D.; Hasher, L.; Zacks, R. T. (1 July 2005). "Framing Effects in Younger and Older Adults". The Journals of Gerontology Series B: Psychological Sciences and Social Sciences. 60 (4): P215–P218. doi:10.1093/geronb/60.4.P215. 
  241. 241.0 241.1 "4 examples of herd mentality (and how to take advantage of it)". iwillteachyoutoberich.com. Retrieved 27 January 2021. 
  242. Pohl, Rüdiger; Pohl, Rüdiger F. (2004). Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory. Psychology Press. ISBN 978-1-84169-351-4. 
  243. Hsee, Christopher K.; Zhang, Jiao. "General Evaluability Theory". doi:10.1177/1745691610374586. 
  244. Marfice, Christina. "How to Use the Framing Effect to Sell More Products". www.plytix.com. Retrieved 6 March 2021. 
  245. "Overcoming Bias". overcomingbias.com. Retrieved 13 March 2020. 
  246. "The "Ostrich Effect" and the Relationship between the Liquidity and the Yields of Financial Assets". The Journal of Business. doi:10.2139/ssrn.431180. 
  247. "Ostrich Effect". thinkingcollaborative.com. Retrieved 8 May 2020. 
  248. 248.0 248.1 Rickford, John R.; Wasow, Thomas; Zwicky, Arnold (2007). "Intensive and quotative all: something new, something old". American Speech. 82 (1): 3–31. doi:10.1215/00031283-2007-001Freely accessible. 
  249. 249.0 249.1 "Climate Change 3: The Grand Narrative Availability Cascade is Making Us Stupid". americanexperiment.org. Retrieved 14 January 2021. 
  250. "APA PsycNet". psycnet.apa.org. Retrieved 28 July 2021. 
  251. Hamblin, James (November 4, 2013). "Cheerleader Effect: Why People Are More Beautiful in Groups". The Atlantic. Retrieved December 5, 2015. 
  252. Carragher, Daniel J.; Thomas, Nicole A.; Gwinn, O. Scott; Nicholls, Mike E. R. "Limited evidence of hierarchical encoding in the cheerleader effect". 
  253. "Why We Spend Coins Faster Than Bills". NPR. May 12, 2009. Retrieved 7 April 2020. 
  254. "Denomination effect". nlpnotes.com. Retrieved 7 May 2020. 
  255. "Pdf." (PDF). 
  256. "The Backfire Effect: Why Facts Don't Always Change Minds – Effectiviology". effectiviology.com. Retrieved 27 January 2021. 
  257. Ross, Lee; Lepper, Mark; Ward, Andrew (30 June 2010). "History of Social Psychology: Insights, Challenges, and Contributions to Theory and Application". Handbook of Social Psychology: socpsy001001. doi:10.1002/9780470561119.socpsy001001. 
  258. "Naive Realism". psychology.iresearchnet.com. Retrieved 17 July 2020. 
  259. Chang, Che-hsu Joe; Menéndez, Cammie Chaumont; Robertson, Michelle M.; Amick, Benjamin C.; Johnson, Peter W.; del Pino, Rosa J.; Dennerlein, Jack T. (November 2010). "Daily self-reports resulted in information bias when assessing exposure duration to computer use". American Journal of Industrial Medicine. 53 (11): 1142–1149. doi:10.1002/ajim.20878. 
  260. "Information bias". Catalog of Bias. 13 November 2019. Retrieved 25 July 2021. 
  261. "Information Bias". catalogofbias.org. Retrieved 22 September 2020. 
  262. Serfas, Sebastian (6 December 2010). Cognitive Biases in the Capital Investment Context: Theoretical Considerations and Empirical Experiments on Violations of Normative Rationality. Springer Science & Business Media. ISBN 978-3-8349-6485-4. 
  263. "Cognitive Biases — The IKEA Effect". medium.com. Retrieved 14 August 2020. 
  264. "What is the Ikea Effect?". bloomreach.com. Retrieved 7 May 2020. 
  265. "Thinking, Fast and Slow". www.goodreads.com. Retrieved 16 June 2021. 
  266. "Marketers Need To Be Aware Of Cognitive Bias". thecustomer.net. Retrieved 12 March 2020. 
  267. "Study Finds That Memory Works Differently in the Age of Google". Columbia University. July 14, 2011. 
  268. "The Google Effect and Digital Amnesia: How We Use Machines to Remember". effectiviology.com. Retrieved 16 July 2020. 
  269. Tom Chivers (2011-12-13). "An unconfirmed sighting of the elusive Higgs boson". Daily Telegraph. 
  270. "When a statistically significant observation should be overlooked.". thedecisionlab.com. Retrieved 7 May 2020. 
  271. Buonomano, Dean (11 July 2011). Brain Bugs: How the Brain's Flaws Shape Our Lives. W. W. Norton & Company. ISBN 978-0-393-08195-4. 
  272. Banaji, Mahzarin R. (18 April 2014). Blindspot: Hidden Biases of Good People. Penguin Books Limited. ISBN 978-81-8475-930-3. 
  273. Quoidbach, Jordi; Gilbert, Daniel T.; Wilson, Timothy D. (2013-01-04). "The End of History Illusion" (PDF). Science. 339 (6115): 96–98. PMID 23288539. doi:10.1126/science.1229294. Young people, middle-aged people, and older people all believed they had changed a lot in the past but would change relatively little in the future. 
  274. "Why You Won't Be the Person You Expect to Be". nytimes.com. Retrieved 7 May 2020. 
  275. "The Art of Thinking Clearly" (PDF). xqdoc.imedao.com. Retrieved 28 July 2021. 
  276. Nantchev, Adrian. 50 Cognitive Biases for an Unfair Advantage in Entrepreneurship. CreateSpace Independent Publishing Platform. ISBN 978-1-5376-0327-8. 
  277. Priest, Henry. BIASES and HEURISTICS: The Complete Collection of Cognitive Biases and Heuristics That Impair Decisions in Banking, Finance and Everything Else. Amazon Digital Services LLC - KDP Print US. ISBN 978-1-0784-3231-3.