Difference between revisions of "Timeline of cognitive biases"

From Timelines
Jump to: navigation, search
 
(102 intermediate revisions by 3 users not shown)
Line 7: Line 7:
 
* What are the different types of cognitive bias described by the timeline?
 
* What are the different types of cognitive bias described by the timeline?
 
** Sort the full timeline by "Bias type".
 
** Sort the full timeline by "Bias type".
 +
** You will mostly see three categories: Social bias, memory bias, and belief, decision-making and behavioral bias.
 
* What are some notable cases in history involving a cognitive bias?
 
* What are some notable cases in history involving a cognitive bias?
 
** Sort the full timeline by "Event type" and look for the group of rows with value "Notable case".
 
** Sort the full timeline by "Event type" and look for the group of rows with value "Notable case".
 
* What are some events describing the development of a concept within the field of cognitive biases?
 
* What are some events describing the development of a concept within the field of cognitive biases?
 
** Sort the full timeline by "Event type" and look for the group of rows with value "Concept development".
 
** Sort the full timeline by "Event type" and look for the group of rows with value "Concept development".
 +
**You will read mentions such as {{w|stereotype}}, {{w|Semmelweis effect}}, and {{w|Bandwagon effect}}, etc.
 
* What are some ilustrative pieces of research related to the field?
 
* What are some ilustrative pieces of research related to the field?
 
** Sort the full timeline by "Event type" and look for the group of rows with value "Research".
 
** Sort the full timeline by "Event type" and look for the group of rows with value "Research".
 +
* What are some books illustrating the literature on the field of cognitive biases?
 +
** Sort the full timeline by "Event type" and look for the group of rows with value "Literature".
 +
** You will read a number of notable authors, such as {{w|Daniel Kahneman}}, and {{w|Irving Fisher}}, among others.
  
 
==Big picture==
 
==Big picture==
Line 21: Line 26:
 
| 1972 backward || Pre concept development era || Multiple concepts later included within the category of cognitive biases are developed throughout time, starting from ancient Greek philosophers.
 
| 1972 backward || Pre concept development era || Multiple concepts later included within the category of cognitive biases are developed throughout time, starting from ancient Greek philosophers.
 
|-
 
|-
| 1972 onward || Modern period || The notion of cognitive biases is introduced by Amos Tversky and Daniel Kahneman.
+
| 1972 onward || Modern period || The notion of cognitive bias is introduced by Amos Tversky and Daniel Kahneman, who in the following years would further elaborate on several different types of cognitive biases and related concepts.
 
|-
 
|-
 
| 21st century || Present time || As of 2020, there are approximately 188 recognized cognitive biases.<ref>{{cite web |title=Every Single Cognitive Bias in One Infographic |url=https://www.visualcapitalist.com/every-single-cognitive-bias/ |website=visualcapitalist.com |access-date=5 December 2020}}</ref>  
 
| 21st century || Present time || As of 2020, there are approximately 188 recognized cognitive biases.<ref>{{cite web |title=Every Single Cognitive Bias in One Infographic |url=https://www.visualcapitalist.com/every-single-cognitive-bias/ |website=visualcapitalist.com |access-date=5 December 2020}}</ref>  
 
|-
 
|-
 
|}
 
|}
 
== Visual data ==
 
 
=== Google Trends ===
 
 
The chart below shows Google Trends data for cognitive biases (topic) fromJanuary 2004 to january 2021, when the screenshot was taken. <ref>{{cite web |title=Cognitive biases |url=https://trends.google.com/trends/explore?date=all&q=Cognitive%20biases |website=trends.google.com |access-date=15 January 2021}}</ref>
 
 
[[File:Cognitive biases gtrends.jpeg|thumb|center|800px]]
 
 
=== Google Ngram Viewer ===
 
 
The chart shows Google Ngram Viewer data for "cognitive bias", from 1972 (when the concept was created) to 2019.<ref>{{cite web |title=Google Books Ngram Viewer |url=https://books.google.com/ngrams/graph?content=cognitive+bias&year_start=1972&year_end=2019&corpus=26&smoothing=3 |website=books.google.com |access-date=28 January 2021 |language=en}}</ref>
 
 
[[File:Cognitive bias ngram.png|thumb|center|800px]]
 
 
=== Wikipedia Views ===
 
 
The chart below shows pageviews of the English Wikipedia article {{w|cognitive bias}},from July 2015 to December 2020.<ref>{{cite web |title=Cognitive biases |url=https://wikipediaviews.org/displayviewsformultiplemonths.php?page=Cognitive+biases&allmonths=allmonths-api&language=en&drilldown=all |website=wikipediaviews.org |access-date=19 January 2021}}</ref>
 
 
[[File:Cognitive biases wv.jpeg|thumb|center|600px]].
 
  
 
==Full timeline==
 
==Full timeline==
Line 52: Line 37:
 
! Year !! Bias type !! Event type !! Details !! Concept definition (when applicable)
 
! Year !! Bias type !! Event type !! Details !! Concept definition (when applicable)
 
|-
 
|-
| c.180 CE || Social bias || Field development || Many philosophers and social theorists observe and consider the phenomenon of belief in a just world, going back to at least as early as the [[w:Pyrrhonism|Pyrrhonist]] philosopher {{w|Sextus Empiricus}}, writing ''circa'' 180 CE, who argues against this belief.<ref>Sextus Empiricus, "Outlines of Pyrrhonism", Book 1, Chapter 13, Section 32</ref> || "The {{w|just-world hypothesis}} is the belief that people get what they deserve since life is fair."<ref>{{cite web |title=Just-World Hypothesis |url=https://www.alleydog.com/glossary/definition.php?term=Just-World+Hypothesis |website=alleydog.com |accessdate=7 May 2020}}</ref>
+
| c.180 CE || Social bias || Concept development || Many philosophers and social theorists observe and consider the phenomenon of belief in a just world, going back to at least as early as the [[w:Pyrrhonism|Pyrrhonist]] philosopher {{w|Sextus Empiricus}}, writing ''circa'' 180 CE, who argues against this belief.<ref>Sextus Empiricus, "Outlines of Pyrrhonism", Book 1, Chapter 13, Section 32</ref> || "The {{w|just-world hypothesis}} is the belief that people get what they deserve since life is fair."<ref>{{cite web |title=Just-World Hypothesis |url=https://www.alleydog.com/glossary/definition.php?term=Just-World+Hypothesis |website=alleydog.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 1747 || || Field development || Scottish doctor {{w|James Lind}} conducts the first systematic [[w:Controlled experiment|clinical trial]].<ref>Carlisle, Rodney (2004). ''Scientific American Inventions and Discoveries'', John Wiley & Songs, Inc., New Jersey. p. 393.</ref> || "Clinical trials are research studies performed in people that are aimed at evaluating a medical, surgical, or behavioral intervention."<ref>{{cite web |title=What Are Clinical Trials and Studies? |url=https://www.nia.nih.gov/health/what-are-clinical-trials-and-studies |website=National Institute on Aging |access-date=28 January 2021 |language=en}}</ref>
+
| 1747 || || Research || Scottish doctor {{w|James Lind}} conducts the first systematic [[w:Controlled experiment|clinical trial]].<ref>Carlisle, Rodney (2004). ''Scientific American Inventions and Discoveries'', John Wiley & Songs, Inc., New Jersey. p. 393.</ref> || "Clinical trials are research studies performed in people that are aimed at evaluating a medical, surgical, or behavioral intervention."<ref>{{cite web |title=What Are Clinical Trials and Studies? |url=https://www.nia.nih.gov/health/what-are-clinical-trials-and-studies |website=National Institute on Aging |access-date=28 January 2021 |language=en}}</ref>
 
|-
 
|-
| 1753 || {{w|Availability bias}} || Field development || {{w|Anthropomorphism}} is first attested, originally in reference to the {{w|heresy}} of applying a human form to the [[w:Christianity|Christian]] [[w:God the Father|God]].<ref>{{citation |date=1753 |title=Chambers's Cyclopædia, Supplement }}</ref><ref name=oed>''Oxford English Dictionary'', 1st ed. "anthropomorphism, ''n.''" Oxford University Press (Oxford), 1885.</ref> || Anthropomorphism is "the interpretation of nonhuman things or events in terms of human characteristics".<ref>{{cite web |title=Anthropomorphism |url=https://www.britannica.com/topic/anthropomorphism |website=britannica.com |accessdate=7 May 2020}}</ref>
+
| 1753 || || Concept development || {{w|Anthropomorphism}} is first attested, originally in reference to the {{w|heresy}} of applying a human form to the [[w:Christianity|Christian]] [[w:God the Father|God]].<ref>{{citation |date=1753 |title=Chambers's Cyclopædia, Supplement }}</ref><ref name=oed>''Oxford English Dictionary'', 1st ed. "anthropomorphism, ''n.''" Oxford University Press (Oxford), 1885.</ref> || Anthropomorphism is "the interpretation of nonhuman things or events in terms of human characteristics".<ref>{{cite web |title=Anthropomorphism |url=https://www.britannica.com/topic/anthropomorphism |website=britannica.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 1776–1799 || || Field development || The {{w|declinism}} belief is traced back to {{w|Edward Gibbon}}'s work,<ref name="Salon1">{{cite web|last1=Miller|first1=Laura|title=Culture is dead — again|url=https://www.salon.com/2015/06/14/culture_is_dead_%E2%80%94_again_its_the_end_of_civilization_as_we_know_it_and_maybe_we_feel_fine/|website=Salon|accessdate=17 April 2018|date=2015-06-14}}</ref> ''{{w|The History of the Decline and Fall of the Roman Empire}}'', where {{w|Edward Gibbon}} argues that Rome collapsed due to the gradual loss of {{w|civic virtue}} among its citizens,<ref>J.G.A. Pocock, "Between Machiavelli and Hume: Gibbon as Civic Humanist and Philosophical Historian," ''Daedalus'' 105:3 (1976), 153–169; and in '''[[#Further reading|Further reading]]:''' Pocock, ''EEG'', 303–304; ''FDF'', 304–306.</ref> || Declinism is "the tendency to believe that the worst is to come".<ref>{{cite web |title=Why we feel the past is better compare to what the future holds |url=https://thedecisionlab.com/biases/declinism/ |website=thedecisionlab.com |accessdate=7 May 2020}}</ref>
+
| 1776–1799 || || Concept development || The {{w|declinism}} belief is traced back to {{w|Edward Gibbon}}'s work ''{{w|The History of the Decline and Fall of the Roman Empire}}'',<ref name="Salon1">{{cite web|last1=Miller|first1=Laura|title=Culture is dead — again|url=https://www.salon.com/2015/06/14/culture_is_dead_%E2%80%94_again_its_the_end_of_civilization_as_we_know_it_and_maybe_we_feel_fine/|website=Salon|accessdate=17 April 2018|date=2015-06-14}}</ref> where {{w|Edward Gibbon}} argues that Rome collapsed due to the gradual loss of {{w|civic virtue}} among its citizens.<ref>J.G.A. Pocock, "Between Machiavelli and Hume: Gibbon as Civic Humanist and Philosophical Historian," ''Daedalus'' 105:3 (1976), 153–169; and in '''[[#Further reading|Further reading]]:''' Pocock, ''EEG'', 303–304; ''FDF'', 304–306.</ref> || Declinism is "the tendency to believe that the worst is to come".<ref>{{cite web |title=Why we feel the past is better compare to what the future holds |url=https://thedecisionlab.com/biases/declinism/ |website=thedecisionlab.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
 
| 1796 || || Literature || French scholar {{w|Pierre-Simon Laplace}} describes in ''A Philosophical Essay on Probabilities'' the ways in which men calculate their probability of having sons: "I have seen men, ardently desirous of having a son, who could learn only with anxiety of the births of boys in the month when they expected to become fathers. Imagining that the ratio of these births to those of girls ought to be the same at the end of each month, they judged that the boys already born would render more probable the births next of girls." The expectant fathers feared that if more sons were born in the surrounding community, then they themselves would be more likely to have a daughter. This essay by Laplace is regarded as one of the earliest descriptions of the fallacy.<ref name="BarronLeider2010">{{cite journal|last1=Barron|first1=Greg|last2=Leider|first2=Stephen|title=The role of experience in the Gambler's Fallacy|journal=Journal of Behavioral Decision Making|url=http://www-personal.umich.edu/~leider/Papers/Gamblers_Fallacy.pdf|date=13 October 2009}}</ref> || "The Gambler's Fallacy is the misconception that something that has not happened for a long time has become 'overdue', such a coin coming up heads after a series of tails."<ref>{{cite web |title=The Gambler's Fallacy - Explained |url=https://www.thecalculatorsite.com/articles/finance/the-gamblers-fallacy.php |website=thecalculatorsite.com |accessdate=7 May 2020}}</ref>
 
| 1796 || || Literature || French scholar {{w|Pierre-Simon Laplace}} describes in ''A Philosophical Essay on Probabilities'' the ways in which men calculate their probability of having sons: "I have seen men, ardently desirous of having a son, who could learn only with anxiety of the births of boys in the month when they expected to become fathers. Imagining that the ratio of these births to those of girls ought to be the same at the end of each month, they judged that the boys already born would render more probable the births next of girls." The expectant fathers feared that if more sons were born in the surrounding community, then they themselves would be more likely to have a daughter. This essay by Laplace is regarded as one of the earliest descriptions of the fallacy.<ref name="BarronLeider2010">{{cite journal|last1=Barron|first1=Greg|last2=Leider|first2=Stephen|title=The role of experience in the Gambler's Fallacy|journal=Journal of Behavioral Decision Making|url=http://www-personal.umich.edu/~leider/Papers/Gamblers_Fallacy.pdf|date=13 October 2009}}</ref> || "The Gambler's Fallacy is the misconception that something that has not happened for a long time has become 'overdue', such a coin coming up heads after a series of tails."<ref>{{cite web |title=The Gambler's Fallacy - Explained |url=https://www.thecalculatorsite.com/articles/finance/the-gamblers-fallacy.php |website=thecalculatorsite.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 1847 || || Concept development || The term {{w|Semmelweis effect}} derives from the name of a Hungarian physician, {{w|Ignaz Semmelweis}}, who discovered in 1847 that childbed fever mortality rates fell ten-fold when doctors disinfected their hands with a chlorine solution before moving from one patient to another, or, most particularly, after an autopsy. The Semmelweis effect is a metaphor for the {{w|reflex}}-like tendency to reject new evidence or new knowledge because it contradicts established norms, beliefs, or {{w|paradigm}}s.<ref>{{cite journal|last1=Mortell|first1=Manfred|last2=Balkhy|first2=Hanan H.|last3=Tannous|first3=Elias B.|last4=Jong|first4=Mei Thiee|title=Physician ‘defiance’ towards hand hygiene compliance: Is there a theory–practice–ethics gap?|journal=Journal of the Saudi Heart Association|date=July 2013|volume=25|issue=3|pages=203–208|doi=10.1016/j.jsha.2013.04.003|pmc=3809478|pmid=24174860}}</ref> || Semmelweis effect "refers to the tendency to automatically reject new information or knowledge because it contradicts current thinking or beliefs."<ref>{{cite web |title=Semmelweis Reflex (Semmelweis Effect) |url=https://www.alleydog.com/glossary/definition.php?term=Semmelweis+Reflex+%28Semmelweis+Effect%29 |website=alleydog.com |accessdate=7 May 2020}}</ref>
+
| 1847 || || Concept development || Hungarian physician {{w|Ignaz Semmelweis}} discovers that hand washing and disinfecting at hospitals dramatically reduces infection and death in paients. His hand-washing suggestions are at the beginning rejected by his contemporaries, often for non-medical reasons. This would give birth to the concept of Semmelweis effect, which is a metaphor for the {{w|reflex}}-like tendency to reject new evidence or new knowledge because it contradicts established norms, beliefs, or {{w|paradigm}}s.<ref>{{cite journal|last1=Mortell|first1=Manfred|last2=Balkhy|first2=Hanan H.|last3=Tannous|first3=Elias B.|last4=Jong|first4=Mei Thiee|title=Physician ‘defiance’ towards hand hygiene compliance: Is there a theory–practice–ethics gap?|journal=Journal of the Saudi Heart Association|date=July 2013|volume=25|issue=3|pages=203–208|doi=10.1016/j.jsha.2013.04.003|pmc=3809478|pmid=24174860}}</ref> || Semmelweis effect "refers to the tendency to automatically reject new information or knowledge because it contradicts current thinking or beliefs."<ref>{{cite web |title=Semmelweis Reflex (Semmelweis Effect) |url=https://www.alleydog.com/glossary/definition.php?term=Semmelweis+Reflex+%28Semmelweis+Effect%29 |website=alleydog.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 1848 || Social (conformity bias) || Concept development || The phrase "jump on the bandwagon" first appears in American politics when enterteiner {{w|Dan Rice}} uses his bandwagon and its music to gain attention for his political campaign appearances. As his campaign becomes more successful, other politicians strive for a seat on the bandwagon, hoping to be associated with his success.<ref>{{cite web |url=http://www.wordwizard.com/phpbb3/viewtopic.php?f=7&t=6642 |title=Bandwagon Effect |accessdate=2007-03-09}}</ref> || {{w|Bandwagon effect}} "is a psychological phenomenon whereby people do something primarily because other people are doing it, regardless of their own beliefs, which they may ignore or override."<ref>{{cite web |title=The Bandwagon Effect |url=https://www.psychologytoday.com/us/blog/stronger-the-broken-places/201708/the-bandwagon-effect |website=psychologytoday.com |accessdate=7 May 2020}}</ref>
+
| 1848 || Social (conformity bias) || Concept development || The phrase "jump on the bandwagon" first appears in American politics when enterteiner {{w|Dan Rice}} uses his bandwagon and its music to gain attention for his political campaign appearances. As his campaign becomes more successful, other politicians would strive for a seat on the bandwagon, hoping to be associated with his success. This preludes the emergence of the term {{w|bandwagon effect}}, which is later coined in the early 20th century.<ref>{{cite web |url=http://www.wordwizard.com/phpbb3/viewtopic.php?f=7&t=6642 |title=Bandwagon Effect |accessdate=2007-03-09}}</ref> || {{w|Bandwagon effect}} "is a psychological phenomenon whereby people do something primarily because other people are doing it, regardless of their own beliefs, which they may ignore or override."<ref>{{cite web |title=The Bandwagon Effect |url=https://www.psychologytoday.com/us/blog/stronger-the-broken-places/201708/the-bandwagon-effect |website=psychologytoday.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 1850 || || Concept development || The first reference to “stereotype” appears as a noun that means “image perpetuated without change.”<ref name="Stereotypes Defined">{{cite web |title=Stereotypes Defined |url=https://stereotypeliberia.wordpress.com/about/stereeotypes-defined/ |website=stereotypeliberia.wordpress.com |accessdate=10 April 2020}}</ref> ||
+
| 1850 || || Concept development || The first reference to “stereotype” appears as a noun that means “image perpetuated without change.”<ref name="Stereotypes Defined">{{cite web |title=Stereotypes Defined |url=https://stereotypeliberia.wordpress.com/about/stereeotypes-defined/ |website=stereotypeliberia.wordpress.com |accessdate=10 April 2020}}</ref> || Stereotype refers to "a widely held but fixed and oversimplified image or idea of a particular type of person or thing"<ref>Oxford Languages</ref>
 
|-
 
|-
 
| 1860 || || Concept development || Both [[w:Weber–Fechner law|Weber's law and Fechner's law]] are published by [[w:Gustav Fechner|Gustav Theodor Fechner]] in the work ''Elemente der Psychophysik'' (''Elements of Psychophysics''). This publication is the first work ever in this field, and where Fechner coins the term {{w|psychophysics}} to describe the interdisciplinary study of how humans perceive physical magnitudes.<ref name="Fechner1">{{cite book
 
| 1860 || || Concept development || Both [[w:Weber–Fechner law|Weber's law and Fechner's law]] are published by [[w:Gustav Fechner|Gustav Theodor Fechner]] in the work ''Elemente der Psychophysik'' (''Elements of Psychophysics''). This publication is the first work ever in this field, and where Fechner coins the term {{w|psychophysics}} to describe the interdisciplinary study of how humans perceive physical magnitudes.<ref name="Fechner1">{{cite book
Line 88: Line 73:
 
| 1866 || Belief, decision-making and behavioral ({{w|apophenia}}) || Concept development|| The German word ''pareidolie'' is used in German articles by [[w:Karl Ludwig Kahlbaum|Dr. Karl Ludwig Kahlbaum]] in his paper ''On Delusion of the Senses''.<ref>[https://books.google.com/books?id=IM06AQAAMAAJ&pg=PA238&dq=%22pareidolia%22&hl=en&sa=X&ved=0ahUKEwjPysqt0ejUAhWe14MKHbdkCdIQ6AEIXzAJ#v=onepage&q=%22pareidolia%22&f=false ] Sibbald, M.D. "Report on the Progress of Psychological Medicine; German Psychological Literature", ''The Journal of Mental Science'', Volume 13.  1867. p. 238</ref> || {{w|Pareidolia}} is "the tendency to perceive a specific, often meaningful image in a random or ambiguous visual pattern."<ref>{{cite web |last1= |first1= |title=pareidolia |url=https://www.merriam-webster.com/dictionary/pareidolia |website=merriam-webster.com |accessdate=7 May 2020}}</ref>
 
| 1866 || Belief, decision-making and behavioral ({{w|apophenia}}) || Concept development|| The German word ''pareidolie'' is used in German articles by [[w:Karl Ludwig Kahlbaum|Dr. Karl Ludwig Kahlbaum]] in his paper ''On Delusion of the Senses''.<ref>[https://books.google.com/books?id=IM06AQAAMAAJ&pg=PA238&dq=%22pareidolia%22&hl=en&sa=X&ved=0ahUKEwjPysqt0ejUAhWe14MKHbdkCdIQ6AEIXzAJ#v=onepage&q=%22pareidolia%22&f=false ] Sibbald, M.D. "Report on the Progress of Psychological Medicine; German Psychological Literature", ''The Journal of Mental Science'', Volume 13.  1867. p. 238</ref> || {{w|Pareidolia}} is "the tendency to perceive a specific, often meaningful image in a random or ambiguous visual pattern."<ref>{{cite web |last1= |first1= |title=pareidolia |url=https://www.merriam-webster.com/dictionary/pareidolia |website=merriam-webster.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 1874 || Memory bias || Field development || The first documented instance of {{w|cryptomnesia}} occurs with the medium {{w|Stainton Moses}}.<ref>Brian Righi. (2008). ''Chapter 4: Talking Boards and Ghostly Goo''. In ''Ghosts, Apparitions and Poltergeists''. Llewellyn Publications."An early example of this occurred in 1874 with he medium William Stanton Moses, who communicated with the spirits of two brothers who had recently died in India. Upon investigation, it was discovered that one week prior to the séance, their obituary had appeared in the newspaper. This was of some importance because Moses's communications with the two spirits contained nothing that wasn't already printed in the newspaper. When the spirits were pressed for further information, they were unable to provide any. Researchers concluded that Moses had seen the obituary, forgotten it, and then resurfaced the memory during the séance."</ref><ref>{{w|Robert Todd Carroll}}. (2014). [http://skepdic.com/cryptomn.html "Cryptomnesia"]. ''{{w|The Skeptic's Dictionary}}''. Retrieved 2014-07-12.</ref> || {{w|Cryptomnesia}} is "an implicit memory phenomenon in which people mistakenly believe that a current thought or idea is a product of their own creation when, in fact, they have encountered it previously and then forgotten it".<ref>{{cite web |title=cryptomnesia |url=https://dictionary.apa.org/cryptomnesia |website=dictionary.apa.org |accessdate=7 May 2020}}</ref>
+
| 1874 || Memory bias || Research || The first documented instance of {{w|cryptomnesia}} occurs with the medium {{w|Stainton Moses}}.<ref>Brian Righi. (2008). ''Chapter 4: Talking Boards and Ghostly Goo''. In ''Ghosts, Apparitions and Poltergeists''. Llewellyn Publications."An early example of this occurred in 1874 with he medium William Stanton Moses, who communicated with the spirits of two brothers who had recently died in India. Upon investigation, it was discovered that one week prior to the séance, their obituary had appeared in the newspaper. This was of some importance because Moses's communications with the two spirits contained nothing that wasn't already printed in the newspaper. When the spirits were pressed for further information, they were unable to provide any. Researchers concluded that Moses had seen the obituary, forgotten it, and then resurfaced the memory during the séance."</ref><ref>{{w|Robert Todd Carroll}}. (2014). [http://skepdic.com/cryptomn.html "Cryptomnesia"]. ''{{w|The Skeptic's Dictionary}}''. Retrieved 2014-07-12.</ref> || {{w|Cryptomnesia}} is "an implicit memory phenomenon in which people mistakenly believe that a current thought or idea is a product of their own creation when, in fact, they have encountered it previously and then forgotten it".<ref>{{cite web |title=cryptomnesia |url=https://dictionary.apa.org/cryptomnesia |website=dictionary.apa.org |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 1876 || Memory bias || Field development || German experimental psychologist {{w|Gustav Fechner}} conducts the earliest known research on the {{w|mere-exposure effect}}.<ref>{{cite web |title=Mere Exposure Effect |url=https://www.wiwi.europa-uni.de/de/lehrstuhl/fine/mikro/bilder_und_pdf-dateien/WS0910/VLBehEconomics/Ausarbeitungen/MereExposure.pdf |website=wiwi.europa-uni.de |accessdate=10 April 2020}}</ref> || {{w|Mere-exposure effect}} "means that people prefer things that they are most familiar with".<ref>{{cite web |title=6 Conversion Principles You Can Learn From The Mere-Exposure Effect |url=https://marketingland.com/6-conversion-principles-can-learn-mere-exposure-effect-140430 |website=marketingland.com |accessdate=7 May 2020}}</ref>
+
| 1876 || Memory bias || Research || German experimental psychologist {{w|Gustav Fechner}} conducts the earliest known research on the {{w|mere-exposure effect}}.<ref>{{cite web |title=Mere Exposure Effect |url=https://www.wiwi.europa-uni.de/de/lehrstuhl/fine/mikro/bilder_und_pdf-dateien/WS0910/VLBehEconomics/Ausarbeitungen/MereExposure.pdf |website=wiwi.europa-uni.de |accessdate=10 April 2020}}</ref> || {{w|Mere-exposure effect}} "means that people prefer things that they are most familiar with".<ref>{{cite web |title=6 Conversion Principles You Can Learn From The Mere-Exposure Effect |url=https://marketingland.com/6-conversion-principles-can-learn-mere-exposure-effect-140430 |website=marketingland.com |accessdate=7 May 2020}}</ref> It is "the tendency to express undue liking for things merely because of familiarity with them."<ref name="dsaaaa"/>
 
|-
 
|-
 
| 1882 || || Concept development || The term ''specious present'' is first introduced by the philosopher E. R. Clay.<ref name="kelly">Anonymous (E. Robert Kelly, 1882) [https://archive.org/details/alternativeastu00claygoog/page/n5/mode/2up ''The Alternative: A Study in Psychology'']. London: Macmillan and Co. p. 168.</ref><ref name=andersen>{{cite journal | last1 = Andersen | first1 = Holly | last2 = Grush | first2 = Rick | name-list-format = vanc | title = A brief history of time-consciousness: historical precursors to James and Husserl | journal = Journal of the History of Philosophy | date = 2009 | volume = 47 | issue = 2 | pages = 277–307| doi = 10.1353/hph.0.0118 |url = https://web.archive.org/web/20080216100320/http://mind.ucsd.edu/papers/bhtc/Andersen%26Grush.pdf}}</ref> || {{w|Specious present}} "is the time duration wherein a state of {{w|consciousness}} is experienced as being in the {{w|present}}".<ref name=james>{{cite book | vauthors = James W | date = 1893 | url = https://archive.org/details/bub_gb_JLcAAAAAMAAJ | title = The principles of psychology | location = New York | publisher = H. Holt and Company. | page = [https://archive.org/details/bub_gb_JLcAAAAAMAAJ/page/n624 609] }}</ref>  
 
| 1882 || || Concept development || The term ''specious present'' is first introduced by the philosopher E. R. Clay.<ref name="kelly">Anonymous (E. Robert Kelly, 1882) [https://archive.org/details/alternativeastu00claygoog/page/n5/mode/2up ''The Alternative: A Study in Psychology'']. London: Macmillan and Co. p. 168.</ref><ref name=andersen>{{cite journal | last1 = Andersen | first1 = Holly | last2 = Grush | first2 = Rick | name-list-format = vanc | title = A brief history of time-consciousness: historical precursors to James and Husserl | journal = Journal of the History of Philosophy | date = 2009 | volume = 47 | issue = 2 | pages = 277–307| doi = 10.1353/hph.0.0118 |url = https://web.archive.org/web/20080216100320/http://mind.ucsd.edu/papers/bhtc/Andersen%26Grush.pdf}}</ref> || {{w|Specious present}} "is the time duration wherein a state of {{w|consciousness}} is experienced as being in the {{w|present}}".<ref name=james>{{cite book | vauthors = James W | date = 1893 | url = https://archive.org/details/bub_gb_JLcAAAAAMAAJ | title = The principles of psychology | location = New York | publisher = H. Holt and Company. | page = [https://archive.org/details/bub_gb_JLcAAAAAMAAJ/page/n624 609] }}</ref>  
Line 98: Line 83:
 
| 1890 || Memory bias || Concept development || The {{w|tip of the tongue}} phenomenon is first described as a psychological phenomenon in the text ''{{w|The Principles of Psychology}}'' by {{w|William James}}.<ref name="James">James, W. (1890). ''Principles of Psychology''. Retrieved from http://psychclassics.yorku.ca/James/Principles/</ref> || {{w|Tip of the tongue}} describes "a state in which one cannot quite recall a familiar word but can recall words of similar form and meaning".<ref>{{cite journal |last1=Brown |first1=Roger |last2=McNeill |first2=David |title=The “tip of the tongue” phenomenon |doi=10.1016/S0022-5371(66)80040-3 |url=https://www.sciencedirect.com/science/article/abs/pii/S0022537166800403}}</ref>
 
| 1890 || Memory bias || Concept development || The {{w|tip of the tongue}} phenomenon is first described as a psychological phenomenon in the text ''{{w|The Principles of Psychology}}'' by {{w|William James}}.<ref name="James">James, W. (1890). ''Principles of Psychology''. Retrieved from http://psychclassics.yorku.ca/James/Principles/</ref> || {{w|Tip of the tongue}} describes "a state in which one cannot quite recall a familiar word but can recall words of similar form and meaning".<ref>{{cite journal |last1=Brown |first1=Roger |last2=McNeill |first2=David |title=The “tip of the tongue” phenomenon |doi=10.1016/S0022-5371(66)80040-3 |url=https://www.sciencedirect.com/science/article/abs/pii/S0022537166800403}}</ref>
 
|-
 
|-
| 1893 || Memory bias || Concept development || {{w|Childhood amnesia}} is first formally reported by psychologist Caroline Miles in her article ''A study of individual psychology''  by the ''American Journal of Psychology''.<ref name=WhereOhWhere>{{cite journal|last=Bauer|first=P|title=Oh where, oh where have those early memories gone? A developmental perspective on childhood amnesia|journal=Psychological Science Agenda|volume=18|year=2004|url=http://www.apa.org/science/about/psa/2004/12/bauer.aspx|issue=12 }}</ref> || {{w|Childhood amnesia}} "refers to the fact that most people cannot remember events that occurred before the age of 3 or 4"<ref>{{cite web |title=Childhood Amnesia |url=https://www.sciencedirect.com/topics/medicine-and-dentistry/childhood-amnesia |website=sciencedirect.com |accessdate=7 May 2020}}</ref>
+
| 1893 || Memory bias || Concept development || {{w|Childhood amnesia}} is first formally reported by psychologist Caroline Miles in her article ''A study of individual psychology''  by the ''American Journal of Psychology''.<ref name=WhereOhWhere>{{cite journal|last=Bauer|first=P|title=Oh where, oh where have those early memories gone? A developmental perspective on childhood amnesia|journal=Psychological Science Agenda|volume=18|year=2004|url=http://www.apa.org/science/about/psa/2004/12/bauer.aspx|issue=12 }}</ref> || {{w|Childhood amnesia}} "refers to the fact that most people cannot remember events that occurred before the age of 3 or 4".<ref>{{cite web |title=Childhood Amnesia |url=https://www.sciencedirect.com/topics/medicine-and-dentistry/childhood-amnesia |website=sciencedirect.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
 
| 1906 || Social (conformity bias) || Concept development || The first known use of {{w|bandwagon effect}} occurs in this year.<ref>{{cite web |title=bandwagon effect |url=https://www.merriam-webster.com/dictionary/bandwagon%20effect |website=merriam-webster.com |accessdate=7 April 2020}}</ref> || "Bandwagon effect is when an idea or belief is being followed because everyone seems to be doing so."<ref>{{cite web |title=Bandwagon Effect - Biases & Heuristics |url=https://thedecisionlab.com/biases/bandwagon-effect/ |website=The Decision Lab |access-date=26 January 2021 |language=en-CA}}</ref>
 
| 1906 || Social (conformity bias) || Concept development || The first known use of {{w|bandwagon effect}} occurs in this year.<ref>{{cite web |title=bandwagon effect |url=https://www.merriam-webster.com/dictionary/bandwagon%20effect |website=merriam-webster.com |accessdate=7 April 2020}}</ref> || "Bandwagon effect is when an idea or belief is being followed because everyone seems to be doing so."<ref>{{cite web |title=Bandwagon Effect - Biases & Heuristics |url=https://thedecisionlab.com/biases/bandwagon-effect/ |website=The Decision Lab |access-date=26 January 2021 |language=en-CA}}</ref>
 
|-
 
|-
| 1906 || Social bias || Field development || American sociologist [[w:William Graham Sumner|William Sumner]] posits that humans are a species that join together in groups by their very nature. However, he also maintains that humans have an innate tendency to favor their own group over others, proclaiming how "each group nourishes its own pride and vanity, boasts itself superior, exists in its own divinities, and looks with contempt on outsiders".<ref>Sumner, William Graham. (1906). ''Folkways: A Study of the Social Importance of Usages, Manners, Customs, Mores, and Morals''. Boston, MA: Ginn.</ref> || {{w|In-group favoritism}} is "the tendency to favor members of one's own group over those in other groups"<ref>{{cite journal |last1=Everett |first1=Jim A. C. |last2=Faber |first2=Nadira S. |last3=Crockett |first3=Molly |title=Preferences and beliefs in ingroup favoritism |doi=10.3389/fnbeh.2015.00015 |pmid=25762906 |url=https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4327620/ |pmc=4327620}}</ref>
+
| 1906 || Social bias || Research || American sociologist [[w:William Graham Sumner|William Sumner]] posits that humans are a species that join together in groups by their very nature. However, he also maintains that humans have an innate tendency to favor their own group over others, proclaiming how "each group nourishes its own pride and vanity, boasts itself superior, exists in its own divinities, and looks with contempt on outsiders".<ref>Sumner, William Graham. (1906). ''Folkways: A Study of the Social Importance of Usages, Manners, Customs, Mores, and Morals''. Boston, MA: Ginn.</ref> || {{w|In-group favoritism}} is "the tendency to favor members of one's own group over those in other groups".<ref>{{cite journal |last1=Everett |first1=Jim A. C. |last2=Faber |first2=Nadira S. |last3=Crockett |first3=Molly |title=Preferences and beliefs in ingroup favoritism |doi=10.3389/fnbeh.2015.00015 |pmid=25762906 |url=https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4327620/ |pmc=4327620}}</ref>
 
|-
 
|-
 
| 1909 || Memory bias || Concept development || The first documented empirical studies on the {{w|testing effect}} are published by Edwina E. Abbott.<ref>{{cite journal|last1=Abbott|first1=Edwina|date=1909|title=On the analysis of the factors of recall in the learning process|url=https://insights.ovid.com/psychological-monographs-general-applied/pmga/1909/11/010/analysis-factor-recall-learning-process/5/00006828|journal=Psychological Monographs: General and Applied|volume=11|issue=1|pages=159–177|via=Ovid|doi=10.1037/h0093018}}</ref><ref>{{Cite book|last1=Larsen|first1=Douglas P.|last2=Butler|first2=Andrew C.|date=2013|editor-last=Walsh, K.|title=Test-enhanced learning|url=https://books.google.com/?id=KW2rAAAAQBAJ&pg=PA443&dq=Test-enhanced+learning#v=onepage&q=Test-enhanced%20learning&f=false|journal=In Oxford Textbook of Medical Education|volume=|issue=|pages=443–452}}</ref> || "{{w|Testing effect}} is the finding that long-term memory is often increased when some of the learning period is devoted to retrieving the to-be-remembered information."<ref>{{cite book |last1=Goldstein |first1=E. Bruce |title=Cognitive Psychology: Connecting Mind, Research and Everyday Experience |publisher=Cengage Learning |isbn=978-1-133-00912-2 |url=https://books.google.com.ar/books?id=9TUIAAAAQBAJ&pg=PA231&redir_esc=y |language=en}}</ref>
 
| 1909 || Memory bias || Concept development || The first documented empirical studies on the {{w|testing effect}} are published by Edwina E. Abbott.<ref>{{cite journal|last1=Abbott|first1=Edwina|date=1909|title=On the analysis of the factors of recall in the learning process|url=https://insights.ovid.com/psychological-monographs-general-applied/pmga/1909/11/010/analysis-factor-recall-learning-process/5/00006828|journal=Psychological Monographs: General and Applied|volume=11|issue=1|pages=159–177|via=Ovid|doi=10.1037/h0093018}}</ref><ref>{{Cite book|last1=Larsen|first1=Douglas P.|last2=Butler|first2=Andrew C.|date=2013|editor-last=Walsh, K.|title=Test-enhanced learning|url=https://books.google.com/?id=KW2rAAAAQBAJ&pg=PA443&dq=Test-enhanced+learning#v=onepage&q=Test-enhanced%20learning&f=false|journal=In Oxford Textbook of Medical Education|volume=|issue=|pages=443–452}}</ref> || "{{w|Testing effect}} is the finding that long-term memory is often increased when some of the learning period is devoted to retrieving the to-be-remembered information."<ref>{{cite book |last1=Goldstein |first1=E. Bruce |title=Cognitive Psychology: Connecting Mind, Research and Everyday Experience |publisher=Cengage Learning |isbn=978-1-133-00912-2 |url=https://books.google.com.ar/books?id=9TUIAAAAQBAJ&pg=PA231&redir_esc=y |language=en}}</ref>
 
|-
 
|-
| 1913 || || Concept development || The term "{{w|Monte Carlo fallacy}}" originates from the best known [[w:Gambler's fallacy#Monte Carlo Casino|example]] of the phenomenon, which occurs in the {{w|Monte Carlo Casino}}.<ref name= "monte_carlo">{{Cite web|url=http://www.bbc.com/future/story/20150127-why-we-gamble-like-monkeys|title=Why we gamble like monkeys|work=BBC.com|date=2015-01-02}}</ref> || {{w|Monte Carlo fallacy}} "occurs when an individual erroneously believes that a certain random event is less likely or more likely, given a previous event or a series of events."<ref>{{cite web |title=Gambler's Fallacy |url=https://www.investopedia.com/terms/g/gamblersfallacy.asp |website=investopedia.com |accessdate=7 May 2020}}</ref>
+
| 1913 || || Concept development || The term "{{w|Monte Carlo fallacy}}" (also known as {{w|Gambler's fallacy}}) originates from the best known [[w:Gambler's fallacy#Monte Carlo Casino|example]] of the phenomenon, which occurs in the {{w|Monte Carlo Casino}}.<ref name= "monte_carlo">{{Cite web|url=http://www.bbc.com/future/story/20150127-why-we-gamble-like-monkeys|title=Why we gamble like monkeys|work=BBC.com|date=2015-01-02}}</ref> || Gambler's fallacy "occurs when an individual erroneously believes that a certain random event is less likely or more likely, given a previous event or a series of events."<ref>{{cite web |title=Gambler's Fallacy |url=https://www.investopedia.com/terms/g/gamblersfallacy.asp |website=investopedia.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
 
| 1914 || Memory bias || Concept development || The first research on the {{w|cross-race effect}} is published.<ref>{{cite journal | last1 = Feingold | first1 = CA | year = 1914 | title = The influence of environment on identification of persons and things | url = https://scholarlycommons.law.northwestern.edu/jclc/vol5/iss1/6| journal = Journal of Criminal Law and Police Science | volume = 5 | issue = 1| pages = 39–51 | doi=10.2307/1133283| jstor = 1133283 }}</ref> || {{w|Cross-race effect}} is "the tendency for eyewitnesses to be better at recognizing members of their own race/ethnicity than members of other races."<ref>{{cite journal |last1=Laub |first1=Cindy E. |last2=Meissner |first2=Christian A. |last3=Susa |first3=Kyle J. |title=The Cross-Race Effect: Resistant to Instructions |doi=10.1155/2013/745836 |url=https://www.hindawi.com/journals/jcrim/2013/745836/}}</ref>
 
| 1914 || Memory bias || Concept development || The first research on the {{w|cross-race effect}} is published.<ref>{{cite journal | last1 = Feingold | first1 = CA | year = 1914 | title = The influence of environment on identification of persons and things | url = https://scholarlycommons.law.northwestern.edu/jclc/vol5/iss1/6| journal = Journal of Criminal Law and Police Science | volume = 5 | issue = 1| pages = 39–51 | doi=10.2307/1133283| jstor = 1133283 }}</ref> || {{w|Cross-race effect}} is "the tendency for eyewitnesses to be better at recognizing members of their own race/ethnicity than members of other races."<ref>{{cite journal |last1=Laub |first1=Cindy E. |last2=Meissner |first2=Christian A. |last3=Susa |first3=Kyle J. |title=The Cross-Race Effect: Resistant to Instructions |doi=10.1155/2013/745836 |url=https://www.hindawi.com/journals/jcrim/2013/745836/}}</ref>
Line 112: Line 97:
 
| 1920 || Social bias || Concept development || The {{w|halo effect}} is named by psychologist {{w|Edward Thorndike}}<ref>{{Cite book
 
| 1920 || Social bias || Concept development || The {{w|halo effect}} is named by psychologist {{w|Edward Thorndike}}<ref>{{Cite book
 
|title=The Advanced Dictionary of Marketing, Scott G. Dacko, 2008: Marketing
 
|title=The Advanced Dictionary of Marketing, Scott G. Dacko, 2008: Marketing
|date=2008-06-18  |publisher=Oxford University Press  |isbn=9780199286003  |location=Oxford  |pages=248}}</ref> in reference to a person being perceived as having a [[w:Halo (religious iconography)|halo]]. He gives the phenomenon its name in his article ''A Constant Error in Psychological Ratings''.<ref name=":2">{{harvnb | Thorndike | 1920}}</ref> In "Constant Error", Thorndike sets out to replicate the study in hopes of pinning down the bias that he thought was present in these ratings. Subsequent researchers would study it in relation to {{w|attractiveness}} and its bearing on the judicial and educational systems.<ref name="BBdang">{{Cite journal|last=Sigall|first=Harold|last2=Ostrove|first2=Nancy|date=1975-03-01|title=Beautiful but Dangerous: Effects of Offender Attractiveness and Nature of the Crime on Juridic Judgment|url=https://www.researchgate.net/publication/232451231|journal=Journal of Personality and Social Psychology|volume=31|issue=3|pages=410–414|doi=10.1037/h0076472}}</ref> Thorndike originally coins the term referring only to people; however, its use would be greatly expanded especially in the area of brand marketing.<ref name=":2" /> "First coined back in 1920, the halo effect describes how our impression of a person forms a halo around our conception of their character." "The term was coined by psychologist Edwin Thorndike in 1920."<ref>{{cite web |title=This Cognitive Bias Explains Why Pretty People Make 12% More Money Than Everybody Else |url=https://www.businessinsider.com.au/halo-effect-money-beauty-bias-2014-11 |website=businessinsider.com |accessdate=6 April 2020}}</ref><ref>{{cite web |title=What Is the Halo Effect? |url=https://www.psychologytoday.com/us/basics/the-halo-effect |website=psychologytoday.com |accessdate=6 April 2020}}</ref> || {{w|Halo effect}} refers to an "error in reasoning in which an impression formed from a single trait or characteristic is allowed to influence multiple judgments or ratings of unrelated factors."<ref>{{cite web |title=Halo effect |url=https://www.britannica.com/science/halo-effect |website=britannica.com |accessdate=7 May 2020}}</ref>
+
|date=2008-06-18  |publisher=Oxford University Press  |isbn=9780199286003  |location=Oxford  |pages=248}}</ref> in reference to a person being perceived as having a [[w:Halo (religious iconography)|halo]]. He gives the phenomenon its name in his article ''A Constant Error in Psychological Ratings''.<ref name=":2">{{harvnb | Thorndike | 1920}}</ref> In "Constant Error", Thorndike sets out to replicate the study in hopes of pinning down the bias that he thought was present in these ratings. Subsequent researchers would study it in relation to {{w|attractiveness}} and its bearing on the judicial and educational systems.<ref name="BBdang">{{Cite journal|last=Sigall|first=Harold|last2=Ostrove|first2=Nancy|date=1975-03-01|title=Beautiful but Dangerous: Effects of Offender Attractiveness and Nature of the Crime on Juridic Judgment|url=https://www.researchgate.net/publication/232451231|journal=Journal of Personality and Social Psychology|volume=31|issue=3|pages=410–414|doi=10.1037/h0076472}}</ref> Thorndike originally coins the term referring only to people; however, its use would be greatly expanded especially in the area of brand marketing.<ref name=":2" /> || {{w|Halo effect}} refers to an "error in reasoning in which an impression formed from a single trait or characteristic is allowed to influence multiple judgments or ratings of unrelated factors."<ref>{{cite web |title=Halo effect |url=https://www.britannica.com/science/halo-effect |website=britannica.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
 
| 1922 || || Concept development || The term “stereotype” is first used in the modern psychological sense by American journalist Walter Lippmann in his work ''Public Opinion''.<ref name="Stereotypes Defined"/> || "Stereotype is most frequently now employed to refer to an often unfair and untrue belief that many people have about all people or things with a particular characteristic."<ref>{{cite web |title=Definition of STEREOTYPE |url=https://www.merriam-webster.com/dictionary/stereotype |website=www.merriam-webster.com |access-date=28 January 2021 |language=en}}</ref>
 
| 1922 || || Concept development || The term “stereotype” is first used in the modern psychological sense by American journalist Walter Lippmann in his work ''Public Opinion''.<ref name="Stereotypes Defined"/> || "Stereotype is most frequently now employed to refer to an often unfair and untrue belief that many people have about all people or things with a particular characteristic."<ref>{{cite web |title=Definition of STEREOTYPE |url=https://www.merriam-webster.com/dictionary/stereotype |website=www.merriam-webster.com |access-date=28 January 2021 |language=en}}</ref>
 
|-
 
|-
| 1927 || Memory bias || Research || Russian psychologist {{w|Bluma Zeigarnik}} publishes in the journal ''[[w:Psychological Research|Psychologische Forschung]]'' a report on a series of experiments uncovering the processes underlying the phenomenon later called {{w|Zeigarnik effect}}.<ref>Zeigarnik 1927: "Das Behalten erledigter und unerledigter Handlungen". ''{{w|Psychologische Forschung}}'' 9, 1-85.</ref> Russian psychologist {{w|Bluma Zeigarnik}} first studies the phenomenon after her professor and [[w:Gestalt psychology|Gestalt]] psychologist {{w|Kurt Lewin}} noticed that a {{w|waiter}} had better recollections of still unpaid orders. However, after the completion of the task – after everyone had paid – he was unable to remember any more details of the orders. Zeigarnik then designed a series of experiments to uncover the processes underlying this phenomenon. Her research report was published in 1927, in the journal ''[[w:Psychological Research|Psychologische Forschung]].''<ref>Zeigarnik 1927: "Das Behalten erledigter und unerledigter Handlungen". ''{{w|Psychologische Forschung}}'' 9, 1-85.</ref> || {{w|Zeigarnik effect}} is the "tendency to remember interrupted or incomplete tasks or events more easily than tasks that have been completed."<ref>{{cite web |title=Zeigarnik Effect |url=https://www.goodtherapy.org/blog/psychpedia/zeigarnik-effect |website=goodtherapy.org |accessdate=7 May 2020}}</ref>
+
| 1927 || Memory bias || Research || [[w:Lithuanians|Lithuanian]]-[[w:Soviet Union|Soviet]] {{w|psychologist}} {{w|Bluma Zeigarnik}} at the {{w|University of Berlin}} first describes the phenomenon that would be later known as {{w|Zeigarnik effect}}.<ref>{{cite web |title=Bluma Wulfovna Zeigarnik |url=https://www.thescienceofpsychotherapy.com/bluma-wulfovna-zeigarnik/ |website=The Science of Psychotherapy |access-date=16 March 2021 |language=en-AU |date=31 March 2014}}</ref><ref>Zeigarnik 1927: "Das Behalten erledigter und unerledigter Handlungen". ''{{w|Psychologische Forschung}}'' 9, 1-85.</ref><ref>Zeigarnik 1927: "Das Behalten erledigter und unerledigter Handlungen". ''{{w|Psychologische Forschung}}'' 9, 1-85.</ref> || {{w|Zeigarnik effect}} is the "tendency to remember interrupted or incomplete tasks or events more easily than tasks that have been completed."<ref>{{cite web |title=Zeigarnik Effect |url=https://www.goodtherapy.org/blog/psychpedia/zeigarnik-effect |website=goodtherapy.org |accessdate=7 May 2020}}</ref>
 
|-
 
|-
 
| 1928 || Belief, decision-making and behavioral || Literature || American economist {{w|Irving Fisher}} publishes ''The {{w|Money Illusion}}'', which develops the concept of the same name.<ref>{{Citation | title = The Money Illusion | last = Fisher | first = Irving | publisher = Adelphi Company | year = 1928 |location=New York }}</ref> || "Money illusion posits that people have a tendency to view their wealth and income in nominal dollar terms, rather than recognize its real value, adjusted for inflation."<ref>{{cite web |last1=Liberto |first1=Daniel |title=Money Illusion Definition |url=https://www.investopedia.com/terms/m/money_illusion.asp |website=Investopedia |access-date=26 January 2021 |language=en}}</ref>   
 
| 1928 || Belief, decision-making and behavioral || Literature || American economist {{w|Irving Fisher}} publishes ''The {{w|Money Illusion}}'', which develops the concept of the same name.<ref>{{Citation | title = The Money Illusion | last = Fisher | first = Irving | publisher = Adelphi Company | year = 1928 |location=New York }}</ref> || "Money illusion posits that people have a tendency to view their wealth and income in nominal dollar terms, rather than recognize its real value, adjusted for inflation."<ref>{{cite web |last1=Liberto |first1=Daniel |title=Money Illusion Definition |url=https://www.investopedia.com/terms/m/money_illusion.asp |website=Investopedia |access-date=26 January 2021 |language=en}}</ref>   
Line 122: Line 107:
 
| 1930 || || Concept development || English epistemologist {{w|C. D. Broad}} further elaborates on the concept of the {{w|specious present}} and states that it may be considered as the temporal equivalent of a sensory datum.<ref name=andersen /> || "The specious present is a term applied to that short duration of time the human mind appears to be able to experience, a period which exists between past and future and which is longer than the singular moment of the actual present."<ref>{{cite web |title=The Specious Present: Andrew Beck, David Claerbout, Colin McCahon, Keith Tyson - Announcements - Art & Education |url=https://www.artandeducation.net/announcements/106498/the-specious-present-andrew-beck-david-claerbout-colin-mccahon-keith-tyson |website=www.artandeducation.net |access-date=27 January 2021}}</ref>
 
| 1930 || || Concept development || English epistemologist {{w|C. D. Broad}} further elaborates on the concept of the {{w|specious present}} and states that it may be considered as the temporal equivalent of a sensory datum.<ref name=andersen /> || "The specious present is a term applied to that short duration of time the human mind appears to be able to experience, a period which exists between past and future and which is longer than the singular moment of the actual present."<ref>{{cite web |title=The Specious Present: Andrew Beck, David Claerbout, Colin McCahon, Keith Tyson - Announcements - Art & Education |url=https://www.artandeducation.net/announcements/106498/the-specious-present-andrew-beck-david-claerbout-colin-mccahon-keith-tyson |website=www.artandeducation.net |access-date=27 January 2021}}</ref>
 
|-
 
|-
| 1932 || Memory bias || Field development || Some of the earliest evidence for the {{w|Fading Affect Bias}} dates back to a study by Cason, who conducts a study using a retrospective procedure where participants recall and rate past events and emotion when prompted finds that recalled emotional intensity for positive events is generally stronger than that of negative events.<ref>{{Cite journal|last=Fleming|first=G. W. T. H.|date=January 1933|title=The Learning and Retention of Pleasant and Unpleasant Activities. (Arch. of Psychol., No. 134, 1932.) Cason, H.|journal=Journal of Mental Science|volume=79|issue=324|pages=187–188|doi=10.1192/bjp.79.324.187-c|issn=0368-315X}}</ref> || The {{w|Fading Affect Bias}} "indicates that the emotional response prompted by positive memories often tends to be stronger than the emotional response prompted by negative memories."<ref>{{cite journal |last1=Skowronski |first1=John J. |last2=Walker |first2=W. Richard |last3=Henderson |first3=Dawn X. |last4=Bond |first4=Gary D. |title=Chapter Three - The Fading Affect Bias: Its History, Its Implications, and Its Future |doi=10.1016/B978-0-12-800052-6.00003-2 |url=https://www.sciencedirect.com/science/article/pii/B9780128000526000032}}</ref>
+
| 1932 || Memory bias || Research || Some of the earliest evidence for the {{w|Fading Affect Bias}} dates back to a study by Cason, who conducts a study using a retrospective procedure where participants recall and rate past events and emotion when prompted finds that recalled emotional intensity for positive events is generally stronger than that of negative events.<ref>{{Cite journal|last=Fleming|first=G. W. T. H.|date=January 1933|title=The Learning and Retention of Pleasant and Unpleasant Activities. (Arch. of Psychol., No. 134, 1932.) Cason, H.|journal=Journal of Mental Science|volume=79|issue=324|pages=187–188|doi=10.1192/bjp.79.324.187-c|issn=0368-315X}}</ref> || The {{w|Fading Affect Bias}} "indicates that the emotional response prompted by positive memories often tends to be stronger than the emotional response prompted by negative memories."<ref>{{cite journal |last1=Skowronski |first1=John J. |last2=Walker |first2=W. Richard |last3=Henderson |first3=Dawn X. |last4=Bond |first4=Gary D. |title=Chapter Three - The Fading Affect Bias: Its History, Its Implications, and Its Future |doi=10.1016/B978-0-12-800052-6.00003-2 |url=https://www.sciencedirect.com/science/article/pii/B9780128000526000032}}</ref>
 
|-
 
|-
 
| 1933 || Memory bias || Concept development || The {{w|Von Restorff effect}} theory is coined by German psychiatrist and pediatrician {{w|Hedwig von Restorff}}, who, in her study, finds that when participants are presented with a list of categorically similar items with one distinctive, isolated item on the list, memory for the item is improved.<ref name="vonRestorff1933">{{cite journal|last1=von Restorff|first1=Hedwig|title=Über die Wirkung von Bereichsbildungen im Spurenfeld|journal=Psychologische Forschung [Psychological Research]|date=1933|volume=18|issue=1|pages=299–342|doi=10.1007/BF02409636|trans-title=The effects of field formation in the trace field|url=http://www.utsa.edu/mind/von_restorff_translation.htm|language=de}}</ref> || "It predicts that when multiple similar objects are present, the one that differs from the rest is most likely to be remembered."<ref>{{cite web |title=The Von Restorff effect |url=https://lawsofux.com/von-restorff-effect |website=lawsofux.com |accessdate=7 May 2020}}</ref>
 
| 1933 || Memory bias || Concept development || The {{w|Von Restorff effect}} theory is coined by German psychiatrist and pediatrician {{w|Hedwig von Restorff}}, who, in her study, finds that when participants are presented with a list of categorically similar items with one distinctive, isolated item on the list, memory for the item is improved.<ref name="vonRestorff1933">{{cite journal|last1=von Restorff|first1=Hedwig|title=Über die Wirkung von Bereichsbildungen im Spurenfeld|journal=Psychologische Forschung [Psychological Research]|date=1933|volume=18|issue=1|pages=299–342|doi=10.1007/BF02409636|trans-title=The effects of field formation in the trace field|url=http://www.utsa.edu/mind/von_restorff_translation.htm|language=de}}</ref> || "It predicts that when multiple similar objects are present, the one that differs from the rest is most likely to be remembered."<ref>{{cite web |title=The Von Restorff effect |url=https://lawsofux.com/von-restorff-effect |website=lawsofux.com |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 1942 || || Concept development || The {{w|Einstellung effect}} is first described by Dr. Abraham Luchins.<ref>{{cite web |title=The Einstellung Effect - Thinking Differently |url=https://exploringyourmind.com/the-einstellung-effect-thinking-differently/ |website=Exploring your mind |access-date=18 April 2021 |language=en |date=27 January 2020}}</ref> || "The Einstellung Effect is a type of mindset that causes humans to repeat the use of "tried and true" strategies for problem solving, even when a simpler solution strategy exists."<ref>{{cite web |title=Einstellung Effect definition {{!}} Psychology Glossary {{!}} alleydog.com |url=https://www.alleydog.com/glossary/definition.php?term=Einstellung+Effect |website=www.alleydog.com |access-date=17 May 2021}}</ref>
 
|-
 
|-
 
| 1945 || Belief, decision-making and behavioral (anchoring bias) || Concept development || {{w|Karl Duncker}} defines {{w|functional fixedness}} as being a "mental block against using an object in a new way that is required to solve a problem".<ref name=Duncker1945>Duncker, K. (1945). "On problem solving". ''{{w|Psychological Monographs}}'', 58:5 (Whole No. 270).</ref> || {{w|Functional fixedness}} "is the inability to realize that something known to have a particular use may also be used to perform other functions."<ref>{{cite web |title=Functional fixedness |url=https://www.britannica.com/science/functional-fixedness |website=britannica.com |accessdate=7 May 2020}}</ref>
 
| 1945 || Belief, decision-making and behavioral (anchoring bias) || Concept development || {{w|Karl Duncker}} defines {{w|functional fixedness}} as being a "mental block against using an object in a new way that is required to solve a problem".<ref name=Duncker1945>Duncker, K. (1945). "On problem solving". ''{{w|Psychological Monographs}}'', 58:5 (Whole No. 270).</ref> || {{w|Functional fixedness}} "is the inability to realize that something known to have a particular use may also be used to perform other functions."<ref>{{cite web |title=Functional fixedness |url=https://www.britannica.com/science/functional-fixedness |website=britannica.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 1946 || Belief, decision-making and behavioral (logical fallacy) || Concept development || American statistician {{w|Joseph Berkson}} illustrates what would be later known as Berkson's paradox,one of the most famous paradox in probability and statistics.<ref>{{cite journal |last1=Batsidis |first1=Apostolos |last2=Tzavelas |first2=George |last3=Alexopoulos |first3=Panagiotis |title=Berkson's paradox and weighted distributions: An application to Alzheimer's disease |url=https://onlinelibrary.wiley.com/doi/abs/10.1002/bimj.201900046}}</ref>  Berkson's bias or fallacy, is a type of selection bias || {{w|Berkson's paradox}} "is a type of selection bias - a mathematical result found in the fields of conditional probability and statistics in which two variables can be negatively correlated even though they have the appearance of being positively correlated within the population."<ref>{{cite web |title=Berkson's Paradox (Berkson's Bias) |url=https://www.alleydog.com/glossary/definition.php?term=Berkson%27s+Paradox+%28Berkson%27s+Bias%29 |website=alleydog.com |accessdate=14 August 2020}}</ref>
+
| 1946 || Belief, decision-making and behavioral (logical fallacy) || Concept development || American statistician {{w|Joseph Berkson}} illustrates what would be later known as {{w|Berkson's paradox}}, one of the most famous paradoxes in probability and statistics.<ref>{{cite journal |last1=Batsidis |first1=Apostolos |last2=Tzavelas |first2=George |last3=Alexopoulos |first3=Panagiotis |title=Berkson's paradox and weighted distributions: An application to Alzheimer's disease |url=https://onlinelibrary.wiley.com/doi/abs/10.1002/bimj.201900046}}</ref>  Berkson's bias or fallacy, is a type of selection bias. || {{w|Berkson's paradox}} "is a type of selection bias{{snd}}a mathematical result found in the fields of conditional probability and statistics in which two variables can be negatively correlated even though they have the appearance of being positively correlated within the population."<ref>{{cite web |title=Berkson's Paradox (Berkson's Bias) |url=https://www.alleydog.com/glossary/definition.php?term=Berkson%27s+Paradox+%28Berkson%27s+Bias%29 |website=alleydog.com |accessdate=14 August 2020}}</ref>
 
|-
 
|-
| 1947 || Belief, decision-making and behavioral ({{w|extension neglect}}) || Concept development|| {{w|Joseph Stalin}} introduces the concept of {{w|compassion fade}} with his statement “the death of one man is a tragedy, the death of millions is a statistic”.<ref name=":4">Johnson, J. (2011). The arithmetic of compassion: rethinking the politics of photography. ''British Journal of Political Science, 41''(3), 621-643. doi: 10.1017/S0007123410000487.</ref> || Compassion fade "refers to the decrease in the compassion one shows for the people in trouble as the number of the victims increase."<ref>{{cite web |title=Compassion fade |url=http://econowmics.com/compassion-fade/ |website=econowmics.com |access-date=15 January 2021}}</ref>
+
| 1947 || Belief, decision-making and behavioral ({{w|extension neglect}}) || Concept development || {{w|Joseph Stalin}} is credited by some for having introduced the concept of {{w|compassion fade}} with his statement “the death of one man is a tragedy, the death of millions is a statistic”.<ref name=":4">Johnson, J. (2011). The arithmetic of compassion: rethinking the politics of photography. ''British Journal of Political Science, 41''(3), 621-643. doi: 10.1017/S0007123410000487.</ref> However, this introduction is considered to be misattributed by others.<ref>{{cite web |title=Joseph Stalin - Wikiquote |url=https://en.wikiquote.org/wiki/Joseph_Stalin#Misattributed |website=en.wikiquote.org |access-date=17 May 2021 |language=en}}</ref> || Compassion fade "refers to the decrease in the compassion one shows for the people in trouble as the number of the victims increase."<ref>{{cite web |title=Compassion fade |url=http://econowmics.com/compassion-fade/ |website=econowmics.com |access-date=15 January 2021}}</ref>
 
|-
 
|-
 
| 1952 || Social (conformity bias) || Concept development || [[w:William H. Whyte|William H. Whyte Jr.]] derives the term ''{{w|groupthink}}'' from {{w|George Orwell}}'s ''{{w|Nineteen Eighty-Four}}'' and popularizes it in [[w:Fortune (magazine)|''Fortune'']] magazine:
 
| 1952 || Social (conformity bias) || Concept development || [[w:William H. Whyte|William H. Whyte Jr.]] derives the term ''{{w|groupthink}}'' from {{w|George Orwell}}'s ''{{w|Nineteen Eighty-Four}}'' and popularizes it in [[w:Fortune (magazine)|''Fortune'']] magazine:
  
 
{{quote|text=Groupthink being a coinage – and, admittedly, a loaded one – a working definition is in order. We are not talking about mere instinctive conformity – it is, after all, a perennial failing of mankind. What we are talking about is a ''rationalized'' conformity – an open, articulate philosophy which holds that group values are not only expedient but right and good as well.<ref>
 
{{quote|text=Groupthink being a coinage – and, admittedly, a loaded one – a working definition is in order. We are not talking about mere instinctive conformity – it is, after all, a perennial failing of mankind. What we are talking about is a ''rationalized'' conformity – an open, articulate philosophy which holds that group values are not only expedient but right and good as well.<ref>
{{cite news |first=W. H., Jr. |last=Whyte  |author-link=William H. Whyte |title=Groupthink |journal=[[Fortune (magazine)|Fortune]] |date=March 1952|pages = 114–117, 142, 146}}
+
{{cite news |first=W. H., Jr. |last=Whyte  |author-link=William H. Whyte |title=Groupthink |journal=[[w:Fortune (magazine)|Fortune]] |date=March 1952|pages = 114–117, 142, 146}}
</ref><ref>{{cite news |first=W. |last=Safire |author-link=William Safire |title=Groupthink |url=https://query.nytimes.com/gst/fullpage.html?res=9C01E2DD173CF93BA3575BC0A9629C8B63 |quote=If the committee's other conclusions are as outdated as its etymology, we're all in trouble. 'Groupthink' (one word, no hyphen) was the title of an article in Fortune magazine in March 1952 by William H. Whyte Jr. ... Whyte derided the notion he argued was held by a trained elite of Washington's 'social engineers.' |work=[[The New York Times]] |date=August 8, 2004 |access-date=February 2, 2012}}</ref>
+
</ref><ref>{{cite web |last1=Safire |first1=William |title=THE WAY WE LIVE NOW: 8-8-04: ON LANGUAGE; Groupthink (Published 2004) |url=https://query.nytimes.com/gst/fullpage.html?res=9C01E2DD173CF93BA3575BC0A9629C8B63 |website=The New York Times |access-date=14 March 2021 |date=8 August 2004}}</ref>
 
}}   
 
}}   
 
|| "Groupthink is a psychological phenomenon in which people strive for consensus within a group."<ref>{{cite web |title=The Psychology Behind Why We Strive for Consensus |url=https://www.verywellmind.com/what-is-groupthink-2795213 |website=Verywell Mind |language=en}}</ref>  
 
|| "Groupthink is a psychological phenomenon in which people strive for consensus within a group."<ref>{{cite web |title=The Psychology Behind Why We Strive for Consensus |url=https://www.verywellmind.com/what-is-groupthink-2795213 |website=Verywell Mind |language=en}}</ref>  
Line 144: Line 131:
 
| 1956 || || Concept development || The term "{{w|Barnum effect}}" is coined by psychologist {{w|Paul Meehl}} in his essay ''Wanted – A Good Cookbook'', because he relates the vague personality descriptions used in certain "pseudo-successful" psychological tests to those given by showman {{w|P. T. Barnum}}.<ref name=Meehl1956>{{cite journal|last1=Meehl |first1=Paul E. |title=Wanted – A Good Cookbook |journal=American Psychologist |date=1956 |volume=11 |issue=6 |pages=263–272 |doi=10.1037/h0044164 |df= }}</ref><ref name="Dutton1988">{{cite journal|last1=Dutton|first1=D. L.|title=The cold reading technique|journal=Experientia|date=1988|volume=44|issue=4|pages=326–332|doi=10.1007/BF01961271|url=http://denisdutton.com/cold_reading.htm|language=en|pmid=3360083}}</ref> || {{w|Barnum effect}} is "the phenomenon that occurs when individuals believe that personality descriptions apply specifically to them (more so than to other people), despite the fact that the description is actually filled with information that applies to everyone."<ref>{{cite web |title=Barnum Effect |url=https://www.britannica.com/science/Barnum-Effect |website=britannica.com |accessdate=7 May 2020}}</ref>
 
| 1956 || || Concept development || The term "{{w|Barnum effect}}" is coined by psychologist {{w|Paul Meehl}} in his essay ''Wanted – A Good Cookbook'', because he relates the vague personality descriptions used in certain "pseudo-successful" psychological tests to those given by showman {{w|P. T. Barnum}}.<ref name=Meehl1956>{{cite journal|last1=Meehl |first1=Paul E. |title=Wanted – A Good Cookbook |journal=American Psychologist |date=1956 |volume=11 |issue=6 |pages=263–272 |doi=10.1037/h0044164 |df= }}</ref><ref name="Dutton1988">{{cite journal|last1=Dutton|first1=D. L.|title=The cold reading technique|journal=Experientia|date=1988|volume=44|issue=4|pages=326–332|doi=10.1007/BF01961271|url=http://denisdutton.com/cold_reading.htm|language=en|pmid=3360083}}</ref> || {{w|Barnum effect}} is "the phenomenon that occurs when individuals believe that personality descriptions apply specifically to them (more so than to other people), despite the fact that the description is actually filled with information that applies to everyone."<ref>{{cite web |title=Barnum Effect |url=https://www.britannica.com/science/Barnum-Effect |website=britannica.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 1957 || || Concept development || British naval historian {{w|C. Northcote Parkinson}} describes what is later called {{w|Parkinson's law of triviality}}, which argues that members of an organization give disproportionate weight to trivial issues.<ref name="parkinson">{{cite book |first=C. Northcote |last=Parkinson |title = Parkinson's Law, or the Pursuit of Progress |publisher=John Murray |isbn=0140091076|year=1958}}</ref> || {{w|Parkinson's law of triviality}} "explains that people will give more energy and focus to trivial or unimportant items than to more important and complex ones."<ref>{{cite web |title=How to Handle Bikeshedding: Parkinson’s Law of Triviality |url=https://projectbliss.net/bikeshedding-parkinsons-law-of-triviality/ |website=projectbliss.net |accessdate=7 May 2020}}</ref>
+
| 1957 || || Concept development || British naval historian {{w|C. Northcote Parkinson}} describes what is later called {{w|Parkinson's law of triviality}}, which argues that members of an organization give disproportionate weight to trivial issues.<ref name="parkinson">{{cite book |first=C. Northcote |last=Parkinson |title = Parkinson's Law, or the Pursuit of Progress |publisher=John Murray |isbn=0140091076|year=1958}}</ref> || {{w|Parkinson's law of triviality}} (also known as the bike-shed effect) "explains that people will give more energy and focus to trivial or unimportant items than to more important and complex ones."<ref>{{cite web |title=How to Handle Bikeshedding: Parkinson’s Law of Triviality |url=https://projectbliss.net/bikeshedding-parkinsons-law-of-triviality/ |website=projectbliss.net |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 1960 || Belief, decision-making and behavioral || Concept development || English psychhologist {{w|Peter Cathcart Wason}} first describes the {{w|confirmation bias}}.<ref>{{cite web |title=The Curious Case of Confirmation Bias |url=https://www.psychologytoday.com/us/blog/seeing-what-others-dont/201905/the-curious-case-confirmation-bias |website=psychologytoday.com |accessdate=7 April 2020}}</ref><ref>{{cite book |last1=Acks |first1=Alex |title=The Bubble of Confirmation Bias |url=https://books.google.com.ar/books?id=hPWCDwAAQBAJ&pg=PA9&dq=confirmation+bias%22+was+coined+by+English+psychologist+Peter+Wason&hl=en&sa=X&ved=0ahUKEwiMnaen1dboAhVAIrkGHX4TAwEQ6AEIMTAB#v=onepage&q=confirmation%20bias%22%20was%20coined%20by%20English%20psychologist%20Peter%20Wason&f=false}}</ref><ref>{{cite book |last1=Myers |first1=David G. |title=Psychology |url=https://books.google.com.ar/books?id=OqZZAAAAYAAJ&q=confirmation+bias%22+was+coined+by+English+psychologist+Peter+Wason&dq=confirmation+bias%22+was+coined+by+English+psychologist+Peter+Wason&hl=en&sa=X&ved=0ahUKEwiMnaen1dboAhVAIrkGHX4TAwEQ6AEISzAE}}</ref> || "{{w|Confirmation bias}} is the tendency of people to favor information that confirms their existing beliefs or hypotheses."<ref>{{cite web |title=Confirmation Bias |url=https://www.simplypsychology.org/confirmation-bias.html |website=simplypsychology.org |accessdate=14 August 2020}}</ref>
+
| 1960 || Belief, decision-making and behavioral || Concept development || English psychologist {{w|Peter Cathcart Wason}} first describes the {{w|confirmation bias}}.<ref>{{cite web |title=The Curious Case of Confirmation Bias |url=https://www.psychologytoday.com/us/blog/seeing-what-others-dont/201905/the-curious-case-confirmation-bias |website=psychologytoday.com |accessdate=7 April 2020}}</ref><ref>{{cite book |last1=Acks |first1=Alex |title=The Bubble of Confirmation Bias |url=https://books.google.com.ar/books?id=hPWCDwAAQBAJ&pg=PA9&dq=confirmation+bias%22+was+coined+by+English+psychologist+Peter+Wason&hl=en&sa=X&ved=0ahUKEwiMnaen1dboAhVAIrkGHX4TAwEQ6AEIMTAB#v=onepage&q=confirmation%20bias%22%20was%20coined%20by%20English%20psychologist%20Peter%20Wason&f=false}}</ref><ref>{{cite book |last1=Myers |first1=David G. |title=Psychology |url=https://books.google.com.ar/books?id=OqZZAAAAYAAJ&q=confirmation+bias%22+was+coined+by+English+psychologist+Peter+Wason&dq=confirmation+bias%22+was+coined+by+English+psychologist+Peter+Wason&hl=en&sa=X&ved=0ahUKEwiMnaen1dboAhVAIrkGHX4TAwEQ6AEISzAE}}</ref> || "{{w|Confirmation bias}} is the tendency of people to favor information that confirms their existing beliefs or hypotheses."<ref>{{cite web |title=Confirmation Bias |url=https://www.simplypsychology.org/confirmation-bias.html |website=simplypsychology.org |accessdate=14 August 2020}}</ref>
 
|-
 
|-
 
| 1960 || Belief, decision-making and behavioral ({{w|confirmation bias}}) || Concept development || {{w|Peter Cathcart Wason}} discovers the classic example of subjects' {{w|congruence bias}}.<ref>{{cite web |title=The Curious Case of Confirmation Bias |url=https://www.psychologytoday.com/gb/blog/seeing-what-others-dont/201905/the-curious-case-confirmation-bias#:~:text=Confirmation%20bias%20was%20first%20described,their%20triple%20fit%20the%20rule. |website=psychologytoday.com |accessdate=14 August 2020}}</ref> || {{w|Congruence bias}} is "the tendency to test hypotheses exclusively through direct testing, instead of considering possible alternatives."<ref>{{cite web |title=Cognitive Bias in Decision Making |url=https://associationanalytics.com/2015/11/30/cognitive-bias-in-decision-making/ |website=associationanalytics.com |accessdate=7 May 2020}}</ref>
 
| 1960 || Belief, decision-making and behavioral ({{w|confirmation bias}}) || Concept development || {{w|Peter Cathcart Wason}} discovers the classic example of subjects' {{w|congruence bias}}.<ref>{{cite web |title=The Curious Case of Confirmation Bias |url=https://www.psychologytoday.com/gb/blog/seeing-what-others-dont/201905/the-curious-case-confirmation-bias#:~:text=Confirmation%20bias%20was%20first%20described,their%20triple%20fit%20the%20rule. |website=psychologytoday.com |accessdate=14 August 2020}}</ref> || {{w|Congruence bias}} is "the tendency to test hypotheses exclusively through direct testing, instead of considering possible alternatives."<ref>{{cite web |title=Cognitive Bias in Decision Making |url=https://associationanalytics.com/2015/11/30/cognitive-bias-in-decision-making/ |website=associationanalytics.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 1961 || Social bias || Study || The {{w|Milgram experiment}} is conducted. This classic experiment establishes the existence of {{w|authority bias}}.<ref>{{cite book |author=Ellis RM |title=Middle Way Philosophy: Omnibus Edition |year=2015 |publisher=[[w:Lulu (company)|Lulu Press]] | url=https://books.google.com/books?id=xG9rCgAAQBAJ&dq=Ellis+RM+Middle+Way+Philosophy%3A+Omnibus+Edition&q=authority#v=onepage&q=milgram&f=false|isbn=9781326351892 }}</ref> || "{{w|Authority bias}} is the human tendency to attribute greater authority and knowledge to persons of authority (fame, power, position, etc.) than they may actually possess."<ref>{{cite web |title=Authority Bias |url=https://www.alleydog.com/glossary/definition.php?term=Authority+Bias |website=alleydog.com |accessdate=14 August 2020}}</ref>
+
| 1961 || Social bias || Research || The {{w|Milgram experiment}} is conducted. This classic experiment establishes the existence of {{w|authority bias}}.<ref>{{cite book |author=Ellis RM |title=Middle Way Philosophy: Omnibus Edition |year=2015 |publisher=[[w:Lulu (company)|Lulu Press]] | url=https://books.google.com/books?id=xG9rCgAAQBAJ&dq=Ellis+RM+Middle+Way+Philosophy%3A+Omnibus+Edition&q=authority#v=onepage&q=milgram&f=false|isbn=9781326351892 }}</ref> || "{{w|Authority bias}} is the human tendency to attribute greater authority and knowledge to persons of authority (fame, power, position, etc.) than they may actually possess."<ref>{{cite web |title=Authority Bias |url=https://www.alleydog.com/glossary/definition.php?term=Authority+Bias |website=alleydog.com |accessdate=14 August 2020}}</ref>
 
|-
 
|-
 
| 1961 || {{w|Ambiguity effect}} || Concept development || The {{w|ambiguity effect}} is first described by American economist {{w|Daniel Ellsberg}}.<ref>{{cite book|last1=Borcherding|first1=Katrin|last2=Laričev|first2=Oleg Ivanovič|last3=Messick|first3=David M.|title=Contemporary Issues in Decision Making|url=https://books.google.com/books?id=W3l9AAAAMAAJ|year=1990|publisher=North-Holland|isbn=978-0-444-88618-7|page=50}}</ref> || "{{w|Ambiguity Effect}} occurs when people prefer options with known probabilities over those with unknown probabilities."<ref>{{cite web |title=Why we prefer options that are known to us |url=https://thedecisionlab.com/biases/ambiguity-effect/ |website=thedecisionlab.com |accessdate=14 August 2020}}</ref>
 
| 1961 || {{w|Ambiguity effect}} || Concept development || The {{w|ambiguity effect}} is first described by American economist {{w|Daniel Ellsberg}}.<ref>{{cite book|last1=Borcherding|first1=Katrin|last2=Laričev|first2=Oleg Ivanovič|last3=Messick|first3=David M.|title=Contemporary Issues in Decision Making|url=https://books.google.com/books?id=W3l9AAAAMAAJ|year=1990|publisher=North-Holland|isbn=978-0-444-88618-7|page=50}}</ref> || "{{w|Ambiguity Effect}} occurs when people prefer options with known probabilities over those with unknown probabilities."<ref>{{cite web |title=Why we prefer options that are known to us |url=https://thedecisionlab.com/biases/ambiguity-effect/ |website=thedecisionlab.com |accessdate=14 August 2020}}</ref>
Line 165: Line 152:
 
  | url = https://books.google.com/books?id=OYe6fsXSP3IC&pg=PA28| isbn = 9781412836296
 
  | url = https://books.google.com/books?id=OYe6fsXSP3IC&pg=PA28| isbn = 9781412836296
 
  }}</ref> || "The law of the instrument principle states that when we acquire a specific tool/skill, we tend to be to see opportunities to use that tool/skill everywhere."<ref>{{cite web |title=Law of the instrument - Biases & Heuristics |url=https://thedecisionlab.com/biases/law-of-the-instrument/ |website=The Decision Lab |access-date=27 January 2021 |language=en-CA}}</ref>
 
  }}</ref> || "The law of the instrument principle states that when we acquire a specific tool/skill, we tend to be to see opportunities to use that tool/skill everywhere."<ref>{{cite web |title=Law of the instrument - Biases & Heuristics |url=https://thedecisionlab.com/biases/law-of-the-instrument/ |website=The Decision Lab |access-date=27 January 2021 |language=en-CA}}</ref>
|-
 
| 1966 || || Study || An experiment shows that people remember a group of words better if they are within the same theme category. Such words that generate recall by association are known as ''semantic cues''.<ref name="Tulving1966">{{cite journal|last1=Tulving|first1=Endel|last2=Pearlstone|first2=Zena|title=Availability versus accessibility of information in memory for words|journal=Journal of Verbal Learning and Verbal Behavior|date=1966|volume=5|issue=4|pages=381–391|doi=10.1016/S0022-5371(66)80048-8}}</ref> || "A semantic cue is a prompt that provides semantic information about a target word to facilitate its retrieval."<ref>{{cite journal |last1=Boyle |first1=Mary |title=Semantic Cue |journal=Encyclopedia of Clinical Neuropsychology |date=2018 |pages=3119–3120 |doi=10.1007/978-3-319-57111-9_921}}</ref>
 
 
|-
 
|-
 
| 1966 || Social (egocentric bias) || Research || Walster hypothesizes that it can be frightening to believe that a misfortune could happen to anyone at random, and attributing responsibility to the person(s) involved helps to manage this emotional reaction.<ref>{{cite journal |last1=Walster |first1=Elaine |title=Assignment of responsibility for an accident. |journal=Journal of Personality and Social Psychology |date=1966 |volume=3 |issue=1 |pages=73–79 |doi=10.1037/h0022733}}</ref> || "The {{w|defensive attribution hypothesis}} is a social psychology term that describes an attributional approach taken by some people - a set of beliefs that an individual uses to protect or "shield" themselves against fears of being the victim or cause of a major mishap."<ref>{{cite web |title=Defensive Attribution Hypothesis definition {{!}} Psychology Glossary {{!}} alleydog.com |url=https://www.alleydog.com/glossary/definition.php?term=Defensive+Attribution+Hypothesis |website=www.alleydog.com |access-date=29 January 2021}}</ref>
 
| 1966 || Social (egocentric bias) || Research || Walster hypothesizes that it can be frightening to believe that a misfortune could happen to anyone at random, and attributing responsibility to the person(s) involved helps to manage this emotional reaction.<ref>{{cite journal |last1=Walster |first1=Elaine |title=Assignment of responsibility for an accident. |journal=Journal of Personality and Social Psychology |date=1966 |volume=3 |issue=1 |pages=73–79 |doi=10.1037/h0022733}}</ref> || "The {{w|defensive attribution hypothesis}} is a social psychology term that describes an attributional approach taken by some people - a set of beliefs that an individual uses to protect or "shield" themselves against fears of being the victim or cause of a major mishap."<ref>{{cite web |title=Defensive Attribution Hypothesis definition {{!}} Psychology Glossary {{!}} alleydog.com |url=https://www.alleydog.com/glossary/definition.php?term=Defensive+Attribution+Hypothesis |website=www.alleydog.com |access-date=29 January 2021}}</ref>
Line 174: Line 159:
 
| 1967 || Belief, decision-making and behavioral ({{w|apophenia}}) || Concept development || {{w|Illusory correlation}} is originally coined by Chapman and Chapman to describe people's tendencies to overestimate relationships between two groups when distinctive and unusual information is presented.<ref name="Chapman1967">{{cite journal|last1=Chapman|first1=L|title=Illusory correlation in observational report|journal=Journal of Verbal Learning and Verbal Behavior|volume=6|issue=1|year=1967|pages=151–155|doi=10.1016/S0022-5371(67)80066-5}}</ref>"<ref>{{cite journal|last=Chapman|first=L.J|title=Illusory correlation in observational report|journal=Journal of Verbal Learning|year=1967|volume=6|pages=151–155|doi=10.1016/s0022-5371(67)80066-5}}</ref> || An {{w|illusory correlation}} occurs when a person perceives a relationship between two variables that are not in fact correlated.<ref>{{cite web |title=Illusory Correlation |url=http://psychology.iresearchnet.com/social-psychology/decision-making/illusory-correlation/ |website=psychology.iresearchnet.com |accessdate=17 July 2020}}</ref>
 
| 1967 || Belief, decision-making and behavioral ({{w|apophenia}}) || Concept development || {{w|Illusory correlation}} is originally coined by Chapman and Chapman to describe people's tendencies to overestimate relationships between two groups when distinctive and unusual information is presented.<ref name="Chapman1967">{{cite journal|last1=Chapman|first1=L|title=Illusory correlation in observational report|journal=Journal of Verbal Learning and Verbal Behavior|volume=6|issue=1|year=1967|pages=151–155|doi=10.1016/S0022-5371(67)80066-5}}</ref>"<ref>{{cite journal|last=Chapman|first=L.J|title=Illusory correlation in observational report|journal=Journal of Verbal Learning|year=1967|volume=6|pages=151–155|doi=10.1016/s0022-5371(67)80066-5}}</ref> || An {{w|illusory correlation}} occurs when a person perceives a relationship between two variables that are not in fact correlated.<ref>{{cite web |title=Illusory Correlation |url=http://psychology.iresearchnet.com/social-psychology/decision-making/illusory-correlation/ |website=psychology.iresearchnet.com |accessdate=17 July 2020}}</ref>
 
|-
 
|-
| 1967 || Social (attribution bias) || Research || American social psychologist {{w|Edward E. Jones}} and Victor Harris conduct a classic experiment<ref name="JonesHarris67">{{cite journal|last=Jones|first=E. E.|last2=Harris|first2=V. A.|year=1967|title=The attribution of attitudes|journal=Journal of Experimental Social Psychology|volume=3|issue=1|pages=1–24|doi=10.1016/0022-1031(67)90034-0}}</ref> that would later give rise to the phrase {{w|Fundamental attribution error}}, coined by {{w|Lee Ross}}<ref>{{cite book|title=Advances in experimental social psychology|last=Ross|first=L.|publisher=Academic Press|year=1977|isbn=978-0-12-015210-0|editor-last=Berkowitz|editor-first=L.|volume=10|location=New York|pages=173–220|chapter=The intuitive psychologist and his shortcomings: Distortions in the attribution process}}</ref> || {{w|Fundamental attribution error}} "is the tendency for people to over-emphasize dispositional, or personality-based explanations for behaviors observed in others while under-emphasizing situational explanations".<ref>{{cite web |title=Fundamental Attribution Error |url=https://www.simplypsychology.org/fundamental-attribution.html |website=simplypsychology.org |accessdate=7 May 2020}}</ref>
+
| 1967 || Social (attribution bias) || Research || American social psychologist {{w|Edward E. Jones}} and Victor Harris conduct a classic experiment<ref name="JonesHarris67">{{cite journal|last=Jones|first=E. E.|last2=Harris|first2=V. A.|year=1967|title=The attribution of attitudes|journal=Journal of Experimental Social Psychology|volume=3|issue=1|pages=1–24|doi=10.1016/0022-1031(67)90034-0}}</ref> that would later give rise to the phrase {{w|Fundamental attribution error}}, coined by {{w|Lee Ross}}.<ref>{{cite book|title=Advances in experimental social psychology|last=Ross|first=L.|publisher=Academic Press|year=1977|isbn=978-0-12-015210-0|editor-last=Berkowitz|editor-first=L.|volume=10|location=New York|pages=173–220|chapter=The intuitive psychologist and his shortcomings: Distortions in the attribution process}}</ref> || {{w|Fundamental attribution error}} "is the tendency for people to over-emphasize dispositional, or personality-based explanations for behaviors observed in others while under-emphasizing situational explanations".<ref>{{cite web |title=Fundamental Attribution Error |url=https://www.simplypsychology.org/fundamental-attribution.html |website=simplypsychology.org |accessdate=7 May 2020}}</ref>
 
|-
 
|-
 
| 1968 || Belief, decision-making and behavioral (anchoring bias) || Concept development ||  American psychologist  {{w|Ward Edwards}} discusses the concept of {{w|conservatism (belief revision)}} bias.<ref name="edwards1">Edwards, Ward. "Conservatism in Human Information Processing (excerpted)". In Daniel Kahneman, Paul Slovic and Amos Tversky. (1982). ''Judgment under uncertainty: Heuristics and biases''. New York: Cambridge University Press. Original work published 1968.</ref> || "[[w:Conservatism (belief revision)|Conservatism bias]] is a mental process in which people maintain their past views or predictions at the cost of recognizing new information."<ref>{{cite web |title=Conservatism Bias |url=https://dwassetmgmt.com/conservatism-bias/|website=dwassetmgmt.com |accessdate=8 May 2020}}</ref>  
 
| 1968 || Belief, decision-making and behavioral (anchoring bias) || Concept development ||  American psychologist  {{w|Ward Edwards}} discusses the concept of {{w|conservatism (belief revision)}} bias.<ref name="edwards1">Edwards, Ward. "Conservatism in Human Information Processing (excerpted)". In Daniel Kahneman, Paul Slovic and Amos Tversky. (1982). ''Judgment under uncertainty: Heuristics and biases''. New York: Cambridge University Press. Original work published 1968.</ref> || "[[w:Conservatism (belief revision)|Conservatism bias]] is a mental process in which people maintain their past views or predictions at the cost of recognizing new information."<ref>{{cite web |title=Conservatism Bias |url=https://dwassetmgmt.com/conservatism-bias/|website=dwassetmgmt.com |accessdate=8 May 2020}}</ref>  
Line 206: Line 191:
 
| 1977 || Memory bias || Research || {{w|Misattribution of memory}}. Early research done by Brown and Kulik finds that flashbulb memories are similar to photographs because they can be described in accurate, vivid detail. In this study, participants describe their circumstances about the moment they learned of the assassination of President John F. Kennedy as well as other similar traumatic events. Participants are able to describe what they were doing, things around them, and other details.<ref>{{Cite journal|last=Brown, R., Kulik J.|date=1977|title=Flashbulb memories|url=|journal=Cognition|volume=5|pages=73–99|doi=10.1016/0010-0277(77)90018-X}}</ref> || {{w|Misattribution of memory}} occurs "when a memory is distorted because of the source, context, or our imagination."<ref>{{cite web |title=Misattribution Effect |url=https://sites.google.com/site/falsememory02/current-research/misattribution |website=sites.google.com |accessdate=7 May 2020}}</ref>
 
| 1977 || Memory bias || Research || {{w|Misattribution of memory}}. Early research done by Brown and Kulik finds that flashbulb memories are similar to photographs because they can be described in accurate, vivid detail. In this study, participants describe their circumstances about the moment they learned of the assassination of President John F. Kennedy as well as other similar traumatic events. Participants are able to describe what they were doing, things around them, and other details.<ref>{{Cite journal|last=Brown, R., Kulik J.|date=1977|title=Flashbulb memories|url=|journal=Cognition|volume=5|pages=73–99|doi=10.1016/0010-0277(77)90018-X}}</ref> || {{w|Misattribution of memory}} occurs "when a memory is distorted because of the source, context, or our imagination."<ref>{{cite web |title=Misattribution Effect |url=https://sites.google.com/site/falsememory02/current-research/misattribution |website=sites.google.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 1977 || Social (egocentric bias) || Concept development || The false consensus effect is first defined by Ross, Green and House.<ref>{{cite web |title=False Consensus Effect |url=http://psychology.iresearchnet.com/social-psychology/social-cognition/false-consensus-effect/ |website=psychology.iresearchnet.com |access-date=14 January 2021}}</ref> || The false consensus effect is "the tendency to assume that one’s own opinions, beliefs, attributes, or behaviors are more widely shared than is actually the case."<ref>{{cite web |title=APA Dictionary of Psychology |url=https://dictionary.apa.org/false-consensus-effect |website=dictionary.apa.org |access-date=29 January 2021 |language=en}}</ref>
+
| 1977 || Social (egocentric bias) || Concept development || A study conducted by {{w|Lee Ross}} and colleagues provides early evidence for a {{w|cognitive bias}} called the [[w:False-consensus effect|false consensus effect]], which is the tendency for people to overestimate the extent to which others share the same views.<ref>{{Cite journal|title = The "false consensus effect": An egocentric bias in social perception and attribution processes|journal = Journal of Experimental Social Psychology|pages = 279–301|volume = 13|issue = 3|doi = 10.1016/0022-1031(77)90049-x|first = Lee|last = Ross|first2 = David|last2 = Greene|first3 = Pamela|last3 = House|year = 1977}}</ref> || The {{w|false-consensus effect}} "refers to the tendency to overestimate consensus for one′s attitudes and behaviors."<ref>{{cite journal |last1=Alicke |first1=Mark |last2=Largo |first2=Edward |title=The Role of Self in the False Consensus Effect |doi=10.1006/jesp.1995.1002 |url=https://www.sciencedirect.com/science/article/abs/pii/S0022103185710025}}</ref><ref>{{cite web |title=False Consensus Effect |url=http://psychology.iresearchnet.com/social-psychology/social-cognition/false-consensus-effect/ |website=psychology.iresearchnet.com |access-date=14 January 2021}}</ref> It is "the tendency to assume that one’s own opinions, beliefs, attributes, or behaviors are more widely shared than is actually the case."<ref>{{cite web |title=APA Dictionary of Psychology |url=https://dictionary.apa.org/false-consensus-effect |website=dictionary.apa.org |access-date=29 January 2021 |language=en}}</ref>
 
|-
 
|-
 
| 1977 || Belief, decision-making and behavioral (truthiness) || Concept development|| The {{w|illusory truth effect}} is first identified in a study at {{w|Villanova University}} and {{w|Temple University}}.<ref name="Hasher1977">{{cite journal|last1=Hasher |first1=Lynn |last2=Goldstein |first2=David |last3=Toppino |first3=Thomas |title=Frequency and the conference of referential validity |journal=Journal of Verbal Learning and Verbal Behavior |date=1977 |volume=16 |issue=1 |pages=107–112 |doi=10.1016/S0022-5371(77)80012-1 | |url=https://web.archive.org/web/20160515062305/http://www.psych.utoronto.ca/users/hasher/PDF/Frequency%20and%20the%20conference%20Hasher%20et%20al%201977.pdf}}</ref><ref name="PLOS ONE">{{cite journal|title=People with Easier to Pronounce Names Promote Truthiness of Claims|journal=PLOS ONE|volume=9|issue=2|pages=e88671|date=September 6, 2014 |doi=10.1371/journal.pone.0088671|pmid=24586368|pmc=3935838|last1=Newman|first1=Eryn J.|last2=Sanson|first2=Mevagh|last3=Miller|first3=Emily K.|last4=Quigley-Mcbride|first4=Adele|last5=Foster|first5=Jeffrey L.|last6=Bernstein|first6=Daniel M.|last7=Garry|first7=Maryanne}}</ref> || The {{w|illusory truth effect}} "occurs when repeating a statement increases the belief that it’s true even when the statement is actually false."<ref>{{cite web |title=Illusory Truth, Lies, and Political Propaganda: Part 1 |url=https://www.psychologytoday.com/us/blog/psych-unseen/202001/illusory-truth-lies-and-political-propaganda-part-1 |website=psychologytoday.com |accessdate=7 May 2020}}</ref>
 
| 1977 || Belief, decision-making and behavioral (truthiness) || Concept development|| The {{w|illusory truth effect}} is first identified in a study at {{w|Villanova University}} and {{w|Temple University}}.<ref name="Hasher1977">{{cite journal|last1=Hasher |first1=Lynn |last2=Goldstein |first2=David |last3=Toppino |first3=Thomas |title=Frequency and the conference of referential validity |journal=Journal of Verbal Learning and Verbal Behavior |date=1977 |volume=16 |issue=1 |pages=107–112 |doi=10.1016/S0022-5371(77)80012-1 | |url=https://web.archive.org/web/20160515062305/http://www.psych.utoronto.ca/users/hasher/PDF/Frequency%20and%20the%20conference%20Hasher%20et%20al%201977.pdf}}</ref><ref name="PLOS ONE">{{cite journal|title=People with Easier to Pronounce Names Promote Truthiness of Claims|journal=PLOS ONE|volume=9|issue=2|pages=e88671|date=September 6, 2014 |doi=10.1371/journal.pone.0088671|pmid=24586368|pmc=3935838|last1=Newman|first1=Eryn J.|last2=Sanson|first2=Mevagh|last3=Miller|first3=Emily K.|last4=Quigley-Mcbride|first4=Adele|last5=Foster|first5=Jeffrey L.|last6=Bernstein|first6=Daniel M.|last7=Garry|first7=Maryanne}}</ref> || The {{w|illusory truth effect}} "occurs when repeating a statement increases the belief that it’s true even when the statement is actually false."<ref>{{cite web |title=Illusory Truth, Lies, and Political Propaganda: Part 1 |url=https://www.psychologytoday.com/us/blog/psych-unseen/202001/illusory-truth-lies-and-political-propaganda-part-1 |website=psychologytoday.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 1977 || Social bias || Concept development || A study conducted by {{w|Lee Ross}} and colleagues provides early evidence for a {{w|cognitive bias}} called the [[w:False-consensus effect|false consensus effect]], which is the tendency for people to overestimate the extent to which others share the same views.<ref>{{Cite journal|title = The "false consensus effect": An egocentric bias in social perception and attribution processes|journal = Journal of Experimental Social Psychology|pages = 279–301|volume = 13|issue = 3|doi = 10.1016/0022-1031(77)90049-x|first = Lee|last = Ross|first2 = David|last2 = Greene|first3 = Pamela|last3 = House|year = 1977}}</ref> || The {{w|false-consensus effect}} "refers to the tendency to overestimate consensus for one′s attitudes and behaviors."<ref>{{cite journal |last1=Alicke |first1=Mark |last2=Largo |first2=Edward |title=The Role of Self in the False Consensus Effect |doi=10.1006/jesp.1995.1002 |url=https://www.sciencedirect.com/science/article/abs/pii/S0022103185710025}}</ref>
+
| 1977 || Memory bias || Research || T. B. Rogers and colleagues publish the first research on the {{w|self-reference effect}}.<ref>{{cite web |title=Self-Reference Effect |url=http://psychology.iresearchnet.com/social-psychology/self/self-reference-effect/ |website=psychology.iresearchnet.com |access-date=12 January 2021}}</ref><ref>{{cite journal |last1=Bentley |first1=Sarah V. |last2=Greenaway |first2=Katharine H. |last3=Haslam |first3=S. Alexander |title=An online paradigm for exploring the self-reference effect |doi=10.1371/journal.pone.0176611 |url=https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0176611}}</ref> || "The self-reference effect refers to people’s tendency to better remember information when that information has been linked to the self than when it has not been linked to the self."<ref>{{cite web |title=Self-Reference Effect - IResearchNet |url=http://psychology.iresearchnet.com/social-psychology/self/self-reference-effect/ |website=Psychology |access-date=10 May 2021 |date=12 January 2016}}</ref>
|-
 
| 1977 || Memory bias || Research || T. B. Rogers and colleagues publish the first research on the {{w|self-reference effect}}.<ref>{{cite web |title=Self-Reference Effect |url=http://psychology.iresearchnet.com/social-psychology/self/self-reference-effect/ |website=psychology.iresearchnet.com |access-date=12 January 2021}}</ref><ref>{{cite journal |last1=Bentley |first1=Sarah V. |last2=Greenaway |first2=Katharine H. |last3=Haslam |first3=S. Alexander |title=An online paradigm for exploring the self-reference effect |doi=10.1371/journal.pone.0176611 |url=https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0176611}}</ref> ||
 
 
|-  
 
|-  
 
| 1978 || Memory bias || Research || Loftus, Miller, and Burns conduct the original {{w|misinformation effect}} study.<ref>{{cite journal |last1=Zaragoza |first1=Maria S. |last2=Belli |first2=Robert F. |last3=Payment |first3=Kristie E. |title=Misinformation Effectsand the Suggestibility of Eyewitness Memory}}</ref> || The {{w|misinformation effect}} "happens when a person's memory becomes less accurate due to information that happens after the event."<ref>{{cite web |title=What Is Misinformation Effect? |url=https://www.growthramp.io/articles/misinformation-effect |website=growthramp.io |accessdate=7 May 2020}}</ref>
 
| 1978 || Memory bias || Research || Loftus, Miller, and Burns conduct the original {{w|misinformation effect}} study.<ref>{{cite journal |last1=Zaragoza |first1=Maria S. |last2=Belli |first2=Robert F. |last3=Payment |first3=Kristie E. |title=Misinformation Effectsand the Suggestibility of Eyewitness Memory}}</ref> || The {{w|misinformation effect}} "happens when a person's memory becomes less accurate due to information that happens after the event."<ref>{{cite web |title=What Is Misinformation Effect? |url=https://www.growthramp.io/articles/misinformation-effect |website=growthramp.io |accessdate=7 May 2020}}</ref>
Line 222: Line 205:
 
| 1979 || Social bias || Concept development || {{w|David Kahneman}} and {{w|Amos Tversky}} originally coin the term {{w|loss aversion}} in a landmark paper on subjective probability.<ref>{{cite web |title=Loss aversion |url=https://www.behavioraleconomics.com/resources/mini-encyclopedia-of-be/loss-aversion/ |website=behavioraleconomics.com |accessdate=14 August 2020}}</ref> || "{{w|Loss aversion}} is a cognitive bias that suggests that for individuals the pain of losing is psychologically twice as powerful as the pleasure of gaining."<ref>{{cite web |title=Why is the pain of losing felt twice as powerfully compared to equivalent gains? |url=https://thedecisionlab.com/biases/loss-aversion/ |website=thedecisionlab.com |accessdate=14 August 2020}}</ref>   
 
| 1979 || Social bias || Concept development || {{w|David Kahneman}} and {{w|Amos Tversky}} originally coin the term {{w|loss aversion}} in a landmark paper on subjective probability.<ref>{{cite web |title=Loss aversion |url=https://www.behavioraleconomics.com/resources/mini-encyclopedia-of-be/loss-aversion/ |website=behavioraleconomics.com |accessdate=14 August 2020}}</ref> || "{{w|Loss aversion}} is a cognitive bias that suggests that for individuals the pain of losing is psychologically twice as powerful as the pleasure of gaining."<ref>{{cite web |title=Why is the pain of losing felt twice as powerfully compared to equivalent gains? |url=https://thedecisionlab.com/biases/loss-aversion/ |website=thedecisionlab.com |accessdate=14 August 2020}}</ref>   
 
|-
 
|-
| 1979 || || Concept development || The {{w|planning fallacy}} is first proposed by {{w|Daniel Kahneman}} and {{w|Amos Tversky}},<ref name="PezzoLitman2006">{{cite journal|last1=Pezzo|first1=Mark V.|last2=Litman|first2=Jordan A.|last3=Pezzo|first3=Stephanie P.|title=On the distinction between yuppies and hippies: Individual differences in prediction biases for planning future tasks |journal=Personality and Individual Differences|volume=41|issue=7|year=2006|pages=1359–1371|issn=0191-8869|doi=10.1016/j.paid.2006.03.029}}</ref><ref>{{cite journal|last1=Kahneman|first1=Daniel|last2=Tversky|first2=Amos|date=1977|title=Intuitive prediction: Biases and corrective procedures|url=http://www.dtic.mil/dtic/tr/fulltext/u2/a047747.pdf}} Decision Research Technical Report PTR-1042-77-6. In {{cite book|title=Judgment Under Uncertainty: Heuristics and Biases|journal=Science|volume=185|issue=4157|last1=Kahneman|first1=Daniel|last2=Tversky|first2=Amos|year=1982|isbn=978-0511809477|editor-last1=Kahneman|editor-first1=Daniel|pages=414–421|chapter=Intuitive prediction: Biases and corrective procedures|doi=10.1017/CBO9780511809477.031|pmid=17835457|editor-last2=Slovic|editor-first2=Paul|editor-last3=Tversky|editor-first3=Amos}}</ref> || "The {{w|planning fallacy}} refers to a prediction phenomenon, all too familiar to many, wherein people underestimate the time it will take to complete a future task, despite knowledge that previous tasks have generally taken longer than planned"<ref>{{cite journal |last1=Buehler |first1=Roger |last2=Griffin |first2=Dale |last3=Peetz |first3=Johanna |title=Chapter One - The Planning Fallacy: Cognitive, Motivational, and Social Origins |doi=10.1016/S0065-2601(10)43001-4 |url=https://www.sciencedirect.com/science/article/pii/S0065260110430014}}</ref>
+
| 1979 || Belief, decision-making and behavioral || Concept development || The {{w|planning fallacy}} is first proposed by {{w|Daniel Kahneman}} and {{w|Amos Tversky}}.<ref name="PezzoLitman2006">{{cite journal|last1=Pezzo|first1=Mark V.|last2=Litman|first2=Jordan A.|last3=Pezzo|first3=Stephanie P.|title=On the distinction between yuppies and hippies: Individual differences in prediction biases for planning future tasks |journal=Personality and Individual Differences|volume=41|issue=7|year=2006|pages=1359–1371|issn=0191-8869|doi=10.1016/j.paid.2006.03.029}}</ref><ref>{{cite journal|last1=Kahneman|first1=Daniel|last2=Tversky|first2=Amos|date=1977|title=Intuitive prediction: Biases and corrective procedures|url=http://www.dtic.mil/dtic/tr/fulltext/u2/a047747.pdf}} Decision Research Technical Report PTR-1042-77-6. In {{cite book|title=Judgment Under Uncertainty: Heuristics and Biases|journal=Science|volume=185|issue=4157|last1=Kahneman|first1=Daniel|last2=Tversky|first2=Amos|year=1982|isbn=978-0511809477|editor-last1=Kahneman|editor-first1=Daniel|pages=414–421|chapter=Intuitive prediction: Biases and corrective procedures|doi=10.1017/CBO9780511809477.031|pmid=17835457|editor-last2=Slovic|editor-first2=Paul|editor-last3=Tversky|editor-first3=Amos}}</ref> || "The {{w|planning fallacy}} refers to a prediction phenomenon, all too familiar to many, wherein people underestimate the time it will take to complete a future task, despite knowledge that previous tasks have generally taken longer than planned"<ref>{{cite journal |last1=Buehler |first1=Roger |last2=Griffin |first2=Dale |last3=Peetz |first3=Johanna |title=Chapter One - The Planning Fallacy: Cognitive, Motivational, and Social Origins |doi=10.1016/S0065-2601(10)43001-4 |url=https://www.sciencedirect.com/science/article/pii/S0065260110430014}}</ref>
 
|-
 
|-
 
| 1980 || Memory bias || Concept development || The term "egocentric bias" is first coined by {{w|Anthony Greenwald}}, a psychologist at {{w|Ohio State University}}.<ref name=":1">{{Cite news|url=https://www.nytimes.com/1984/06/12/science/a-bias-puts-self-at-center-of-everything.html|title=A bias puts self at center of everything|last=Goleman|first=Daniel|date=1984-06-12|newspaper=The New York Times|access-date=2016-12-09}}</ref> || "The {{w|egocentric bias}} is a cognitive bias that causes people to rely too heavily on their own point of view when they examine events in their life or when they try to see things from other people’s perspective."<ref>{{cite web |title=The Egocentric Bias: Why It’s Hard to See Things from a Different Perspective |url=https://effectiviology.com/egocentric-bias/ |website=effectiviology.com |accessdate=16 July 2020}}</ref>
 
| 1980 || Memory bias || Concept development || The term "egocentric bias" is first coined by {{w|Anthony Greenwald}}, a psychologist at {{w|Ohio State University}}.<ref name=":1">{{Cite news|url=https://www.nytimes.com/1984/06/12/science/a-bias-puts-self-at-center-of-everything.html|title=A bias puts self at center of everything|last=Goleman|first=Daniel|date=1984-06-12|newspaper=The New York Times|access-date=2016-12-09}}</ref> || "The {{w|egocentric bias}} is a cognitive bias that causes people to rely too heavily on their own point of view when they examine events in their life or when they try to see things from other people’s perspective."<ref>{{cite web |title=The Egocentric Bias: Why It’s Hard to See Things from a Different Perspective |url=https://effectiviology.com/egocentric-bias/ |website=effectiviology.com |accessdate=16 July 2020}}</ref>
Line 230: Line 213:
 
| 1980 || Belief, decision-making and behavioral ({{w|truthiness}}) || Concept development || The term ''{{w|subjective validation}}'' first appears in the book ''{{w|The Psychology of the Psychic}}'' by {{w|David F. Marks}} and Richard Kammann.<ref>{{cite book|last1=Frazier|first1=Kendrick|title=Science Confronts the Paranormal|date=1986|publisher=Prometheus Books|isbn=|page=101}}</ref> || {{w|Subjective validation}} "causes an individual to consider a statement or another piece of information correct if it has any significance or personal meaning (validating their previous opinion) to them."<ref>{{cite web |title=Subjective Validation |url=https://www.alleydog.com/glossary/definition.php?term=Subjective+Validation |website=alleydog.com |accessdate=14 August 2020}}</ref>
 
| 1980 || Belief, decision-making and behavioral ({{w|truthiness}}) || Concept development || The term ''{{w|subjective validation}}'' first appears in the book ''{{w|The Psychology of the Psychic}}'' by {{w|David F. Marks}} and Richard Kammann.<ref>{{cite book|last1=Frazier|first1=Kendrick|title=Science Confronts the Paranormal|date=1986|publisher=Prometheus Books|isbn=|page=101}}</ref> || {{w|Subjective validation}} "causes an individual to consider a statement or another piece of information correct if it has any significance or personal meaning (validating their previous opinion) to them."<ref>{{cite web |title=Subjective Validation |url=https://www.alleydog.com/glossary/definition.php?term=Subjective+Validation |website=alleydog.com |accessdate=14 August 2020}}</ref>
 
|-
 
|-
| 1980 || Belief, decision-making and behavioral || Concept development || The phenomenon of {{w|optimism bias}} is initially described by Weinstein, who finds that the majority of college students believe that their chances of developing a drinking problem or getting divorced are lower than their peers.<ref>{{cite web |title=Understanding the Optimism Bias |url=https://www.verywellmind.com/what-is-the-optimism-bias-2795031 |website=verywellmind.com |access-date=15 January 2021}}</ref> || "Optimism Bias refers to the tendency for individuals to underestimate their probability of experiencing adverse effects despite the obvious."<ref>{{cite web |title=Optimism Bias - Biases & Heuristics |url=https://thedecisionlab.com/biases/optimism-bias/ |website=The Decision Lab |access-date=28 January 2021 |language=en-CA}}</ref>
+
| 1980 || Belief, decision-making and behavioral || Concept development || The phenomenon of {{w|optimism bias}} is initially described by Weinstein, who finds that the majority of college students believe that their chances of developing a drinking problem or getting divorced are lower than their peers'.<ref>{{cite web |title=Understanding the Optimism Bias |url=https://www.verywellmind.com/what-is-the-optimism-bias-2795031 |website=verywellmind.com |access-date=15 January 2021}}</ref> || "Optimism Bias refers to the tendency for individuals to underestimate their probability of experiencing adverse effects despite the obvious."<ref>{{cite web |title=Optimism Bias - Biases & Heuristics |url=https://thedecisionlab.com/biases/optimism-bias/ |website=The Decision Lab |access-date=28 January 2021 |language=en-CA}}</ref>
 
|-
 
|-
| 1981 || Social bias || Study || Tversky and Kahneman conduct a demonstration of the [[w:Framing effect (psychology)|framing effect]].<ref name="Framing">{{cite web |title=Framing Effect - an overview {{!}} ScienceDirect Topics |url=https://www.sciencedirect.com/topics/psychology/framing-effect |website=www.sciencedirect.com |access-date=29 January 2021}}</ref> || "The Framing effect is the principle that our choices are influenced by the way they are framed through different wordings, settings, and situations."<ref>{{cite web |title=Why do our decisions depend on how options are presented to us? |url=https://thedecisionlab.com/biases/framing-effect/ |website=thedecisionlab.com |access-date=16 January 2021}}</ref>   
+
| 1981 || Social bias || Research || Tversky and Kahneman conduct a demonstration of the [[w:Framing effect (psychology)|framing effect]].<ref name="Framing">{{cite web |title=Framing Effect - an overview {{!}} ScienceDirect Topics |url=https://www.sciencedirect.com/topics/psychology/framing-effect |website=www.sciencedirect.com |access-date=29 January 2021}}</ref> || "The Framing effect is the principle that our choices are influenced by the way they are framed through different wordings, settings, and situations."<ref>{{cite web |title=Why do our decisions depend on how options are presented to us? |url=https://thedecisionlab.com/biases/framing-effect/ |website=thedecisionlab.com |access-date=16 January 2021}}</ref>   
 
|-
 
|-
 
| 1981 || Belief, decision-making and behavioral ({{w|prospect theory}}) || Concept development || The {{w|pseudocertainty effect}} is illustrated by {{w|Daniel Kahneman}}.<ref>{{cite journal |last1=Tversky |first1=A |last2=Kahneman |first2=D |title=The framing of decisions and the psychology of choice |journal=Science |date=30 January 1981 |volume=211 |issue=4481 |pages=453–458 |doi=10.1126/SCIENCE.7455683}}</ref> || "{{w|Pseudocertainty effect}} refers to people's tendency to make risk-averse choices if the expected outcome is positive, but make risk-seeking choices to avoid negative outcomes."<ref>{{cite web |title=Pseudocertainty effect |url=https://www.wiwi.europa-uni.de/de/lehrstuhl/fine/mikro/bilder_und_pdf-dateien/WS0910/VLBehEconomics/Ausarbeitungen/PseudocertaintyEeffect.doc#:~:text=Pseudocertainty%20effect%20refers%20to%20people's,choices%20to%20avoid%20negative%20outcomes.&text=Expounding%20on%20theories%20of%20decision,effects%20of%20certainty%20and%20pseudocertainty. |website=wiwi.europa-uni.de |accessdate=14 August 2020}}</ref>
 
| 1981 || Belief, decision-making and behavioral ({{w|prospect theory}}) || Concept development || The {{w|pseudocertainty effect}} is illustrated by {{w|Daniel Kahneman}}.<ref>{{cite journal |last1=Tversky |first1=A |last2=Kahneman |first2=D |title=The framing of decisions and the psychology of choice |journal=Science |date=30 January 1981 |volume=211 |issue=4481 |pages=453–458 |doi=10.1126/SCIENCE.7455683}}</ref> || "{{w|Pseudocertainty effect}} refers to people's tendency to make risk-averse choices if the expected outcome is positive, but make risk-seeking choices to avoid negative outcomes."<ref>{{cite web |title=Pseudocertainty effect |url=https://www.wiwi.europa-uni.de/de/lehrstuhl/fine/mikro/bilder_und_pdf-dateien/WS0910/VLBehEconomics/Ausarbeitungen/PseudocertaintyEeffect.doc#:~:text=Pseudocertainty%20effect%20refers%20to%20people's,choices%20to%20avoid%20negative%20outcomes.&text=Expounding%20on%20theories%20of%20decision,effects%20of%20certainty%20and%20pseudocertainty. |website=wiwi.europa-uni.de |accessdate=14 August 2020}}</ref>
 
|-
 
|-
| 1982 || Social (egocentric bias) || Research || {{w|Trait ascription bias}}. In a study involving fifty-six undergraduate psychology students from the University of Bielefeld, Kammer et al.  demonstrate  that subjects rate their own variability on each of 20 trait terms to be considerably higher than their peers.<ref name=kammer>{{cite journal |last=Kammer |first=D. |year=1982 |title=Differences in trait ascriptions to self and friend: Unconfounding intensity from variability |journal=Psychological Reports |volume=51 |issue=1 |pages=99–102 |doi=10.2466/pr0.1982.51.1.99 }}</ref> || "{{w|Trait ascription bias}} is the belief that other people's behavior and reactions are generally predictable while you yourself are more unpredictable."<ref>{{cite web |title=Trait Ascription Bias |url=https://www.alleydog.com/glossary/definition.php?term=Trait+Ascription+Bias |website=alleydog.com |accessdate=14 August 2020}}</ref>
+
| 1982 || Social (egocentric bias) || Research || {{w|Trait ascription bias}}. In a study involving fifty-six undergraduate psychology students from the University of Bielefeld, Kammer et al.  demonstrate  that subjects rate their own variability on each of 20 trait terms to be considerably higher than their peers'.<ref name=kammer>{{cite journal |last=Kammer |first=D. |year=1982 |title=Differences in trait ascriptions to self and friend: Unconfounding intensity from variability |journal=Psychological Reports |volume=51 |issue=1 |pages=99–102 |doi=10.2466/pr0.1982.51.1.99 }}</ref> || "{{w|Trait ascription bias}} is the belief that other people's behavior and reactions are generally predictable while you yourself are more unpredictable."<ref>{{cite web |title=Trait Ascription Bias |url=https://www.alleydog.com/glossary/definition.php?term=Trait+Ascription+Bias |website=alleydog.com |accessdate=14 August 2020}}</ref>
 
|-
 
|-
| 1982 || Belief, decision-making and behavioral ({{w|framing effect}}) || Research || The {{w|decoy effect}} is first demonstrated by Joel Huber and others at {{w|Duke University}}, explains how when a customer is hesitating between two options, presenting them with a third “asymmetrically dominated” option that acts as a decoy will strongly influence which decision they make.<ref name="tactics.convertize.com">{{cite web |title=Decoy Effect definition |url=https://tactics.convertize.com/definitions/decoy-effect |website=tactics.convertize.com |access-date=14 January 2021}}</ref> || "The {{w|decoy effect}} is defined as the phenomenon whereby consumers change their preference between two options when presented with a third option."<ref>{{cite web |last1=Mortimer |first1=Gary |title=The decoy effect: how you are influenced to choose without really knowing it |url=https://theconversation.com/the-decoy-effect-how-you-are-influenced-to-choose-without-really-knowing-it-111259#:~:text=The%20decoy%20effect%20is%20defined,or%20%E2%80%9Casymmetric%20dominance%20effect%E2%80%9D. |website=The Conversation |access-date=29 January 2021 |language=en}}</ref>  
+
| 1982 || Belief, decision-making and behavioral ({{w|framing effect}}) || Research || The {{w|decoy effect}} is first demonstrated by Joel Huber and others at {{w|Duke University}}. The effect explains how when a customer is hesitating between two options, presenting them with a third “asymmetrically dominated” option that acts as a decoy will strongly influence which decision they make.<ref name="tactics.convertize.com">{{cite web |title=Decoy Effect definition |url=https://tactics.convertize.com/definitions/decoy-effect |website=tactics.convertize.com |access-date=14 January 2021}}</ref> || "The {{w|decoy effect}} is defined as the phenomenon whereby consumers change their preference between two options when presented with a third option."<ref>{{cite web |last1=Mortimer |first1=Gary |title=The decoy effect: how you are influenced to choose without really knowing it |url=https://theconversation.com/the-decoy-effect-how-you-are-influenced-to-choose-without-really-knowing-it-111259#:~:text=The%20decoy%20effect%20is%20defined,or%20%E2%80%9Casymmetric%20dominance%20effect%E2%80%9D. |website=The Conversation |access-date=29 January 2021 |language=en}}</ref>  
 
|-
 
|-
 
| 1983 || Social ({{w|egocentric bias}}) || Concept development || Sociologist W. Phillips Davison first articulates the {{w|third-person effect}} hypothesis.<ref>{{cite journal |title=Third-Person Effect |journal=Encyclopedia of Survey Research Methods |date=2008 |doi=10.4135/9781412963947.n582}}</ref><ref>{{cite web |last1=Conners |first1=Joan L. |title=Understanding the Third-Person Effect |url=http://cscc.scu.edu/trends/v24/v24_2.pdf}}</ref> || {{w|Third-person effect}} refers to "the commonly held belief that other people are more affected, due to personal prejudices, by mass media than you yourself are. This view, largely due to a personal conceit, is caused by the self-concept of being more astute and aware than others, or of being less vulnerable to persuasion than others."<ref>{{cite web |title=Third-Person Effect |url=https://www.alleydog.com/glossary/definition.php?term=Third-Person+Effect |website=alleydog.com |accessdate=7 May 2020}}</ref>
 
| 1983 || Social ({{w|egocentric bias}}) || Concept development || Sociologist W. Phillips Davison first articulates the {{w|third-person effect}} hypothesis.<ref>{{cite journal |title=Third-Person Effect |journal=Encyclopedia of Survey Research Methods |date=2008 |doi=10.4135/9781412963947.n582}}</ref><ref>{{cite web |last1=Conners |first1=Joan L. |title=Understanding the Third-Person Effect |url=http://cscc.scu.edu/trends/v24/v24_2.pdf}}</ref> || {{w|Third-person effect}} refers to "the commonly held belief that other people are more affected, due to personal prejudices, by mass media than you yourself are. This view, largely due to a personal conceit, is caused by the self-concept of being more astute and aware than others, or of being less vulnerable to persuasion than others."<ref>{{cite web |title=Third-Person Effect |url=https://www.alleydog.com/glossary/definition.php?term=Third-Person+Effect |website=alleydog.com |accessdate=7 May 2020}}</ref>
Line 246: Line 229:
 
| 1985 || Belief, decision-making and behavioral (prospect theory) || Concept development || The {{w|disposition effect}} anomaly is identified and named by Hersh Shefrin and Meir Statman, who note that "people dislike incurring losses much more than they enjoy making gains, and people are willing to gamble in the domain of losses." Consequently, "investors will hold onto stocks that have lost value...and will be eager to sell stocks that have risen in value." The researchers coin the term "disposition effect" to describe this tendency of holding on to losing stocks too long and to sell off well-performing stocks too readily.<ref name="Behavioural Finance">{{cite web|title=Disposition Effect|website=Behavioural Finance|accessdate=11 January 2017|url=https://web.archive.org/web/20170324030730/http://disposition-effect.behaviouralfinance.net/}}</ref> || "The {{w|disposition effect}} refers to investors’ reluctance to sell assets that have lost value and greater likelihood of selling assets that have made gains."<ref>{{cite web |title=Disposition effect |url=https://www.behavioraleconomics.com/resources/mini-encyclopedia-of-be/disposition-effect/ |website=behavioraleconomics.com |accessdate=16 July 2020}}</ref>
 
| 1985 || Belief, decision-making and behavioral (prospect theory) || Concept development || The {{w|disposition effect}} anomaly is identified and named by Hersh Shefrin and Meir Statman, who note that "people dislike incurring losses much more than they enjoy making gains, and people are willing to gamble in the domain of losses." Consequently, "investors will hold onto stocks that have lost value...and will be eager to sell stocks that have risen in value." The researchers coin the term "disposition effect" to describe this tendency of holding on to losing stocks too long and to sell off well-performing stocks too readily.<ref name="Behavioural Finance">{{cite web|title=Disposition Effect|website=Behavioural Finance|accessdate=11 January 2017|url=https://web.archive.org/web/20170324030730/http://disposition-effect.behaviouralfinance.net/}}</ref> || "The {{w|disposition effect}} refers to investors’ reluctance to sell assets that have lost value and greater likelihood of selling assets that have made gains."<ref>{{cite web |title=Disposition effect |url=https://www.behavioraleconomics.com/resources/mini-encyclopedia-of-be/disposition-effect/ |website=behavioraleconomics.com |accessdate=16 July 2020}}</ref>
 
|-
 
|-
| 1985 || Belief, decision-making and behavioral (logical fallacy) || Concept development || The {{w|hot-hand fallacy}} is first described in a paper by {{w|Amos Tversky}}, {{w|Thomas Gilovich}}, and Robert Vallone. || "The {{w|hot-hand fallacy}} effect refers to the tendency for people to expect streaks in sports performance to continue."<ref>{{cite web |title=Hot Hand Effect |url=http://psychology.iresearchnet.com/social-psychology/decision-making/hot-hand-effect/ |website=psychology.iresearchnet.com |accessdate=16 July 2020}}</ref>
+
| 1985 || Belief, decision-making and behavioral (logical fallacy) || Concept development || The {{w|hot-hand fallacy}} is first described in a paper by {{w|Amos Tversky}}, {{w|Thomas Gilovich}}, and Robert Vallone.<ref>{{cite web |last1=US |first1=Joshua Miller,Adam Sanjurjo,The Conversation |title=Momentum Isn&rsquo;t Magic&mdash;Vindicating the Hot Hand with the Mathematics of Streaks |url=https://www.scientificamerican.com/article/momentum-isnt-magic-vindicating-the-hot-hand-with-the-mathematics-of-streaks/ |website=Scientific American |access-date=16 June 2021 |language=en}}</ref> || "The {{w|hot-hand fallacy}} effect refers to the tendency for people to expect streaks in sports performance to continue."<ref>{{cite web |title=Hot Hand Effect |url=http://psychology.iresearchnet.com/social-psychology/decision-making/hot-hand-effect/ |website=psychology.iresearchnet.com |accessdate=16 July 2020}}</ref>
 
|-
 
|-
| 1986 || Memory bias || Research || McDaniel and Einstein argue that bizarreness intrinsically does not enhance memory in their paper.<ref>{{cite journal |last1=Iaccino |first1=J. F. |last2=Sowa |first2=S. J. |date=February 1989 |title=Bizarre imagery in paired-associate learning: an effective mnemonic aid with mixed context, delayed testing, and self-paced conditions |volume=68 |issue=1 |pages=307–16 |pmid=2928063 |doi=10.2466/pms.1989.68.1.307 |journal=Percept mot Skills}}</ref> || "The {{w|bizarreness effect}} holds that items associated with bizarre sentences or phrases are more readily recalled than those associated with common sentences or phrases."<ref>{{cite web |title=Bizarreness effect |url=https://www.britannica.com/topic/bizarreness-effect |website=britannica.com |accessdate=16 July 2020}}</ref>
+
| 1986 || Memory bias || Research || McDaniel and Einstein describe the ''bizarreness effect'' as the finding that people have superior memory for bizarre sentences relative to common ones.<ref>{{cite web |last1=Geraci |first1=Lisa |last2=McDaniel |first2=Mark A. |last3=Miller |first3=Tyler M. |last4=Hughes |first4=Matthew L. |title=The bizarreness effect: evidence for the critical influence of retrieval processes |url=https://link.springer.com/article/10.3758/s13421-013-0335-4 |website=Memory & Cognition |pages=1228–1237 |language=en |doi=10.3758/s13421-013-0335-4 |date=2013-11-01}}</ref> However, the researchers argue that bizarreness intrinsically does not enhance memory in their paper.<ref>{{cite journal |last1=Iaccino |first1=J. F. |last2=Sowa |first2=S. J. |date=February 1989 |title=Bizarre imagery in paired-associate learning: an effective mnemonic aid with mixed context, delayed testing, and self-paced conditions |volume=68 |issue=1 |pages=307–16 |pmid=2928063 |doi=10.2466/pms.1989.68.1.307 |journal=Percept mot Skills}}</ref><ref>{{cite web |title=The imagery bizarreness effect as a function of sentence complexity and presentation time |url=https://link.springer.com/content/pdf/10.3758/BF03334758.pdf |website=link.springer.com |access-date=18 June 2021}}</ref> || "The {{w|bizarreness effect}} holds that items associated with bizarre sentences or phrases are more readily recalled than those associated with common sentences or phrases."<ref>{{cite web |title=Bizarreness effect |url=https://www.britannica.com/topic/bizarreness-effect |website=britannica.com |accessdate=16 July 2020}}</ref>
|-
 
| 1988 || Belief, decision-making and behavioral || Research || In an experiment by Baron, Beattie and Hershey, subjects considered this diagnostic problem involving fictitious diseases.<ref name="Baron2006">{{cite book|last=Baron|first=Jonathan|title=Thinking and Deciding|url=https://books.google.com/books?id=Fc5fQgAACAAJ|edition=4th|year=2006|publisher=Cambridge University Press|isbn=978-0-521-68043-1|page=177|chapter=Information bias and the value of information}}</ref> || "[[w:Information bias (psychology)|Information bias]] is any systematic difference from the truth that arises in the collection, recall, recording and handling of information in a study, including how missing data is dealt with."<ref>{{cite web |title=Information Bias |url=https://catalogofbias.org/biases/information-bias/#:~:text=Information%20bias%20is%20any%20systematic,recall%20bias%20and%20reporting%20bias. |website=catalogofbias.org |accessdate=22 September 2020}}</ref>
 
 
|-
 
|-
 
| 1988 || Social || Concept development || The {{w|Reactive devaluation}} bias is proposed by {{w|Lee Ross}} and Constance Stillinger.<ref name=RossStillinger1988>Lee Ross, Constance A. Stillinger, "Psychological barriers to conflict resolution", Stanford Center on Conflict and Negotiation, Stanford University, 1988, [https://books.google.com/books?id=R2QrAQAAIAAJ&focus=searchwithinvolume&q=reactive p. 4]</ref> || "Reactive Devaluation is tendency to value the proposal of someone we recognized as an antagonist as being less interesting than if it was made by someone else."<ref>{{cite web |title=Why we often tend to devalue proposals made by people who we consider to be adversaries |url=https://thedecisionlab.com/biases/reactive-devaluation/ |website=thedecisionlab.com |accessdate=22 September 2020}}</ref>
 
| 1988 || Social || Concept development || The {{w|Reactive devaluation}} bias is proposed by {{w|Lee Ross}} and Constance Stillinger.<ref name=RossStillinger1988>Lee Ross, Constance A. Stillinger, "Psychological barriers to conflict resolution", Stanford Center on Conflict and Negotiation, Stanford University, 1988, [https://books.google.com/books?id=R2QrAQAAIAAJ&focus=searchwithinvolume&q=reactive p. 4]</ref> || "Reactive Devaluation is tendency to value the proposal of someone we recognized as an antagonist as being less interesting than if it was made by someone else."<ref>{{cite web |title=Why we often tend to devalue proposals made by people who we consider to be adversaries |url=https://thedecisionlab.com/biases/reactive-devaluation/ |website=thedecisionlab.com |accessdate=22 September 2020}}</ref>
Line 261: Line 242:
 
|-
 
|-
 
| 1990 || Belief, decision-making and behavioral ({{w|confirmation bias}}) || Concept development || The phenomenon known as “satisfaction of search” is first described, in which a radiologist fails to detect a second abnormality, apparently because of prematurely ceasing to search the images after detecting a “satisfying” finding.<ref>{{cite journal |last1=Bruno |first1=Michael A. |title=256 Shades of gray: uncertainty and diagnostic error in radiology |doi=10.1515/dx-2017-0006 |url=https://www.degruyter.com/view/journals/dx/4/3/article-p149.xml?language=en}}</ref> || "Satisfaction of search describes a situation in which the detection of one radiographic abnormality interferes with that of others."<ref>{{cite journal |last1=Ashman |first1=C. J. |last2=Yu |first2=J. S. |last3=Wolfman |first3=D. |title=Satisfaction of search in osteoradiology |journal=AJR. American journal of roentgenology |date=August 2000 |volume=175 |issue=2 |pages=541–544 |doi=10.2214/ajr.175.2.1750541 |url=https://pubmed.ncbi.nlm.nih.gov/10915712/ |access-date=27 January 2021 |issn=0361-803X}}</ref>
 
| 1990 || Belief, decision-making and behavioral ({{w|confirmation bias}}) || Concept development || The phenomenon known as “satisfaction of search” is first described, in which a radiologist fails to detect a second abnormality, apparently because of prematurely ceasing to search the images after detecting a “satisfying” finding.<ref>{{cite journal |last1=Bruno |first1=Michael A. |title=256 Shades of gray: uncertainty and diagnostic error in radiology |doi=10.1515/dx-2017-0006 |url=https://www.degruyter.com/view/journals/dx/4/3/article-p149.xml?language=en}}</ref> || "Satisfaction of search describes a situation in which the detection of one radiographic abnormality interferes with that of others."<ref>{{cite journal |last1=Ashman |first1=C. J. |last2=Yu |first2=J. S. |last3=Wolfman |first3=D. |title=Satisfaction of search in osteoradiology |journal=AJR. American journal of roentgenology |date=August 2000 |volume=175 |issue=2 |pages=541–544 |doi=10.2214/ajr.175.2.1750541 |url=https://pubmed.ncbi.nlm.nih.gov/10915712/ |access-date=27 January 2021 |issn=0361-803X}}</ref>
 +
|-
 +
| 1990 || || Literature || Jean-Paul Caverni, Jean-Marc Fabre and Michel Gonzalez publish ''Cognitive Biases''.<ref>{{cite web |title=Cognitive biases |url=https://catalog.library.vanderbilt.edu/discovery/fulldisplay/alma991024853679703276/01VAN_INST:vanui |website=catalog.library.vanderbilt.edu |access-date=25 July 2021 |language=en}}</ref> ||
 
|-
 
|-
 
| 1991 || Social (egocentric bias) || Concept development || The term {{w|illusory superiority}} is first used by the researchers Van Yperen and Buunk.<ref>{{cite web |title=Self-Enhancement and Superiority Biases in Social Comparison |url=https://www.researchgate.net/publication/247505886_Self-Enhancement_and_Superiority_Biases_in_Social_Comparison |website=researchgate.net |accessdate=14 August 2020}}</ref> || {{w|Illusory superiority}} "indicates an individual who has a belief that they are somehow inherently superior to others".<ref>{{cite web |title=Illusory Superiority |url=https://www.alleydog.com/glossary/definition.php?term=Illusory+Superiority |website=alleydog.com |accessdate=7 May 2020}}</ref>
 
| 1991 || Social (egocentric bias) || Concept development || The term {{w|illusory superiority}} is first used by the researchers Van Yperen and Buunk.<ref>{{cite web |title=Self-Enhancement and Superiority Biases in Social Comparison |url=https://www.researchgate.net/publication/247505886_Self-Enhancement_and_Superiority_Biases_in_Social_Comparison |website=researchgate.net |accessdate=14 August 2020}}</ref> || {{w|Illusory superiority}} "indicates an individual who has a belief that they are somehow inherently superior to others".<ref>{{cite web |title=Illusory Superiority |url=https://www.alleydog.com/glossary/definition.php?term=Illusory+Superiority |website=alleydog.com |accessdate=7 May 2020}}</ref>
Line 268: Line 251:
 
| 1994 || Belief, decision-making and behavioral || Concept development || The {{w|Women are wonderful effect}} term is coined by researchers {{w|Alice Eagly}} and {{w|Antonio Mladinic}} in a paper, where they question the widely-held view that there was prejudice against women.<ref>{{cite web |title=“Women Are Wonderful” Effect |url=https://www.scribd.com/document/274926319/Women-Are-Wonderful-Effect |website=scribd.com |accessdate=10 April 2020}}</ref> || "The {{w|women are wonderful effect}} is a phenomenon found in psychological research in which people associate more positive attributes with women as compared to men."<ref>{{cite web |title=“women are wonderful” effect |url=https://crazyfacts.com/the-women-are-wonderful-effect-is-a-phenomenon-found-in-psychological-research/ |website=crazyfacts.com |accessdate=18 July 2020}}</ref>
 
| 1994 || Belief, decision-making and behavioral || Concept development || The {{w|Women are wonderful effect}} term is coined by researchers {{w|Alice Eagly}} and {{w|Antonio Mladinic}} in a paper, where they question the widely-held view that there was prejudice against women.<ref>{{cite web |title=“Women Are Wonderful” Effect |url=https://www.scribd.com/document/274926319/Women-Are-Wonderful-Effect |website=scribd.com |accessdate=10 April 2020}}</ref> || "The {{w|women are wonderful effect}} is a phenomenon found in psychological research in which people associate more positive attributes with women as compared to men."<ref>{{cite web |title=“women are wonderful” effect |url=https://crazyfacts.com/the-women-are-wonderful-effect-is-a-phenomenon-found-in-psychological-research/ |website=crazyfacts.com |accessdate=18 July 2020}}</ref>
 
|-
 
|-
| 1995 || || Concept development || The {{w|implicit bias}} is first described in a publication by Tony Greenwald and Mahzarin Banaji.<ref>{{cite web |title=PROJECT IMPLICIT LECTURES AND WORKSHOPS |url=https://www.projectimplicit.net/lectures.html |website=projectimplicit.net |accessdate=12 March 2020}}</ref> || "Research on {{w|implicit bias}} suggests that people can act on the basis of prejudice and stereotypes without intending to do so."<ref>{{cite web |title=Implicit Bias |url=https://plato.stanford.edu/entries/implicit-bias/ |website=plato.stanford.edu |accessdate=8 May 2020}}</ref>
+
| 1994 || Belief, decision-making and behavioral (logical fallacy) || Research || Research by Fox, Rogers, and Tversky provides evidence of the {{w|subadditivity effect}} in expert judgment, after having investigated 32 professional options traders.<ref name="Support theo">{{cite journal |last1=Tversky |first1=Amos |last2=Koehler |first2=Derek J. |title=Support theory: A nonextensional representation of subjective probability. |journal=Psychological Review |date=October 1994 |volume=101 |issue=4 |pages=547–567 |doi=10.1037/0033-295X.101.4.547}}</ref> || The {{w|subadditivity effect}} is "the tendency to judge probability of the whole to be less than the probabilities of the parts".<ref>{{cite web |title=Today's term from psychology is Subadditivity Effect. |url=https://steemit.com/life/@jevh/today-s-term-from-psychology-is-subadditivity-effect |website=steemit.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 1996 || || Research || {{w|Daniel Kahneman}} and {{w|Amos Tversky}} argue that cognitive biases have efficient practical implications for areas including clinical judgment, entrepreneurship, finance, and management.<ref>{{cite journal|author1=Kahneman, D. |author2=Tversky, A.  |last-author-amp=yes |title=On the reality of cognitive illusions|journal=Psychological Review|year=1996|volume=103|issue=3|pages=582–591|doi=10.1037/0033-295X.103.3.582|pmid=8759048|url=http://psy.ucsd.edu/%7Emckenzie/KahnemanTversky1996PsychRev.pdf|citeseerx=10.1.1.174.5117  }}</ref><ref name="S.X. Zhang and J. Cueto 2015">{{cite journal |author1=S.X. Zhang |author2=J. Cueto |title=The Study of Bias in Entrepreneurship |journal= Entrepreneurship Theory and Practice |volume=41 |issue=3 |pages=419–454 |doi= 10.1111/etap.12212  |year=2015 }}</ref> ||
+
| 1995 || || Concept development || The {{w|implicit bias}} is first described in a publication by Tony Greenwald and {{w|Mahzarin Banaji}}.<ref>{{cite web |title=PROJECT IMPLICIT LECTURES AND WORKSHOPS |url=https://www.projectimplicit.net/lectures.html |website=projectimplicit.net |accessdate=12 March 2020}}</ref> || "Research on {{w|implicit bias}} suggests that people can act on the basis of prejudice and stereotypes without intending to do so."<ref>{{cite web |title=Implicit Bias |url=https://plato.stanford.edu/entries/implicit-bias/ |website=plato.stanford.edu |accessdate=8 May 2020}}</ref>
 
|-
 
|-
| 1998 || Belief, decision-making and behavioral || Research || Gilbert et al. report on the presence of {{w|impact bias}} in registered voters.<ref>{{cite journal |last1=Medway |first1=Dominic |last2=Foos |first2=Adrienne |last3=Goatman |first3=Anna |title=Impact bias in student evaluations of higher education |journal=Studies in Higher Education |doi=10.1080/03075079.2015.1071345 |url=https://www.tandfonline.com/doi/full/10.1080/03075079.2015.1071345 |accessdate=7 May 2020}}</ref> || "{{w|Impact bias}} refers to a human tendency to overestimate emotional responses to events and experiences"<ref>{{cite journal |last1=Medway |first1=Dominic |last2=Foos |first2=Adrienne |last3=Goatman |first3=Anna |title=Impact bias in student evaluations of higher education |journal=Studies in Higher Education |doi=10.1080/03075079.2015.1071345 |url=https://www.tandfonline.com/doi/full/10.1080/03075079.2015.1071345 |accessdate=7 May 2020}}</ref>
+
| 1996 || || Research || {{w|Daniel Kahneman}} and {{w|Amos Tversky}} argue that cognitive biases have efficient practical implications for areas including clinical judgment, entrepreneurship, finance, and management.<ref>{{cite journal|author1=Kahneman, D. |author2=Tversky, A.  |last-author-amp=yes |title=On the reality of cognitive illusions|journal=Psychological Review|year=1996|volume=103|issue=3|pages=582–591|doi=10.1037/0033-295X.103.3.582|pmid=8759048|url=http://psy.ucsd.edu/%7Emckenzie/KahnemanTversky1996PsychRev.pdf}}</ref><ref name="S.X. Zhang and J. Cueto 2015">{{cite journal |author1=S.X. Zhang |author2=J. Cueto |title=The Study of Bias in Entrepreneurship |journal= Entrepreneurship Theory and Practice |volume=41 |issue=3 |pages=419–454 |doi= 10.1111/etap.12212  |year=2015 }}</ref> ||
 
|-
 
|-
| 1998 || || Concept development || The {{w|implicit-association test}} is introduced in the scientific literature by {{w|Anthony Greenwald}}, Debbie McGhee, and Jordan Schwartz.<ref name = "Greenwald 1998">{{Citation | title = Measuring Individual Differences in Implicit Cognition: The Implicit Association Test | year = 1998 | journal = Journal of Personality and Social Psychology | pages = 1464–1480 | volume = 74 | issue = 6 | last1 = Greenwald| first1 =  Anthony G. | last2 =  McGhee | first2 =  Debbie E. | last3 =  Schwartz | first3 =  Jordan L.K. | doi=10.1037/0022-3514.74.6.1464 | pmid=9654756}}</ref> || The {{w|implicit-association test}} is "a reaction time based categorization task that measures the differential associative strength between bipolar targets and evaluative attribute concepts as an approach to indexing implicit beliefs or biases."<ref>{{cite journal |last1=Healy |first1=Graham F. |last2=Boran |first2=Lorraine |last3=Smeaton |first3=Alan F. |title=Neural Patterns of the Implicit Association Test |doi=10.3389/fnhum.2015.00605 |pmid=26635570 |url=https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4656831/ |pmc=4656831}}</ref>
+
| 1998 || Belief, decision-making and behavioral || Research || Gilbert et al. report on the presence of {{w|impact bias}} in registered voters.<ref>{{cite journal |last1=Medway |first1=Dominic |last2=Foos |first2=Adrienne |last3=Goatman |first3=Anna |title=Impact bias in student evaluations of higher education |journal=Studies in Higher Education |doi=10.1080/03075079.2015.1071345 |url=https://www.tandfonline.com/doi/full/10.1080/03075079.2015.1071345 |accessdate=7 May 2020}}</ref> || "{{w|Impact bias}} refers to a human tendency to overestimate emotional responses to events and experiences."<ref>{{cite journal |last1=Medway |first1=Dominic |last2=Foos |first2=Adrienne |last3=Goatman |first3=Anna |title=Impact bias in student evaluations of higher education |journal=Studies in Higher Education |doi=10.1080/03075079.2015.1071345 |url=https://www.tandfonline.com/doi/full/10.1080/03075079.2015.1071345 |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 1998 || || Concept development || The {{w|implicit-association test}} is introduced in the scientific literature by {{w|Anthony Greenwald}}, Debbie McGhee, and Jordan Schwartz.<ref name = "Greenwald 1998">{{Citation | title = Measuring Individual Differences in Implicit Cognition: The Implicit Association Test | year = 1998 | journal = Journal of Personality and Social Psychology | pages = 1464–1480 | volume = 74 | issue = 6 | last1 = Greenwald| first1 =  Anthony G. | last2 =  McGhee | first2 =  Debbie E. | last3 =  Schwartz | first3 =  Jordan L.K. | doi=10.1037/0022-3514.74.6.1464 | pmid=9654756}}</ref> It is a research method able to provide a range of new possibilities for those looking to conduct research exploring attitudes and beliefs.<ref>{{cite web |title=The Implicit Association Test (IAT) - iMotions |url=https://imotions.com/blog/implicit-association-test/ |website=Imotions Publish |access-date=17 May 2021 |language=en |date=15 December 2020}}</ref> || "The {{w|implicit-association test}} is a flexible task designed to tap automatic associations between concepts (e.g., math and arts) and attributes (e.g., good or bad, male or female, self or other)."<ref>{{cite web |title=Implicit Association Test |url=https://www.projectimplicit.net/nosek/iat/#:~:text=The%20Implicit%20Association%20Test%20is,female%2C%20self%20or%20other). |website=www.projectimplicit.net |access-date=17 May 2021}}</ref>
 
|-
 
|-
 
| 1998 || Belief, decision-making and behavioral ({{w|extension neglect}}) || Concept development || Hsee discovers a less-is-better effect in three contexts: "(1) a person giving a $45 scarf (from scarves ranging from $5-$50) as a gift was perceived to be more generous than one giving a $55 coat (from coats ranging from $50-$500); (2) an overfilled ice cream serving with 7 oz of ice cream was valued more than an underfilled serving with 8 oz of ice cream; (3) a dinnerware set with 24 intact pieces was judged more favourably than one with 31 intact pieces (including the same 24) plus a few broken ones."<ref name="hsee">{{cite journal|last=Hsee|first=Christopher K.|title=Less Is Better: When Low-value Options Are Valued More Highly than High-value Options|journal=Journal of Behavioral Decision Making|year=1998|volume=11|issue=2|pages=107–121|doi=10.1002/(SICI)1099-0771(199806)11:2<107::AID-BDM292>3.0.CO;2-Y |url=http://faculty.chicagobooth.edu/christopher.hsee/vita/papers/LessIsBetter.pdf}}</ref> || "The {{w|less-is-better effect}} is the tendency to prefer the smaller or the lesser alternative when choosing individually, but not when evaluating together."<ref>{{cite web |title=Why we prefer the smaller or the lesser alternative |url=https://thedecisionlab.com/biases/less-is-better-effect/ |website=thedecisionlab.com |accessdate=7 May 2020}}</ref>
 
| 1998 || Belief, decision-making and behavioral ({{w|extension neglect}}) || Concept development || Hsee discovers a less-is-better effect in three contexts: "(1) a person giving a $45 scarf (from scarves ranging from $5-$50) as a gift was perceived to be more generous than one giving a $55 coat (from coats ranging from $50-$500); (2) an overfilled ice cream serving with 7 oz of ice cream was valued more than an underfilled serving with 8 oz of ice cream; (3) a dinnerware set with 24 intact pieces was judged more favourably than one with 31 intact pieces (including the same 24) plus a few broken ones."<ref name="hsee">{{cite journal|last=Hsee|first=Christopher K.|title=Less Is Better: When Low-value Options Are Valued More Highly than High-value Options|journal=Journal of Behavioral Decision Making|year=1998|volume=11|issue=2|pages=107–121|doi=10.1002/(SICI)1099-0771(199806)11:2<107::AID-BDM292>3.0.CO;2-Y |url=http://faculty.chicagobooth.edu/christopher.hsee/vita/papers/LessIsBetter.pdf}}</ref> || "The {{w|less-is-better effect}} is the tendency to prefer the smaller or the lesser alternative when choosing individually, but not when evaluating together."<ref>{{cite web |title=Why we prefer the smaller or the lesser alternative |url=https://thedecisionlab.com/biases/less-is-better-effect/ |website=thedecisionlab.com |accessdate=7 May 2020}}</ref>
Line 292: Line 277:
 
| 2002 || || Research || {{w|Bystander effect}}. Research indicates that priming a social context may inhibit helping behavior. Imagining being around one other person or being around a group of people can affect a person's willingness to help.<ref>{{cite journal | last1 = Garcia | first1 = S.M. | last2 = Weaver | first2 = K. | last3 = Darley | first3 = J.M. | last4 = Moskowitz | first4 = G.B. | year = 2002 | title = Crowded minds: the implicit bystander effect | url = | journal = Journal of Personality and Social Psychology | volume = 83 | issue = 4| pages = 843–853 | doi=10.1037/0022-3514.83.4.843| pmid = 12374439 }}</ref> || "The bystander effect occurs when the presence of others discourages an individual from intervening in an emergency situation."<ref>{{cite web |title=Bystander Effect |url=https://www.psychologytoday.com/intl/basics/bystander-effect |website=psychologytoday.com |accessdate=7 May 2020}}</ref>
 
| 2002 || || Research || {{w|Bystander effect}}. Research indicates that priming a social context may inhibit helping behavior. Imagining being around one other person or being around a group of people can affect a person's willingness to help.<ref>{{cite journal | last1 = Garcia | first1 = S.M. | last2 = Weaver | first2 = K. | last3 = Darley | first3 = J.M. | last4 = Moskowitz | first4 = G.B. | year = 2002 | title = Crowded minds: the implicit bystander effect | url = | journal = Journal of Personality and Social Psychology | volume = 83 | issue = 4| pages = 843–853 | doi=10.1037/0022-3514.83.4.843| pmid = 12374439 }}</ref> || "The bystander effect occurs when the presence of others discourages an individual from intervening in an emergency situation."<ref>{{cite web |title=Bystander Effect |url=https://www.psychologytoday.com/intl/basics/bystander-effect |website=psychologytoday.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 2003 || Belief, decision-making and behavioral || Concept development || The term ''{{w|projection bias}}'' is first introduced in the paper ''Projection Bias in Predicting Future Utility'' by Loewenstein, O'Donoghue and Rabin.<ref name=Frederick2011>{{cite book|last1=Frederick|first1=Shane|last2=Loewenstein|first2=George|last3=O'Donoghue|first3=Ted|editor1-last=Camerer|editor1-first=Colin F.|editor2-last=Loewenstein|editor2-first=George|editor3-last=Rabin|editor3-first=Matthew|title=Advances in Behavioral Economics|date=2011|publisher=Princeton University Press|isbn=978-1400829118|pages=187–188|chapter-url=https://books.google.com/books?id=sA4jJOjwCW4C&pg=PA187|language=en|chapter=Time Discounting and Time Preference: A Critical Review|ref=harv}}</ref> || {{w|Projection bias}} "refers to people’s assumption that their tastes or preferences will remain the same over time"<ref>{{cite web |title=Projection bias |url=https://www.behavioraleconomics.com/resources/mini-encyclopedia-of-be/projection-bias/ |website=behavioraleconomics.com |accessdate=7 May 2020}}</ref>
+
| 2002 || Belief, decision-making and behavioral ({{w|prospect theory}}) || Recognition || {{w|Daniel Kahneman}} is awarded the {{w|Nobel Memorial Prize in Economic Sciences}} for his work on {{w|prospect theory}}. He is the first non-economist by profession to win the prize.<ref>{{cite web |title=Kahneman receives Nobel Prize at ceremony |url=https://www.princeton.edu/news/2002/12/10/kahneman-receives-nobel-prize-ceremony |website=Princeton University |access-date=16 June 2021 |language=en}}</ref><ref>{{cite web |title=Psychologist wins Nobel Prize |url=https://www.apa.org/monitor/dec02/nobel.html |website=www.apa.org |access-date=16 June 2021}}</ref> || "{{w|Prospect theory}} assumes that losses and gains are valued differently, and thus individuals make decisions based on perceived gains instead of perceived losses."<ref>{{cite web |last1=Chen |first1=Full Bio Follow Linkedin Follow Twitter James |last2=Investing |first2=Is the Former Director of |last3=trader |first3=trading content at Investopedia He is an expert |last4=Adviser |first4=Investment |last5=Chen |first5=global market strategist Learn about our editorial policies James |title=Prospect Theory |url=https://www.investopedia.com/terms/p/prospecttheory.asp |website=Investopedia |access-date=16 June 2021 |language=en}}</ref>
 
|-
 
|-
| 2003 || || Concept development || Lovallo and Kahneman proposed an expanded definition of {{w|planning fallacy}} as the tendency to underestimate the time, costs, and risks of future actions and at the same time overestimate the benefits of the same actions. According to this definition, the planning fallacy results in not only time overruns, but also {{w|cost overruns}} and {{w|benefit shortfall}}s.<ref>{{cite journal |last1=Lovallo |first1=Dan |first2=Daniel |last2=Kahneman  |date=July 2003 |title=Delusions of Success: How Optimism Undermines Executives' Decisions |journal=Harvard Business Review |volume=81 |issue=7 |pages=56–63|pmid=12858711 |url=https://hbr.org/2003/07/delusions-of-success-how-optimism-undermines-executives-decisions}}</ref> || "{{w|Planning fallacy}} refers to a prediction phenomenon, all too familiar to many, wherein people underestimate the time it will take to complete a future task, despite knowledge that previous tasks have generally taken longer than planned."<ref>{{cite journal |last1=Buehler |first1=Roger |last2=Griffin |first2=Dale |last3=Peetz |first3=Johanna |title=The Planning Fallacy |journal=Advances in Experimental Social Psychology |date=2010 |volume=43 |pages=1–62 |doi=10.1016/S0065-2601(10)43001-4}}</ref>
+
| 2003 || Belief, decision-making and behavioral || Concept development || The term ''{{w|projection bias}}'' is first introduced in the paper ''Projection Bias in Predicting Future Utility'' by Loewenstein, O'Donoghue and Rabin.<ref name=Frederick2011>{{cite book|last1=Frederick|first1=Shane|last2=Loewenstein|first2=George|last3=O'Donoghue|first3=Ted|editor1-last=Camerer|editor1-first=Colin F.|editor2-last=Loewenstein|editor2-first=George|editor3-last=Rabin|editor3-first=Matthew|title=Advances in Behavioral Economics|date=2011|publisher=Princeton University Press|isbn=978-1400829118|pages=187–188|chapter-url=https://books.google.com/books?id=sA4jJOjwCW4C&pg=PA187|language=en|chapter=Time Discounting and Time Preference: A Critical Review|ref=harv}}</ref> || {{w|Projection bias}} "refers to people’s assumption that their tastes or preferences will remain the same over time."<ref>{{cite web |title=Projection bias |url=https://www.behavioraleconomics.com/resources/mini-encyclopedia-of-be/projection-bias/ |website=behavioraleconomics.com |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 2003 || || Concept development || Lovallo and Kahneman propose an expanded definition of {{w|planning fallacy}} as the tendency to underestimate the time, costs, and risks of future actions and at the same time overestimate the benefits of the same actions. According to this definition, the planning fallacy results in not only time overruns, but also {{w|cost overruns}} and {{w|benefit shortfall}}s.<ref>{{cite journal |last1=Lovallo |first1=Dan |first2=Daniel |last2=Kahneman  |date=July 2003 |title=Delusions of Success: How Optimism Undermines Executives' Decisions |journal=Harvard Business Review |volume=81 |issue=7 |pages=56–63|pmid=12858711 |url=https://hbr.org/2003/07/delusions-of-success-how-optimism-undermines-executives-decisions}}</ref> || "{{w|Planning fallacy}} refers to a prediction phenomenon, all too familiar to many, wherein people underestimate the time it will take to complete a future task, despite knowledge that previous tasks have generally taken longer than planned."<ref>{{cite journal |last1=Buehler |first1=Roger |last2=Griffin |first2=Dale |last3=Peetz |first3=Johanna |title=The Planning Fallacy |journal=Advances in Experimental Social Psychology |date=2010 |volume=43 |pages=1–62 |doi=10.1016/S0065-2601(10)43001-4}}</ref>
 
|-
 
|-
 
| 2003 || Belief, decision-making and behavioral ([[w:framing effect (psychology)|framing effect]]) || Research || Johnson and Goldstein report on the [[w:framing effect (psychology)|framing effect]] playing a key role in the rate of organ donation.<ref name="Framing"/> || "The term {{w|framing effect}} refers to a phenomenon whereby the choices people make are systematically altered by the language used in the formulation of options."<ref>{{cite journal |last1=Kim |first1=S. |last2=Goldstein |first2=D. |last3=Hasher |first3=L. |last4=Zacks |first4=R. T. |title=Framing Effects in Younger and Older Adults |journal=The Journals of Gerontology Series B: Psychological Sciences and Social Sciences |date=1 July 2005 |volume=60 |issue=4 |pages=P215–P218 |doi=10.1093/geronb/60.4.P215}}</ref>         
 
| 2003 || Belief, decision-making and behavioral ([[w:framing effect (psychology)|framing effect]]) || Research || Johnson and Goldstein report on the [[w:framing effect (psychology)|framing effect]] playing a key role in the rate of organ donation.<ref name="Framing"/> || "The term {{w|framing effect}} refers to a phenomenon whereby the choices people make are systematically altered by the language used in the formulation of options."<ref>{{cite journal |last1=Kim |first1=S. |last2=Goldstein |first2=D. |last3=Hasher |first3=L. |last4=Zacks |first4=R. T. |title=Framing Effects in Younger and Older Adults |journal=The Journals of Gerontology Series B: Psychological Sciences and Social Sciences |date=1 July 2005 |volume=60 |issue=4 |pages=P215–P218 |doi=10.1093/geronb/60.4.P215}}</ref>         
 
|-
 
|-
| 2004 || Social bias || Literature || American journalist {{w|James Surowiecki}} publishes ''{{w|The Wisdom of Crowds}}'', which explores herd mentality and draws the conclusion that the decisions made by groups are often better and more accurate than those made by any individual member.<ref name=sdf/> || "Herd mentality (also known as mob mentality) describes a behavior in which people act the same way or adopt similar behaviors as the people around them often ignoring their own feelings in the process."<ref name=sdf>{{cite web |title=4 examples of herd mentality (and how to take advantage of it) |url=https://www.iwillteachyoutoberich.com/blog/herd-mentality/#:~:text=Herd%20mentality%20(also%20known%20as,what%20the%20herd%20is%20doing. |website=iwillteachyoutoberich.com |access-date=27 January 2021}}</ref>
+
| 2004 || Social bias || Literature || American journalist {{w|James Surowiecki}} publishes ''{{w|The Wisdom of Crowds}}'', which explores herd mentality and draws the conclusion that the decisions made by groups are often better and more accurate than those made by any individual member.<ref name=sdf/> || "Herd mentality (also known as mob mentality) describes a behavior in which people act the same way or adopt similar behaviors as the people around them{{snd}}often ignoring their own feelings in the process."<ref name=sdf>{{cite web |title=4 examples of herd mentality (and how to take advantage of it) |url=https://www.iwillteachyoutoberich.com/blog/herd-mentality/#:~:text=Herd%20mentality%20(also%20known%20as,what%20the%20herd%20is%20doing. |website=iwillteachyoutoberich.com |access-date=27 January 2021}}</ref>
 
|-
 
|-
| 2004 || Belief, decision-making and behavioral ([[w:Framing effect (psychology)|framing effect]]) || Concept development || The concept of the {{w|distinction bias}} is advanced by Christopher K. Hsee and Jiao Zhang of the {{w|University of Chicago}} as an explanation for differences in evaluations of options between joint evaluation mode and separate evaluation mode.<ref>{{cite journal |last1=Hsee |first1=Christopher K. |last2=Zhang |first2=Jiao |title=General Evaluability Theory |doi=10.1177/1745691610374586 |url=https://journals.sagepub.com/doi/10.1177/1745691610374586}}</ref> || {{w|Distinction bias}} is "an explanation for why people evaluate objects differently when evaluating them jointly, as opposed to separately."<ref>{{cite web |title=Why we tend to view two options as more distinctive when evaluating them simultaneously then separately. |url=https://thedecisionlab.com/biases/distinction-bias/ |website=thedecisionlab.com |accessdate=16 July 2020}}</ref>  
+
| 2004 || || Literature || Rüdiger Pohl and Rüdiger F. Pohl publish ''Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory'', which provides an overview of research in the area.<ref>{{cite book |last1=Pohl |first1=Rüdiger |last2=Pohl |first2=Rüdiger F. |title=Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory |date=2004 |publisher=Psychology Press |isbn=978-1-84169-351-4 |url=https://books.google.com.ar/books/about/Cognitive_Illusions.html?id=k5gTes7yyWEC&source=kp_book_description&redir_esc=y |language=en}}</ref>
 +
|-
 +
| 2004 || Belief, decision-making and behavioral ([[w:Framing effect (psychology)|framing effect]]) || Concept development || The concept of the {{w|distinction bias}} is advanced by Christopher K. Hsee and Jiao Zhang of the {{w|University of Chicago}} as an explanation for differences in evaluations of options between joint evaluation mode and separate evaluation mode.<ref>{{cite journal |last1=Hsee |first1=Christopher K. |last2=Zhang |first2=Jiao |title=General Evaluability Theory |doi=10.1177/1745691610374586 |url=https://journals.sagepub.com/doi/10.1177/1745691610374586}}</ref> || {{w|Distinction bias}} is "the tendency to view two options as more dissimilar when evaluating them simultaneously than when evaluating them separately." This bias is similar to the {{w|less-is-better effect}}, which is "the tendency to prefer a smaller set to a larger set judged separately, but not jointly."<ref name="dsaaaa">{{cite web |title=List of cognitive biases |url=https://uxinlux.github.io/cognitive-biases/ |website=uxinlux.github.io |access-date=25 July 2021 |language=en}}</ref>
 
|-
 
|-
 
| 2005 || || Research || Haigh and List report on the [[w:framing effect (psychology)|framing effect]] playing a key role in stock market forecasting.<ref name="Framing"/> || "The framing effect is a type of cognitive bias that causes people to react to something in different ways depending on how the information is presented to them."<ref>{{cite web |last1=Marfice |first1=Christina |title=How to Use the Framing Effect to Sell More Products |url=https://www.plytix.com/blog/framing-effect |website=www.plytix.com |access-date=6 March 2021 |language=en-us}}</ref>
 
| 2005 || || Research || Haigh and List report on the [[w:framing effect (psychology)|framing effect]] playing a key role in stock market forecasting.<ref name="Framing"/> || "The framing effect is a type of cognitive bias that causes people to react to something in different ways depending on how the information is presented to them."<ref>{{cite web |last1=Marfice |first1=Christina |title=How to Use the Framing Effect to Sell More Products |url=https://www.plytix.com/blog/framing-effect |website=www.plytix.com |access-date=6 March 2021 |language=en-us}}</ref>
Line 306: Line 295:
 
| 2006 || || Organization || Overcoming Bias launches as a group blog on the "general theme of how to move our beliefs closer to reality, in the face of our natural biases such as overconfidence and wishful thinking, and our bias to believe we have corrected for such biases, when we have done no such thing."<ref>{{cite web |title=Overcoming Bias |url=http://www.overcomingbias.com/about |website=overcomingbias.com |accessdate=13 March 2020}}</ref> ||
 
| 2006 || || Organization || Overcoming Bias launches as a group blog on the "general theme of how to move our beliefs closer to reality, in the face of our natural biases such as overconfidence and wishful thinking, and our bias to believe we have corrected for such biases, when we have done no such thing."<ref>{{cite web |title=Overcoming Bias |url=http://www.overcomingbias.com/about |website=overcomingbias.com |accessdate=13 March 2020}}</ref> ||
 
|-
 
|-
| 2006 || Belief, decision-making and behavioral || Concept development || The {{w|Ostrich effect}} is coined by Galai & Sade.<ref>{{cite journal |title=The "Ostrich Effect" and the Relationship between the Liquidity and the Yields of Financial Assets |journal=The Journal of Business |doi=10.2139/ssrn.431180}}</ref> || "The {{w|ostrich effect}} bias is a tendency to ignore dangerous or negative information by ignoring it or burying one's head in the sand"<ref>{{cite web |title=Ostrich Effect |url=https://www.thinkingcollaborative.com/stj/ostrich-effect/ |website=thinkingcollaborative.com |accessdate=8 May 2020}}</ref>
+
| 2006 || Belief, decision-making and behavioral || Concept development || The {{w|Ostrich effect}} is coined by Galai & Sade.<ref>{{cite journal |title=The "Ostrich Effect" and the Relationship between the Liquidity and the Yields of Financial Assets |journal=The Journal of Business |doi=10.2139/ssrn.431180}}</ref> || "The {{w|ostrich effect}} bias is a tendency to ignore dangerous or negative information by ignoring it or burying one's head in the sand."<ref>{{cite web |title=Ostrich Effect |url=https://www.thinkingcollaborative.com/stj/ostrich-effect/ |website=thinkingcollaborative.com |accessdate=8 May 2020}}</ref>
 
|-
 
|-
| 2007 || Belief, decision-making and behavioral || Concept development || The term ''{{w|recency illusion}}'' is coined by {{w|Stanford University}} linguist {{w|Arnold Zwicky}}.<ref name="sssa">{{cite journal |authorlink1= John R. Rickford |last1=Rickford |first1=John R. |last2=Wasow |first2=Thomas |last3=Zwicky |first3=Arnold |date=2007 |title=Intensive and quotative ''all'': something new, something old |journal=American Speech |doi=10.1215/00031283-2007-001 |volume=82 |issue=1 |pages=3–31|doi-access=free }}</ref> || The {{w|recency illusion}} is the belief or impression that a word or language usage is of recent origin when it is long-established."<ref name="sssa"/>
+
| 2007 || Belief, decision-making and behavioral || Concept development || The term ''{{w|recency illusion}}'' is coined by {{w|Stanford University}} linguist {{w|Arnold Zwicky}}.<ref name="sssa">{{cite journal |last1=Rickford |first1=John R. |last2=Wasow |first2=Thomas |last3=Zwicky |first3=Arnold |date=2007 |title=Intensive and quotative ''all'': something new, something old |journal=American Speech |doi=10.1215/00031283-2007-001 |volume=82 |issue=1 |pages=3–31|doi-access=free }}</ref> || The {{w|recency illusion}} is the belief or impression that a word or language usage is of recent origin when it is long-established."<ref name="sssa"/>
 
|-
 
|-
 
| 2007 || Social (conformity bias) || Concept development || The concept of an “availability cascade” is defined by professors Timur Kuran and Cass Sunstein.<ref name="sddf">{{cite web |title=Climate Change 3: The Grand Narrative Availability Cascade is Making Us Stupid |url=https://www.americanexperiment.org/2016/11/the-grand-narrative-availability-cascade-is-making-us-stupid/ |website=americanexperiment.org |access-date=14 January 2021}}</ref> || Availability cascade refers to the "self-reinforcing process of collective belief formation by which an expressed perception triggers a chain reaction that gives the perception of increasing plausibility through its rising availability in public discourse."<ref name="sddf"/>
 
| 2007 || Social (conformity bias) || Concept development || The concept of an “availability cascade” is defined by professors Timur Kuran and Cass Sunstein.<ref name="sddf">{{cite web |title=Climate Change 3: The Grand Narrative Availability Cascade is Making Us Stupid |url=https://www.americanexperiment.org/2016/11/the-grand-narrative-availability-cascade-is-making-us-stupid/ |website=americanexperiment.org |access-date=14 January 2021}}</ref> || Availability cascade refers to the "self-reinforcing process of collective belief formation by which an expressed perception triggers a chain reaction that gives the perception of increasing plausibility through its rising availability in public discourse."<ref name="sddf"/>
 
|-
 
|-
| 2008 || Social bias ({{w|association fallacy}}) || Concept development || {{w|Cheerleader effect}}. "The phrase was coined by the character {{w|Barney Stinson}} in "{{w|Not a Father's Day}}", an episode of the television series ''{{w|How I Met Your Mother}}'', first aired in November 2008. Barney points out to his friends a group of women that initially seem attractive, but who all seem to be very ugly when examined individually. This point is made again by [[w:Ted Mosby|Ted]] and [[w:Robin Scherbatsky|Robin]] later in the episode, who note that some of Barney's friends also only seem attractive in a group."<ref>{{cite web|url=https://www.theatlantic.com/business/archive/2013/11/cheerleader-effect-why-people-are-more-beautiful-in-groups/281119/|title=Cheerleader Effect: Why People Are More Beautiful in Groups|work={{w|The Atlantic}}|last=Hamblin|first=James|date=November 4, 2013|accessdate=December 5, 2015}}</ref> || "The {{w|cheerleader effect}} refers to the increase in attractiveness that an individual face experiences when seen in a group of other faces."<ref>{{cite journal |last1=Carragher |first1=Daniel J. |last2=Thomas |first2=Nicole A. |last3=Gwinn |first3=O. Scott |last4=Nicholls |first4=Mike E. R. |title=Limited evidence of hierarchical encoding in the cheerleader effect |url=https://www.nature.com/articles/s41598-019-45789-6}}</ref>
+
| 2008 || Belief, decision-making and behavioral || Literature || Israeli-American author {{w|Dan Ariely}} publishes ''{{w|Predictably Irrational: The Hidden Forces That Shape Our Decisions}}'', which explores cognitive biases within the genre of {{w|behavioral economics}}.<ref>{{cite web |title=APA PsycNet |url=https://psycnet.apa.org/record/2008-04432-000 |website=psycnet.apa.org |access-date=28 July 2021 |language=en}}</ref>
 +
|-
 +
| 2008 || Social bias ({{w|association fallacy}}) || Concept development || The term {{w|cheerleader effect}} is coined by the character {{w|Barney Stinson}} in ''{{w|Not a Father's Day}}'', an episode of the television series ''{{w|How I Met Your Mother}}''. Barney points out to his friends a group of women that initially seem attractive, but who all seem to be very ugly when examined individually.<ref>{{cite web|url=https://www.theatlantic.com/business/archive/2013/11/cheerleader-effect-why-people-are-more-beautiful-in-groups/281119/|title=Cheerleader Effect: Why People Are More Beautiful in Groups|work={{w|The Atlantic}}|last=Hamblin|first=James|date=November 4, 2013|accessdate=December 5, 2015}}</ref> || "The {{w|cheerleader effect}} refers to the increase in attractiveness that an individual face experiences when seen in a group of other faces."<ref>{{cite journal |last1=Carragher |first1=Daniel J. |last2=Thomas |first2=Nicole A. |last3=Gwinn |first3=O. Scott |last4=Nicholls |first4=Mike E. R. |title=Limited evidence of hierarchical encoding in the cheerleader effect |url=https://www.nature.com/articles/s41598-019-45789-6}}</ref>
 
|-
 
|-
 
| 2009 || Belief, decision-making and behavioral ({{w|framing effect}}) || Concept development || The concept of {{w|denomination effect}} is proposed by Priya Raghubir, professor at the {{w|New York University Stern School of Business}}, and Joydeep Srivastava, professor at [[w:University of Maryland, College Park|University of Maryland]], in their paper.<ref name="NPR">{{cite news|title=Why We Spend Coins Faster Than Bills|url=https://www.npr.org/templates/story/story.php?storyId=104063298|accessdate=7 April 2020|publisher=NPR|date=May 12, 2009}}</ref> || {{w|Denomination effect}} relates "to currency, whereby people are less likely to spend larger bills than their equivalent value in smaller bills."<ref>{{cite web |title=Denomination effect |url=http://nlpnotes.com/denomination-effect/ |website=nlpnotes.com |accessdate=7 May 2020}}</ref>
 
| 2009 || Belief, decision-making and behavioral ({{w|framing effect}}) || Concept development || The concept of {{w|denomination effect}} is proposed by Priya Raghubir, professor at the {{w|New York University Stern School of Business}}, and Joydeep Srivastava, professor at [[w:University of Maryland, College Park|University of Maryland]], in their paper.<ref name="NPR">{{cite news|title=Why We Spend Coins Faster Than Bills|url=https://www.npr.org/templates/story/story.php?storyId=104063298|accessdate=7 April 2020|publisher=NPR|date=May 12, 2009}}</ref> || {{w|Denomination effect}} relates "to currency, whereby people are less likely to spend larger bills than their equivalent value in smaller bills."<ref>{{cite web |title=Denomination effect |url=http://nlpnotes.com/denomination-effect/ |website=nlpnotes.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 2010 || Belief, decision-making and behavioral ({{w|confirmation bias}}) || Concept development ||  The phrase ''{{w|backfire effect}}'' is first coined by American political scientist {{w|Brendan Nyhan}} and Jason Reifler.<ref>{{Cite web|url=http://www.dartmouth.edu/~nyhan/nyhan-reifler.pdf|title=Pdf.}}</ref> || "The backfire effect is a cognitive bias that causes people who encounter evidence that challenges their beliefs to reject that evidence, and to strengthen their support of their original stance."<ref>{{cite web |title=The Backfire Effect: Why Facts Don’t Always Change Minds – Effectiviology |url=https://effectiviology.com/backfire-effect-facts-dont-change-minds/ |website=effectiviology.com |access-date=27 January 2021}}</ref>
+
| 2010 || Belief, decision-making and behavioral ({{w|confirmation bias}}) || Concept development ||  The {{w|backfire effect}} is first coined by American political scientist {{w|Brendan Nyhan}} and Jason Reifler.<ref>{{Cite web|url=http://www.dartmouth.edu/~nyhan/nyhan-reifler.pdf|title=Pdf.}}</ref> || "The backfire effect is a cognitive bias that causes people who encounter evidence that challenges their beliefs to reject that evidence, and to strengthen their support of their original stance."<ref>{{cite web |title=The Backfire Effect: Why Facts Don’t Always Change Minds – Effectiviology |url=https://effectiviology.com/backfire-effect-facts-dont-change-minds/ |website=effectiviology.com |access-date=27 January 2021}}</ref>
 
|-
 
|-
| 2010 || Belief, decision-making and behavioral ({{w|egocentric bias}}) || Research || The ''Handbook of Social Psychology'' recognizes {{w|naïve realism}} as one of "four hard-won insights about [[w:Perception|human perception]], [[w:Thought|thinking]], {{w|motivation}} and {{w|behavior}} that... represent important, indeed foundational, contributions of {{w|social psychology}}."<ref>{{cite journal |last1=Ross |first1=Lee |last2=Lepper |first2=Mark |last3=Ward |first3=Andrew |title=History of Social Psychology: Insights, Challenges, and Contributions to Theory and Application |journal=Handbook of Social Psychology |date=30 June 2010 |pages=socpsy001001 |doi=10.1002/9780470561119.socpsy001001}}</ref> || "{{w|Naïve realism}} describes people’s tendency to believe that they perceive the social world “as it is”—as objective reality—rather than as a subjective construction and interpretation of reality."<ref>{{cite web |title=Naive Realism |url=http://psychology.iresearchnet.com/social-psychology/decision-making/naive-realism/ |website=psychology.iresearchnet.com |accessdate=17 July 2020}}</ref>
+
| 2010 || Belief, decision-making and behavioral || Research || The ''Handbook of Social Psychology'' recognizes {{w|naïve realism}} as one of "four hard-won insights about [[w:Perception|human perception]], [[w:Thought|thinking]], {{w|motivation}} and {{w|behavior}} that... represent important, indeed foundational, contributions of {{w|social psychology}}."<ref>{{cite journal |last1=Ross |first1=Lee |last2=Lepper |first2=Mark |last3=Ward |first3=Andrew |title=History of Social Psychology: Insights, Challenges, and Contributions to Theory and Application |journal=Handbook of Social Psychology |date=30 June 2010 |pages=socpsy001001 |doi=10.1002/9780470561119.socpsy001001}}</ref> || "{{w|Naïve realism}} describes people’s tendency to believe that they perceive the social world “as it is”—as objective reality—rather than as a subjective construction and interpretation of reality."<ref>{{cite web |title=Naive Realism |url=http://psychology.iresearchnet.com/social-psychology/decision-making/naive-realism/ |website=psychology.iresearchnet.com |accessdate=17 July 2020}}</ref>
 
|-
 
|-
| 2011 || Belief, decision-making and behavioral || Concept development || The {{w|IKEA effect}} is identified and named by {{w|Michael I. Norton}} of {{w|Harvard Business School}}, Daniel Mochon of {{w|Yale}}, and {{w|Dan Ariely}} of {{w|Duke University}}, who publish the results of three studies in this year.<ref>{{cite web |title=Cognitive Biases — The IKEA Effect |url=https://medium.com/@michaelgearon/cognitive-biases-the-ikea-effect-d994ea6a28ad |website=medium.com |accessdate=14 August 2020}}</ref> || "The Ikea Effect is the cognitive phenomena where customers get more excited and place a higher value in the products they have partially created, modified or personalized."<ref>{{cite web |title=What is the Ikea Effect? |url=https://www.bloomreach.com/en/blog/2019/08/ikea-effect.html |website=bloomreach.com |accessdate=7 May 2020}}</ref>  
+
| 2010 || Belief, decision-making and behavioral || Research || In a study looking at computer use and musculoskeletal symptoms, Chang et al investigate information bias in the self-reporting of personal computer use. Over a period of 3 weeks, young adults report the duration of computer use each day, as well as musculoskeletal symptoms. Usage-monitor software installed onto participant’s computers provides the reference measure. Results show that the relationships between daily self-reported and software-recorded computer-use duration varied greatly across subject with [[w:Spearman's rank correlation coefficient|Spearman's correlations]] ranging from -0.22 to 0.8. Self-reports generally overestimated computer use when software-recorded durations were less than 3.6 hr, and underestimated when above 3.6 hr.<ref>{{cite journal |last1=Chang |first1=Che-hsu Joe |last2=Menéndez |first2=Cammie Chaumont |last3=Robertson |first3=Michelle M. |last4=Amick |first4=Benjamin C. |last5=Johnson |first5=Peter W. |last6=del Pino |first6=Rosa J. |last7=Dennerlein |first7=Jack T. |title=Daily self-reports resulted in information bias when assessing exposure duration to computer use |journal=American Journal of Industrial Medicine |date=November 2010 |volume=53 |issue=11 |pages=1142–1149 |doi=10.1002/ajim.20878}}</ref><ref>{{cite web |title=Information bias |url=https://catalogofbias.org/biases/information-bias/ |website=Catalog of Bias |access-date=25 July 2021 |language=en |date=13 November 2019}}</ref> || "[[w:Information bias (psychology)|Information bias]] is any systematic difference from the truth that arises in the collection, recall, recording and handling of information in a study, including how missing data is dealt with."<ref>{{cite web |title=Information Bias |url=https://catalogofbias.org/biases/information-bias/#:~:text=Information%20bias%20is%20any%20systematic,recall%20bias%20and%20reporting%20bias. |website=catalogofbias.org |accessdate=22 September 2020}}</ref>
 
|-
 
|-
| 2011 || Memory bias || Concept development || The {{w|Google Eeffect}}, also known as “digital amnesia”, is first described by Betsy Sparrow from {{w|Columbia University}} and her colleagues. Their paper describes the results of several memory experiments involving technology.<ref name="thecustomer.net">{{cite web |title=Marketers Need To Be Aware Of Cognitive Bias |url=https://thecustomer.net/marketers-need-to-be-aware-of-cognitive-bias/?cn-reloaded=1 |website=thecustomer.net |accessdate=12 March 2020}}</ref><ref name="Columbia">{{cite web|title=Study Finds That Memory Works Differently in the Age of Google |publisher={{w|Columbia University}}|date=July 14, 2011|url=https://web.archive.org/web/20110717092619/http://news.columbia.edu/research/2490}}</ref> || The {{w|Google effect}} "represents people’s tendency to forget information that they can find online, particularly by using search engines such as {{w|Google}}."<ref>{{cite web |title=The Google Effect and Digital Amnesia: How We Use Machines to Remember |url=https://effectiviology.com/the-google-effect-and-digital-amnesia/#:~:text=Summary%20and%20conclusions-,The%20Google%20effect%20is%20a%20psychological%20phenomenon%20that%20represents%20people's,search%20engines%20such%20as%20Google. |website=effectiviology.com |accessdate=16 July 2020}}</ref>
+
| 2010 || || Literature || Sebastian Serfas publishes ''Cognitive Biases in the Capital Investment Context: Theoretical Considerations and Empirical Experiments on Violations of Normative Rationality'', which shows how cognitive biases systematically affect and distort capital investment-related decision making and business judgements.<ref>{{cite book |last1=Serfas |first1=Sebastian |title=Cognitive Biases in the Capital Investment Context: Theoretical Considerations and Empirical Experiments on Violations of Normative Rationality |date=6 December 2010 |publisher=Springer Science & Business Media |isbn=978-3-8349-6485-4 |url=https://books.google.com.ar/books/about/Cognitive_Biases_in_the_Capital_Investme.html?id=i7OJWje1JgQC&source=kp_book_description&redir_esc=y |language=en}}</ref>
 +
|-
 +
| 2011 || Belief, decision-making and behavioral || Concept development || The {{w|IKEA effect}} is identified and named by {{w|Michael I. Norton}} of {{w|Harvard Business School}}, Daniel Mochon of {{w|Yale}}, and {{w|Dan Ariely}} of {{w|Duke University}}, who publish the results of three studies in this year.<ref>{{cite web |title=Cognitive Biases — The IKEA Effect |url=https://medium.com/@michaelgearon/cognitive-biases-the-ikea-effect-d994ea6a28ad |website=medium.com |accessdate=14 August 2020}}</ref> || "The [IKEA effect] is the cognitive phenomena where customers get more excited and place a higher value in the products they have partially created, modified or personalized."<ref>{{cite web |title=What is the Ikea Effect? |url=https://www.bloomreach.com/en/blog/2019/08/ikea-effect.html |website=bloomreach.com |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 2011 || || Literature || {{w|Daniel Kahneman}} publishes ''{{w|Thinking, Fast and Slow}}'', which covers cognitive biases, in addition to his work in other fields.<ref>{{cite web |title=Thinking, Fast and Slow |url=https://www.goodreads.com/book/show/11468377-thinking-fast-and-slow |website=www.goodreads.com |access-date=16 June 2021}}</ref> ||
 +
|-
 +
| 2011 || Memory bias || Concept development || The {{w|Google effect}}, also known as “digital amnesia”, is first described by Betsy Sparrow from {{w|Columbia University}} and her colleagues. Their paper describes the results of several memory experiments involving technology.<ref name="thecustomer.net">{{cite web |title=Marketers Need To Be Aware Of Cognitive Bias |url=https://thecustomer.net/marketers-need-to-be-aware-of-cognitive-bias/?cn-reloaded=1 |website=thecustomer.net |accessdate=12 March 2020}}</ref><ref name="Columbia">{{cite web|title=Study Finds That Memory Works Differently in the Age of Google |publisher={{w|Columbia University}}|date=July 14, 2011|url=https://web.archive.org/web/20110717092619/http://news.columbia.edu/research/2490}}</ref> || The {{w|Google effect}} "represents people’s tendency to forget information that they can find online, particularly by using search engines such as {{w|Google}}."<ref>{{cite web |title=The Google Effect and Digital Amnesia: How We Use Machines to Remember |url=https://effectiviology.com/the-google-effect-and-digital-amnesia/#:~:text=Summary%20and%20conclusions-,The%20Google%20effect%20is%20a%20psychological%20phenomenon%20that%20represents%20people's,search%20engines%20such%20as%20Google. |website=effectiviology.com |accessdate=16 July 2020}}</ref>
 
|-
 
|-
 
| 2011 || Belief, decision-making and behavioral || Notable case || The {{w|look-elsewhere effect}}, more generally known in statistics as the {{w|problem of multiple comparisons}}, gains some media attention in the context of the search for the {{w|Higgs boson}} at the {{w|Large Hadron Collider}}.<ref>{{cite web|url=http://blogs.telegraph.co.uk/news/tomchiversscience/100123873/an-unconfirmed-sighting-of-the-elusive-higgs-boson/|title=An unconfirmed sighting of the elusive Higgs boson|author=Tom Chivers|date=2011-12-13|publisher=Daily Telegraph}}</ref> || The {{w|look-elsewhere effect}} "occurs when a statistically significant observation is found but, actually, arose by chance and due to the size of the parameter space and sample observed."<ref>{{cite web |title=When a statistically significant observation should be overlooked. |url=https://thedecisionlab.com/biases/look-elsewhere-effect/ |website=thedecisionlab.com |accessdate=7 May 2020}}</ref>
 
| 2011 || Belief, decision-making and behavioral || Notable case || The {{w|look-elsewhere effect}}, more generally known in statistics as the {{w|problem of multiple comparisons}}, gains some media attention in the context of the search for the {{w|Higgs boson}} at the {{w|Large Hadron Collider}}.<ref>{{cite web|url=http://blogs.telegraph.co.uk/news/tomchiversscience/100123873/an-unconfirmed-sighting-of-the-elusive-higgs-boson/|title=An unconfirmed sighting of the elusive Higgs boson|author=Tom Chivers|date=2011-12-13|publisher=Daily Telegraph}}</ref> || The {{w|look-elsewhere effect}} "occurs when a statistically significant observation is found but, actually, arose by chance and due to the size of the parameter space and sample observed."<ref>{{cite web |title=When a statistically significant observation should be overlooked. |url=https://thedecisionlab.com/biases/look-elsewhere-effect/ |website=thedecisionlab.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 2012 || Belief, decision-making and behavioral (logical fallacy) || Research || In an article in ''{{w|Psychological Bulletin}}'' it is suggested the {{w|subadditivity effect}} can be explained by an {{w|information-theoretic}} generative mechanism that assumes a noisy conversion of objective evidence (observation) into subjective estimates (judgment).<ref name="HilbertPsychBull">{{cite journal|last1=Hilbert|first1=Martin|title=Toward a synthesis of cognitive biases: How noisy information processing can bias human decision making|journal=Psychological Bulletin|date=2012|volume=138|issue=2|pages=211–237|doi=10.1037/a0025940|pmid=22122235|url=https://web.archive.org/web/20160304023236/http://www.martinhilbert.net/HilbertPsychBull.pdf}}</ref> || The {{w|subadditivity effect}} is "the tendency to judge probability of the whole to be less than the probabilities of the parts".<ref>{{cite web |title=Today's term from psychology is Subadditivity Effect. |url=https://steemit.com/life/@jevh/today-s-term-from-psychology-is-subadditivity-effect |website=steemit.com |accessdate=7 May 2020}}</ref>
+
| 2011 || || Literature || American neuroscientist {{w|Dean Buonomano}} publishes ''Brain Bugs: How the Brain's Flaws Shape Our Lives'', which attempts to explain the brain’s inherent flaws.<ref>{{cite book |last1=Buonomano |first1=Dean |title=Brain Bugs: How the Brain's Flaws Shape Our Lives |date=11 July 2011 |publisher=W. W. Norton & Company |isbn=978-0-393-08195-4 |url=https://books.google.com.ar/books/about/Brain_Bugs_How_the_Brain_s_Flaws_Shape_O.html?id=eAKIcDmhBuEC&source=kp_book_description&redir_esc=y |language=en}}</ref> ||
 +
|-
 +
| 2013 (February 12) || || Literature || American psychologist {{w|Mahzarin Banaji}} publishes ''Blindspot: Hidden Biases of Good People'', which explains the science that shapes our likes and dislikes and our judgments about people’s character, abilities and potential. The book uses the {{w|implicit-association test}}, an assessment that measures attitudes and beliefs that people may be unwilling or unable to report.<ref>{{cite book |last1=Banaji |first1=Mahzarin R. |title=Blindspot: Hidden Biases of Good People |date=18 April 2014 |publisher=Penguin Books Limited |isbn=978-81-8475-930-3 |url=https://books.google.com.ar/books/about/Blindspot.html?id=r0A7_joFYewC&source=kp_book_description&redir_esc=y |language=en}}</ref> ||
 
|-
 
|-
 
| 2013 || Belief, decision-making and behavioral || Concept development || The term “{{w|end-of-history illusion}}” originates in a journal article by psychologists Jordi Quoidbach, [[w:Daniel Gilbert (psychologist)|Daniel Gilbert]], and {{w|Timothy Wilson}} detailing their research on the phenomenon and leveraging the phrase coined by [[w:The End of History and the Last Man|Francis Fukuyama's 1992 book of the same name]].<ref name="Quoidbach2013">{{cite journal |last1= Quoidbach |first1= Jordi |last2= Gilbert |first2= Daniel T.|last3= Wilson |first3= Timothy D. |date= 2013-01-04 |title= The End of History Illusion |journal= [[w:Science (journal)|Science]] |volume= 339 |issue= 6115 |pages= 96–98 |doi= 10.1126/science.1229294 |pmid= 23288539|quote= Young people, middle-aged people, and older people all believed they had changed a lot in the past but would change relatively little in the future.|url= https://web.archive.org/web/20130113214951/http://www.wjh.harvard.edu/~dtg/Quoidbach%20et%20al%202013.pdf |archivedate= 2013-01-13}}</ref> || The {{w|end-of-history illusion}} occurs "when people tend to underestimate how much they will change in the future.”<ref>{{cite web |title=Why You Won’t Be the Person You Expect to Be |url=https://www.nytimes.com/2013/01/04/science/study-in-science-shows-end-of-history-illusion.html |website=nytimes.com |accessdate=7 May 2020}}</ref>
 
| 2013 || Belief, decision-making and behavioral || Concept development || The term “{{w|end-of-history illusion}}” originates in a journal article by psychologists Jordi Quoidbach, [[w:Daniel Gilbert (psychologist)|Daniel Gilbert]], and {{w|Timothy Wilson}} detailing their research on the phenomenon and leveraging the phrase coined by [[w:The End of History and the Last Man|Francis Fukuyama's 1992 book of the same name]].<ref name="Quoidbach2013">{{cite journal |last1= Quoidbach |first1= Jordi |last2= Gilbert |first2= Daniel T.|last3= Wilson |first3= Timothy D. |date= 2013-01-04 |title= The End of History Illusion |journal= [[w:Science (journal)|Science]] |volume= 339 |issue= 6115 |pages= 96–98 |doi= 10.1126/science.1229294 |pmid= 23288539|quote= Young people, middle-aged people, and older people all believed they had changed a lot in the past but would change relatively little in the future.|url= https://web.archive.org/web/20130113214951/http://www.wjh.harvard.edu/~dtg/Quoidbach%20et%20al%202013.pdf |archivedate= 2013-01-13}}</ref> || The {{w|end-of-history illusion}} occurs "when people tend to underestimate how much they will change in the future.”<ref>{{cite web |title=Why You Won’t Be the Person You Expect to Be |url=https://www.nytimes.com/2013/01/04/science/study-in-science-shows-end-of-history-illusion.html |website=nytimes.com |accessdate=7 May 2020}}</ref>
 +
|-
 +
| 2013 || || Literature || Swiss writer {{w|Rolf Dobelli}} publishes ''{{w|The Art of Thinking Clearly}}'', which describes the most common thinking errors, ranging from cognitive biases to envy and social distortions.<ref>{{cite web |title=The Art of Thinking Clearly |url=http://xqdoc.imedao.com/166eb7278f3556e3fe9dc3ef.pdf |website=xqdoc.imedao.com |access-date=28 July 2021}}</ref>
 +
|-
 +
| 2016 || || Literature || Adrian Nantchev publishes ''50 Cognitive Biases for an Unfair Advantage in Entrepreneurship''.<ref>{{cite book |last1=Nantchev |first1=Adrian |title=50 Cognitive Biases for an Unfair Advantage in Entrepreneurship |publisher=CreateSpace Independent Publishing Platform |isbn=978-1-5376-0327-8 |url=https://books.google.com.ar/books/about/50_Cognitive_Biases_for_an_Unfair_Advant.html?id=yY4pvgAACAAJ&source=kp_book_description&redir_esc=y |language=en}}</ref>
 +
|-
 +
| 2019 || || Literature || Henry Priest publishes ''Biases and Heuristics: The Complete Collection of Cognitive Biases and Heuristics That Impair Decisions in Banking, Finance and Everything Else''.<ref>{{cite book |last1=Priest |first1=Henry |title=BIASES and HEURISTICS: The Complete Collection of Cognitive Biases and Heuristics That Impair Decisions in Banking, Finance and Everything Else |publisher=Amazon Digital Services LLC - KDP Print US |isbn=978-1-0784-3231-3 |url=https://books.google.com.ar/books/about/BIASES_and_HEURISTICS.html?id=z4RWxwEACAAJ&source=kp_book_description&redir_esc=y |language=en}}</ref>
 
|-
 
|-
 
|}
 
|}
 +
 +
== Visual and numerical data ==
 +
 +
=== Mentions on Google Scholar ===
 +
 +
The following table summarizes per-year mentions on Google Scholar as of May 17, 2021.
 +
 +
{| class="sortable wikitable"
 +
! Year
 +
! Overconfidence Bias
 +
! Self Serving Bias
 +
! Herd Mentality
 +
! Loss Aversion
 +
! Framing Cognitive Bias
 +
! Narrative Fallacy
 +
! Anchoring Bias
 +
! Confirmation Bias
 +
! Hindsight Bias
 +
! Representativeness Heuristic
 +
|-
 +
| 1980 || 89 || 3,060 || 102 || 1,830 || 134 || 390 || 221 || 2,150 || 420 || 136
 +
|-
 +
| 1985 || 144 || 3,570 || 137 || 2,500 || 311 || 557 || 320 || 2,560 || 583 || 226
 +
|-
 +
| 1990 || 234 || 6,410 || 268 || 3,810 || 779 || 958 || 584 || 4,780 || 1,010 || 414
 +
|-
 +
| 1995 || 428 || 10,600 || 502 || 5,040 || 1,610 || 1,560 || 1,100 || 7,070 || 1,660 || 539
 +
|-
 +
| 2000 || 824 || 18,500 || 745 || 8,590 || 3,010 || 2,550 || 1,960 || 12,400 || 2,970 || 832
 +
|-
 +
| 2002 || 1,090 || 20,700 || 1,020 || 11,200 || 3,850 || 2,390 || 2,560 || 12,400 || 3,430 || 898
 +
|-
 +
| 2004 || 1,700 || 24,200 || 1,160 || 14,000 || 5,120 || 3,300 || 3,370 || 16,200 || 4,200 || 1,130
 +
|-
 +
| 2006 || 2,050 || 27,300 || 1,220 || 16,900 || 6,470 || 3,570 || 4,090 || 20,500 || 4,660 || 1,500
 +
|-
 +
| 2008|| 2,650 || 32,300 || 1,520 || 20,700 || 8,220 || 4,690 || 5,040 || 25,600 || 5,500 || 1,580
 +
|-
 +
| 2010 || 3,350 || 36,700 || 1,810 || 25,500 || 10,700 || 5,320 || 6,220 || 31,300 || 6,280 || 2,270
 +
|-
 +
| 2012|| 4,500 || 40,100 || 2,140 || 29,200 || 13,900 || 6,180 || 7,910 || 38,500 || 7,310 || 2,820 
 +
|-
 +
| 2014 || 5,300 || 42,400 || 2,260 || 31,800 || 17,800 || 8,890 || 9,230 || 43,800 || 8,070 || 3,440 
 +
|-
 +
| 2016 || 6,020 || 42,600 || 2,390 || 31,600 || 19,900 || 9,160 || 10,600 || 45,100 || 8,790 || 3,700 
 +
|-
 +
| 2017 || 6,760 || 41,600 || 2,210 || 31,000 || 21,900 || 9,570 || 11,300 || 40,300 || 9,010 || 4,090 
 +
|-
 +
| 2018 || 7,500 || 39,700 || 2,360 || 31,200 || 23,200 || 10,300 || 12,500 || 42,200 || 9,650 || 4,300 
 +
|-
 +
| 2019 || 8,290 || 33,800 || 2,330 || 29,700 || 24,000 || 10,200 || 13,200 || 35,400 || 7,990 || 4,490 
 +
|-
 +
| 2020 || 9,110 || 30,100 || 2,670 || 28,000 || 25,500 || 10,200 || 15,200 || 32,500 || 9,300 || 4,590   
 +
|-
 +
|}
 +
 +
[[File:Cognitive biases.png|thumb|center|700px]]
 +
 +
=== Google Trends ===
 +
 +
The chart below shows Google Trends data for cognitive biases (topic) from January 2004 to January 2021, when the screenshot was taken.<ref>{{cite web |title=Cognitive biases |url=https://trends.google.com/trends/explore?date=all&q=Cognitive%20biases |website=trends.google.com |access-date=15 January 2021}}</ref>
 +
 +
[[File:Cognitive biases gtrends.jpeg|thumb|center|700px]]
 +
 +
=== Google Ngram Viewer ===
 +
 +
The chart shows Google Ngram Viewer data for "cognitive bias", from 1972 (when the concept was created) to 2019.<ref>{{cite web |title=Google Books Ngram Viewer |url=https://books.google.com/ngrams/graph?content=cognitive+bias&year_start=1972&year_end=2019&corpus=26&smoothing=3 |website=books.google.com |access-date=28 January 2021 |language=en}}</ref>
 +
 +
[[File:Cognitive bias ngram.png|thumb|center|700px]]
 +
 +
=== Wikipedia Views ===
 +
 +
The chart below shows pageviews of the English Wikipedia article {{w|cognitive bias}}, from July 2015 to December 2020.<ref>{{cite web |title=Cognitive biases |url=https://wikipediaviews.org/displayviewsformultiplemonths.php?page=Cognitive+biases&allmonths=allmonths-api&language=en&drilldown=all |website=wikipediaviews.org |access-date=19 January 2021}}</ref>
 +
 +
[[File:Cognitive biases wv.jpeg|thumb|center|450px]]
  
 
==Meta information on the timeline==
 
==Meta information on the timeline==
Line 348: Line 428:
 
===What the timeline is still missing===
 
===What the timeline is still missing===
  
* [http://cognitivebiasoftheday.com/list]
+
* Issa: This is probably going to take a whole bunch of work, but eventually it would be nice if the rows containing specific studies that were conducted could mention whether the study has been replicated or not.
* [https://onlinelibrary.wiley.com/doi/full/10.1002/9781119125563.evpsych241]
 
  
 
===Timeline update strategy===
 
===Timeline update strategy===
  
 
==See also==
 
==See also==
 +
 +
* [[Timeline of the rationality community]]
  
 
==External links==
 
==External links==

Latest revision as of 08:31, 30 December 2023

This is a timeline of cognitive biases, attempting to describe several events related to the development of new concepts, as well as some illustrative events describing research in the field.

Sample questions

The following are some interesting questions that can be answered by reading this timeline:

  • What are the different types of cognitive bias described by the timeline?
    • Sort the full timeline by "Bias type".
    • You will mostly see three categories: Social bias, memory bias, and belief, decision-making and behavioral bias.
  • What are some notable cases in history involving a cognitive bias?
    • Sort the full timeline by "Event type" and look for the group of rows with value "Notable case".
  • What are some events describing the development of a concept within the field of cognitive biases?
  • What are some ilustrative pieces of research related to the field?
    • Sort the full timeline by "Event type" and look for the group of rows with value "Research".
  • What are some books illustrating the literature on the field of cognitive biases?
    • Sort the full timeline by "Event type" and look for the group of rows with value "Literature".
    • You will read a number of notable authors, such as Daniel Kahneman, and Irving Fisher, among others.

Big picture

Time period Development summary More details
1972 backward Pre concept development era Multiple concepts later included within the category of cognitive biases are developed throughout time, starting from ancient Greek philosophers.
1972 onward Modern period The notion of cognitive bias is introduced by Amos Tversky and Daniel Kahneman, who in the following years would further elaborate on several different types of cognitive biases and related concepts.
21st century Present time As of 2020, there are approximately 188 recognized cognitive biases.[1]

Full timeline

Year Bias type Event type Details Concept definition (when applicable)
c.180 CE Social bias Concept development Many philosophers and social theorists observe and consider the phenomenon of belief in a just world, going back to at least as early as the Pyrrhonist philosopher Sextus Empiricus, writing circa 180 CE, who argues against this belief.[2] "The just-world hypothesis is the belief that people get what they deserve since life is fair."[3]
1747 Research Scottish doctor James Lind conducts the first systematic clinical trial.[4] "Clinical trials are research studies performed in people that are aimed at evaluating a medical, surgical, or behavioral intervention."[5]
1753 Concept development Anthropomorphism is first attested, originally in reference to the heresy of applying a human form to the Christian God.[6][7] Anthropomorphism is "the interpretation of nonhuman things or events in terms of human characteristics".[8]
1776–1799 Concept development The declinism belief is traced back to Edward Gibbon's work The History of the Decline and Fall of the Roman Empire,[9] where Edward Gibbon argues that Rome collapsed due to the gradual loss of civic virtue among its citizens.[10] Declinism is "the tendency to believe that the worst is to come".[11]
1796 Literature French scholar Pierre-Simon Laplace describes in A Philosophical Essay on Probabilities the ways in which men calculate their probability of having sons: "I have seen men, ardently desirous of having a son, who could learn only with anxiety of the births of boys in the month when they expected to become fathers. Imagining that the ratio of these births to those of girls ought to be the same at the end of each month, they judged that the boys already born would render more probable the births next of girls." The expectant fathers feared that if more sons were born in the surrounding community, then they themselves would be more likely to have a daughter. This essay by Laplace is regarded as one of the earliest descriptions of the fallacy.[12] "The Gambler's Fallacy is the misconception that something that has not happened for a long time has become 'overdue', such a coin coming up heads after a series of tails."[13]
1847 Concept development Hungarian physician Ignaz Semmelweis discovers that hand washing and disinfecting at hospitals dramatically reduces infection and death in paients. His hand-washing suggestions are at the beginning rejected by his contemporaries, often for non-medical reasons. This would give birth to the concept of Semmelweis effect, which is a metaphor for the reflex-like tendency to reject new evidence or new knowledge because it contradicts established norms, beliefs, or paradigms.[14] Semmelweis effect "refers to the tendency to automatically reject new information or knowledge because it contradicts current thinking or beliefs."[15]
1848 Social (conformity bias) Concept development The phrase "jump on the bandwagon" first appears in American politics when enterteiner Dan Rice uses his bandwagon and its music to gain attention for his political campaign appearances. As his campaign becomes more successful, other politicians would strive for a seat on the bandwagon, hoping to be associated with his success. This preludes the emergence of the term bandwagon effect, which is later coined in the early 20th century.[16] Bandwagon effect "is a psychological phenomenon whereby people do something primarily because other people are doing it, regardless of their own beliefs, which they may ignore or override."[17]
1850 Concept development The first reference to “stereotype” appears as a noun that means “image perpetuated without change.”[18] Stereotype refers to "a widely held but fixed and oversimplified image or idea of a particular type of person or thing"[19]
1860 Concept development Both Weber's law and Fechner's law are published by Gustav Theodor Fechner in the work Elemente der Psychophysik (Elements of Psychophysics). This publication is the first work ever in this field, and where Fechner coins the term psychophysics to describe the interdisciplinary study of how humans perceive physical magnitudes.[20] Weber–Fechner law "states that the change in a stimulus that will be just noticeable is a constant ratio of the original stimulus."[21]
1866 Belief, decision-making and behavioral (apophenia) Concept development The German word pareidolie is used in German articles by Dr. Karl Ludwig Kahlbaum in his paper On Delusion of the Senses.[22] Pareidolia is "the tendency to perceive a specific, often meaningful image in a random or ambiguous visual pattern."[23]
1874 Memory bias Research The first documented instance of cryptomnesia occurs with the medium Stainton Moses.[24][25] Cryptomnesia is "an implicit memory phenomenon in which people mistakenly believe that a current thought or idea is a product of their own creation when, in fact, they have encountered it previously and then forgotten it".[26]
1876 Memory bias Research German experimental psychologist Gustav Fechner conducts the earliest known research on the mere-exposure effect.[27] Mere-exposure effect "means that people prefer things that they are most familiar with".[28] It is "the tendency to express undue liking for things merely because of familiarity with them."[29]
1882 Concept development The term specious present is first introduced by the philosopher E. R. Clay.[30][31] Specious present "is the time duration wherein a state of consciousness is experienced as being in the present".[32]
1885 Memory bias Concept development The phenomenon of spacing effect is first identified by Hermann Ebbinghaus, and his detailed study of it is published in his book Über das Gedächtnis. Untersuchungen zur experimentellen Psychologie (Memory: A Contribution to Experimental Psychology). "The spacing effect describes the robust finding that long-term learning is promoted when learning events are spaced out in time, rather than presented in immediate succession".[33]
1890 Memory bias Concept development The tip of the tongue phenomenon is first described as a psychological phenomenon in the text The Principles of Psychology by William James.[34] Tip of the tongue describes "a state in which one cannot quite recall a familiar word but can recall words of similar form and meaning".[35]
1893 Memory bias Concept development Childhood amnesia is first formally reported by psychologist Caroline Miles in her article A study of individual psychology by the American Journal of Psychology.[36] Childhood amnesia "refers to the fact that most people cannot remember events that occurred before the age of 3 or 4".[37]
1906 Social (conformity bias) Concept development The first known use of bandwagon effect occurs in this year.[38] "Bandwagon effect is when an idea or belief is being followed because everyone seems to be doing so."[39]
1906 Social bias Research American sociologist William Sumner posits that humans are a species that join together in groups by their very nature. However, he also maintains that humans have an innate tendency to favor their own group over others, proclaiming how "each group nourishes its own pride and vanity, boasts itself superior, exists in its own divinities, and looks with contempt on outsiders".[40] In-group favoritism is "the tendency to favor members of one's own group over those in other groups".[41]
1909 Memory bias Concept development The first documented empirical studies on the testing effect are published by Edwina E. Abbott.[42][43] "Testing effect is the finding that long-term memory is often increased when some of the learning period is devoted to retrieving the to-be-remembered information."[44]
1913 Concept development The term "Monte Carlo fallacy" (also known as Gambler's fallacy) originates from the best known example of the phenomenon, which occurs in the Monte Carlo Casino.[45] Gambler's fallacy "occurs when an individual erroneously believes that a certain random event is less likely or more likely, given a previous event or a series of events."[46]
1914 Memory bias Concept development The first research on the cross-race effect is published.[47] Cross-race effect is "the tendency for eyewitnesses to be better at recognizing members of their own race/ethnicity than members of other races."[48]
1920 Social bias Concept development The halo effect is named by psychologist Edward Thorndike[49] in reference to a person being perceived as having a halo. He gives the phenomenon its name in his article A Constant Error in Psychological Ratings.[50] In "Constant Error", Thorndike sets out to replicate the study in hopes of pinning down the bias that he thought was present in these ratings. Subsequent researchers would study it in relation to attractiveness and its bearing on the judicial and educational systems.[51] Thorndike originally coins the term referring only to people; however, its use would be greatly expanded especially in the area of brand marketing.[50] Halo effect refers to an "error in reasoning in which an impression formed from a single trait or characteristic is allowed to influence multiple judgments or ratings of unrelated factors."[52]
1922 Concept development The term “stereotype” is first used in the modern psychological sense by American journalist Walter Lippmann in his work Public Opinion.[18] "Stereotype is most frequently now employed to refer to an often unfair and untrue belief that many people have about all people or things with a particular characteristic."[53]
1927 Memory bias Research Lithuanian-Soviet psychologist Bluma Zeigarnik at the University of Berlin first describes the phenomenon that would be later known as Zeigarnik effect.[54][55][56] Zeigarnik effect is the "tendency to remember interrupted or incomplete tasks or events more easily than tasks that have been completed."[57]
1928 Belief, decision-making and behavioral Literature American economist Irving Fisher publishes The Money Illusion, which develops the concept of the same name.[58] "Money illusion posits that people have a tendency to view their wealth and income in nominal dollar terms, rather than recognize its real value, adjusted for inflation."[59]
1930 Concept development English epistemologist C. D. Broad further elaborates on the concept of the specious present and states that it may be considered as the temporal equivalent of a sensory datum.[31] "The specious present is a term applied to that short duration of time the human mind appears to be able to experience, a period which exists between past and future and which is longer than the singular moment of the actual present."[60]
1932 Memory bias Research Some of the earliest evidence for the Fading Affect Bias dates back to a study by Cason, who conducts a study using a retrospective procedure where participants recall and rate past events and emotion when prompted finds that recalled emotional intensity for positive events is generally stronger than that of negative events.[61] The Fading Affect Bias "indicates that the emotional response prompted by positive memories often tends to be stronger than the emotional response prompted by negative memories."[62]
1933 Memory bias Concept development The Von Restorff effect theory is coined by German psychiatrist and pediatrician Hedwig von Restorff, who, in her study, finds that when participants are presented with a list of categorically similar items with one distinctive, isolated item on the list, memory for the item is improved.[63] "It predicts that when multiple similar objects are present, the one that differs from the rest is most likely to be remembered."[64]
1942 Concept development The Einstellung effect is first described by Dr. Abraham Luchins.[65] "The Einstellung Effect is a type of mindset that causes humans to repeat the use of "tried and true" strategies for problem solving, even when a simpler solution strategy exists."[66]
1945 Belief, decision-making and behavioral (anchoring bias) Concept development Karl Duncker defines functional fixedness as being a "mental block against using an object in a new way that is required to solve a problem".[67] Functional fixedness "is the inability to realize that something known to have a particular use may also be used to perform other functions."[68]
1946 Belief, decision-making and behavioral (logical fallacy) Concept development American statistician Joseph Berkson illustrates what would be later known as Berkson's paradox, one of the most famous paradoxes in probability and statistics.[69] Berkson's bias or fallacy, is a type of selection bias. Berkson's paradox "is a type of selection bias – a mathematical result found in the fields of conditional probability and statistics in which two variables can be negatively correlated even though they have the appearance of being positively correlated within the population."[70]
1947 Belief, decision-making and behavioral (extension neglect) Concept development Joseph Stalin is credited by some for having introduced the concept of compassion fade with his statement “the death of one man is a tragedy, the death of millions is a statistic”.[71] However, this introduction is considered to be misattributed by others.[72] Compassion fade "refers to the decrease in the compassion one shows for the people in trouble as the number of the victims increase."[73]
1952 Social (conformity bias) Concept development William H. Whyte Jr. derives the term groupthink from George Orwell's Nineteen Eighty-Four and popularizes it in Fortune magazine:
Groupthink being a coinage – and, admittedly, a loaded one – a working definition is in order. We are not talking about mere instinctive conformity – it is, after all, a perennial failing of mankind. What we are talking about is a rationalized conformity – an open, articulate philosophy which holds that group values are not only expedient but right and good as well.[74][75]
"Groupthink is a psychological phenomenon in which people strive for consensus within a group."[76]
1954 Social bias Concept development The social comparison theory is initially proposed by social psychologist Leon Festinger. It centers on the belief that there is a drive within individuals to gain accurate self-evaluations.[77] The social comparison theory refers to "the idea that individuals determine their own social and personal worth based on how they stack up against others".[78]
1956 Concept development The term "Barnum effect" is coined by psychologist Paul Meehl in his essay Wanted – A Good Cookbook, because he relates the vague personality descriptions used in certain "pseudo-successful" psychological tests to those given by showman P. T. Barnum.[79][80] Barnum effect is "the phenomenon that occurs when individuals believe that personality descriptions apply specifically to them (more so than to other people), despite the fact that the description is actually filled with information that applies to everyone."[81]
1957 Concept development British naval historian C. Northcote Parkinson describes what is later called Parkinson's law of triviality, which argues that members of an organization give disproportionate weight to trivial issues.[82] Parkinson's law of triviality (also known as the bike-shed effect) "explains that people will give more energy and focus to trivial or unimportant items than to more important and complex ones."[83]
1960 Belief, decision-making and behavioral Concept development English psychologist Peter Cathcart Wason first describes the confirmation bias.[84][85][86] "Confirmation bias is the tendency of people to favor information that confirms their existing beliefs or hypotheses."[87]
1960 Belief, decision-making and behavioral (confirmation bias) Concept development Peter Cathcart Wason discovers the classic example of subjects' congruence bias.[88] Congruence bias is "the tendency to test hypotheses exclusively through direct testing, instead of considering possible alternatives."[89]
1961 Social bias Research The Milgram experiment is conducted. This classic experiment establishes the existence of authority bias.[90] "Authority bias is the human tendency to attribute greater authority and knowledge to persons of authority (fame, power, position, etc.) than they may actually possess."[91]
1961 Ambiguity effect Concept development The ambiguity effect is first described by American economist Daniel Ellsberg.[92] "Ambiguity Effect occurs when people prefer options with known probabilities over those with unknown probabilities."[93]
1964 Memory bias Concept development The original work on the telescoping effect is usually attributed to an article by Neter and Waksberg in the Journal of the American Statistical Association.[94] The term telescoping comes from the idea that time seems to shrink toward the present in the way that the distance to objects seems to shrink when they are viewed through a telescope.[94] "The telescoping effect refers to inaccurate perceptions regarding time, where people see recent events as more remote than they are (backward telescoping), and remote events as more recent (forward telescoping).[95]
1964 Belief, decision-making and behavioral (anchoring bias) Concept development The first recorded statement of the concept of Law of the instrument is Abraham Kaplan's: "I call it the law of the instrument, and it may be formulated as follows: Give a small boy a hammer, and he will find that everything he encounters needs pounding."[96] "The law of the instrument principle states that when we acquire a specific tool/skill, we tend to be to see opportunities to use that tool/skill everywhere."[97]
1966 Social (egocentric bias) Research Walster hypothesizes that it can be frightening to believe that a misfortune could happen to anyone at random, and attributing responsibility to the person(s) involved helps to manage this emotional reaction.[98] "The defensive attribution hypothesis is a social psychology term that describes an attributional approach taken by some people - a set of beliefs that an individual uses to protect or "shield" themselves against fears of being the victim or cause of a major mishap."[99]
1967 Belief, decision-making and behavioral Notable case Risk compensation. Sweden experiences a drop in crashes and fatalities, following the change from driving on the left to driving on the right. This is linked to the increased apparent risk. The number of motor insurance claims goes down by 40%, returning to normal over the next six weeks.[100][101] Fatality levels would take two years to return to normal.[102] "Risk compensation postulates that humans have a built-in level of acceptable risk-taking and that our behaviour adjusts to this level in a homeostatic manner".[103]
1967 Belief, decision-making and behavioral (apophenia) Concept development Illusory correlation is originally coined by Chapman and Chapman to describe people's tendencies to overestimate relationships between two groups when distinctive and unusual information is presented.[104]"[105] An illusory correlation occurs when a person perceives a relationship between two variables that are not in fact correlated.[106]
1967 Social (attribution bias) Research American social psychologist Edward E. Jones and Victor Harris conduct a classic experiment[107] that would later give rise to the phrase Fundamental attribution error, coined by Lee Ross.[108] Fundamental attribution error "is the tendency for people to over-emphasize dispositional, or personality-based explanations for behaviors observed in others while under-emphasizing situational explanations".[109]
1968 Belief, decision-making and behavioral (anchoring bias) Concept development American psychologist Ward Edwards discusses the concept of conservatism (belief revision) bias.[110] "Conservatism bias is a mental process in which people maintain their past views or predictions at the cost of recognizing new information."[111]
1968 Social Concept development German-born American psychologist Robert Rosenthal and Lenore Jacobsen first describe what would be called Pygmalion Effect (also called the Galatea effect).[112] Pygmalion Effect "refers to the phenomenon of people improving their performance when others have high expectations of them."[113]
1969 Social (cognitive dissonance) Concept development Researchers confirm the Ben Franklin effect.[114] The Ben Franklin effect refers to "an altruistic reaction that makes a person more likely to do a favor for someone that they have already completed a favor for; more likely than they are to return a favor to someone who has completed a favor for them."[115]
1969 Memory bias Research Crowder and Morton argue that the suffix effect is a reflection of the contribution of the auditory sensory memory or echoic memory to recall in the nonsuffix control condition.[116] "The suffix effect is the selective impairment in recall of the final items of a spoken list when the list is followed by a nominally irrelevant speech item, or suffix."[117]
1971 Social bias Concept development The concept of actor–observer asymmetry (also actor–observer bias) is introduced by Jones and Nisbett. It explains the errors that one makes when forming attributions about the behavior of others.[118] The actor–observer asymmetry "states that people tend to explain their own behavior with situation causes and other people's behavior with person causes".[119]
1972 Concept development The concept of cognitive bias is introduced in this year through the work of researchers Amos Tversky and Daniel Kahneman.[120] Cognitive bias refers to "people's systematic but purportedly flawed patterns of responses to judgment and decision problems."[121]
1973 Memory bias Concept development American academic Baruch Fischhoff attends a seminar where Paul E. Meehl states an observation that clinicians often overestimate their ability to have foreseen the outcome of a particular case, as they claim to have known it all along.[122] "Hindsight bias, the tendency, upon learning an outcome of an event—such as an experiment, a sporting event, a military decision, or a political election—to overestimate one's ability to have foreseen the outcome."[123]
1973 Belief, decision-making and behavioral (egocentric bias) Concept development The illusion of validity bias is first described by Amos Tversky and Daniel Kahneman in their paper.[124] The illusion of validity occurs when an individual overestimates their ability to predict an outcome when analyzing a set of data - especially when the data appears to have a consistent pattern or appears to 'tell a story".[125]
1973 Memory bias Concept development The next-in-line effect is first studied experimentally by Malcolm Brenner. In his experiment the participants were each in turn reading a word aloud from an index card, and after 25 words were asked to recall as many of all the read words as possible. The results of the experiment show that words read aloud within approximately nine seconds before the subject's own turn are recalled worse than other words.[126] "Next-in-line effect. people not remembering what other people said because they were too busy rehearsing their own part."[127]
1974 Memory bias Research Elizabeth Loftus and John Palmer conduct a study to investigate the effects of language on the development of false memory.[128] "False memory refers to cases in which people remember events differently from the way they happened or, in the most dramatic case, remember events that never happened at all."[129]
1974 Belief, decision-making and behavioral Concept development Anchoring is first described by Tversky and Kahneman.[130] "Anchoring bias occurs when people rely too much on pre-existing information or the first information they find when making decisions."[131]
1975 Social (attribution bias) Research Miller and Ross conduct a study that is one of the earliest to assess not only self-serving bias but also the attributions for successes and failures within this theory.[132] Self-serving bias is the common habit of a person taking credit for positive events or outcomes, but blaming outside factors for negative events."[133]
1976 Belief, decision-making and behavioral (logical fallacy) Concept development Escalation of commitment is first described by Barry M. Staw in his paper Knee deep in the big muddy: A study of escalating commitment to a chosen course of action.[134] Escalation of commitment "refers to the irrational behavior of investing additional resources in a failing project."[135]
1976 Social (attribution bias) Research Prior to Pettigrew's formalization of the ultimate attribution error, Birt Duncan finds that White participants view Black individuals as more violent than White individuals in an "ambiguous shove" situation, where a Black or White person accidentally shoves a White person.[136] "The tendency for persons from one group (the ingroup) to determine that any bad acts by members of an outgroup—for example, a racial or ethnic minority group—are caused by internal attributes or traits rather than by outside circumstances or situations, while viewing their positive behaviors as merely exceptions to the rule or the result of luck."[137]
1977 Memory bias Research Misattribution of memory. Early research done by Brown and Kulik finds that flashbulb memories are similar to photographs because they can be described in accurate, vivid detail. In this study, participants describe their circumstances about the moment they learned of the assassination of President John F. Kennedy as well as other similar traumatic events. Participants are able to describe what they were doing, things around them, and other details.[138] Misattribution of memory occurs "when a memory is distorted because of the source, context, or our imagination."[139]
1977 Social (egocentric bias) Concept development A study conducted by Lee Ross and colleagues provides early evidence for a cognitive bias called the false consensus effect, which is the tendency for people to overestimate the extent to which others share the same views.[140] The false-consensus effect "refers to the tendency to overestimate consensus for one′s attitudes and behaviors."[141][142] It is "the tendency to assume that one’s own opinions, beliefs, attributes, or behaviors are more widely shared than is actually the case."[143]
1977 Belief, decision-making and behavioral (truthiness) Concept development The illusory truth effect is first identified in a study at Villanova University and Temple University.[144][145] The illusory truth effect "occurs when repeating a statement increases the belief that it’s true even when the statement is actually false."[146]
1977 Memory bias Research T. B. Rogers and colleagues publish the first research on the self-reference effect.[147][148] "The self-reference effect refers to people’s tendency to better remember information when that information has been linked to the self than when it has not been linked to the self."[149]
1978 Memory bias Research Loftus, Miller, and Burns conduct the original misinformation effect study.[150] The misinformation effect "happens when a person's memory becomes less accurate due to information that happens after the event."[151]
1979 Social (attribution bias) Research Thomas Nagel identifies four kinds of moral luck in his essay.[152] "Moral luck occurs when the features of action which generate a particular moral assessment lie significantly beyond the control of the agent who is so assessed."[153]
1979 Social bias Concept development The ultimate attribution error is first established by Thomas F. Pettigrew in his publication The Ultimate Attribution Error: Extending Allport's Cognitive Analysis of Prejudice.[154] "Ultimate attribution error refers to the tendency of individuals to make less internal attributions of negative behaviors committed by ingroup members compared to outgroup members."[155]
1979 Social bias Concept development David Kahneman and Amos Tversky originally coin the term loss aversion in a landmark paper on subjective probability.[156] "Loss aversion is a cognitive bias that suggests that for individuals the pain of losing is psychologically twice as powerful as the pleasure of gaining."[157]
1979 Belief, decision-making and behavioral Concept development The planning fallacy is first proposed by Daniel Kahneman and Amos Tversky.[158][159] "The planning fallacy refers to a prediction phenomenon, all too familiar to many, wherein people underestimate the time it will take to complete a future task, despite knowledge that previous tasks have generally taken longer than planned"[160]
1980 Memory bias Concept development The term "egocentric bias" is first coined by Anthony Greenwald, a psychologist at Ohio State University.[161] "The egocentric bias is a cognitive bias that causes people to rely too heavily on their own point of view when they examine events in their life or when they try to see things from other people’s perspective."[162]
1980 Social bias Concept development Ruth Hamill, Richard E. Nisbett, and Timothy DeCamp Wilson become the first to study the first type of group attribution error in detail in their paper Insensitivity to Sample Bias: Generalizing From Atypical Cases.[163] Group attribution error is "the tendency for perceivers to assume that a specific group member’s personal characteristics and preferences, including beliefs, attitudes, and decisions, are similar to those of the group to which he or she belongs."[164]
1980 Belief, decision-making and behavioral (truthiness) Concept development The term subjective validation first appears in the book The Psychology of the Psychic by David F. Marks and Richard Kammann.[165] Subjective validation "causes an individual to consider a statement or another piece of information correct if it has any significance or personal meaning (validating their previous opinion) to them."[166]
1980 Belief, decision-making and behavioral Concept development The phenomenon of optimism bias is initially described by Weinstein, who finds that the majority of college students believe that their chances of developing a drinking problem or getting divorced are lower than their peers'.[167] "Optimism Bias refers to the tendency for individuals to underestimate their probability of experiencing adverse effects despite the obvious."[168]
1981 Social bias Research Tversky and Kahneman conduct a demonstration of the framing effect.[169] "The Framing effect is the principle that our choices are influenced by the way they are framed through different wordings, settings, and situations."[170]
1981 Belief, decision-making and behavioral (prospect theory) Concept development The pseudocertainty effect is illustrated by Daniel Kahneman.[171] "Pseudocertainty effect refers to people's tendency to make risk-averse choices if the expected outcome is positive, but make risk-seeking choices to avoid negative outcomes."[172]
1982 Social (egocentric bias) Research Trait ascription bias. In a study involving fifty-six undergraduate psychology students from the University of Bielefeld, Kammer et al. demonstrate that subjects rate their own variability on each of 20 trait terms to be considerably higher than their peers'.[173] "Trait ascription bias is the belief that other people's behavior and reactions are generally predictable while you yourself are more unpredictable."[174]
1982 Belief, decision-making and behavioral (framing effect) Research The decoy effect is first demonstrated by Joel Huber and others at Duke University. The effect explains how when a customer is hesitating between two options, presenting them with a third “asymmetrically dominated” option that acts as a decoy will strongly influence which decision they make.[175] "The decoy effect is defined as the phenomenon whereby consumers change their preference between two options when presented with a third option."[176]
1983 Social (egocentric bias) Concept development Sociologist W. Phillips Davison first articulates the third-person effect hypothesis.[177][178] Third-person effect refers to "the commonly held belief that other people are more affected, due to personal prejudices, by mass media than you yourself are. This view, largely due to a personal conceit, is caused by the self-concept of being more astute and aware than others, or of being less vulnerable to persuasion than others."[179]
1983 Social (conformity bias) Research Jones reports the presence of courtesy bias in Asian cultures.[180] "Courtesy bias is the tendency that some individuals have of not fully stating their unhappiness with a service or product because of a desire not to offend the person or organization that they are responding to."[181]
1985 Belief, decision-making and behavioral (prospect theory) Concept development The disposition effect anomaly is identified and named by Hersh Shefrin and Meir Statman, who note that "people dislike incurring losses much more than they enjoy making gains, and people are willing to gamble in the domain of losses." Consequently, "investors will hold onto stocks that have lost value...and will be eager to sell stocks that have risen in value." The researchers coin the term "disposition effect" to describe this tendency of holding on to losing stocks too long and to sell off well-performing stocks too readily.[182] "The disposition effect refers to investors’ reluctance to sell assets that have lost value and greater likelihood of selling assets that have made gains."[183]
1985 Belief, decision-making and behavioral (logical fallacy) Concept development The hot-hand fallacy is first described in a paper by Amos Tversky, Thomas Gilovich, and Robert Vallone.[184] "The hot-hand fallacy effect refers to the tendency for people to expect streaks in sports performance to continue."[185]
1986 Memory bias Research McDaniel and Einstein describe the bizarreness effect as the finding that people have superior memory for bizarre sentences relative to common ones.[186] However, the researchers argue that bizarreness intrinsically does not enhance memory in their paper.[187][188] "The bizarreness effect holds that items associated with bizarre sentences or phrases are more readily recalled than those associated with common sentences or phrases."[189]
1988 Social Concept development The Reactive devaluation bias is proposed by Lee Ross and Constance Stillinger.[190] "Reactive Devaluation is tendency to value the proposal of someone we recognized as an antagonist as being less interesting than if it was made by someone else."[191]
1988 Belief, decision-making and behavioral (prospect theory) Research Samuelson and Zeckhauser demonstrate status quo bias using a questionnaire in which subjects faced a series of decision problems, which were alternately framed to be with and without a pre-existing status quo position. Subjects tended to remain with the status quo when such a position was offered to them.[192] "Status quo bias refers to the phenomenon of preferring that one's environment and situation remain as they already are."[193]
1989 Belief, decision-making and behavioral Concept development The term "curse of knowledge" is coined in a Journal of Political Economy article by economists Colin Camerer, George Loewenstein, and Martin Weber. The curse of knowledge causes people to fail to account for the fact that others don't know the same things that they do.[194]
1990 Belief, decision-making and behavioral (prospect theory) Research Kahneman, Knetsch and Thaler publish a paper containing the first experimental test of the Endowment Effect.[195] It refers to an emotional bias that causes individuals to value an owned object higher, often irrationally, than its market value.
1990 Belief, decision-making and behavioral (confirmation bias) Concept development The phenomenon known as “satisfaction of search” is first described, in which a radiologist fails to detect a second abnormality, apparently because of prematurely ceasing to search the images after detecting a “satisfying” finding.[196] "Satisfaction of search describes a situation in which the detection of one radiographic abnormality interferes with that of others."[197]
1990 Literature Jean-Paul Caverni, Jean-Marc Fabre and Michel Gonzalez publish Cognitive Biases.[198]
1991 Social (egocentric bias) Concept development The term illusory superiority is first used by the researchers Van Yperen and Buunk.[199] Illusory superiority "indicates an individual who has a belief that they are somehow inherently superior to others".[200]
1991 Social (conformity bias) Research Marín and Marín report courtesy bias to be common in Hispanic cultures.[180] The "Courtesy Bias is the reluctance of an individual to give negative feedback for fear of offending."[201]
1994 Belief, decision-making and behavioral Concept development The Women are wonderful effect term is coined by researchers Alice Eagly and Antonio Mladinic in a paper, where they question the widely-held view that there was prejudice against women.[202] "The women are wonderful effect is a phenomenon found in psychological research in which people associate more positive attributes with women as compared to men."[203]
1994 Belief, decision-making and behavioral (logical fallacy) Research Research by Fox, Rogers, and Tversky provides evidence of the subadditivity effect in expert judgment, after having investigated 32 professional options traders.[204] The subadditivity effect is "the tendency to judge probability of the whole to be less than the probabilities of the parts".[205]
1995 Concept development The implicit bias is first described in a publication by Tony Greenwald and Mahzarin Banaji.[206] "Research on implicit bias suggests that people can act on the basis of prejudice and stereotypes without intending to do so."[207]
1996 Research Daniel Kahneman and Amos Tversky argue that cognitive biases have efficient practical implications for areas including clinical judgment, entrepreneurship, finance, and management.[208][209]
1998 Belief, decision-making and behavioral Research Gilbert et al. report on the presence of impact bias in registered voters.[210] "Impact bias refers to a human tendency to overestimate emotional responses to events and experiences."[211]
1998 Concept development The implicit-association test is introduced in the scientific literature by Anthony Greenwald, Debbie McGhee, and Jordan Schwartz.[212] It is a research method able to provide a range of new possibilities for those looking to conduct research exploring attitudes and beliefs.[213] "The implicit-association test is a flexible task designed to tap automatic associations between concepts (e.g., math and arts) and attributes (e.g., good or bad, male or female, self or other)."[214]
1998 Belief, decision-making and behavioral (extension neglect) Concept development Hsee discovers a less-is-better effect in three contexts: "(1) a person giving a $45 scarf (from scarves ranging from $5-$50) as a gift was perceived to be more generous than one giving a $55 coat (from coats ranging from $50-$500); (2) an overfilled ice cream serving with 7 oz of ice cream was valued more than an underfilled serving with 8 oz of ice cream; (3) a dinnerware set with 24 intact pieces was judged more favourably than one with 31 intact pieces (including the same 24) plus a few broken ones."[215] "The less-is-better effect is the tendency to prefer the smaller or the lesser alternative when choosing individually, but not when evaluating together."[216]
1999 Belief, decision-making and behavioral Concept development The psychological phenomenon of illusory superiority known as Dunning–Kruger effect is identified as a form of cognitive bias in Kruger and Dunning's 1999 study, Unskilled and Unaware of It: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments.[217] "The Dunning-Kruger effect is a cognitive bias in which people wrongly overestimate their knowledge or ability in a specific area."[218]
1999 Memory bias Concept development The term "spotlight effect" is coined by Thomas Gilovich and Kenneth Savitsky.[219] The phenomenon first appears in the world of psychology in the journal Current Directions in Psychological Science. "The spotlight effect refers to the tendency to think that more people notice something about you than they do."[220]
1999 Social (egocentric bias) Concept development Kruger and Gilovich publish study titled Naive cynicism in everyday theories of responsibility assessment: On biased assumptions of bias, which formally introduces the concept of naïve cynicism.[221] Naïve cynicism is "the tendency of laypeople to expect other people’s judgments will have a motivational basis and therefore will be biased in the direction of their self-interest."[222]
2002 Belief, decision-making and behavioral Concept development Daniel Kahneman and Shane Frederick propose the process of attribute substitution.[223] "Attribute substitution occurs when an individual has to make a judgment (of a target attribute) that is computationally complex, and instead substitutes a more easily calculated heuristic attribute."[224]
2001 Belief, decision-making and behavioral (framing effect) Research Druckman shows that economic policies receive higher support when framed in terms of the employment rates rather than unemployment rates.[225] "The Framing Effect is a cognitive bias that explains how we react differently to things depending on how they are presented to us."[226]
2002 Social (egocentric bias) Concept development Pronin et al. introduce the concept of "bias blind spot".[227] Bias blind spot "refers to the tendency for people to be able to identify distortionary biases in others, while being ignorant of and susceptible to precisely these biases in their own thinking."[227]
2002 Research Bystander effect. Research indicates that priming a social context may inhibit helping behavior. Imagining being around one other person or being around a group of people can affect a person's willingness to help.[228] "The bystander effect occurs when the presence of others discourages an individual from intervening in an emergency situation."[229]
2002 Belief, decision-making and behavioral (prospect theory) Recognition Daniel Kahneman is awarded the Nobel Memorial Prize in Economic Sciences for his work on prospect theory. He is the first non-economist by profession to win the prize.[230][231] "Prospect theory assumes that losses and gains are valued differently, and thus individuals make decisions based on perceived gains instead of perceived losses."[232]
2003 Belief, decision-making and behavioral Concept development The term projection bias is first introduced in the paper Projection Bias in Predicting Future Utility by Loewenstein, O'Donoghue and Rabin.[233] Projection bias "refers to people’s assumption that their tastes or preferences will remain the same over time."[234]
2003 Concept development Lovallo and Kahneman propose an expanded definition of planning fallacy as the tendency to underestimate the time, costs, and risks of future actions and at the same time overestimate the benefits of the same actions. According to this definition, the planning fallacy results in not only time overruns, but also cost overruns and benefit shortfalls.[235] "Planning fallacy refers to a prediction phenomenon, all too familiar to many, wherein people underestimate the time it will take to complete a future task, despite knowledge that previous tasks have generally taken longer than planned."[236]
2003 Belief, decision-making and behavioral (framing effect) Research Johnson and Goldstein report on the framing effect playing a key role in the rate of organ donation.[169] "The term framing effect refers to a phenomenon whereby the choices people make are systematically altered by the language used in the formulation of options."[237]
2004 Social bias Literature American journalist James Surowiecki publishes The Wisdom of Crowds, which explores herd mentality and draws the conclusion that the decisions made by groups are often better and more accurate than those made by any individual member.[238] "Herd mentality (also known as mob mentality) describes a behavior in which people act the same way or adopt similar behaviors as the people around them – often ignoring their own feelings in the process."[238]
2004 Literature Rüdiger Pohl and Rüdiger F. Pohl publish Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory, which provides an overview of research in the area.[239]
2004 Belief, decision-making and behavioral (framing effect) Concept development The concept of the distinction bias is advanced by Christopher K. Hsee and Jiao Zhang of the University of Chicago as an explanation for differences in evaluations of options between joint evaluation mode and separate evaluation mode.[240] Distinction bias is "the tendency to view two options as more dissimilar when evaluating them simultaneously than when evaluating them separately." This bias is similar to the less-is-better effect, which is "the tendency to prefer a smaller set to a larger set judged separately, but not jointly."[29]
2005 Research Haigh and List report on the framing effect playing a key role in stock market forecasting.[169] "The framing effect is a type of cognitive bias that causes people to react to something in different ways depending on how the information is presented to them."[241]
2006 Organization Overcoming Bias launches as a group blog on the "general theme of how to move our beliefs closer to reality, in the face of our natural biases such as overconfidence and wishful thinking, and our bias to believe we have corrected for such biases, when we have done no such thing."[242]
2006 Belief, decision-making and behavioral Concept development The Ostrich effect is coined by Galai & Sade.[243] "The ostrich effect bias is a tendency to ignore dangerous or negative information by ignoring it or burying one's head in the sand."[244]
2007 Belief, decision-making and behavioral Concept development The term recency illusion is coined by Stanford University linguist Arnold Zwicky.[245] The recency illusion is the belief or impression that a word or language usage is of recent origin when it is long-established."[245]
2007 Social (conformity bias) Concept development The concept of an “availability cascade” is defined by professors Timur Kuran and Cass Sunstein.[246] Availability cascade refers to the "self-reinforcing process of collective belief formation by which an expressed perception triggers a chain reaction that gives the perception of increasing plausibility through its rising availability in public discourse."[246]
2008 Belief, decision-making and behavioral Literature Israeli-American author Dan Ariely publishes Predictably Irrational: The Hidden Forces That Shape Our Decisions, which explores cognitive biases within the genre of behavioral economics.[247]
2008 Social bias (association fallacy) Concept development The term cheerleader effect is coined by the character Barney Stinson in Not a Father's Day, an episode of the television series How I Met Your Mother. Barney points out to his friends a group of women that initially seem attractive, but who all seem to be very ugly when examined individually.[248] "The cheerleader effect refers to the increase in attractiveness that an individual face experiences when seen in a group of other faces."[249]
2009 Belief, decision-making and behavioral (framing effect) Concept development The concept of denomination effect is proposed by Priya Raghubir, professor at the New York University Stern School of Business, and Joydeep Srivastava, professor at University of Maryland, in their paper.[250] Denomination effect relates "to currency, whereby people are less likely to spend larger bills than their equivalent value in smaller bills."[251]
2010 Belief, decision-making and behavioral (confirmation bias) Concept development The backfire effect is first coined by American political scientist Brendan Nyhan and Jason Reifler.[252] "The backfire effect is a cognitive bias that causes people who encounter evidence that challenges their beliefs to reject that evidence, and to strengthen their support of their original stance."[253]
2010 Belief, decision-making and behavioral Research The Handbook of Social Psychology recognizes naïve realism as one of "four hard-won insights about human perception, thinking, motivation and behavior that... represent important, indeed foundational, contributions of social psychology."[254] "Naïve realism describes people’s tendency to believe that they perceive the social world “as it is”—as objective reality—rather than as a subjective construction and interpretation of reality."[255]
2010 Belief, decision-making and behavioral Research In a study looking at computer use and musculoskeletal symptoms, Chang et al investigate information bias in the self-reporting of personal computer use. Over a period of 3 weeks, young adults report the duration of computer use each day, as well as musculoskeletal symptoms. Usage-monitor software installed onto participant’s computers provides the reference measure. Results show that the relationships between daily self-reported and software-recorded computer-use duration varied greatly across subject with Spearman's correlations ranging from -0.22 to 0.8. Self-reports generally overestimated computer use when software-recorded durations were less than 3.6 hr, and underestimated when above 3.6 hr.[256][257] "Information bias is any systematic difference from the truth that arises in the collection, recall, recording and handling of information in a study, including how missing data is dealt with."[258]
2010 Literature Sebastian Serfas publishes Cognitive Biases in the Capital Investment Context: Theoretical Considerations and Empirical Experiments on Violations of Normative Rationality, which shows how cognitive biases systematically affect and distort capital investment-related decision making and business judgements.[259]
2011 Belief, decision-making and behavioral Concept development The IKEA effect is identified and named by Michael I. Norton of Harvard Business School, Daniel Mochon of Yale, and Dan Ariely of Duke University, who publish the results of three studies in this year.[260] "The [IKEA effect] is the cognitive phenomena where customers get more excited and place a higher value in the products they have partially created, modified or personalized."[261]
2011 Literature Daniel Kahneman publishes Thinking, Fast and Slow, which covers cognitive biases, in addition to his work in other fields.[262]
2011 Memory bias Concept development The Google effect, also known as “digital amnesia”, is first described by Betsy Sparrow from Columbia University and her colleagues. Their paper describes the results of several memory experiments involving technology.[263][264] The Google effect "represents people’s tendency to forget information that they can find online, particularly by using search engines such as Google."[265]
2011 Belief, decision-making and behavioral Notable case The look-elsewhere effect, more generally known in statistics as the problem of multiple comparisons, gains some media attention in the context of the search for the Higgs boson at the Large Hadron Collider.[266] The look-elsewhere effect "occurs when a statistically significant observation is found but, actually, arose by chance and due to the size of the parameter space and sample observed."[267]
2011 Literature American neuroscientist Dean Buonomano publishes Brain Bugs: How the Brain's Flaws Shape Our Lives, which attempts to explain the brain’s inherent flaws.[268]
2013 (February 12) Literature American psychologist Mahzarin Banaji publishes Blindspot: Hidden Biases of Good People, which explains the science that shapes our likes and dislikes and our judgments about people’s character, abilities and potential. The book uses the implicit-association test, an assessment that measures attitudes and beliefs that people may be unwilling or unable to report.[269]
2013 Belief, decision-making and behavioral Concept development The term “end-of-history illusion” originates in a journal article by psychologists Jordi Quoidbach, Daniel Gilbert, and Timothy Wilson detailing their research on the phenomenon and leveraging the phrase coined by Francis Fukuyama's 1992 book of the same name.[270] The end-of-history illusion occurs "when people tend to underestimate how much they will change in the future.”[271]
2013 Literature Swiss writer Rolf Dobelli publishes The Art of Thinking Clearly, which describes the most common thinking errors, ranging from cognitive biases to envy and social distortions.[272]
2016 Literature Adrian Nantchev publishes 50 Cognitive Biases for an Unfair Advantage in Entrepreneurship.[273]
2019 Literature Henry Priest publishes Biases and Heuristics: The Complete Collection of Cognitive Biases and Heuristics That Impair Decisions in Banking, Finance and Everything Else.[274]

Visual and numerical data

Mentions on Google Scholar

The following table summarizes per-year mentions on Google Scholar as of May 17, 2021.

Year Overconfidence Bias Self Serving Bias Herd Mentality Loss Aversion Framing Cognitive Bias Narrative Fallacy Anchoring Bias Confirmation Bias Hindsight Bias Representativeness Heuristic
1980 89 3,060 102 1,830 134 390 221 2,150 420 136
1985 144 3,570 137 2,500 311 557 320 2,560 583 226
1990 234 6,410 268 3,810 779 958 584 4,780 1,010 414
1995 428 10,600 502 5,040 1,610 1,560 1,100 7,070 1,660 539
2000 824 18,500 745 8,590 3,010 2,550 1,960 12,400 2,970 832
2002 1,090 20,700 1,020 11,200 3,850 2,390 2,560 12,400 3,430 898
2004 1,700 24,200 1,160 14,000 5,120 3,300 3,370 16,200 4,200 1,130
2006 2,050 27,300 1,220 16,900 6,470 3,570 4,090 20,500 4,660 1,500
2008 2,650 32,300 1,520 20,700 8,220 4,690 5,040 25,600 5,500 1,580
2010 3,350 36,700 1,810 25,500 10,700 5,320 6,220 31,300 6,280 2,270
2012 4,500 40,100 2,140 29,200 13,900 6,180 7,910 38,500 7,310 2,820
2014 5,300 42,400 2,260 31,800 17,800 8,890 9,230 43,800 8,070 3,440
2016 6,020 42,600 2,390 31,600 19,900 9,160 10,600 45,100 8,790 3,700
2017 6,760 41,600 2,210 31,000 21,900 9,570 11,300 40,300 9,010 4,090
2018 7,500 39,700 2,360 31,200 23,200 10,300 12,500 42,200 9,650 4,300
2019 8,290 33,800 2,330 29,700 24,000 10,200 13,200 35,400 7,990 4,490
2020 9,110 30,100 2,670 28,000 25,500 10,200 15,200 32,500 9,300 4,590
Cognitive biases.png

Google Trends

The chart below shows Google Trends data for cognitive biases (topic) from January 2004 to January 2021, when the screenshot was taken.[275]

Cognitive biases gtrends.jpeg

Google Ngram Viewer

The chart shows Google Ngram Viewer data for "cognitive bias", from 1972 (when the concept was created) to 2019.[276]

Cognitive bias ngram.png

Wikipedia Views

The chart below shows pageviews of the English Wikipedia article cognitive bias, from July 2015 to December 2020.[277]

Cognitive biases wv.jpeg

Meta information on the timeline

How the timeline was built

The initial version of the timeline was written by User:Sebastian.

Funding information for this timeline is available.

Feedback and comments

Feedback for the timeline can be provided at the following places:

  • FIXME

What the timeline is still missing

  • Issa: This is probably going to take a whole bunch of work, but eventually it would be nice if the rows containing specific studies that were conducted could mention whether the study has been replicated or not.

Timeline update strategy

See also

External links

References

  1. "Every Single Cognitive Bias in One Infographic". visualcapitalist.com. Retrieved 5 December 2020. 
  2. Sextus Empiricus, "Outlines of Pyrrhonism", Book 1, Chapter 13, Section 32
  3. "Just-World Hypothesis". alleydog.com. Retrieved 7 May 2020. 
  4. Carlisle, Rodney (2004). Scientific American Inventions and Discoveries, John Wiley & Songs, Inc., New Jersey. p. 393.
  5. "What Are Clinical Trials and Studies?". National Institute on Aging. Retrieved 28 January 2021. 
  6. Chambers's Cyclopædia, Supplement, 1753 
  7. Oxford English Dictionary, 1st ed. "anthropomorphism, n." Oxford University Press (Oxford), 1885.
  8. "Anthropomorphism". britannica.com. Retrieved 7 May 2020. 
  9. Miller, Laura (2015-06-14). "Culture is dead — again". Salon. Retrieved 17 April 2018. 
  10. J.G.A. Pocock, "Between Machiavelli and Hume: Gibbon as Civic Humanist and Philosophical Historian," Daedalus 105:3 (1976), 153–169; and in Further reading: Pocock, EEG, 303–304; FDF, 304–306.
  11. "Why we feel the past is better compare to what the future holds". thedecisionlab.com. Retrieved 7 May 2020. 
  12. Barron, Greg; Leider, Stephen (13 October 2009). "The role of experience in the Gambler's Fallacy" (PDF). Journal of Behavioral Decision Making. 
  13. "The Gambler's Fallacy - Explained". thecalculatorsite.com. Retrieved 7 May 2020. 
  14. Mortell, Manfred; Balkhy, Hanan H.; Tannous, Elias B.; Jong, Mei Thiee (July 2013). "Physician 'defiance' towards hand hygiene compliance: Is there a theory–practice–ethics gap?". Journal of the Saudi Heart Association. 25 (3): 203–208. PMC 3809478Freely accessible. PMID 24174860. doi:10.1016/j.jsha.2013.04.003. 
  15. "Semmelweis Reflex (Semmelweis Effect)". alleydog.com. Retrieved 7 May 2020. 
  16. "Bandwagon Effect". Retrieved 2007-03-09. 
  17. "The Bandwagon Effect". psychologytoday.com. Retrieved 7 May 2020. 
  18. 18.0 18.1 "Stereotypes Defined". stereotypeliberia.wordpress.com. Retrieved 10 April 2020. 
  19. Oxford Languages
  20. Fechner, Gustav Theodor (1966) [First published .1860]. Howes, D H; Boring, E G, eds. Elements of psychophysics [Elemente der Psychophysik]. volume 1. Translated by Adler, H E. United States of America: Holt, Rinehart and Winston. 
  21. "Weber's law". britannica.com. Retrieved 7 May 2020. 
  22. [1] Sibbald, M.D. "Report on the Progress of Psychological Medicine; German Psychological Literature", The Journal of Mental Science, Volume 13. 1867. p. 238
  23. "pareidolia". merriam-webster.com. Retrieved 7 May 2020. 
  24. Brian Righi. (2008). Chapter 4: Talking Boards and Ghostly Goo. In Ghosts, Apparitions and Poltergeists. Llewellyn Publications."An early example of this occurred in 1874 with he medium William Stanton Moses, who communicated with the spirits of two brothers who had recently died in India. Upon investigation, it was discovered that one week prior to the séance, their obituary had appeared in the newspaper. This was of some importance because Moses's communications with the two spirits contained nothing that wasn't already printed in the newspaper. When the spirits were pressed for further information, they were unable to provide any. Researchers concluded that Moses had seen the obituary, forgotten it, and then resurfaced the memory during the séance."
  25. Robert Todd Carroll. (2014). "Cryptomnesia". The Skeptic's Dictionary. Retrieved 2014-07-12.
  26. "cryptomnesia". dictionary.apa.org. Retrieved 7 May 2020. 
  27. "Mere Exposure Effect" (PDF). wiwi.europa-uni.de. Retrieved 10 April 2020. 
  28. "6 Conversion Principles You Can Learn From The Mere-Exposure Effect". marketingland.com. Retrieved 7 May 2020. 
  29. 29.0 29.1 "List of cognitive biases". uxinlux.github.io. Retrieved 25 July 2021. 
  30. Anonymous (E. Robert Kelly, 1882) The Alternative: A Study in Psychology. London: Macmillan and Co. p. 168.
  31. 31.0 31.1 Andersen H, Grush R (2009). "A brief history of time-consciousness: historical precursors to James and Husserl" (PDF). Journal of the History of Philosophy. 47 (2): 277–307. doi:10.1353/hph.0.0118. 
  32. James W (1893). The principles of psychology. New York: H. Holt and Company. p. 609. 
  33. Vlach, Haley A.; Sandhofer, Catherine M. "Distributing Learning Over Time: The Spacing Effect in Children's Acquisition and Generalization of Science Concepts". PMC 3399982Freely accessible. PMID 22616822. doi:10.1111/j.1467-8624.2012.01781.x. 
  34. James, W. (1890). Principles of Psychology. Retrieved from http://psychclassics.yorku.ca/James/Principles/
  35. Brown, Roger; McNeill, David. "The "tip of the tongue" phenomenon". doi:10.1016/S0022-5371(66)80040-3. 
  36. Bauer, P (2004). "Oh where, oh where have those early memories gone? A developmental perspective on childhood amnesia". Psychological Science Agenda. 18 (12). 
  37. "Childhood Amnesia". sciencedirect.com. Retrieved 7 May 2020. 
  38. "bandwagon effect". merriam-webster.com. Retrieved 7 April 2020. 
  39. "Bandwagon Effect - Biases & Heuristics". The Decision Lab. Retrieved 26 January 2021. 
  40. Sumner, William Graham. (1906). Folkways: A Study of the Social Importance of Usages, Manners, Customs, Mores, and Morals. Boston, MA: Ginn.
  41. Everett, Jim A. C.; Faber, Nadira S.; Crockett, Molly. "Preferences and beliefs in ingroup favoritism". PMC 4327620Freely accessible. PMID 25762906. doi:10.3389/fnbeh.2015.00015. 
  42. Abbott, Edwina (1909). "On the analysis of the factors of recall in the learning process". Psychological Monographs: General and Applied. 11 (1): 159–177. doi:10.1037/h0093018 – via Ovid. 
  43. Larsen, Douglas P.; Butler, Andrew C. (2013). Walsh, K., ed. Test-enhanced learning. In Oxford Textbook of Medical Education. pp. 443–452. 
  44. Goldstein, E. Bruce. Cognitive Psychology: Connecting Mind, Research and Everyday Experience. Cengage Learning. ISBN 978-1-133-00912-2. 
  45. "Why we gamble like monkeys". BBC.com. 2015-01-02. 
  46. "Gambler's Fallacy". investopedia.com. Retrieved 7 May 2020. 
  47. Feingold, CA (1914). "The influence of environment on identification of persons and things". Journal of Criminal Law and Police Science. 5 (1): 39–51. JSTOR 1133283. doi:10.2307/1133283. 
  48. Laub, Cindy E.; Meissner, Christian A.; Susa, Kyle J. "The Cross-Race Effect: Resistant to Instructions". doi:10.1155/2013/745836. 
  49. The Advanced Dictionary of Marketing, Scott G. Dacko, 2008: Marketing. Oxford: Oxford University Press. 2008-06-18. p. 248. ISBN 9780199286003. 
  50. 50.0 50.1 Thorndike 1920
  51. Sigall, Harold; Ostrove, Nancy (1975-03-01). "Beautiful but Dangerous: Effects of Offender Attractiveness and Nature of the Crime on Juridic Judgment". Journal of Personality and Social Psychology. 31 (3): 410–414. doi:10.1037/h0076472. 
  52. "Halo effect". britannica.com. Retrieved 7 May 2020. 
  53. "Definition of STEREOTYPE". www.merriam-webster.com. Retrieved 28 January 2021. 
  54. "Bluma Wulfovna Zeigarnik". The Science of Psychotherapy. 31 March 2014. Retrieved 16 March 2021. 
  55. Zeigarnik 1927: "Das Behalten erledigter und unerledigter Handlungen". Psychologische Forschung 9, 1-85.
  56. Zeigarnik 1927: "Das Behalten erledigter und unerledigter Handlungen". Psychologische Forschung 9, 1-85.
  57. "Zeigarnik Effect". goodtherapy.org. Retrieved 7 May 2020. 
  58. Fisher, Irving (1928), The Money Illusion, New York: Adelphi Company 
  59. Liberto, Daniel. "Money Illusion Definition". Investopedia. Retrieved 26 January 2021. 
  60. "The Specious Present: Andrew Beck, David Claerbout, Colin McCahon, Keith Tyson - Announcements - Art & Education". www.artandeducation.net. Retrieved 27 January 2021. 
  61. Fleming, G. W. T. H. (January 1933). "The Learning and Retention of Pleasant and Unpleasant Activities. (Arch. of Psychol., No. 134, 1932.) Cason, H.". Journal of Mental Science. 79 (324): 187–188. ISSN 0368-315X. doi:10.1192/bjp.79.324.187-c. 
  62. Skowronski, John J.; Walker, W. Richard; Henderson, Dawn X.; Bond, Gary D. "Chapter Three - The Fading Affect Bias: Its History, Its Implications, and Its Future". doi:10.1016/B978-0-12-800052-6.00003-2. 
  63. von Restorff, Hedwig (1933). "Über die Wirkung von Bereichsbildungen im Spurenfeld" [The effects of field formation in the trace field]. Psychologische Forschung [Psychological Research] (in Deutsch). 18 (1): 299–342. doi:10.1007/BF02409636. 
  64. "The Von Restorff effect". lawsofux.com. Retrieved 7 May 2020. 
  65. "The Einstellung Effect - Thinking Differently". Exploring your mind. 27 January 2020. Retrieved 18 April 2021. 
  66. "Einstellung Effect definition | Psychology Glossary | alleydog.com". www.alleydog.com. Retrieved 17 May 2021. 
  67. Duncker, K. (1945). "On problem solving". Psychological Monographs, 58:5 (Whole No. 270).
  68. "Functional fixedness". britannica.com. Retrieved 7 May 2020. 
  69. Batsidis, Apostolos; Tzavelas, George; Alexopoulos, Panagiotis. "Berkson's paradox and weighted distributions: An application to Alzheimer's disease". 
  70. "Berkson's Paradox (Berkson's Bias)". alleydog.com. Retrieved 14 August 2020. 
  71. Johnson, J. (2011). The arithmetic of compassion: rethinking the politics of photography. British Journal of Political Science, 41(3), 621-643. doi: 10.1017/S0007123410000487.
  72. "Joseph Stalin - Wikiquote". en.wikiquote.org. Retrieved 17 May 2021. 
  73. "Compassion fade". econowmics.com. Retrieved 15 January 2021. 
  74. Whyte, W. H., Jr. (March 1952). "Groupthink". Fortune. pp. 114–117, 142, 146. 
  75. Safire, William (8 August 2004). "THE WAY WE LIVE NOW: 8-8-04: ON LANGUAGE; Groupthink (Published 2004)". The New York Times. Retrieved 14 March 2021. 
  76. "The Psychology Behind Why We Strive for Consensus". Verywell Mind. 
  77. Festinger L (1954). "A theory of social comparison processes". Human Relations. 7 (2): 117–140. doi:10.1177/001872675400700202. 
  78. "Social Comparison Theory". psychologytoday.com. Retrieved 7 May 2020. 
  79. Meehl, Paul E. (1956). "Wanted – A Good Cookbook". American Psychologist. 11 (6): 263–272. doi:10.1037/h0044164. 
  80. Dutton, D. L. (1988). "The cold reading technique". Experientia. 44 (4): 326–332. PMID 3360083. doi:10.1007/BF01961271. 
  81. "Barnum Effect". britannica.com. Retrieved 7 May 2020. 
  82. Parkinson, C. Northcote (1958). Parkinson's Law, or the Pursuit of Progress. John Murray. ISBN 0140091076. 
  83. "How to Handle Bikeshedding: Parkinson's Law of Triviality". projectbliss.net. Retrieved 7 May 2020. 
  84. "The Curious Case of Confirmation Bias". psychologytoday.com. Retrieved 7 April 2020. 
  85. Acks, Alex. The Bubble of Confirmation Bias. 
  86. Myers, David G. Psychology. 
  87. "Confirmation Bias". simplypsychology.org. Retrieved 14 August 2020. 
  88. "The Curious Case of Confirmation Bias". psychologytoday.com. Retrieved 14 August 2020. 
  89. "Cognitive Bias in Decision Making". associationanalytics.com. Retrieved 7 May 2020. 
  90. Ellis RM (2015). Middle Way Philosophy: Omnibus Edition. Lulu Press. ISBN 9781326351892. 
  91. "Authority Bias". alleydog.com. Retrieved 14 August 2020. 
  92. Borcherding, Katrin; Laričev, Oleg Ivanovič; Messick, David M. (1990). Contemporary Issues in Decision Making. North-Holland. p. 50. ISBN 978-0-444-88618-7. 
  93. "Why we prefer options that are known to us". thedecisionlab.com. Retrieved 14 August 2020. 
  94. 94.0 94.1 Rubin, David C.; Baddeley, Alan D. (1989). "Telescoping is not time compression: A model". Memory & Cognition. 17 (6): 653–661. PMID 2811662. doi:10.3758/BF03202626. 
  95. "Telescoping effect - Biases & Heuristics". The Decision Lab. Retrieved 26 January 2021. 
  96. Abraham Kaplan (1964). The Conduct of Inquiry: Methodology for Behavioral Science. San Francisco: Chandler Publishing Co. p. 28. ISBN 9781412836296. 
  97. "Law of the instrument - Biases & Heuristics". The Decision Lab. Retrieved 27 January 2021. 
  98. Walster, Elaine (1966). "Assignment of responsibility for an accident.". Journal of Personality and Social Psychology. 3 (1): 73–79. doi:10.1037/h0022733. 
  99. "Defensive Attribution Hypothesis definition | Psychology Glossary | alleydog.com". www.alleydog.com. Retrieved 29 January 2021. 
  100. Adams, John (1985). Risk and Freedom: Record of Road Safety Regulation. Brefi Press. ISBN 9780948537059. 
  101. Flock, Elizabeth (2012-02-17). "Dagen H: The day Sweden switched sides of the road". Washington Post. On the day of the change, only 150 minor accidents were reported. Traffic accidents over the next few months went down. ... By 1969, however, accidents were back at normal levels 
  102. "On September 4 there were 125 reported traffic accidents as opposed to 130-196 from the previous Mondays. No traffic fatalities were linked to the switch. In fact, fatalities dropped for two years, possibly because drivers were more vigilant after the switch." Sweden finally began driving on the right side of the road in 1967 The Examiner Sept 2, 2009
  103. Mok, D; Gore, G; Hagel, B; Mok, E; Magdalinos, H; Pless, B. "Risk compensation in children's activities: A pilot study". PMC 2721187Freely accessible. PMID 19657519. doi:10.1093/pch/9.5.327. 
  104. Chapman, L (1967). "Illusory correlation in observational report". Journal of Verbal Learning and Verbal Behavior. 6 (1): 151–155. doi:10.1016/S0022-5371(67)80066-5. 
  105. Chapman, L.J (1967). "Illusory correlation in observational report". Journal of Verbal Learning. 6: 151–155. doi:10.1016/s0022-5371(67)80066-5. 
  106. "Illusory Correlation". psychology.iresearchnet.com. Retrieved 17 July 2020. 
  107. Jones, E. E.; Harris, V. A. (1967). "The attribution of attitudes". Journal of Experimental Social Psychology. 3 (1): 1–24. doi:10.1016/0022-1031(67)90034-0. 
  108. Ross, L. (1977). "The intuitive psychologist and his shortcomings: Distortions in the attribution process". In Berkowitz, L. Advances in experimental social psychology. 10. New York: Academic Press. pp. 173–220. ISBN 978-0-12-015210-0. 
  109. "Fundamental Attribution Error". simplypsychology.org. Retrieved 7 May 2020. 
  110. Edwards, Ward. "Conservatism in Human Information Processing (excerpted)". In Daniel Kahneman, Paul Slovic and Amos Tversky. (1982). Judgment under uncertainty: Heuristics and biases. New York: Cambridge University Press. Original work published 1968.
  111. "Conservatism Bias". dwassetmgmt.com. Retrieved 8 May 2020. 
  112. "Statistics How To". statisticshowto.com. Retrieved 7 April 2020. 
  113. "Pygmalion Effect". alleydog.com. Retrieved 7 May 2020. 
  114. "To Become Super-Likable, Practice "The Ben Franklin Effect"". medium.com. Retrieved 13 March 2020. 
  115. "Ben Franklin Effect". alleydog.com. Retrieved 7 May 2020. 
  116. "The suffix effect: How many positions are involved?" (PDF). link.springer.com. Retrieved 5 May 2020. 
  117. "Two-component theory of the suffix effect: Contrary evidence". link.springer.com. Retrieved 16 July 2020. 
  118. Malle, BF. "The actor-observer asymmetry in attribution: a (surprising) meta-analysis.". PMID 17073526. doi:10.1037/0033-2909.132.6.895. 
  119. "The actor-observer asymmetry in attribution: A (surprising) meta-analysis.". psycnet.apa.org. Retrieved 7 May 2020. 
  120. "Cognitive Bias: How Your Mind Plays Tricks on You and How to Overcome That at Work". zapier.com. Retrieved 15 January 2021. 
  121. "Cognitive Bias". sciencedirect.com. Retrieved 16 January 2021. 
  122. Fischhoff, B (2007). "An early history of hindsight research". Social Cognition. 25: 10–13. doi:10.1521/soco.2007.25.1.10. 
  123. "Hindsight bias". Encyclopedia Britannica. Retrieved 27 January 2021. 
  124. "Why are we overconfident in our predictions?". thedecisionlab.com. Retrieved 10 April 2020. 
  125. "Illusion Of Validity". alleydog.com. Retrieved 7 May 2020. 
  126. Brenner, Malcolm (1973). "The next-in-line effect" (PDF). Journal of Verbal Learning and Verbal Behavior. 12 (3): 320–323. doi:10.1016/s0022-5371(73)80076-3. 
  127. "Memory Flashcards". Quizlet. Retrieved 27 January 2021. 
  128. Loftus, Elizabeth F.; Palmer, John C. (1974). "Reconstruction of automobile destruction: An example of the interaction between language and memory". Journal of Verbal Learning and Verbal Behavior. 13 (5): 585–589. doi:10.1016/s0022-5371(74)80011-3. 
  129. "False memory". scholarpedia.org. Retrieved 14 August 2020. 
  130. Ralph, Kelcie; Delbosc, Alexa. "I'm multimodal, aren't you? How ego-centric anchoring biases experts' perceptions of travel patterns". doi:10.1016/j.tra.2017.04.027. 
  131. "Anchoring Bias - Definition, Overview and Examples". Corporate Finance Institute. Retrieved 27 January 2021. 
  132. Larson, James; Rutger U; Douglass Coll (1977). "Evidence for a self-serving bias in the attribution of causality". Journal of Personality. 45 (3): 430–441. doi:10.1111/j.1467-6494.1977.tb00162.x. 
  133. "What Is a Self-Serving Bias and What Are Some Examples of It?". healthline.com. Retrieved 7 May 2020. 
  134. Staw, Barry M. (1976). "Knee-deep in the big muddy: a study of escalating commitment to a chosen course of action". Organizational Behavior and Human Performance. 16 (1): 27–44. doi:10.1016/0030-5073(76)90005-2. 
  135. "Escalation of Commitment: Definition, Causes & Examples". bizfluent.com. Retrieved 7 May 2020. 
  136. Duncan, B. L. (1976). "Differential social perception and attribution if intergroup violence: Testing the lower limits of stereotyping of Blacks". Journal of Personality and Social Psychology. 34 (4): 75–93. doi:10.1037/0022-3514.34.4.590. 
  137. "APA Dictionary of Psychology". dictionary.apa.org. Retrieved 7 May 2020. 
  138. Brown, R., Kulik J. (1977). "Flashbulb memories". Cognition. 5: 73–99. doi:10.1016/0010-0277(77)90018-X. 
  139. "Misattribution Effect". sites.google.com. Retrieved 7 May 2020. 
  140. Ross, Lee; Greene, David; House, Pamela (1977). "The "false consensus effect": An egocentric bias in social perception and attribution processes". Journal of Experimental Social Psychology. 13 (3): 279–301. doi:10.1016/0022-1031(77)90049-x. 
  141. Alicke, Mark; Largo, Edward. "The Role of Self in the False Consensus Effect". doi:10.1006/jesp.1995.1002. 
  142. "False Consensus Effect". psychology.iresearchnet.com. Retrieved 14 January 2021. 
  143. "APA Dictionary of Psychology". dictionary.apa.org. Retrieved 29 January 2021. 
  144. Hasher, Lynn; Goldstein, David; Toppino, Thomas (1977). "Frequency and the conference of referential validity" (PDF). Journal of Verbal Learning and Verbal Behavior. 16 (1): 107–112. doi:10.1016/S0022-5371(77)80012-1. 
  145. Newman, Eryn J.; Sanson, Mevagh; Miller, Emily K.; Quigley-Mcbride, Adele; Foster, Jeffrey L.; Bernstein, Daniel M.; Garry, Maryanne (September 6, 2014). "People with Easier to Pronounce Names Promote Truthiness of Claims". PLOS ONE. 9 (2): e88671. PMC 3935838Freely accessible. PMID 24586368. doi:10.1371/journal.pone.0088671. 
  146. "Illusory Truth, Lies, and Political Propaganda: Part 1". psychologytoday.com. Retrieved 7 May 2020. 
  147. "Self-Reference Effect". psychology.iresearchnet.com. Retrieved 12 January 2021. 
  148. Bentley, Sarah V.; Greenaway, Katharine H.; Haslam, S. Alexander. "An online paradigm for exploring the self-reference effect". doi:10.1371/journal.pone.0176611. 
  149. "Self-Reference Effect - IResearchNet". Psychology. 12 January 2016. Retrieved 10 May 2021. 
  150. Zaragoza, Maria S.; Belli, Robert F.; Payment, Kristie E. "Misinformation Effectsand the Suggestibility of Eyewitness Memory". 
  151. "What Is Misinformation Effect?". growthramp.io. Retrieved 7 May 2020. 
  152. Rudy Hiller, Fernando. "How to (dis)solve Nagel's paradox about moral luck and responsibility". doi:10.1590/0100-6045.2016.V39N1.FRH. 
  153. "Moral Luck". philpapers.org. Retrieved 7 May 2020. 
  154. Pettigrew, T. F. (1979). "The ultimate attribution error: Extending Allport's cognitive analysis of prejudice". Personality and Social Psychology Bulletin. 5 (4): 461–476. doi:10.1177/014616727900500407. 
  155. Fraser Pettigrew, Thomas. "The Ultimate Attribution Error: Extending Allport's Cognitive Analysis of Prejudice". doi:10.1177/014616727900500407. 
  156. "Loss aversion". behavioraleconomics.com. Retrieved 14 August 2020. 
  157. "Why is the pain of losing felt twice as powerfully compared to equivalent gains?". thedecisionlab.com. Retrieved 14 August 2020. 
  158. Pezzo, Mark V.; Litman, Jordan A.; Pezzo, Stephanie P. (2006). "On the distinction between yuppies and hippies: Individual differences in prediction biases for planning future tasks". Personality and Individual Differences. 41 (7): 1359–1371. ISSN 0191-8869. doi:10.1016/j.paid.2006.03.029. 
  159. Kahneman, Daniel; Tversky, Amos (1977). "Intuitive prediction: Biases and corrective procedures" (PDF).  Decision Research Technical Report PTR-1042-77-6. In Kahneman, Daniel; Tversky, Amos (1982). "Intuitive prediction: Biases and corrective procedures". In Kahneman, Daniel; Slovic, Paul; Tversky, Amos. Judgment Under Uncertainty: Heuristics and Biases. Science. 185. pp. 414–421. ISBN 978-0511809477. PMID 17835457. doi:10.1017/CBO9780511809477.031. 
  160. Buehler, Roger; Griffin, Dale; Peetz, Johanna. "Chapter One - The Planning Fallacy: Cognitive, Motivational, and Social Origins". doi:10.1016/S0065-2601(10)43001-4. 
  161. Goleman, Daniel (1984-06-12). "A bias puts self at center of everything". The New York Times. Retrieved 2016-12-09. 
  162. "The Egocentric Bias: Why It's Hard to See Things from a Different Perspective". effectiviology.com. Retrieved 16 July 2020. 
  163. Hamill, Ruth; Wilson, Timothy D.; Nisbett, Richard E. (1980). "Insensitivity to sample bias: Generalizing from atypical cases" (PDF). Journal of Personality and Social Psychology. 39 (4): 578–589. doi:10.1037/0022-3514.39.4.578. 
  164. "group attribution error". dictionary.apa.org. Retrieved 14 August 2020. 
  165. Frazier, Kendrick (1986). Science Confronts the Paranormal. Prometheus Books. p. 101. 
  166. "Subjective Validation". alleydog.com. Retrieved 14 August 2020. 
  167. "Understanding the Optimism Bias". verywellmind.com. Retrieved 15 January 2021. 
  168. "Optimism Bias - Biases & Heuristics". The Decision Lab. Retrieved 28 January 2021. 
  169. 169.0 169.1 169.2 "Framing Effect - an overview | ScienceDirect Topics". www.sciencedirect.com. Retrieved 29 January 2021. 
  170. "Why do our decisions depend on how options are presented to us?". thedecisionlab.com. Retrieved 16 January 2021. 
  171. Tversky, A; Kahneman, D (30 January 1981). "The framing of decisions and the psychology of choice". Science. 211 (4481): 453–458. doi:10.1126/SCIENCE.7455683. 
  172. "Pseudocertainty effect". wiwi.europa-uni.de. Retrieved 14 August 2020. 
  173. Kammer, D. (1982). "Differences in trait ascriptions to self and friend: Unconfounding intensity from variability". Psychological Reports. 51 (1): 99–102. doi:10.2466/pr0.1982.51.1.99. 
  174. "Trait Ascription Bias". alleydog.com. Retrieved 14 August 2020. 
  175. "Decoy Effect definition". tactics.convertize.com. Retrieved 14 January 2021. 
  176. Mortimer, Gary. "The decoy effect: how you are influenced to choose without really knowing it". The Conversation. Retrieved 29 January 2021. 
  177. "Third-Person Effect". Encyclopedia of Survey Research Methods. 2008. doi:10.4135/9781412963947.n582. 
  178. Conners, Joan L. "Understanding the Third-Person Effect" (PDF). 
  179. "Third-Person Effect". alleydog.com. Retrieved 7 May 2020. 
  180. 180.0 180.1 Hakim, Catherine. Models of the Family in Modern Societies: Ideals and Realities: Ideals and Realities. 
  181. "Courtesy Bias". alleydog.com. Retrieved 14 August 2020. 
  182. "Disposition Effect". Behavioural Finance. Retrieved 11 January 2017. 
  183. "Disposition effect". behavioraleconomics.com. Retrieved 16 July 2020. 
  184. US, Joshua Miller,Adam Sanjurjo,The Conversation. "Momentum Isn’t Magic—Vindicating the Hot Hand with the Mathematics of Streaks". Scientific American. Retrieved 16 June 2021. 
  185. "Hot Hand Effect". psychology.iresearchnet.com. Retrieved 16 July 2020. 
  186. Geraci, Lisa; McDaniel, Mark A.; Miller, Tyler M.; Hughes, Matthew L. (2013-11-01). "The bizarreness effect: evidence for the critical influence of retrieval processes". Memory & Cognition. pp. 1228–1237. doi:10.3758/s13421-013-0335-4. 
  187. Iaccino, J. F.; Sowa, S. J. (February 1989). "Bizarre imagery in paired-associate learning: an effective mnemonic aid with mixed context, delayed testing, and self-paced conditions". Percept mot Skills. 68 (1): 307–16. PMID 2928063. doi:10.2466/pms.1989.68.1.307. 
  188. "The imagery bizarreness effect as a function of sentence complexity and presentation time" (PDF). link.springer.com. Retrieved 18 June 2021. 
  189. "Bizarreness effect". britannica.com. Retrieved 16 July 2020. 
  190. Lee Ross, Constance A. Stillinger, "Psychological barriers to conflict resolution", Stanford Center on Conflict and Negotiation, Stanford University, 1988, p. 4
  191. "Why we often tend to devalue proposals made by people who we consider to be adversaries". thedecisionlab.com. Retrieved 22 September 2020. 
  192. Samuelson, W.; Zeckhauser, R. (1988). "Status quo bias in decision making". Journal of Risk and Uncertainty. 1: 7–59. doi:10.1007/bf00055564. 
  193. "Status Quo Bias: What It Means and How It Affects Your Behavior". thoughtco.com. Retrieved 22 September 2020. 
  194. "The Curse of Knowledge: What It Is and How to Account for It". effectiviology.com. Retrieved 6 May 2020. 
  195. Atladóttir, Kristín. "The Endowment Effect and other biases in creative goods transactions" (PDF). ISSN 1670-8288. 
  196. Bruno, Michael A. "256 Shades of gray: uncertainty and diagnostic error in radiology". doi:10.1515/dx-2017-0006. 
  197. Ashman, C. J.; Yu, J. S.; Wolfman, D. (August 2000). "Satisfaction of search in osteoradiology". AJR. American journal of roentgenology. 175 (2): 541–544. ISSN 0361-803X. doi:10.2214/ajr.175.2.1750541. Retrieved 27 January 2021. 
  198. "Cognitive biases". catalog.library.vanderbilt.edu. Retrieved 25 July 2021. 
  199. "Self-Enhancement and Superiority Biases in Social Comparison". researchgate.net. Retrieved 14 August 2020. 
  200. "Illusory Superiority". alleydog.com. Retrieved 7 May 2020. 
  201. "The Courtesy Bias". smallbusinessforum.co. Retrieved 14 August 2020. 
  202. ""Women Are Wonderful" Effect". scribd.com. Retrieved 10 April 2020. 
  203. ""women are wonderful" effect". crazyfacts.com. Retrieved 18 July 2020. 
  204. Tversky, Amos; Koehler, Derek J. (October 1994). "Support theory: A nonextensional representation of subjective probability.". Psychological Review. 101 (4): 547–567. doi:10.1037/0033-295X.101.4.547. 
  205. "Today's term from psychology is Subadditivity Effect.". steemit.com. Retrieved 7 May 2020. 
  206. "PROJECT IMPLICIT LECTURES AND WORKSHOPS". projectimplicit.net. Retrieved 12 March 2020. 
  207. "Implicit Bias". plato.stanford.edu. Retrieved 8 May 2020. 
  208. Kahneman, D. & Tversky, A. (1996). "On the reality of cognitive illusions" (PDF). Psychological Review. 103 (3): 582–591. PMID 8759048. doi:10.1037/0033-295X.103.3.582. 
  209. S.X. Zhang; J. Cueto (2015). "The Study of Bias in Entrepreneurship". Entrepreneurship Theory and Practice. 41 (3): 419–454. doi:10.1111/etap.12212. 
  210. Medway, Dominic; Foos, Adrienne; Goatman, Anna. "Impact bias in student evaluations of higher education". Studies in Higher Education. doi:10.1080/03075079.2015.1071345. Retrieved 7 May 2020. 
  211. Medway, Dominic; Foos, Adrienne; Goatman, Anna. "Impact bias in student evaluations of higher education". Studies in Higher Education. doi:10.1080/03075079.2015.1071345. Retrieved 7 May 2020. 
  212. Greenwald, Anthony G.; McGhee, Debbie E.; Schwartz, Jordan L.K. (1998), "Measuring Individual Differences in Implicit Cognition: The Implicit Association Test", Journal of Personality and Social Psychology, 74 (6): 1464–1480, PMID 9654756, doi:10.1037/0022-3514.74.6.1464 
  213. "The Implicit Association Test (IAT) - iMotions". Imotions Publish. 15 December 2020. Retrieved 17 May 2021. 
  214. "Implicit Association Test". www.projectimplicit.net. Retrieved 17 May 2021. 
  215. Hsee, Christopher K. (1998). "Less Is Better: When Low-value Options Are Valued More Highly than High-value Options" (PDF). Journal of Behavioral Decision Making. 11 (2): 107–121. doi:10.1002/(SICI)1099-0771(199806)11:2<107::AID-BDM292>3.0.CO;2-Y. 
  216. "Why we prefer the smaller or the lesser alternative". thedecisionlab.com. Retrieved 7 May 2020. 
  217. Kruger, Justin; Dunning, David (1999). "Unskilled and Unaware of It: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments". Journal of Personality and Social Psychology. 77 (6): 1121–1134. PMID 10626367. doi:10.1037/0022-3514.77.6.1121. 
  218. "Dunning-Kruger Effect". psychologytoday.com. Retrieved 14 August 2020. 
  219. Gilovich, T.; Medvec, V. H.; Savitsky, K. (2000). "The spotlight effect in social judgment: An egocentric bias in estimates of the salience of one's own actions and appearance" (PDF). Journal of Personality and Social Psychology. 78 (2): 211–222. PMID 10707330. doi:10.1037//0022-3514.78.2.211. 
  220. "The Spotlight Effect". psychologytoday.com. Retrieved 14 August 2020. 
  221. Kruger, Justin; Gilovich, Thomas (1999). "'Naive cynicism' in everyday theories of responsibility assessment: On biased assumptions of bias.". Journal of Personality and Social Psychology. 76 (5): 743–753. doi:10.1037/0022-3514.76.5.743. 
  222. "Naive Cynicism". psychology.iresearchnet.com. Retrieved 16 July 2020. 
  223. Kahneman, Daniel; Frederick, Shane (2002). "Representativeness Revisited: Attribute Substitution in Intuitive Judgment". In Thomas Gilovich; Dale Griffin; Daniel Kahneman. Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge: Cambridge University Press. pp. 49–81. ISBN 978-0-521-79679-8. 
  224. "Attribute substitution- a quick guide". biasandbelief.wordpress.com. Retrieved 7 May 2020. 
  225. Gearon, Michael (12 February 2019). "Cognitive Biases — Framing effect". Medium. Retrieved 6 March 2021. 
  226. "Definition". tactics.convertize.com. Retrieved 6 March 2021. 
  227. 227.0 227.1 Pronin, Emily; Lin, Daniel Y.; Ross, Lee. "The Bias Blind Spot: Perceptions of Bias in Self Versus Others". doi:10.1177/0146167202286008. 
  228. Garcia, S.M.; Weaver, K.; Darley, J.M.; Moskowitz, G.B. (2002). "Crowded minds: the implicit bystander effect". Journal of Personality and Social Psychology. 83 (4): 843–853. PMID 12374439. doi:10.1037/0022-3514.83.4.843. 
  229. "Bystander Effect". psychologytoday.com. Retrieved 7 May 2020. 
  230. "Kahneman receives Nobel Prize at ceremony". Princeton University. Retrieved 16 June 2021. 
  231. "Psychologist wins Nobel Prize". www.apa.org. Retrieved 16 June 2021. 
  232. Chen, Full Bio Follow Linkedin Follow Twitter James; Investing, Is the Former Director of; trader, trading content at Investopedia He is an expert; Adviser, Investment; Chen, global market strategist Learn about our editorial policies James. "Prospect Theory". Investopedia. Retrieved 16 June 2021. 
  233. Frederick, Shane; Loewenstein, George; O'Donoghue, Ted (2011). "Time Discounting and Time Preference: A Critical Review". In Camerer, Colin F.; Loewenstein, George; Rabin, Matthew. Advances in Behavioral Economics. Princeton University Press. pp. 187–188. ISBN 978-1400829118. 
  234. "Projection bias". behavioraleconomics.com. Retrieved 7 May 2020. 
  235. Lovallo, Dan; Kahneman, Daniel (July 2003). "Delusions of Success: How Optimism Undermines Executives' Decisions". Harvard Business Review. 81 (7): 56–63. PMID 12858711. 
  236. Buehler, Roger; Griffin, Dale; Peetz, Johanna (2010). "The Planning Fallacy". Advances in Experimental Social Psychology. 43: 1–62. doi:10.1016/S0065-2601(10)43001-4. 
  237. Kim, S.; Goldstein, D.; Hasher, L.; Zacks, R. T. (1 July 2005). "Framing Effects in Younger and Older Adults". The Journals of Gerontology Series B: Psychological Sciences and Social Sciences. 60 (4): P215–P218. doi:10.1093/geronb/60.4.P215. 
  238. 238.0 238.1 "4 examples of herd mentality (and how to take advantage of it)". iwillteachyoutoberich.com. Retrieved 27 January 2021. 
  239. Pohl, Rüdiger; Pohl, Rüdiger F. (2004). Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory. Psychology Press. ISBN 978-1-84169-351-4. 
  240. Hsee, Christopher K.; Zhang, Jiao. "General Evaluability Theory". doi:10.1177/1745691610374586. 
  241. Marfice, Christina. "How to Use the Framing Effect to Sell More Products". www.plytix.com. Retrieved 6 March 2021. 
  242. "Overcoming Bias". overcomingbias.com. Retrieved 13 March 2020. 
  243. "The "Ostrich Effect" and the Relationship between the Liquidity and the Yields of Financial Assets". The Journal of Business. doi:10.2139/ssrn.431180. 
  244. "Ostrich Effect". thinkingcollaborative.com. Retrieved 8 May 2020. 
  245. 245.0 245.1 Rickford, John R.; Wasow, Thomas; Zwicky, Arnold (2007). "Intensive and quotative all: something new, something old". American Speech. 82 (1): 3–31. doi:10.1215/00031283-2007-001Freely accessible. 
  246. 246.0 246.1 "Climate Change 3: The Grand Narrative Availability Cascade is Making Us Stupid". americanexperiment.org. Retrieved 14 January 2021. 
  247. "APA PsycNet". psycnet.apa.org. Retrieved 28 July 2021. 
  248. Hamblin, James (November 4, 2013). "Cheerleader Effect: Why People Are More Beautiful in Groups". The Atlantic. Retrieved December 5, 2015. 
  249. Carragher, Daniel J.; Thomas, Nicole A.; Gwinn, O. Scott; Nicholls, Mike E. R. "Limited evidence of hierarchical encoding in the cheerleader effect". 
  250. "Why We Spend Coins Faster Than Bills". NPR. May 12, 2009. Retrieved 7 April 2020. 
  251. "Denomination effect". nlpnotes.com. Retrieved 7 May 2020. 
  252. "Pdf." (PDF). 
  253. "The Backfire Effect: Why Facts Don't Always Change Minds – Effectiviology". effectiviology.com. Retrieved 27 January 2021. 
  254. Ross, Lee; Lepper, Mark; Ward, Andrew (30 June 2010). "History of Social Psychology: Insights, Challenges, and Contributions to Theory and Application". Handbook of Social Psychology: socpsy001001. doi:10.1002/9780470561119.socpsy001001. 
  255. "Naive Realism". psychology.iresearchnet.com. Retrieved 17 July 2020. 
  256. Chang, Che-hsu Joe; Menéndez, Cammie Chaumont; Robertson, Michelle M.; Amick, Benjamin C.; Johnson, Peter W.; del Pino, Rosa J.; Dennerlein, Jack T. (November 2010). "Daily self-reports resulted in information bias when assessing exposure duration to computer use". American Journal of Industrial Medicine. 53 (11): 1142–1149. doi:10.1002/ajim.20878. 
  257. "Information bias". Catalog of Bias. 13 November 2019. Retrieved 25 July 2021. 
  258. "Information Bias". catalogofbias.org. Retrieved 22 September 2020. 
  259. Serfas, Sebastian (6 December 2010). Cognitive Biases in the Capital Investment Context: Theoretical Considerations and Empirical Experiments on Violations of Normative Rationality. Springer Science & Business Media. ISBN 978-3-8349-6485-4. 
  260. "Cognitive Biases — The IKEA Effect". medium.com. Retrieved 14 August 2020. 
  261. "What is the Ikea Effect?". bloomreach.com. Retrieved 7 May 2020. 
  262. "Thinking, Fast and Slow". www.goodreads.com. Retrieved 16 June 2021. 
  263. "Marketers Need To Be Aware Of Cognitive Bias". thecustomer.net. Retrieved 12 March 2020. 
  264. "Study Finds That Memory Works Differently in the Age of Google". Columbia University. July 14, 2011. 
  265. "The Google Effect and Digital Amnesia: How We Use Machines to Remember". effectiviology.com. Retrieved 16 July 2020. 
  266. Tom Chivers (2011-12-13). "An unconfirmed sighting of the elusive Higgs boson". Daily Telegraph. 
  267. "When a statistically significant observation should be overlooked.". thedecisionlab.com. Retrieved 7 May 2020. 
  268. Buonomano, Dean (11 July 2011). Brain Bugs: How the Brain's Flaws Shape Our Lives. W. W. Norton & Company. ISBN 978-0-393-08195-4. 
  269. Banaji, Mahzarin R. (18 April 2014). Blindspot: Hidden Biases of Good People. Penguin Books Limited. ISBN 978-81-8475-930-3. 
  270. Quoidbach, Jordi; Gilbert, Daniel T.; Wilson, Timothy D. (2013-01-04). "The End of History Illusion" (PDF). Science. 339 (6115): 96–98. PMID 23288539. doi:10.1126/science.1229294. Young people, middle-aged people, and older people all believed they had changed a lot in the past but would change relatively little in the future. 
  271. "Why You Won't Be the Person You Expect to Be". nytimes.com. Retrieved 7 May 2020. 
  272. "The Art of Thinking Clearly" (PDF). xqdoc.imedao.com. Retrieved 28 July 2021. 
  273. Nantchev, Adrian. 50 Cognitive Biases for an Unfair Advantage in Entrepreneurship. CreateSpace Independent Publishing Platform. ISBN 978-1-5376-0327-8. 
  274. Priest, Henry. BIASES and HEURISTICS: The Complete Collection of Cognitive Biases and Heuristics That Impair Decisions in Banking, Finance and Everything Else. Amazon Digital Services LLC - KDP Print US. ISBN 978-1-0784-3231-3. 
  275. "Cognitive biases". trends.google.com. Retrieved 15 January 2021. 
  276. "Google Books Ngram Viewer". books.google.com. Retrieved 28 January 2021. 
  277. "Cognitive biases". wikipediaviews.org. Retrieved 19 January 2021.