Difference between revisions of "Timeline of cognitive biases"

From Timelines
Jump to: navigation, search
Line 276: Line 276:
 
| 1998 || || Concept development || The {{w|implicit-association test}} is introduced in the scientific literature by {{w|Anthony Greenwald}}, Debbie McGhee, and Jordan Schwartz.<ref name = "Greenwald 1998">{{Citation | title = Measuring Individual Differences in Implicit Cognition: The Implicit Association Test | year = 1998 | journal = Journal of Personality and Social Psychology | pages = 1464–1480 | volume = 74 | issue = 6 | last1 = Greenwald| first1 =  Anthony G. | last2 =  McGhee | first2 =  Debbie E. | last3 =  Schwartz | first3 =  Jordan L.K. | doi=10.1037/0022-3514.74.6.1464 | pmid=9654756}}</ref> || The {{w|implicit-association test}} is "a reaction time based categorization task that measures the differential associative strength between bipolar targets and evaluative attribute concepts as an approach to indexing implicit beliefs or biases."<ref>{{cite journal |last1=Healy |first1=Graham F. |last2=Boran |first2=Lorraine |last3=Smeaton |first3=Alan F. |title=Neural Patterns of the Implicit Association Test |doi=10.3389/fnhum.2015.00605 |pmid=26635570 |url=https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4656831/ |pmc=4656831}}</ref>
 
| 1998 || || Concept development || The {{w|implicit-association test}} is introduced in the scientific literature by {{w|Anthony Greenwald}}, Debbie McGhee, and Jordan Schwartz.<ref name = "Greenwald 1998">{{Citation | title = Measuring Individual Differences in Implicit Cognition: The Implicit Association Test | year = 1998 | journal = Journal of Personality and Social Psychology | pages = 1464–1480 | volume = 74 | issue = 6 | last1 = Greenwald| first1 =  Anthony G. | last2 =  McGhee | first2 =  Debbie E. | last3 =  Schwartz | first3 =  Jordan L.K. | doi=10.1037/0022-3514.74.6.1464 | pmid=9654756}}</ref> || The {{w|implicit-association test}} is "a reaction time based categorization task that measures the differential associative strength between bipolar targets and evaluative attribute concepts as an approach to indexing implicit beliefs or biases."<ref>{{cite journal |last1=Healy |first1=Graham F. |last2=Boran |first2=Lorraine |last3=Smeaton |first3=Alan F. |title=Neural Patterns of the Implicit Association Test |doi=10.3389/fnhum.2015.00605 |pmid=26635570 |url=https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4656831/ |pmc=4656831}}</ref>
 
|-
 
|-
| 1998 || Belief, decision-making and behavioral ({{w|extension neglect}}) || Concept development|| Hsee discovers a less-is-better effect in three contexts: "(1) a person giving a $45 scarf (from scarves ranging from $5-$50) as a gift was perceived to be more generous than one giving a $55 coat (from coats ranging from $50-$500); (2) an overfilled ice cream serving with 7 oz of ice cream was valued more than an underfilled serving with 8 oz of ice cream; (3) a dinnerware set with 24 intact pieces was judged more favourably than one with 31 intact pieces (including the same 24) plus a few broken ones."<ref name="hsee">{{cite journal|last=Hsee|first=Christopher K.|title=Less Is Better: When Low-value Options Are Valued More Highly than High-value Options|journal=Journal of Behavioral Decision Making|year=1998|volume=11|issue=2|pages=107–121|doi=10.1002/(SICI)1099-0771(199806)11:2<107::AID-BDM292>3.0.CO;2-Y |url=http://faculty.chicagobooth.edu/christopher.hsee/vita/papers/LessIsBetter.pdf}}</ref> || "The {{w|less-is-better effect}} is the tendency to prefer the smaller or the lesser alternative when choosing individually, but not when evaluating together."<ref>{{cite web |title=Why we prefer the smaller or the lesser alternative |url=https://thedecisionlab.com/biases/less-is-better-effect/ |website=thedecisionlab.com |accessdate=7 May 2020}}</ref>
+
| 1998 || Belief, decision-making and behavioral ({{w|extension neglect}}) || Concept development || Hsee discovers a less-is-better effect in three contexts: "(1) a person giving a $45 scarf (from scarves ranging from $5-$50) as a gift was perceived to be more generous than one giving a $55 coat (from coats ranging from $50-$500); (2) an overfilled ice cream serving with 7 oz of ice cream was valued more than an underfilled serving with 8 oz of ice cream; (3) a dinnerware set with 24 intact pieces was judged more favourably than one with 31 intact pieces (including the same 24) plus a few broken ones."<ref name="hsee">{{cite journal|last=Hsee|first=Christopher K.|title=Less Is Better: When Low-value Options Are Valued More Highly than High-value Options|journal=Journal of Behavioral Decision Making|year=1998|volume=11|issue=2|pages=107–121|doi=10.1002/(SICI)1099-0771(199806)11:2<107::AID-BDM292>3.0.CO;2-Y |url=http://faculty.chicagobooth.edu/christopher.hsee/vita/papers/LessIsBetter.pdf}}</ref> || "The {{w|less-is-better effect}} is the tendency to prefer the smaller or the lesser alternative when choosing individually, but not when evaluating together."<ref>{{cite web |title=Why we prefer the smaller or the lesser alternative |url=https://thedecisionlab.com/biases/less-is-better-effect/ |website=thedecisionlab.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 1999 || Belief, decision-making and behavioral || Concept development|| The psychological phenomenon of illusory superiority known as {{w|Dunning–Kruger effect}} is identified as a form of cognitive bias in Kruger and Dunning's 1999 study, ''Unskilled and Unaware of It: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments''.<ref name="Kruger">{{cite journal |last=Kruger |first=Justin |last2=Dunning |first2=David |date=1999 |title=Unskilled and Unaware of It: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments |journal={{w|Journal of Personality and Social Psychology}} |volume=77 |issue=6 |pages=1121–1134|doi=10.1037/0022-3514.77.6.1121 |pmid=10626367}}</ref> || "The Dunning-Kruger effect is a cognitive bias in which people wrongly overestimate their knowledge or ability in a specific area."<ref>{{cite web |title=Dunning-Kruger Effect |url=https://www.psychologytoday.com/intl/basics/dunning-kruger-effect |website=psychologytoday.com |accessdate=14 August 2020}}</ref>
+
| 1999 || Belief, decision-making and behavioral || Concept development || The psychological phenomenon of illusory superiority known as {{w|Dunning–Kruger effect}} is identified as a form of cognitive bias in Kruger and Dunning's 1999 study, ''Unskilled and Unaware of It: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments''.<ref name="Kruger">{{cite journal |last=Kruger |first=Justin |last2=Dunning |first2=David |date=1999 |title=Unskilled and Unaware of It: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments |journal={{w|Journal of Personality and Social Psychology}} |volume=77 |issue=6 |pages=1121–1134|doi=10.1037/0022-3514.77.6.1121 |pmid=10626367}}</ref> || "The Dunning-Kruger effect is a cognitive bias in which people wrongly overestimate their knowledge or ability in a specific area."<ref>{{cite web |title=Dunning-Kruger Effect |url=https://www.psychologytoday.com/intl/basics/dunning-kruger-effect |website=psychologytoday.com |accessdate=14 August 2020}}</ref>
 
|-
 
|-
| 1999 || Memory bias || Concept development|| The term "{{w|spotlight effect}}" is coined by {{w|Thomas Gilovich}} and Kenneth Savitsky.<ref name=":0">{{Cite journal |pmid = 10707330|year = 2000|last1 = Gilovich|first1 = T.|title = The spotlight effect in social judgment: An egocentric bias in estimates of the salience of one's own actions and appearance|journal = Journal of Personality and Social Psychology|volume = 78|issue = 2|pages = 211–222|last2 = Medvec|first2 = V. H.|last3 = Savitsky|first3 = K.|doi = 10.1037//0022-3514.78.2.211|url=https://web.archive.org/web/20131030215508/http://www.psych.cornell.edu/sites/default/files/Gilo.Medvec.Sav_.pdf}}</ref>  The phenomenon first appears in the world of psychology in the journal ''{{w|Current Directions in Psychological Science}}''. || "The {{w|spotlight effect}} refers to the tendency to think that more people notice something about you than they do."<ref>{{cite web |title=The Spotlight Effect |url=https://www.psychologytoday.com/us/blog/the-big-questions/201111/the-spotlight-effect |website=psychologytoday.com |accessdate=14 August 2020}}</ref>
+
| 1999 || Memory bias || Concept development || The term "{{w|spotlight effect}}" is coined by {{w|Thomas Gilovich}} and Kenneth Savitsky.<ref name=":0">{{Cite journal |pmid = 10707330|year = 2000|last1 = Gilovich|first1 = T.|title = The spotlight effect in social judgment: An egocentric bias in estimates of the salience of one's own actions and appearance|journal = Journal of Personality and Social Psychology|volume = 78|issue = 2|pages = 211–222|last2 = Medvec|first2 = V. H.|last3 = Savitsky|first3 = K.|doi = 10.1037//0022-3514.78.2.211|url=https://web.archive.org/web/20131030215508/http://www.psych.cornell.edu/sites/default/files/Gilo.Medvec.Sav_.pdf}}</ref>  The phenomenon first appears in the world of psychology in the journal ''{{w|Current Directions in Psychological Science}}''. || "The {{w|spotlight effect}} refers to the tendency to think that more people notice something about you than they do."<ref>{{cite web |title=The Spotlight Effect |url=https://www.psychologytoday.com/us/blog/the-big-questions/201111/the-spotlight-effect |website=psychologytoday.com |accessdate=14 August 2020}}</ref>
 
|-
 
|-
| 1999 || Social ({{w|egocentric bias}}) || Concept development|| Kruger and Gilovich publish study titled ''Naive cynicism in everyday theories of responsibility assessment: On biased assumptions of bias'', which formally introduces the concept of {{w|naïve cynicism}}.<ref name="Kruger 1999">{{cite journal|last1=Kruger|first1=Justin|last2=Gilovich|first2=Thomas|title='Naive cynicism' in everyday theories of responsibility assessment: On biased assumptions of bias.|journal=Journal of Personality and Social Psychology|date=1999|volume=76|issue=5|pages=743–753|doi=10.1037/0022-3514.76.5.743}}</ref> [[economics]],<ref name="Heath 2006">{{cite journal|last1=Heath|first1=Joseph|title=Business ethics without stakeholders|journal=Business Ethics Quarterly|volume=16|issue=4|pages=533–557|url=http://benjaminferguson.org/wp-content/uploads/2013/01/Heath-2006-Business-Ethics-Quarterly.pdf|doi=10.5840/beq200616448|date=2006}}</ref> || {{w|Naïve cynicism}} is "the tendency of laypeople to expect other people’s judgments will have a motivational basis and therefore will be biased in the direction of their self-interest."<ref>{{cite web |title=Naive Cynicism |url=http://psychology.iresearchnet.com/social-psychology/decision-making/naive-cynicism/ |website=psychology.iresearchnet.com |accessdate=16 July 2020}}</ref>
+
| 1999 || Social ({{w|egocentric bias}}) || Concept development || Kruger and Gilovich publish study titled ''Naive cynicism in everyday theories of responsibility assessment: On biased assumptions of bias'', which formally introduces the concept of {{w|naïve cynicism}}.<ref name="Kruger 1999">{{cite journal|last1=Kruger|first1=Justin|last2=Gilovich|first2=Thomas|title='Naive cynicism' in everyday theories of responsibility assessment: On biased assumptions of bias.|journal=Journal of Personality and Social Psychology|date=1999|volume=76|issue=5|pages=743–753|doi=10.1037/0022-3514.76.5.743}}</ref> || {{w|Naïve cynicism}} is "the tendency of laypeople to expect other people’s judgments will have a motivational basis and therefore will be biased in the direction of their self-interest."<ref>{{cite web |title=Naive Cynicism |url=http://psychology.iresearchnet.com/social-psychology/decision-making/naive-cynicism/ |website=psychology.iresearchnet.com |accessdate=16 July 2020}}</ref>
 
|-
 
|-
| 2002 || Belief, decision-making and behavioral || Concept development|| {{w|Daniel Kahneman}} and {{w|Shane Frederick}} propose the process of {{w|attribute substitution}}.<ref name="revisited">{{cite book |last= Kahneman |first=Daniel |first2=Shane |last2=Frederick  |title=Heuristics and Biases: The Psychology of Intuitive Judgment |editor=Thomas Gilovich |editor2=Dale Griffin |editor3=Daniel Kahneman |publisher =Cambridge University Press |location=Cambridge |year=2002 |pages=49–81 |chapter=Representativeness Revisited: Attribute Substitution in Intuitive Judgment |isbn=978-0-521-79679-8}}</ref> || "{{w|Attribute substitution}} occurs when an individual has to make a judgment (of a target attribute) that is computationally complex, and instead substitutes a more easily calculated heuristic attribute."<ref>{{cite web |title=Attribute substitution- a quick guide |url=https://biasandbelief.wordpress.com/2009/06/01/attribute-substitution/ |website=biasandbelief.wordpress.com |accessdate=7 May 2020}}</ref>
+
| 2002 || Belief, decision-making and behavioral || Concept development || {{w|Daniel Kahneman}} and {{w|Shane Frederick}} propose the process of {{w|attribute substitution}}.<ref name="revisited">{{cite book |last= Kahneman |first=Daniel |first2=Shane |last2=Frederick  |title=Heuristics and Biases: The Psychology of Intuitive Judgment |editor=Thomas Gilovich |editor2=Dale Griffin |editor3=Daniel Kahneman |publisher =Cambridge University Press |location=Cambridge |year=2002 |pages=49–81 |chapter=Representativeness Revisited: Attribute Substitution in Intuitive Judgment |isbn=978-0-521-79679-8}}</ref> || "{{w|Attribute substitution}} occurs when an individual has to make a judgment (of a target attribute) that is computationally complex, and instead substitutes a more easily calculated heuristic attribute."<ref>{{cite web |title=Attribute substitution- a quick guide |url=https://biasandbelief.wordpress.com/2009/06/01/attribute-substitution/ |website=biasandbelief.wordpress.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 2002 || Social ({{w|egocentric bias}}) || Concept development|| Pronin et al. introduce the concept of "{{w|bias blind spot}}".<ref name=dfds>{{cite journal |last1=Pronin |first1=Emily |last2=Lin |first2=Daniel Y. |last3=Ross |first3=Lee |title=The Bias Blind Spot: Perceptions of Bias in Self Versus Others |doi=10.1177/0146167202286008 |url=https://www.researchgate.net/publication/241096502_The_Bias_Blind_Spot_Perceptions_of_Bias_in_Self_Versus_Others#:~:text=(2002)%20call%20the%20%22bias,biases%20in%20their%20own%20thinking.}}</ref> || Bias blind spot "refers to the tendency for people to be able to identify distortionary biases in others, while being ignorant of and susceptible to precisely these biases in their own thinking."<ref name=dfds/>
+
| 2002 || Social ({{w|egocentric bias}}) || Concept development || Pronin et al. introduce the concept of "{{w|bias blind spot}}".<ref name=dfds>{{cite journal |last1=Pronin |first1=Emily |last2=Lin |first2=Daniel Y. |last3=Ross |first3=Lee |title=The Bias Blind Spot: Perceptions of Bias in Self Versus Others |doi=10.1177/0146167202286008 |url=https://www.researchgate.net/publication/241096502_The_Bias_Blind_Spot_Perceptions_of_Bias_in_Self_Versus_Others#:~:text=(2002)%20call%20the%20%22bias,biases%20in%20their%20own%20thinking.}}</ref> || Bias blind spot "refers to the tendency for people to be able to identify distortionary biases in others, while being ignorant of and susceptible to precisely these biases in their own thinking."<ref name=dfds/>
 
|-
 
|-
 
| 2002 || || Research || {{w|Bystander effect}}. Research indicates that priming a social context may inhibit helping behavior. Imagining being around one other person or being around a group of people can affect a person's willingness to help.<ref>{{cite journal | last1 = Garcia | first1 = S.M. | last2 = Weaver | first2 = K. | last3 = Darley | first3 = J.M. | last4 = Moskowitz | first4 = G.B. | year = 2002 | title = Crowded minds: the implicit bystander effect | url = | journal = Journal of Personality and Social Psychology | volume = 83 | issue = 4| pages = 843–853 | doi=10.1037/0022-3514.83.4.843| pmid = 12374439 }}</ref> || "The bystander effect occurs when the presence of others discourages an individual from intervening in an emergency situation."<ref>{{cite web |title=Bystander Effect |url=https://www.psychologytoday.com/intl/basics/bystander-effect |website=psychologytoday.com |accessdate=7 May 2020}}</ref>
 
| 2002 || || Research || {{w|Bystander effect}}. Research indicates that priming a social context may inhibit helping behavior. Imagining being around one other person or being around a group of people can affect a person's willingness to help.<ref>{{cite journal | last1 = Garcia | first1 = S.M. | last2 = Weaver | first2 = K. | last3 = Darley | first3 = J.M. | last4 = Moskowitz | first4 = G.B. | year = 2002 | title = Crowded minds: the implicit bystander effect | url = | journal = Journal of Personality and Social Psychology | volume = 83 | issue = 4| pages = 843–853 | doi=10.1037/0022-3514.83.4.843| pmid = 12374439 }}</ref> || "The bystander effect occurs when the presence of others discourages an individual from intervening in an emergency situation."<ref>{{cite web |title=Bystander Effect |url=https://www.psychologytoday.com/intl/basics/bystander-effect |website=psychologytoday.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 2003 || Belief, decision-making and behavioral || Concept development|| The term ''{{w|projection bias}}'' is first introduced in the paper ''Projection Bias in Predicting Future Utility'' by Loewenstein, O'Donoghue and Rabin.<ref name=Frederick2011>{{cite book|last1=Frederick|first1=Shane|last2=Loewenstein|first2=George|last3=O'Donoghue|first3=Ted|editor1-last=Camerer|editor1-first=Colin F.|editor2-last=Loewenstein|editor2-first=George|editor3-last=Rabin|editor3-first=Matthew|title=Advances in Behavioral Economics|date=2011|publisher=Princeton University Press|isbn=978-1400829118|pages=187–188|chapter-url=https://books.google.com/books?id=sA4jJOjwCW4C&pg=PA187|language=en|chapter=Time Discounting and Time Preference: A Critical Review|ref=harv}}</ref> || "It refers to people’s assumption that their tastes or preferences will remain the same over time"<ref>{{cite web |title=Projection bias |url=https://www.behavioraleconomics.com/resources/mini-encyclopedia-of-be/projection-bias/ |website=behavioraleconomics.com |accessdate=7 May 2020}}</ref>
+
| 2003 || Belief, decision-making and behavioral || Concept development || The term ''{{w|projection bias}}'' is first introduced in the paper ''Projection Bias in Predicting Future Utility'' by Loewenstein, O'Donoghue and Rabin.<ref name=Frederick2011>{{cite book|last1=Frederick|first1=Shane|last2=Loewenstein|first2=George|last3=O'Donoghue|first3=Ted|editor1-last=Camerer|editor1-first=Colin F.|editor2-last=Loewenstein|editor2-first=George|editor3-last=Rabin|editor3-first=Matthew|title=Advances in Behavioral Economics|date=2011|publisher=Princeton University Press|isbn=978-1400829118|pages=187–188|chapter-url=https://books.google.com/books?id=sA4jJOjwCW4C&pg=PA187|language=en|chapter=Time Discounting and Time Preference: A Critical Review|ref=harv}}</ref> || "It refers to people’s assumption that their tastes or preferences will remain the same over time"<ref>{{cite web |title=Projection bias |url=https://www.behavioraleconomics.com/resources/mini-encyclopedia-of-be/projection-bias/ |website=behavioraleconomics.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 2003 || || Concept development|| Lovallo and Kahneman proposed an expanded definition of {{w|planning fallacy}} as the tendency to underestimate the time, costs, and risks of future actions and at the same time overestimate the benefits of the same actions. According to this definition, the planning fallacy results in not only time overruns, but also {{w|cost overruns}} and {{w|benefit shortfall}}s.<ref>{{cite journal |last1=Lovallo |first1=Dan |first2=Daniel |last2=Kahneman  |date=July 2003 |title=Delusions of Success: How Optimism Undermines Executives' Decisions |journal=Harvard Business Review |volume=81 |issue=7 |pages=56–63|pmid=12858711 |url=https://hbr.org/2003/07/delusions-of-success-how-optimism-undermines-executives-decisions}}</ref> || {{w|Planning fallacy}}
+
| 2003 || || Concept development || Lovallo and Kahneman proposed an expanded definition of {{w|planning fallacy}} as the tendency to underestimate the time, costs, and risks of future actions and at the same time overestimate the benefits of the same actions. According to this definition, the planning fallacy results in not only time overruns, but also {{w|cost overruns}} and {{w|benefit shortfall}}s.<ref>{{cite journal |last1=Lovallo |first1=Dan |first2=Daniel |last2=Kahneman  |date=July 2003 |title=Delusions of Success: How Optimism Undermines Executives' Decisions |journal=Harvard Business Review |volume=81 |issue=7 |pages=56–63|pmid=12858711 |url=https://hbr.org/2003/07/delusions-of-success-how-optimism-undermines-executives-decisions}}</ref> || {{w|Planning fallacy}}
 
|-
 
|-
 
| 2004 || Social bias || Literature || American journalist {{w|James Surowiecki}} publishes ''{{w|The Wisdom of Crowds}}'', which explores herd mentality and draws the conclusion that the decisions made by groups are often better and more accurate than those made by any individual member.<ref name=sdf/> || "Herd mentality (also known as mob mentality) describes a behavior in which people act the same way or adopt similar behaviors as the people around them — often ignoring their own feelings in the process."<ref name=sdf>{{cite web |title=4 examples of herd mentality (and how to take advantage of it) |url=https://www.iwillteachyoutoberich.com/blog/herd-mentality/#:~:text=Herd%20mentality%20(also%20known%20as,what%20the%20herd%20is%20doing. |website=iwillteachyoutoberich.com |access-date=27 January 2021}}</ref>
 
| 2004 || Social bias || Literature || American journalist {{w|James Surowiecki}} publishes ''{{w|The Wisdom of Crowds}}'', which explores herd mentality and draws the conclusion that the decisions made by groups are often better and more accurate than those made by any individual member.<ref name=sdf/> || "Herd mentality (also known as mob mentality) describes a behavior in which people act the same way or adopt similar behaviors as the people around them — often ignoring their own feelings in the process."<ref name=sdf>{{cite web |title=4 examples of herd mentality (and how to take advantage of it) |url=https://www.iwillteachyoutoberich.com/blog/herd-mentality/#:~:text=Herd%20mentality%20(also%20known%20as,what%20the%20herd%20is%20doing. |website=iwillteachyoutoberich.com |access-date=27 January 2021}}</ref>
Line 300: Line 300:
 
| 2006 || || Organization || Overcoming Bias launches as a group blog on the "general theme of how to move our beliefs closer to reality, in the face of our natural biases such as overconfidence and wishful thinking, and our bias to believe we have corrected for such biases, when we have done no such thing."<ref>{{cite web |title=Overcoming Bias |url=http://www.overcomingbias.com/about |website=overcomingbias.com |accessdate=13 March 2020}}</ref> ||
 
| 2006 || || Organization || Overcoming Bias launches as a group blog on the "general theme of how to move our beliefs closer to reality, in the face of our natural biases such as overconfidence and wishful thinking, and our bias to believe we have corrected for such biases, when we have done no such thing."<ref>{{cite web |title=Overcoming Bias |url=http://www.overcomingbias.com/about |website=overcomingbias.com |accessdate=13 March 2020}}</ref> ||
 
|-
 
|-
| 2006 || Belief, decision-making and behavioral || Concept development|| The {{w|Ostrich effect}} is coined by Galai & Sade.<ref>{{cite journal |title=The "Ostrich Effect" and the Relationship between the Liquidity and the Yields of Financial Assets |journal=The Journal of Business |doi=10.2139/ssrn.431180}}</ref> || "The {{w|ostrich effect}} bias is a tendency to ignore dangerous or negative information by ignoring it or burying one's head in the sand"<ref>{{cite web |title=Ostrich Effect |url=https://www.thinkingcollaborative.com/stj/ostrich-effect/ |website=thinkingcollaborative.com |accessdate=8 May 2020}}</ref>
+
| 2006 || Belief, decision-making and behavioral || Concept development || The {{w|Ostrich effect}} is coined by Galai & Sade.<ref>{{cite journal |title=The "Ostrich Effect" and the Relationship between the Liquidity and the Yields of Financial Assets |journal=The Journal of Business |doi=10.2139/ssrn.431180}}</ref> || "The {{w|ostrich effect}} bias is a tendency to ignore dangerous or negative information by ignoring it or burying one's head in the sand"<ref>{{cite web |title=Ostrich Effect |url=https://www.thinkingcollaborative.com/stj/ostrich-effect/ |website=thinkingcollaborative.com |accessdate=8 May 2020}}</ref>
 
|-
 
|-
| 2007 || Belief, decision-making and behavioral || Concept development|| The term ''{{w|recency illusion}}'' is coined by {{w|Stanford University}} linguist {{w|Arnold Zwicky}}.<ref name="sssa">{{cite journal |authorlink1= John R. Rickford |last1=Rickford |first1=John R. |last2=Wasow |first2=Thomas |last3=Zwicky |first3=Arnold |date=2007 |title=Intensive and quotative ''all'': something new, something old |journal=American Speech |doi=10.1215/00031283-2007-001 |volume=82 |issue=1 |pages=3–31|doi-access=free }}</ref> || The {{w|recency illusion}} is the belief or impression that a word or language usage is of recent origin when it is long-established."<ref name="sssa"/>
+
| 2007 || Belief, decision-making and behavioral || Concept development || The term ''{{w|recency illusion}}'' is coined by {{w|Stanford University}} linguist {{w|Arnold Zwicky}}.<ref name="sssa">{{cite journal |authorlink1= John R. Rickford |last1=Rickford |first1=John R. |last2=Wasow |first2=Thomas |last3=Zwicky |first3=Arnold |date=2007 |title=Intensive and quotative ''all'': something new, something old |journal=American Speech |doi=10.1215/00031283-2007-001 |volume=82 |issue=1 |pages=3–31|doi-access=free }}</ref> || The {{w|recency illusion}} is the belief or impression that a word or language usage is of recent origin when it is long-established."<ref name="sssa"/>
 
|-
 
|-
| 2007 || Social (conformity bias) || Concept development|| The concept of an “availability cascade” is defined by professors Timur Kuran and Cass Sunstein.<ref name="sdf">{{cite web |title=Climate Change 3: The Grand Narrative Availability Cascade is Making Us Stupid |url=https://www.americanexperiment.org/2016/11/the-grand-narrative-availability-cascade-is-making-us-stupid/ |website=americanexperiment.org |access-date=14 January 2021}}</ref> || Availability cascade refers to the "self-reinforcing process of collective belief formation by which an expressed perception triggers a chain reaction that gives the perception of increasing plausibility through its rising availability in public discourse."<ref name="sdf"/>
+
| 2007 || Social (conformity bias) || Concept development || The concept of an “availability cascade” is defined by professors Timur Kuran and Cass Sunstein.<ref name="sdf">{{cite web |title=Climate Change 3: The Grand Narrative Availability Cascade is Making Us Stupid |url=https://www.americanexperiment.org/2016/11/the-grand-narrative-availability-cascade-is-making-us-stupid/ |website=americanexperiment.org |access-date=14 January 2021}}</ref> || Availability cascade refers to the "self-reinforcing process of collective belief formation by which an expressed perception triggers a chain reaction that gives the perception of increasing plausibility through its rising availability in public discourse."<ref name="sdf"/>
 
|-
 
|-
| 2008 || Social bias ({{w|association fallacy}}) || Concept development|| {{w|Cheerleader effect}}. "The phrase was coined by the character {{w|Barney Stinson}} in "{{w|Not a Father's Day}}", an episode of the television series ''{{w|How I Met Your Mother}}'', first aired in November 2008. Barney points out to his friends a group of women that initially seem attractive, but who all seem to be very ugly when examined individually. This point is made again by [[w:Ted Mosby|Ted]] and [[w:Robin Scherbatsky|Robin]] later in the episode, who note that some of Barney's friends also only seem attractive in a group."<ref>{{cite web|url=https://www.theatlantic.com/business/archive/2013/11/cheerleader-effect-why-people-are-more-beautiful-in-groups/281119/|title=Cheerleader Effect: Why People Are More Beautiful in Groups|work={{w|The Atlantic}}|last=Hamblin|first=James|date=November 4, 2013|accessdate=December 5, 2015}}</ref> || "The {{w|cheerleader effect}} refers to the increase in attractiveness that an individual face experiences when seen in a group of other faces."<ref>{{cite journal |last1=Carragher |first1=Daniel J. |last2=Thomas |first2=Nicole A. |last3=Gwinn |first3=O. Scott |last4=Nicholls |first4=Mike E. R. |title=Limited evidence of hierarchical encoding in the cheerleader effect |url=https://www.nature.com/articles/s41598-019-45789-6}}</ref>
+
| 2008 || Social bias ({{w|association fallacy}}) || Concept development || {{w|Cheerleader effect}}. "The phrase was coined by the character {{w|Barney Stinson}} in "{{w|Not a Father's Day}}", an episode of the television series ''{{w|How I Met Your Mother}}'', first aired in November 2008. Barney points out to his friends a group of women that initially seem attractive, but who all seem to be very ugly when examined individually. This point is made again by [[w:Ted Mosby|Ted]] and [[w:Robin Scherbatsky|Robin]] later in the episode, who note that some of Barney's friends also only seem attractive in a group."<ref>{{cite web|url=https://www.theatlantic.com/business/archive/2013/11/cheerleader-effect-why-people-are-more-beautiful-in-groups/281119/|title=Cheerleader Effect: Why People Are More Beautiful in Groups|work={{w|The Atlantic}}|last=Hamblin|first=James|date=November 4, 2013|accessdate=December 5, 2015}}</ref> || "The {{w|cheerleader effect}} refers to the increase in attractiveness that an individual face experiences when seen in a group of other faces."<ref>{{cite journal |last1=Carragher |first1=Daniel J. |last2=Thomas |first2=Nicole A. |last3=Gwinn |first3=O. Scott |last4=Nicholls |first4=Mike E. R. |title=Limited evidence of hierarchical encoding in the cheerleader effect |url=https://www.nature.com/articles/s41598-019-45789-6}}</ref>
 
|-
 
|-
| 2009 || Belief, decision-making and behavioral ({{w|framing effect}}) || Concept development|| The concept of {{w|denomination effect}} is proposed by Priya Raghubir, professor at the {{w|New York University Stern School of Business}}, and Joydeep Srivastava, professor at [[w:University of Maryland, College Park|University of Maryland]], in their paper.<ref name="NPR">{{cite news|title=Why We Spend Coins Faster Than Bills|url=https://www.npr.org/templates/story/story.php?storyId=104063298|accessdate=7 April 2020|publisher=NPR|date=May 12, 2009}}</ref> || {{w|Denomination effect}} relates "to currency, whereby people are less likely to spend larger bills than their equivalent value in smaller bills."<ref>{{cite web |title=Denomination effect |url=http://nlpnotes.com/denomination-effect/ |website=nlpnotes.com |accessdate=7 May 2020}}</ref>
+
| 2009 || Belief, decision-making and behavioral ({{w|framing effect}}) || Concept development || The concept of {{w|denomination effect}} is proposed by Priya Raghubir, professor at the {{w|New York University Stern School of Business}}, and Joydeep Srivastava, professor at [[w:University of Maryland, College Park|University of Maryland]], in their paper.<ref name="NPR">{{cite news|title=Why We Spend Coins Faster Than Bills|url=https://www.npr.org/templates/story/story.php?storyId=104063298|accessdate=7 April 2020|publisher=NPR|date=May 12, 2009}}</ref> || {{w|Denomination effect}} relates "to currency, whereby people are less likely to spend larger bills than their equivalent value in smaller bills."<ref>{{cite web |title=Denomination effect |url=http://nlpnotes.com/denomination-effect/ |website=nlpnotes.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 2010 || Belief, decision-making and behavioral ({{w|confirmation bias}}) || Concept development||  The phrase ''{{w|backfire effect}}'' is first coined by American political scientist {{w|Brendan Nyhan}} and Jason Reifler.<ref>{{Cite web|url=http://www.dartmouth.edu/~nyhan/nyhan-reifler.pdf|title=Pdf.}}</ref> || "The backfire effect is a cognitive bias that causes people who encounter evidence that challenges their beliefs to reject that evidence, and to strengthen their support of their original stance."<ref>{{cite web |title=The Backfire Effect: Why Facts Don’t Always Change Minds – Effectiviology |url=https://effectiviology.com/backfire-effect-facts-dont-change-minds/ |website=effectiviology.com |access-date=27 January 2021}}</ref>
+
| 2010 || Belief, decision-making and behavioral ({{w|confirmation bias}}) || Concept development ||  The phrase ''{{w|backfire effect}}'' is first coined by American political scientist {{w|Brendan Nyhan}} and Jason Reifler.<ref>{{Cite web|url=http://www.dartmouth.edu/~nyhan/nyhan-reifler.pdf|title=Pdf.}}</ref> || "The backfire effect is a cognitive bias that causes people who encounter evidence that challenges their beliefs to reject that evidence, and to strengthen their support of their original stance."<ref>{{cite web |title=The Backfire Effect: Why Facts Don’t Always Change Minds – Effectiviology |url=https://effectiviology.com/backfire-effect-facts-dont-change-minds/ |website=effectiviology.com |access-date=27 January 2021}}</ref>
 
|-
 
|-
 
| 2010 || Belief, decision-making and behavioral ({{w|egocentric bias}}) || Research || The ''Handbook of Social Psychology'' recognizes {{w|naïve realism}} as one of "four hard-won insights about [[w:Perception|human perception]], [[w:Thought|thinking]], {{w|motivation}} and {{w|behavior}} that... represent important, indeed foundational, contributions of {{w|social psychology}}."<ref>{{cite journal |last1=Ross |first1=Lee |last2=Lepper |first2=Mark |last3=Ward |first3=Andrew |title=History of Social Psychology: Insights, Challenges, and Contributions to Theory and Application |journal=Handbook of Social Psychology |date=30 June 2010 |pages=socpsy001001 |doi=10.1002/9780470561119.socpsy001001}}</ref> || "{{w|Naïve realism}} describes people’s tendency to believe that they perceive the social world “as it is”—as objective reality—rather than as a subjective construction and interpretation of reality."<ref>{{cite web |title=Naive Realism |url=http://psychology.iresearchnet.com/social-psychology/decision-making/naive-realism/ |website=psychology.iresearchnet.com |accessdate=17 July 2020}}</ref>
 
| 2010 || Belief, decision-making and behavioral ({{w|egocentric bias}}) || Research || The ''Handbook of Social Psychology'' recognizes {{w|naïve realism}} as one of "four hard-won insights about [[w:Perception|human perception]], [[w:Thought|thinking]], {{w|motivation}} and {{w|behavior}} that... represent important, indeed foundational, contributions of {{w|social psychology}}."<ref>{{cite journal |last1=Ross |first1=Lee |last2=Lepper |first2=Mark |last3=Ward |first3=Andrew |title=History of Social Psychology: Insights, Challenges, and Contributions to Theory and Application |journal=Handbook of Social Psychology |date=30 June 2010 |pages=socpsy001001 |doi=10.1002/9780470561119.socpsy001001}}</ref> || "{{w|Naïve realism}} describes people’s tendency to believe that they perceive the social world “as it is”—as objective reality—rather than as a subjective construction and interpretation of reality."<ref>{{cite web |title=Naive Realism |url=http://psychology.iresearchnet.com/social-psychology/decision-making/naive-realism/ |website=psychology.iresearchnet.com |accessdate=17 July 2020}}</ref>
 
|-
 
|-
| 2011 || Belief, decision-making and behavioral || Concept development|| The {{w|IKEA effect}} is identified and named by {{w|Michael I. Norton}} of {{w|Harvard Business School}}, Daniel Mochon of {{w|Yale}}, and {{w|Dan Ariely}} of {{w|Duke University}}, who publish the results of three studies in this year.<ref>{{cite web |title=Cognitive Biases — The IKEA Effect |url=https://medium.com/@michaelgearon/cognitive-biases-the-ikea-effect-d994ea6a28ad |website=medium.com |accessdate=14 August 2020}}</ref> || "The Ikea Effect is the cognitive phenomena where customers get more excited and place a higher value in the products they have partially created, modified or personalized."<ref>{{cite web |title=What is the Ikea Effect? |url=https://www.bloomreach.com/en/blog/2019/08/ikea-effect.html |website=bloomreach.com |accessdate=7 May 2020}}</ref>  
+
| 2011 || Belief, decision-making and behavioral || Concept development || The {{w|IKEA effect}} is identified and named by {{w|Michael I. Norton}} of {{w|Harvard Business School}}, Daniel Mochon of {{w|Yale}}, and {{w|Dan Ariely}} of {{w|Duke University}}, who publish the results of three studies in this year.<ref>{{cite web |title=Cognitive Biases — The IKEA Effect |url=https://medium.com/@michaelgearon/cognitive-biases-the-ikea-effect-d994ea6a28ad |website=medium.com |accessdate=14 August 2020}}</ref> || "The Ikea Effect is the cognitive phenomena where customers get more excited and place a higher value in the products they have partially created, modified or personalized."<ref>{{cite web |title=What is the Ikea Effect? |url=https://www.bloomreach.com/en/blog/2019/08/ikea-effect.html |website=bloomreach.com |accessdate=7 May 2020}}</ref>  
 
|-
 
|-
| 2011 || Memory bias || Concept development|| The {{w|Google Eeffect}}, also known as “digital amnesia”, is first described by Betsy Sparrow from {{w|Columbia University}} and her colleagues. Their paper describes the results of several memory experiments involving technology.<ref name="thecustomer.net">{{cite web |title=Marketers Need To Be Aware Of Cognitive Bias |url=https://thecustomer.net/marketers-need-to-be-aware-of-cognitive-bias/?cn-reloaded=1 |website=thecustomer.net |accessdate=12 March 2020}}</ref><ref name="Columbia">{{cite web|title=Study Finds That Memory Works Differently in the Age of Google |publisher={{w|Columbia University}}|date=July 14, 2011|url=https://web.archive.org/web/20110717092619/http://news.columbia.edu/research/2490}}</ref> || The {{w|Google effect}} "represents people’s tendency to forget information that they can find online, particularly by using search engines such as {{w|Google}}."<ref>{{cite web |title=The Google Effect and Digital Amnesia: How We Use Machines to Remember |url=https://effectiviology.com/the-google-effect-and-digital-amnesia/#:~:text=Summary%20and%20conclusions-,The%20Google%20effect%20is%20a%20psychological%20phenomenon%20that%20represents%20people's,search%20engines%20such%20as%20Google. |website=effectiviology.com |accessdate=16 July 2020}}</ref>
+
| 2011 || Memory bias || Concept development || The {{w|Google Eeffect}}, also known as “digital amnesia”, is first described by Betsy Sparrow from {{w|Columbia University}} and her colleagues. Their paper describes the results of several memory experiments involving technology.<ref name="thecustomer.net">{{cite web |title=Marketers Need To Be Aware Of Cognitive Bias |url=https://thecustomer.net/marketers-need-to-be-aware-of-cognitive-bias/?cn-reloaded=1 |website=thecustomer.net |accessdate=12 March 2020}}</ref><ref name="Columbia">{{cite web|title=Study Finds That Memory Works Differently in the Age of Google |publisher={{w|Columbia University}}|date=July 14, 2011|url=https://web.archive.org/web/20110717092619/http://news.columbia.edu/research/2490}}</ref> || The {{w|Google effect}} "represents people’s tendency to forget information that they can find online, particularly by using search engines such as {{w|Google}}."<ref>{{cite web |title=The Google Effect and Digital Amnesia: How We Use Machines to Remember |url=https://effectiviology.com/the-google-effect-and-digital-amnesia/#:~:text=Summary%20and%20conclusions-,The%20Google%20effect%20is%20a%20psychological%20phenomenon%20that%20represents%20people's,search%20engines%20such%20as%20Google. |website=effectiviology.com |accessdate=16 July 2020}}</ref>
 
|-
 
|-
 
| 2011 || Belief, decision-making and behavioral || Notable case || The {{w|look-elsewhere effect}}, more generally known in statistics as the {{w|problem of multiple comparisons}}, gains some media attention in the context of the search for the {{w|Higgs boson}} at the {{w|Large Hadron Collider}}.<ref>{{cite web|url=http://blogs.telegraph.co.uk/news/tomchiversscience/100123873/an-unconfirmed-sighting-of-the-elusive-higgs-boson/|title=An unconfirmed sighting of the elusive Higgs boson|author=Tom Chivers|date=2011-12-13|publisher=Daily Telegraph}}</ref> || The {{w|look-elsewhere effect}} "occurs when a statistically significant observation is found but, actually, arose by chance and due to the size of the parameter space and sample observed."<ref>{{cite web |title=When a statistically significant observation should be overlooked. |url=https://thedecisionlab.com/biases/look-elsewhere-effect/ |website=thedecisionlab.com |accessdate=7 May 2020}}</ref>
 
| 2011 || Belief, decision-making and behavioral || Notable case || The {{w|look-elsewhere effect}}, more generally known in statistics as the {{w|problem of multiple comparisons}}, gains some media attention in the context of the search for the {{w|Higgs boson}} at the {{w|Large Hadron Collider}}.<ref>{{cite web|url=http://blogs.telegraph.co.uk/news/tomchiversscience/100123873/an-unconfirmed-sighting-of-the-elusive-higgs-boson/|title=An unconfirmed sighting of the elusive Higgs boson|author=Tom Chivers|date=2011-12-13|publisher=Daily Telegraph}}</ref> || The {{w|look-elsewhere effect}} "occurs when a statistically significant observation is found but, actually, arose by chance and due to the size of the parameter space and sample observed."<ref>{{cite web |title=When a statistically significant observation should be overlooked. |url=https://thedecisionlab.com/biases/look-elsewhere-effect/ |website=thedecisionlab.com |accessdate=7 May 2020}}</ref>
Line 322: Line 322:
 
| 2012 || Belief, decision-making and behavioral (logical fallacy) || Research || In an article in ''{{w|Psychological Bulletin}}'' it is suggested the {{w|subadditivity effect}} can be explained by an {{w|information-theoretic}} generative mechanism that assumes a noisy conversion of objective evidence (observation) into subjective estimates (judgment).<ref name="HilbertPsychBull">{{cite journal|last1=Hilbert|first1=Martin|title=Toward a synthesis of cognitive biases: How noisy information processing can bias human decision making|journal=Psychological Bulletin|date=2012|volume=138|issue=2|pages=211–237|doi=10.1037/a0025940|pmid=22122235|url=https://web.archive.org/web/20160304023236/http://www.martinhilbert.net/HilbertPsychBull.pdf}}</ref> || The {{w|subadditivity effect}} is "the tendency to judge probability of the whole to be less than the probabilities of the parts".<ref>{{cite web |title=Today's term from psychology is Subadditivity Effect. |url=https://steemit.com/life/@jevh/today-s-term-from-psychology-is-subadditivity-effect |website=steemit.com |accessdate=7 May 2020}}</ref>
 
| 2012 || Belief, decision-making and behavioral (logical fallacy) || Research || In an article in ''{{w|Psychological Bulletin}}'' it is suggested the {{w|subadditivity effect}} can be explained by an {{w|information-theoretic}} generative mechanism that assumes a noisy conversion of objective evidence (observation) into subjective estimates (judgment).<ref name="HilbertPsychBull">{{cite journal|last1=Hilbert|first1=Martin|title=Toward a synthesis of cognitive biases: How noisy information processing can bias human decision making|journal=Psychological Bulletin|date=2012|volume=138|issue=2|pages=211–237|doi=10.1037/a0025940|pmid=22122235|url=https://web.archive.org/web/20160304023236/http://www.martinhilbert.net/HilbertPsychBull.pdf}}</ref> || The {{w|subadditivity effect}} is "the tendency to judge probability of the whole to be less than the probabilities of the parts".<ref>{{cite web |title=Today's term from psychology is Subadditivity Effect. |url=https://steemit.com/life/@jevh/today-s-term-from-psychology-is-subadditivity-effect |website=steemit.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
| 2013 || Belief, decision-making and behavioral || Concept development|| The term “{{w|end-of-history illusion}}” originates in a journal article by psychologists Jordi Quoidbach, [[w:Daniel Gilbert (psychologist)|Daniel Gilbert]], and {{w|Timothy Wilson}} detailing their research on the phenomenon and leveraging the phrase coined by [[w:The End of History and the Last Man|Francis Fukuyama's 1992 book of the same name]].<ref name="Quoidbach2013">{{cite journal |last1= Quoidbach |first1= Jordi |last2= Gilbert |first2= Daniel T.|last3= Wilson |first3= Timothy D. |date= 2013-01-04 |title= The End of History Illusion |journal= [[w:Science (journal)|Science]] |volume= 339 |issue= 6115 |pages= 96–98 |doi= 10.1126/science.1229294 |pmid= 23288539|quote= Young people, middle-aged people, and older people all believed they had changed a lot in the past but would change relatively little in the future.|url= https://web.archive.org/web/20130113214951/http://www.wjh.harvard.edu/~dtg/Quoidbach%20et%20al%202013.pdf |archivedate= 2013-01-13}}</ref> || The {{w|end-of-history illusion}} occurs "when people tend to underestimate how much they will change in the future.”<ref>{{cite web |title=Why You Won’t Be the Person You Expect to Be |url=https://www.nytimes.com/2013/01/04/science/study-in-science-shows-end-of-history-illusion.html |website=nytimes.com |accessdate=7 May 2020}}</ref>
+
| 2013 || Belief, decision-making and behavioral || Concept development || The term “{{w|end-of-history illusion}}” originates in a journal article by psychologists Jordi Quoidbach, [[w:Daniel Gilbert (psychologist)|Daniel Gilbert]], and {{w|Timothy Wilson}} detailing their research on the phenomenon and leveraging the phrase coined by [[w:The End of History and the Last Man|Francis Fukuyama's 1992 book of the same name]].<ref name="Quoidbach2013">{{cite journal |last1= Quoidbach |first1= Jordi |last2= Gilbert |first2= Daniel T.|last3= Wilson |first3= Timothy D. |date= 2013-01-04 |title= The End of History Illusion |journal= [[w:Science (journal)|Science]] |volume= 339 |issue= 6115 |pages= 96–98 |doi= 10.1126/science.1229294 |pmid= 23288539|quote= Young people, middle-aged people, and older people all believed they had changed a lot in the past but would change relatively little in the future.|url= https://web.archive.org/web/20130113214951/http://www.wjh.harvard.edu/~dtg/Quoidbach%20et%20al%202013.pdf |archivedate= 2013-01-13}}</ref> || The {{w|end-of-history illusion}} occurs "when people tend to underestimate how much they will change in the future.”<ref>{{cite web |title=Why You Won’t Be the Person You Expect to Be |url=https://www.nytimes.com/2013/01/04/science/study-in-science-shows-end-of-history-illusion.html |website=nytimes.com |accessdate=7 May 2020}}</ref>
 
|-
 
|-
 
|}
 
|}

Revision as of 21:42, 28 January 2021

This is a timeline of cognitive biases.

Sample questions

The following are some interesting questions that can be answered by reading this timeline:

  • What are the different types of cognitive bias described by the timeline?

..

    • Sort the full timeline by "Event type" and look for the group of rows with value "Notable case".
    • Sort the full timeline by "Event type" and look for the group of rows with value "Concept development".

Big picture

Time period Development summary More details
1972 backward Pre concept development era Multiple concepts later included within the category of cognitive biases are developed throughout time, starting from ancient Greek philosophers.
1972 onward Modern period The notion of cognitive biases is introduced by Amos Tversky and Daniel Kahneman.
21st century Present time As of 2020, there are approximately 188 recognized cognitive biases.[1]

Visual data

Google Trends

The chart below shows Google Trends data for cognitive biases (topic) fromJanuary 2004 to january 2021, when the screenshot was taken. [2]

Cognitive biases gtrends.jpeg

Google Ngram Viewer

The chart shows Google Ngram Viewer data for "cognitive bias", from 1972 (when the concept was created) to 2019.[3]

Cognitive bias ngram.png

Wikipedia Views

The chart below shows pageviews of the English Wikipedia article cognitive bias,from July 2015 to December 2020.[4]

Cognitive biases wv.jpeg
.

Full timeline

Year Bias type Event type Details Concept definition (when applicable)
c.180 CE Social bias Field development Many philosophers and social theorists observe and consider the phenomenon of belief in a just world, going back to at least as early as the Pyrrhonist philosopher Sextus Empiricus, writing circa 180 CE, who argues against this belief.[5] "The just-world hypothesis is the belief that people get what they deserve since life is fair."[6]
1747 Field development Scottish doctor James Lind conducts the first systematic clinical trial.[7] "Clinical trials are research studies performed in people that are aimed at evaluating a medical, surgical, or behavioral intervention."[8]
1753 Availability bias Field development Anthropomorphism is first attested, originally in reference to the heresy of applying a human form to the Christian God.[9][10] Anthropomorphism is "the interpretation of nonhuman things or events in terms of human characteristics".[11]
1776–1799 Field development The declinism belief is traced back to Edward Gibbon's work,[12] The History of the Decline and Fall of the Roman Empire, where Edward Gibbon argues that Rome collapsed due to the gradual loss of civic virtue among its citizens,[13] Declinism is "the tendency to believe that the worst is to come".[14]
1796 Literature French scholar Pierre-Simon Laplace describes in A Philosophical Essay on Probabilities the ways in which men calculate their probability of having sons: "I have seen men, ardently desirous of having a son, who could learn only with anxiety of the births of boys in the month when they expected to become fathers. Imagining that the ratio of these births to those of girls ought to be the same at the end of each month, they judged that the boys already born would render more probable the births next of girls." The expectant fathers feared that if more sons were born in the surrounding community, then they themselves would be more likely to have a daughter. This essay by Laplace is regarded as one of the earliest descriptions of the fallacy.[15] "The Gambler's Fallacy is the misconception that something that has not happened for a long time has become 'overdue', such a coin coming up heads after a series of tails."[16]
1847 Concept development The term Semmelweis effect derives from the name of a Hungarian physician, Ignaz Semmelweis, who discovered in 1847 that childbed fever mortality rates fell ten-fold when doctors disinfected their hands with a chlorine solution before moving from one patient to another, or, most particularly, after an autopsy. The Semmelweis effect is a metaphor for the reflex-like tendency to reject new evidence or new knowledge because it contradicts established norms, beliefs, or paradigms.[17] Semmelweis effect "refers to the tendency to automatically reject new information or knowledge because it contradicts current thinking or beliefs."[18]
1848 Social (conformity bias) Concept development The phrase "jump on the bandwagon" first appears in American politics when enterteiner Dan Rice uses his bandwagon and its music to gain attention for his political campaign appearances. As his campaign becomes more successful, other politicians strive for a seat on the bandwagon, hoping to be associated with his success.[19] Bandwagon effect "is a psychological phenomenon whereby people do something primarily because other people are doing it, regardless of their own beliefs, which they may ignore or override."[20]
1850 Concept development The first reference to “stereotype” appears as a noun that means “image perpetuated without change.”[21]
1860 Concept development Both Weber's law and Fechner's law are published by Gustav Theodor Fechner in the work Elemente der Psychophysik (Elements of Psychophysics). This publication is the first work ever in this field, and where Fechner coins the term psychophysics to describe the interdisciplinary study of how humans perceive physical magnitudes.[22] Weber–Fechner law "states that the change in a stimulus that will be just noticeable is a constant ratio of the original stimulus."[23]
1866 Belief, decision-making and behavioral (apophenia) Concept development The German word pareidolie is used in German articles by Dr. Karl Ludwig Kahlbaum in his paper On Delusion of the Senses.[24] Pareidolia is "the tendency to perceive a specific, often meaningful image in a random or ambiguous visual pattern."[25]
1874 Memory bias Field development The first documented instance of cryptomnesia occurs with the medium Stainton Moses.[26][27] Cryptomnesia is "an implicit memory phenomenon in which people mistakenly believe that a current thought or idea is a product of their own creation when, in fact, they have encountered it previously and then forgotten it".[28]
1876 Memory bias Field development German experimental psychologist Gustav Fechner conducts the earliest known research on the mere-exposure effect.[29] Mere-exposure effect "means that people prefer things that they are most familiar with".[30]
1882 Concept development The term specious present is first introduced by the philosopher E. R. Clay.[31][32] Specious present "is the time duration wherein a state of consciousness is experienced as being in the present".[33]
1885 Memory bias Concept development The phenomenon of spacing effect is first identified by Hermann Ebbinghaus, and his detailed study of it is published in his book Über das Gedächtnis. Untersuchungen zur experimentellen Psychologie (Memory: A Contribution to Experimental Psychology). "The spacing effect describes the robust finding that long-term learning is promoted when learning events are spaced out in time, rather than presented in immediate succession".[34]
1890 Memory bias Concept development The tip of the tongue phenomenon is first described as a psychological phenomenon in the text The Principles of Psychology by William James.[35] Tip of the tongue describes "a state in which one cannot quite recall a familiar word but can recall words of similar form and meaning".[36]
1893 Memory bias Concept development Childhood amnesia is first formally reported by psychologist Caroline Miles in her article A study of individual psychology by the American Journal of Psychology.[37] Childhood amnesia "refers to the fact that most people cannot remember events that occurred before the age of 3 or 4"[38]
1906 Social (conformity bias) Concept development The first known use of bandwagon effect occurs in this year.[39] "Bandwagon effect is when an idea or belief is being followed because everyone seems to be doing so."[40]
1906 Social bias Field development American sociologist William Sumner posits that humans are a species that join together in groups by their very nature. However, he also maintains that humans have an innate tendency to favor their own group over others, proclaiming how "each group nourishes its own pride and vanity, boasts itself superior, exists in its own divinities, and looks with contempt on outsiders".[41] In-group favoritism is "the tendency to favor members of one's own group over those in other groups"[42]
1909 Memory bias Concept development The first documented empirical studies on the testing effect are published by Edwina E. Abbott.[43][44] "Testing effect is the finding that long-term memory is often increased when some of the learning period is devoted to retrieving the to-be-remembered information."[45]
1913 Concept development The term "Monte Carlo fallacy" originates from the best known example of the phenomenon, which occurs in the Monte Carlo Casino.[46] Monte Carlo fallacy "occurs when an individual erroneously believes that a certain random event is less likely or more likely, given a previous event or a series of events."[47]
1914 Memory bias Concept development The first research on the cross-race effect is published.[48] Cross-race effect is "the tendency for eyewitnesses to be better at recognizing members of their own race/ethnicity than members of other races."[49]
1920 Social bias Concept development The halo effect is named by psychologist Edward Thorndike[50] in reference to a person being perceived as having a halo. He gives the phenomenon its name in his article A Constant Error in Psychological Ratings.[51] In "Constant Error", Thorndike sets out to replicate the study in hopes of pinning down the bias that he thought was present in these ratings. Subsequent researchers would study it in relation to attractiveness and its bearing on the judicial and educational systems.[52] Thorndike originally coins the term referring only to people; however, its use would be greatly expanded especially in the area of brand marketing.[51] "First coined back in 1920, the halo effect describes how our impression of a person forms a halo around our conception of their character." "The term was coined by psychologist Edwin Thorndike in 1920."[53][54] Halo effect refers to an "error in reasoning in which an impression formed from a single trait or characteristic is allowed to influence multiple judgments or ratings of unrelated factors."[55]
1922 Concept development The term “stereotype” is first used in the modern psychological sense by American journalist Walter Lippmann in his work Public Opinion.[21] "Stereotype is most frequently now employed to refer to an often unfair and untrue belief that many people have about all people or things with a particular characteristic."[56]
1927 Memory bias Research Russian psychologist Bluma Zeigarnik publishes in the journal Psychologische Forschung a report on a series of experiments uncovering the processes underlying the phenomenon later called Zeigarnik effect.[57] Russian psychologist Bluma Zeigarnik first studies the phenomenon after her professor and Gestalt psychologist Kurt Lewin noticed that a waiter had better recollections of still unpaid orders. However, after the completion of the task – after everyone had paid – he was unable to remember any more details of the orders. Zeigarnik then designed a series of experiments to uncover the processes underlying this phenomenon. Her research report was published in 1927, in the journal Psychologische Forschung.[58] Zeigarnik effect is the "tendency to remember interrupted or incomplete tasks or events more easily than tasks that have been completed."[59]
1928 Belief, decision-making and behavioral Literature American economist Irving Fisher publishes The Money Illusion, which develops the concept of the same name.[60] "Money illusion posits that people have a tendency to view their wealth and income in nominal dollar terms, rather than recognize its real value, adjusted for inflation."[61]
1930 Concept development English epistemologist C. D. Broad further elaborates on the concept of the specious present and states that it may be considered as the temporal equivalent of a sensory datum.[32] "The specious present is a term applied to that short duration of time the human mind appears to be able to experience, a period which exists between past and future and which is longer than the singular moment of the actual present."[62]
1932 Memory bias Field development Some of the earliest evidence for the Fading Affect Bias dates back to a study by Cason, who conducts a study using a retrospective procedure where participants recall and rate past events and emotion when prompted finds that recalled emotional intensity for positive events is generally stronger than that of negative events.[63] The Fading Affect Bias "indicates that the emotional response prompted by positive memories often tends to be stronger than the emotional response prompted by negative memories."[64]
1933 Memory bias Concept development The Von Restorff effect theory is coined by German psychiatrist and pediatrician Hedwig von Restorff, who, in her study, finds that when participants are presented with a list of categorically similar items with one distinctive, isolated item on the list, memory for the item is improved.[65] "It predicts that when multiple similar objects are present, the one that differs from the rest is most likely to be remembered."[66]
1945 Belief, decision-making and behavioral (anchoring bias) Concept development Karl Duncker defines functional fixedness as being a "mental block against using an object in a new way that is required to solve a problem".[67] Functional fixedness "is the inability to realize that something known to have a particular use may also be used to perform other functions."[68]
1946 Belief, decision-making and behavioral (logical fallacy) Concept development American statistician Joseph Berkson illustrates what would be later known as Berkson's paradox,one of the most famous paradox in probability and statistics.[69] Berkson's bias or fallacy, is a type of selection bias Berkson's paradox "is a type of selection bias - a mathematical result found in the fields of conditional probability and statistics in which two variables can be negatively correlated even though they have the appearance of being positively correlated within the population."[70]
1947 Belief, decision-making and behavioral (extension neglect) Concept development Joseph Stalin introduces the concept of compassion fade with his statement “the death of one man is a tragedy, the death of millions is a statistic”.[71] Compassion fade "refers to the decrease in the compassion one shows for the people in trouble as the number of the victims increase."[72]
1952 Social (conformity bias) Concept development William H. Whyte Jr. derives the term groupthink from George Orwell's Nineteen Eighty-Four and popularizes it in Fortune magazine:
Groupthink being a coinage – and, admittedly, a loaded one – a working definition is in order. We are not talking about mere instinctive conformity – it is, after all, a perennial failing of mankind. What we are talking about is a rationalized conformity – an open, articulate philosophy which holds that group values are not only expedient but right and good as well.[73][74]
"Groupthink is a psychological phenomenon in which people strive for consensus within a group."[75]
1954 Social bias Concept development The social comparison theory is initially proposed by social psychologist Leon Festinger. It centers on the belief that there is a drive within individuals to gain accurate self-evaluations.[76] The social comparison theory refers to "the idea that individuals determine their own social and personal worth based on how they stack up against others".[77]
1956 Concept development The term "Barnum effect" is coined by psychologist Paul Meehl in his essay Wanted – A Good Cookbook, because he relates the vague personality descriptions used in certain "pseudo-successful" psychological tests to those given by showman P. T. Barnum.[78][79] Barnum effect is "the phenomenon that occurs when individuals believe that personality descriptions apply specifically to them (more so than to other people), despite the fact that the description is actually filled with information that applies to everyone."[80]
1957 Concept development British naval historian C. Northcote Parkinson describes what is later called Parkinson's law of triviality, which argues that members of an organization give disproportionate weight to trivial issues.[81] Parkinson's law of triviality "explains that people will give more energy and focus to trivial or unimportant items than to more important and complex ones."[82]
1960 Belief, decision-making and behavioral Concept development English psychhologist Peter Cathcart Wason first describes the confirmation bias.[83][84][85] "Confirmation bias is the tendency of people to favor information that confirms their existing beliefs or hypotheses."[86]
1960 Belief, decision-making and behavioral (confirmation bias) Concept development Peter Cathcart Wason discovers the classic example of subjects' congruence bias.[87] Congruence bias is "the tendency to test hypotheses exclusively through direct testing, instead of considering possible alternatives."[88]
1961 Social bias Study The Milgram experiment is conducted. This classic experiment establishes the existence of authority bias.[89] "Authority bias is the human tendency to attribute greater authority and knowledge to persons of authority (fame, power, position, etc.) than they may actually possess."[90]
1961 Ambiguity effect Concept development The ambiguity effect is first described by American economist Daniel Ellsberg.[91] "Ambiguity Effect occurs when people prefer options with known probabilities over those with unknown probabilities."[92]
1964 Memory bias Concept development The original work on the telescoping effect is usually attributed to an article by Neter and Waksberg in the Journal of the American Statistical Association.[93] The term telescoping comes from the idea that time seems to shrink toward the present in the way that the distance to objects seems to shrink when they are viewed through a telescope.[93] "The telescoping effect refers to inaccurate perceptions regarding time, where people see recent events as more remote than they are (backward telescoping), and remote events as more recent (forward telescoping).[94]
1964 Belief, decision-making and behavioral (anchoring bias) Concept development The first recorded statement of the concept of Law of the instrument is Abraham Kaplan's: "I call it the law of the instrument, and it may be formulated as follows: Give a small boy a hammer, and he will find that everything he encounters needs pounding."[95] "The law of the instrument principle states that when we acquire a specific tool/skill, we tend to be to see opportunities to use that tool/skill everywhere."[96]
1966 Study An experiment shows that people remember a group of words better if they are within the same theme category. Such words that generate recall by association are known as semantic cues.[97]
1966 Social (egocentric bias) Research Walster hypothesizes that it can be frightening to believe that a misfortune could happen to anyone at random, and attributing responsibility to the person(s) involved helps to manage this emotional reaction.[98] Defensive attribution hypothesis
1967 Belief, decision-making and behavioral Notable case Risk compensation. Sweden experiences a drop in crashes and fatalities, following the change from driving on the left to driving on the right. This is linked to the increased apparent risk. The number of motor insurance claims goes down by 40%, returning to normal over the next six weeks.[99][100] Fatality levels would take two years to return to normal.[101] "Risk compensation postulates that humans have a built-in level of acceptable risk-taking and that our behaviour adjusts to this level in a homeostatic manner".[102]
1967 Belief, decision-making and behavioral (apophenia) Concept development Illusory correlation is originally coined by Chapman and Chapman to describe people's tendencies to overestimate relationships between two groups when distinctive and unusual information is presented.[103]"[104] An illusory correlation occurs when a person perceives a relationship between two variables that are not in fact correlated.[105]
1967 Social (attribution bias) Research American social psychologist Edward E. Jones and Victor Harris conduct a classic experiment[106] that would later give rise to the phrase Fundamental attribution error, coined by Lee Ross[107] Fundamental attribution error "is the tendency for people to over-emphasize dispositional, or personality-based explanations for behaviors observed in others while under-emphasizing situational explanations".[108]
1968 Belief, decision-making and behavioral (anchoring bias) Concept development American psychologist Ward Edwards discusses the concept of conservatism (belief revision) bias.[109] "Conservatism bias is a mental process in which people maintain their past views or predictions at the cost of recognizing new information."[110]
1968 Social Concept development German-born American psychologist Robert Rosenthal and Lenore Jacobsen first describe what would be called Pygmalion Effect (also called the Galatea effect).[111] Pygmalion Effect "refers to the phenomenon of people improving their performance when others have high expectations of them."[112]
1969 Social (cognitive dissonance) Concept development Researchers confirm the Ben Franklin effect.[113] The Ben Franklin effect refers to "an altruistic reaction that makes a person more likely to do a favor for someone that they have already completed a favor for; more likely than they are to return a favor to someone who has completed a favor for them."[114]
1969 Memory bias Research Crowder and Morton argue that the suffix effect is a reflection of the contribution of the auditory sensory memory or echoic memory to recall in the nonsuffix control condition.[115] "The suffix effect is the selective impairment in recall of the final items of a spoken list when the list is followed by a nominally irrelevant speech item, or suffix."[116]
1971 Social bias Concept development The concept of actor–observer asymmetry (also actor–observer bias) is introduced by Jones and Nisbett. It explains the errors that one makes when forming attributions about the behavior of others.[117] The actor–observer asymmetry "states that people tend to explain their own behavior with situation causes and other people's behavior with person causes".[118]
1972 Concept development The concept of cognitive bias is introduced in this year through the work of researchers Amos Tversky and Daniel Kahneman.[119] Cognitive bias refers to "people's systematic but purportedly flawed patterns of responses to judgment and decision problems."[120]
1973 Memory bias Concept development American academic Baruch Fischhoff attends a seminar where Paul E. Meehl states an observation that clinicians often overestimate their ability to have foreseen the outcome of a particular case, as they claim to have known it all along.[121] "Hindsight bias, the tendency, upon learning an outcome of an event—such as an experiment, a sporting event, a military decision, or a political election—to overestimate one's ability to have foreseen the outcome."[122]
1973 Belief, decision-making and behavioral (egocentric bias) Concept development The illusion of validity bias is first described by Amos Tversky and Daniel Kahneman in their paper.[123] The illusion of validity occurs when an individual overestimates their ability to predict an outcome when analyzing a set of data - especially when the data appears to have a consistent pattern or appears to 'tell a story".[124]
1973 Memory bias Concept development The next-in-line effect is first studied experimentally by Malcolm Brenner. In his experiment the participants were each in turn reading a word aloud from an index card, and after 25 words were asked to recall as many of all the read words as possible. The results of the experiment show that words read aloud within approximately nine seconds before the subject's own turn are recalled worse than other words.[125] "Next-in-line effect. people not remembering what other people said because they were too busy rehearsing their own part."[126]
1974 Memory bias Research Elizabeth Loftus and John Palmer conduct a study to investigate the effects of language on the development of false memory.[127] "False memory refers to cases in which people remember events differently from the way they happened or, in the most dramatic case, remember events that never happened at all."[128]
1974 Belief, decision-making and behavioral Concept development Anchoring is first described by Tversky and Kahneman.[129] "Anchoring bias occurs when people rely too much on pre-existing information or the first information they find when making decisions."[130]
1975 Social (attribution bias) Research Miller and Ross conduct a study that is one of the earliest to assess not only self-serving bias but also the attributions for successes and failures within this theory.[131] Self-serving bias is the common habit of a person taking credit for positive events or outcomes, but blaming outside factors for negative events."[132]
1976 Belief, decision-making and behavioral (logical fallacy) Concept development Escalation of commitment is first described by Barry M. Staw in his paper Knee deep in the big muddy: A study of escalating commitment to a chosen course of action.[133] Escalation of commitment "refers to the irrational behavior of investing additional resources in a failing project."[134]
1976 Social (attribution bias) Research Prior to Pettigrew's formalization of the ultimate attribution error, Birt Duncan finds that White participants view Black individuals as more violent than White individuals in an "ambiguous shove" situation, where a Black or White person accidentally shoves a White person.[135] "The tendency for persons from one group (the ingroup) to determine that any bad acts by members of an outgroup—for example, a racial or ethnic minority group—are caused by internal attributes or traits rather than by outside circumstances or situations, while viewing their positive behaviors as merely exceptions to the rule or the result of luck."[136]
1977 Memory bias Research Misattribution of memory. Early research done by Brown and Kulik finds that flashbulb memories are similar to photographs because they can be described in accurate, vivid detail. In this study, participants describe their circumstances about the moment they learned of the assassination of President John F. Kennedy as well as other similar traumatic events. Participants are able to describe what they were doing, things around them, and other details.[137] Misattribution of memory occurs "when a memory is distorted because of the source, context, or our imagination."[138]
1977 Social (egocentric bias) Concept development The false consensus effect is first defined by Ross, Green and House.[139]
1977 Belief, decision-making and behavioral (truthiness) Concept development The illusory truth effect is first identified in a study at Villanova University and Temple University.[140][141] The illusory truth effect "occurs when repeating a statement increases the belief that it’s true even when the statement is actually false."[142]
1977 Social bias Concept development A study conducted by Lee Ross and colleagues provides early evidence for a cognitive bias called the false consensus effect, which is the tendency for people to overestimate the extent to which others share the same views.[143] The false-consensus effect "refers to the tendency to overestimate consensus for one′s attitudes and behaviors."[144]
1977 Memory bias Research T. B. Rogers and colleagues publish the first research on the self-reference effect.[145][146]
1978 Memory bias Research Loftus, Miller, and Burns conduct the original misinformation effect study.[147] The misinformation effect "happens when a person's memory becomes less accurate due to information that happens after the event."[148]
1979 Research Professor Charles G. Lord writes seeking answers as to whether we might overcome the Bacon principle, or whether humans are always held hostage to their initial beliefs even in the face of compelling and contradictory evidence.[149]
1979 Social (attribution bias) Research Thomas Nagel identifies four kinds of moral luck in his essay.[150] "Moral luck occurs when the features of action which generate a particular moral assessment lie significantly beyond the control of the agent who is so assessed."[151]
1979 Social bias Concept development The ultimate attribution error is first established by Thomas F. Pettigrew in his publication The Ultimate Attribution Error: Extending Allport's Cognitive Analysis of Prejudice.[152] "Ultimate attribution error refers to the tendency of individuals to make less internal attributions of negative behaviors committed by ingroup members compared to outgroup members."[153]
1979 Social bias Concept development David Kahneman and Amos Tversky originally coin the term loss aversion in a landmark paper on subjective probability.[154] "Loss aversion is a cognitive bias that suggests that for individuals the pain of losing is psychologically twice as powerful as the pleasure of gaining."[155]
1979 Concept development The planning fallacy is first proposed by Daniel Kahneman and Amos Tversky,[156][157] "The planning fallacy refers to a prediction phenomenon, all too familiar to many, wherein people underestimate the time it will take to complete a future task, despite knowledge that previous tasks have generally taken longer than planned"[158]
1980 Memory bias Concept development The term "egocentric bias" is first coined by Anthony Greenwald, a psychologist at Ohio State University.[159] "The egocentric bias is a cognitive bias that causes people to rely too heavily on their own point of view when they examine events in their life or when they try to see things from other people’s perspective."[160]
1980 Social bias Concept development Ruth Hamill, Richard E. Nisbett, and Timothy DeCamp Wilson become the first to study the first type of group attribution error in detail in their paper Insensitivity to Sample Bias: Generalizing From Atypical Cases.[161] Group attribution error is "the tendency for perceivers to assume that a specific group member’s personal characteristics and preferences, including beliefs, attitudes, and decisions, are similar to those of the group to which he or she belongs."[162]
1980 Belief, decision-making and behavioral (truthiness) Concept development The term subjective validation first appears in the book The Psychology of the Psychic by David F. Marks and Richard Kammann.[163] Subjective validation "causes an individual to consider a statement or another piece of information correct if it has any significance or personal meaning (validating their previous opinion) to them."[164]
1980 Belief, decision-making and behavioral Concept development The phenomenon of optimism bias is initially described by Weinstein, who finds that the majority of college students believe that their chances of developing a drinking problem or getting divorced are lower than their peers.[165] "Optimism Bias refers to the tendency for individuals to underestimate their probability of experiencing adverse effects despite the obvious."[166]
1981 Social bias Study Framing effect "The Framing effect is the principle that our choices are influenced by the way they are framed through different wordings, settings, and situations."[167]
1981 Belief, decision-making and behavioral (prospect theory) Concept development The pseudocertainty effect is illustrated by Daniel Kahneman.[168] "Pseudocertainty effect refers to people's tendency to make risk-averse choices if the expected outcome is positive, but make risk-seeking choices to avoid negative outcomes."[169]
1982 Social (egocentric bias) Research Trait ascription bias. In a study involving fifty-six undergraduate psychology students from the University of Bielefeld, Kammer et al. demonstrate that subjects rate their own variability on each of 20 trait terms to be considerably higher than their peers.[170] "Trait ascription bias is the belief that other people's behavior and reactions are generally predictable while you yourself are more unpredictable."[171]
1982 Belief, decision-making and behavioral (framing effect) Research The decoy effect is first demonstrated by Joel Huber and others at Duke University, explains how when a customer is hesitating between two options, presenting them with a third “asymmetrically dominated” option that acts as a decoy will strongly influence which decision they make.[172] "The decoy effect is defined as the phenomenon whereby consumers change their preference between two options when presented with a third option."[173]
1983 Social (egocentric bias) Concept development Sociologist W. Phillips Davison first articulates the third-person effect hypothesis. "is the commonly held belief that other people are more affected, due to personal prejudices, by mass media than you yourself are. This view, largely due to a personal conceit, is caused by the self-concept of being more astute and aware than others, or of being less vulnerable to persuasion than others."[174]
1983 Social (conformity bias) Research Jones reports the presence of courtesy bias in Asian cultures.[175] "Courtesy bias is the tendency that some individuals have of not fully stating their unhappiness with a service or product because of a desire not to offend the person or organization that they are responding to."[176]
1985 Social bias Concept development Scott T. Allison and David Messick Group attribution error first report on a second form of group attribution error. "This form describes people's tendency to assume incorrectly that group decisions reflect group members' attitudes."[177]
1985 Belief, decision-making and behavioral (prospect theory) Concept development The disposition effect anomaly is identified and named by Hersh Shefrin and Meir Statman, who note that "people dislike incurring losses much more than they enjoy making gains, and people are willing to gamble in the domain of losses." Consequently, "investors will hold onto stocks that have lost value...and will be eager to sell stocks that have risen in value." The researchers coin the term "disposition effect" to describe this tendency of holding on to losing stocks too long and to sell off well-performing stocks too readily.[178] "The disposition effect refers to investors’ reluctance to sell assets that have lost value and greater likelihood of selling assets that have made gains."[179]
1985 Belief, decision-making and behavioral (logical fallacy) Concept development The hot-hand fallacy is first described in a paper by Amos Tversky, Thomas Gilovich, and Robert Vallone. "The hot-hand fallacy effect refers to the tendency for people to expect streaks in sports performance to continue."[180]
1986 Memory bias Research McDaniel and Einstein argue that bizarreness intrinsically does not enhance memory in their paper.[181] "The bizarreness effect holds that items associated with bizarre sentences or phrases are more readily recalled than those associated with common sentences or phrases."[182]
1988 Belief, decision-making and behavioral Research In an experiment by Baron, Beattie and Hershey, subjects considered this diagnostic problem involving fictitious diseases.[183] "Information bias is any systematic difference from the truth that arises in the collection, recall, recording and handling of information in a study, including how missing data is dealt with."[184]
1988 Social Concept development The Reactive devaluation bias is proposed by Lee Ross and Constance Stillinger.[185] "Reactive Devaluation is tendency to value the proposal of someone we recognized as an antagonist as being less interesting than if it was made by someone else."[186]
1988 Belief, decision-making and behavioral (prospect theory) Research Samuelson and Zeckhauser demonstrate status quo bias using a questionnaire in which subjects faced a series of decision problems, which were alternately framed to be with and without a pre-existing status quo position. Subjects tended to remain with the status quo when such a position was offered to them.[187] "Status quo bias refers to the phenomenon of preferring that one's environment and situation remain as they already are."[188]
1989 Belief, decision-making and behavioral Concept development The term "curse of knowledge" is coined in a Journal of Political Economy article by economists Colin Camerer, George Loewenstein, and Martin Weber. The curse of knowledge causes people to fail to account for the fact that others don't know the same things that they do.[189]
1990 Belief, decision-making and behavioral (prospect theory) Research Kahneman, Knetsch and Thaler publish a paper containing the first experimental test of the Endowment Effect.[190] It refers to an emotional bias that causes individuals to value an owned object higher, often irrationally, than its market value.
1990 Belief, decision-making and behavioral (confirmation bias) Concept development "There is also a well-described phenomenon known as “satisfaction of search”, first described in 1990, in which a radiologist fails to detect a second abnormality, apparently because of prematurely ceasing to search the images after detecting a “satisfying” finding, perhaps one that explains the patient’s clinical symptoms or is “satisfying” to the radiologist in some other wa"[191] "Satisfaction of search describes a situation in which the detection of one radiographic abnormality interferes with that of others."[192]
1991 Social (egocentric bias) Concept development The term illusory superiority is first used by the researchers Van Yperen and Buunk.[193] Illusory superiority "indicates an individual who has a belief that they are somehow inherently superior to others".[194]
1991 Social (conformity bias) Research Marín and Marín report courtesy bias to be common in Hispanic cultures.[175] The "Courtesy Bias is the reluctance of an individual to give negative feedback for fear of offending."[195]
1994 Belief, decision-making and behavioral Concept development The Women are wonderful effect term is coined by researchers Alice Eagly and Antonio Mladinic in a paper, where they question the widely-held view that there was prejudice against women.[196] "The women are wonderful effect is a phenomenon found in psychological research in which people associate more positive attributes with women as compared to men."[197]
1995 Concept development The implicit bias is first described in a publication by Tony Greenwald and Mahzarin Banaji.[198] "Research on implicit bias suggests that people can act on the basis of prejudice and stereotypes without intending to do so."[199]
1996 Research Daniel Kahneman and Amos Tversky argue that cognitive biases have efficient practical implications for areas including clinical judgment, entrepreneurship, finance, and management.[200][201]
1998 Belief, decision-making and behavioral Research Gilbert et al. report on the presence of impact bias in registered voters.[202] "Impact bias refers to a human tendency to overestimate emotional responses to events and experiences"[203]
1998 Concept development The implicit-association test is introduced in the scientific literature by Anthony Greenwald, Debbie McGhee, and Jordan Schwartz.[204] The implicit-association test is "a reaction time based categorization task that measures the differential associative strength between bipolar targets and evaluative attribute concepts as an approach to indexing implicit beliefs or biases."[205]
1998 Belief, decision-making and behavioral (extension neglect) Concept development Hsee discovers a less-is-better effect in three contexts: "(1) a person giving a $45 scarf (from scarves ranging from $5-$50) as a gift was perceived to be more generous than one giving a $55 coat (from coats ranging from $50-$500); (2) an overfilled ice cream serving with 7 oz of ice cream was valued more than an underfilled serving with 8 oz of ice cream; (3) a dinnerware set with 24 intact pieces was judged more favourably than one with 31 intact pieces (including the same 24) plus a few broken ones."[206] "The less-is-better effect is the tendency to prefer the smaller or the lesser alternative when choosing individually, but not when evaluating together."[207]
1999 Belief, decision-making and behavioral Concept development The psychological phenomenon of illusory superiority known as Dunning–Kruger effect is identified as a form of cognitive bias in Kruger and Dunning's 1999 study, Unskilled and Unaware of It: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments.[208] "The Dunning-Kruger effect is a cognitive bias in which people wrongly overestimate their knowledge or ability in a specific area."[209]
1999 Memory bias Concept development The term "spotlight effect" is coined by Thomas Gilovich and Kenneth Savitsky.[210] The phenomenon first appears in the world of psychology in the journal Current Directions in Psychological Science. "The spotlight effect refers to the tendency to think that more people notice something about you than they do."[211]
1999 Social (egocentric bias) Concept development Kruger and Gilovich publish study titled Naive cynicism in everyday theories of responsibility assessment: On biased assumptions of bias, which formally introduces the concept of naïve cynicism.[212] Naïve cynicism is "the tendency of laypeople to expect other people’s judgments will have a motivational basis and therefore will be biased in the direction of their self-interest."[213]
2002 Belief, decision-making and behavioral Concept development Daniel Kahneman and Shane Frederick propose the process of attribute substitution.[214] "Attribute substitution occurs when an individual has to make a judgment (of a target attribute) that is computationally complex, and instead substitutes a more easily calculated heuristic attribute."[215]
2002 Social (egocentric bias) Concept development Pronin et al. introduce the concept of "bias blind spot".[216] Bias blind spot "refers to the tendency for people to be able to identify distortionary biases in others, while being ignorant of and susceptible to precisely these biases in their own thinking."[216]
2002 Research Bystander effect. Research indicates that priming a social context may inhibit helping behavior. Imagining being around one other person or being around a group of people can affect a person's willingness to help.[217] "The bystander effect occurs when the presence of others discourages an individual from intervening in an emergency situation."[218]
2003 Belief, decision-making and behavioral Concept development The term projection bias is first introduced in the paper Projection Bias in Predicting Future Utility by Loewenstein, O'Donoghue and Rabin.[219] "It refers to people’s assumption that their tastes or preferences will remain the same over time"[220]
2003 Concept development Lovallo and Kahneman proposed an expanded definition of planning fallacy as the tendency to underestimate the time, costs, and risks of future actions and at the same time overestimate the benefits of the same actions. According to this definition, the planning fallacy results in not only time overruns, but also cost overruns and benefit shortfalls.[221] Planning fallacy
2004 Social bias Literature American journalist James Surowiecki publishes The Wisdom of Crowds, which explores herd mentality and draws the conclusion that the decisions made by groups are often better and more accurate than those made by any individual member.[222] "Herd mentality (also known as mob mentality) describes a behavior in which people act the same way or adopt similar behaviors as the people around them — often ignoring their own feelings in the process."[222]
2004 Belief, decision-making and behavioral (framing effect) Concept development The concept of the distinction bias is advanced by Christopher K. Hsee and Jiao Zhang of the University of Chicago as an explanation for differences in evaluations of options between joint evaluation mode and separate evaluation mode.[223] Distinction bias is "an explanation for why people evaluate objects differently when evaluating them jointly, as opposed to separately."[224]
2006 Organization Overcoming Bias launches as a group blog on the "general theme of how to move our beliefs closer to reality, in the face of our natural biases such as overconfidence and wishful thinking, and our bias to believe we have corrected for such biases, when we have done no such thing."[225]
2006 Belief, decision-making and behavioral Concept development The Ostrich effect is coined by Galai & Sade.[226] "The ostrich effect bias is a tendency to ignore dangerous or negative information by ignoring it or burying one's head in the sand"[227]
2007 Belief, decision-making and behavioral Concept development The term recency illusion is coined by Stanford University linguist Arnold Zwicky.[228] The recency illusion is the belief or impression that a word or language usage is of recent origin when it is long-established."[228]
2007 Social (conformity bias) Concept development The concept of an “availability cascade” is defined by professors Timur Kuran and Cass Sunstein.[222] Availability cascade refers to the "self-reinforcing process of collective belief formation by which an expressed perception triggers a chain reaction that gives the perception of increasing plausibility through its rising availability in public discourse."[222]
2008 Social bias (association fallacy) Concept development Cheerleader effect. "The phrase was coined by the character Barney Stinson in "Not a Father's Day", an episode of the television series How I Met Your Mother, first aired in November 2008. Barney points out to his friends a group of women that initially seem attractive, but who all seem to be very ugly when examined individually. This point is made again by Ted and Robin later in the episode, who note that some of Barney's friends also only seem attractive in a group."[229] "The cheerleader effect refers to the increase in attractiveness that an individual face experiences when seen in a group of other faces."[230]
2009 Belief, decision-making and behavioral (framing effect) Concept development The concept of denomination effect is proposed by Priya Raghubir, professor at the New York University Stern School of Business, and Joydeep Srivastava, professor at University of Maryland, in their paper.[231] Denomination effect relates "to currency, whereby people are less likely to spend larger bills than their equivalent value in smaller bills."[232]
2010 Belief, decision-making and behavioral (confirmation bias) Concept development The phrase backfire effect is first coined by American political scientist Brendan Nyhan and Jason Reifler.[233] "The backfire effect is a cognitive bias that causes people who encounter evidence that challenges their beliefs to reject that evidence, and to strengthen their support of their original stance."[234]
2010 Belief, decision-making and behavioral (egocentric bias) Research The Handbook of Social Psychology recognizes naïve realism as one of "four hard-won insights about human perception, thinking, motivation and behavior that... represent important, indeed foundational, contributions of social psychology."[235] "Naïve realism describes people’s tendency to believe that they perceive the social world “as it is”—as objective reality—rather than as a subjective construction and interpretation of reality."[236]
2011 Belief, decision-making and behavioral Concept development The IKEA effect is identified and named by Michael I. Norton of Harvard Business School, Daniel Mochon of Yale, and Dan Ariely of Duke University, who publish the results of three studies in this year.[237] "The Ikea Effect is the cognitive phenomena where customers get more excited and place a higher value in the products they have partially created, modified or personalized."[238]
2011 Memory bias Concept development The Google Eeffect, also known as “digital amnesia”, is first described by Betsy Sparrow from Columbia University and her colleagues. Their paper describes the results of several memory experiments involving technology.[239][240] The Google effect "represents people’s tendency to forget information that they can find online, particularly by using search engines such as Google."[241]
2011 Belief, decision-making and behavioral Notable case The look-elsewhere effect, more generally known in statistics as the problem of multiple comparisons, gains some media attention in the context of the search for the Higgs boson at the Large Hadron Collider.[242] The look-elsewhere effect "occurs when a statistically significant observation is found but, actually, arose by chance and due to the size of the parameter space and sample observed."[243]
2012 Belief, decision-making and behavioral (logical fallacy) Research In an article in Psychological Bulletin it is suggested the subadditivity effect can be explained by an information-theoretic generative mechanism that assumes a noisy conversion of objective evidence (observation) into subjective estimates (judgment).[244] The subadditivity effect is "the tendency to judge probability of the whole to be less than the probabilities of the parts".[245]
2013 Belief, decision-making and behavioral Concept development The term “end-of-history illusion” originates in a journal article by psychologists Jordi Quoidbach, Daniel Gilbert, and Timothy Wilson detailing their research on the phenomenon and leveraging the phrase coined by Francis Fukuyama's 1992 book of the same name.[246] The end-of-history illusion occurs "when people tend to underestimate how much they will change in the future.”[247]

Meta information on the timeline

How the timeline was built

The initial version of the timeline was written by User:Sebastian.

Funding information for this timeline is available.

Feedback and comments

Feedback for the timeline can be provided at the following places:

  • FIXME

What the timeline is still missing

Timeline update strategy

See also

External links

References

  1. "Every Single Cognitive Bias in One Infographic". visualcapitalist.com. Retrieved 5 December 2020. 
  2. "Cognitive biases". trends.google.com. Retrieved 15 January 2021. 
  3. "Google Books Ngram Viewer". books.google.com. Retrieved 28 January 2021. 
  4. "Cognitive biases". wikipediaviews.org. Retrieved 19 January 2021. 
  5. Sextus Empiricus, "Outlines of Pyrrhonism", Book 1, Chapter 13, Section 32
  6. "Just-World Hypothesis". alleydog.com. Retrieved 7 May 2020. 
  7. Carlisle, Rodney (2004). Scientific American Inventions and Discoveries, John Wiley & Songs, Inc., New Jersey. p. 393.
  8. "What Are Clinical Trials and Studies?". National Institute on Aging. Retrieved 28 January 2021. 
  9. Chambers's Cyclopædia, Supplement, 1753 
  10. Oxford English Dictionary, 1st ed. "anthropomorphism, n." Oxford University Press (Oxford), 1885.
  11. "Anthropomorphism". britannica.com. Retrieved 7 May 2020. 
  12. Miller, Laura (2015-06-14). "Culture is dead — again". Salon. Retrieved 17 April 2018. 
  13. J.G.A. Pocock, "Between Machiavelli and Hume: Gibbon as Civic Humanist and Philosophical Historian," Daedalus 105:3 (1976), 153–169; and in Further reading: Pocock, EEG, 303–304; FDF, 304–306.
  14. "Why we feel the past is better compare to what the future holds". thedecisionlab.com. Retrieved 7 May 2020. 
  15. Barron, Greg; Leider, Stephen (13 October 2009). "The role of experience in the Gambler's Fallacy" (PDF). Journal of Behavioral Decision Making. 
  16. "The Gambler's Fallacy - Explained". thecalculatorsite.com. Retrieved 7 May 2020. 
  17. Mortell, Manfred; Balkhy, Hanan H.; Tannous, Elias B.; Jong, Mei Thiee (July 2013). "Physician 'defiance' towards hand hygiene compliance: Is there a theory–practice–ethics gap?". Journal of the Saudi Heart Association. 25 (3): 203–208. PMC 3809478Freely accessible. PMID 24174860. doi:10.1016/j.jsha.2013.04.003. 
  18. "Semmelweis Reflex (Semmelweis Effect)". alleydog.com. Retrieved 7 May 2020. 
  19. "Bandwagon Effect". Retrieved 2007-03-09. 
  20. "The Bandwagon Effect". psychologytoday.com. Retrieved 7 May 2020. 
  21. 21.0 21.1 "Stereotypes Defined". stereotypeliberia.wordpress.com. Retrieved 10 April 2020. 
  22. Fechner, Gustav Theodor (1966) [First published .1860]. Howes, D H; Boring, E G, eds. Elements of psychophysics [Elemente der Psychophysik]. volume 1. Translated by Adler, H E. United States of America: Holt, Rinehart and Winston. 
  23. "Weber's law". britannica.com. Retrieved 7 May 2020. 
  24. [1] Sibbald, M.D. "Report on the Progress of Psychological Medicine; German Psychological Literature", The Journal of Mental Science, Volume 13. 1867. p. 238
  25. "pareidolia". merriam-webster.com. Retrieved 7 May 2020. 
  26. Brian Righi. (2008). Chapter 4: Talking Boards and Ghostly Goo. In Ghosts, Apparitions and Poltergeists. Llewellyn Publications."An early example of this occurred in 1874 with he medium William Stanton Moses, who communicated with the spirits of two brothers who had recently died in India. Upon investigation, it was discovered that one week prior to the séance, their obituary had appeared in the newspaper. This was of some importance because Moses's communications with the two spirits contained nothing that wasn't already printed in the newspaper. When the spirits were pressed for further information, they were unable to provide any. Researchers concluded that Moses had seen the obituary, forgotten it, and then resurfaced the memory during the séance."
  27. Robert Todd Carroll. (2014). "Cryptomnesia". The Skeptic's Dictionary. Retrieved 2014-07-12.
  28. "cryptomnesia". dictionary.apa.org. Retrieved 7 May 2020. 
  29. "Mere Exposure Effect" (PDF). wiwi.europa-uni.de. Retrieved 10 April 2020. 
  30. "6 Conversion Principles You Can Learn From The Mere-Exposure Effect". marketingland.com. Retrieved 7 May 2020. 
  31. Anonymous (E. Robert Kelly, 1882) The Alternative: A Study in Psychology. London: Macmillan and Co. p. 168.
  32. 32.0 32.1 Andersen H, Grush R (2009). "A brief history of time-consciousness: historical precursors to James and Husserl" (PDF). Journal of the History of Philosophy. 47 (2): 277–307. doi:10.1353/hph.0.0118. 
  33. James W (1893). The principles of psychology. New York: H. Holt and Company. p. 609. 
  34. Vlach, Haley A.; Sandhofer, Catherine M. "Distributing Learning Over Time: The Spacing Effect in Children's Acquisition and Generalization of Science Concepts". PMC 3399982Freely accessible. PMID 22616822. doi:10.1111/j.1467-8624.2012.01781.x. 
  35. James, W. (1890). Principles of Psychology. Retrieved from http://psychclassics.yorku.ca/James/Principles/
  36. Brown, Roger; McNeill, David. "The "tip of the tongue" phenomenon". doi:10.1016/S0022-5371(66)80040-3. 
  37. Bauer, P (2004). "Oh where, oh where have those early memories gone? A developmental perspective on childhood amnesia". Psychological Science Agenda. 18 (12). 
  38. "Childhood Amnesia". sciencedirect.com. Retrieved 7 May 2020. 
  39. "bandwagon effect". merriam-webster.com. Retrieved 7 April 2020. 
  40. "Bandwagon Effect - Biases & Heuristics". The Decision Lab. Retrieved 26 January 2021. 
  41. Sumner, William Graham. (1906). Folkways: A Study of the Social Importance of Usages, Manners, Customs, Mores, and Morals. Boston, MA: Ginn.
  42. Everett, Jim A. C.; Faber, Nadira S.; Crockett, Molly. "Preferences and beliefs in ingroup favoritism". PMC 4327620Freely accessible. PMID 25762906. doi:10.3389/fnbeh.2015.00015. 
  43. Abbott, Edwina (1909). "On the analysis of the factors of recall in the learning process". Psychological Monographs: General and Applied. 11 (1): 159–177. doi:10.1037/h0093018 – via Ovid. 
  44. Larsen, Douglas P.; Butler, Andrew C. (2013). Walsh, K., ed. Test-enhanced learning. In Oxford Textbook of Medical Education. pp. 443–452. 
  45. Goldstein, E. Bruce. Cognitive Psychology: Connecting Mind, Research and Everyday Experience. Cengage Learning. ISBN 978-1-133-00912-2. 
  46. "Why we gamble like monkeys". BBC.com. 2015-01-02. 
  47. "Gambler's Fallacy". investopedia.com. Retrieved 7 May 2020. 
  48. Feingold, CA (1914). "The influence of environment on identification of persons and things". Journal of Criminal Law and Police Science. 5 (1): 39–51. JSTOR 1133283. doi:10.2307/1133283. 
  49. Laub, Cindy E.; Meissner, Christian A.; Susa, Kyle J. "The Cross-Race Effect: Resistant to Instructions". doi:10.1155/2013/745836. 
  50. The Advanced Dictionary of Marketing, Scott G. Dacko, 2008: Marketing. Oxford: Oxford University Press. 2008-06-18. p. 248. ISBN 9780199286003. 
  51. 51.0 51.1 Thorndike 1920
  52. Sigall, Harold; Ostrove, Nancy (1975-03-01). "Beautiful but Dangerous: Effects of Offender Attractiveness and Nature of the Crime on Juridic Judgment". Journal of Personality and Social Psychology. 31 (3): 410–414. doi:10.1037/h0076472. 
  53. "This Cognitive Bias Explains Why Pretty People Make 12% More Money Than Everybody Else". businessinsider.com. Retrieved 6 April 2020. 
  54. "What Is the Halo Effect?". psychologytoday.com. Retrieved 6 April 2020. 
  55. "Halo effect". britannica.com. Retrieved 7 May 2020. 
  56. "Definition of STEREOTYPE". www.merriam-webster.com. Retrieved 28 January 2021. 
  57. Zeigarnik 1927: "Das Behalten erledigter und unerledigter Handlungen". Psychologische Forschung 9, 1-85.
  58. Zeigarnik 1927: "Das Behalten erledigter und unerledigter Handlungen". Psychologische Forschung 9, 1-85.
  59. "Zeigarnik Effect". goodtherapy.org. Retrieved 7 May 2020. 
  60. Fisher, Irving (1928), The Money Illusion, New York: Adelphi Company 
  61. Liberto, Daniel. "Money Illusion Definition". Investopedia. Retrieved 26 January 2021. 
  62. "The Specious Present: Andrew Beck, David Claerbout, Colin McCahon, Keith Tyson - Announcements - Art & Education". www.artandeducation.net. Retrieved 27 January 2021. 
  63. Fleming, G. W. T. H. (January 1933). "The Learning and Retention of Pleasant and Unpleasant Activities. (Arch. of Psychol., No. 134, 1932.) Cason, H.". Journal of Mental Science. 79 (324): 187–188. ISSN 0368-315X. doi:10.1192/bjp.79.324.187-c. 
  64. Skowronski, John J.; Walker, W. Richard; Henderson, Dawn X.; Bond, Gary D. "Chapter Three - The Fading Affect Bias: Its History, Its Implications, and Its Future". doi:10.1016/B978-0-12-800052-6.00003-2. 
  65. von Restorff, Hedwig (1933). "Über die Wirkung von Bereichsbildungen im Spurenfeld" [The effects of field formation in the trace field]. Psychologische Forschung [Psychological Research] (in Deutsch). 18 (1): 299–342. doi:10.1007/BF02409636. 
  66. "The Von Restorff effect". lawsofux.com. Retrieved 7 May 2020. 
  67. Duncker, K. (1945). "On problem solving". Psychological Monographs, 58:5 (Whole No. 270).
  68. "Functional fixedness". britannica.com. Retrieved 7 May 2020. 
  69. Batsidis, Apostolos; Tzavelas, George; Alexopoulos, Panagiotis. "Berkson's paradox and weighted distributions: An application to Alzheimer's disease". 
  70. "Berkson's Paradox (Berkson's Bias)". alleydog.com. Retrieved 14 August 2020. 
  71. Johnson, J. (2011). The arithmetic of compassion: rethinking the politics of photography. British Journal of Political Science, 41(3), 621-643. doi: 10.1017/S0007123410000487.
  72. "Compassion fade". econowmics.com. Retrieved 15 January 2021. 
  73. Whyte, W. H., Jr. (March 1952). "Groupthink". Fortune. pp. 114–117, 142, 146. 
  74. Safire, W. (August 8, 2004). "Groupthink". The New York Times. Retrieved February 2, 2012. If the committee's other conclusions are as outdated as its etymology, we're all in trouble. 'Groupthink' (one word, no hyphen) was the title of an article in Fortune magazine in March 1952 by William H. Whyte Jr. ... Whyte derided the notion he argued was held by a trained elite of Washington's 'social engineers.' 
  75. "The Psychology Behind Why We Strive for Consensus". Verywell Mind. 
  76. Festinger L (1954). "A theory of social comparison processes". Human Relations. 7 (2): 117–140. doi:10.1177/001872675400700202. 
  77. "Social Comparison Theory". psychologytoday.com. Retrieved 7 May 2020. 
  78. Meehl, Paul E. (1956). "Wanted – A Good Cookbook". American Psychologist. 11 (6): 263–272. doi:10.1037/h0044164. 
  79. Dutton, D. L. (1988). "The cold reading technique". Experientia. 44 (4): 326–332. PMID 3360083. doi:10.1007/BF01961271. 
  80. "Barnum Effect". britannica.com. Retrieved 7 May 2020. 
  81. Parkinson, C. Northcote (1958). Parkinson's Law, or the Pursuit of Progress. John Murray. ISBN 0140091076. 
  82. "How to Handle Bikeshedding: Parkinson's Law of Triviality". projectbliss.net. Retrieved 7 May 2020. 
  83. "The Curious Case of Confirmation Bias". psychologytoday.com. Retrieved 7 April 2020. 
  84. Acks, Alex. The Bubble of Confirmation Bias. 
  85. Myers, David G. Psychology. 
  86. "Confirmation Bias". simplypsychology.org. Retrieved 14 August 2020. 
  87. "The Curious Case of Confirmation Bias". psychologytoday.com. Retrieved 14 August 2020. 
  88. "Cognitive Bias in Decision Making". associationanalytics.com. Retrieved 7 May 2020. 
  89. Ellis RM (2015). Middle Way Philosophy: Omnibus Edition. Lulu Press. ISBN 9781326351892. 
  90. "Authority Bias". alleydog.com. Retrieved 14 August 2020. 
  91. Borcherding, Katrin; Laričev, Oleg Ivanovič; Messick, David M. (1990). Contemporary Issues in Decision Making. North-Holland. p. 50. ISBN 978-0-444-88618-7. 
  92. "Why we prefer options that are known to us". thedecisionlab.com. Retrieved 14 August 2020. 
  93. 93.0 93.1 Rubin, David C.; Baddeley, Alan D. (1989). "Telescoping is not time compression: A model". Memory & Cognition. 17 (6): 653–661. PMID 2811662. doi:10.3758/BF03202626. 
  94. "Telescoping effect - Biases & Heuristics". The Decision Lab. Retrieved 26 January 2021. 
  95. Abraham Kaplan (1964). The Conduct of Inquiry: Methodology for Behavioral Science. San Francisco: Chandler Publishing Co. p. 28. ISBN 9781412836296. 
  96. "Law of the instrument - Biases & Heuristics". The Decision Lab. Retrieved 27 January 2021. 
  97. Tulving, Endel; Pearlstone, Zena (1966). "Availability versus accessibility of information in memory for words". Journal of Verbal Learning and Verbal Behavior. 5 (4): 381–391. doi:10.1016/S0022-5371(66)80048-8. 
  98. Walster, Elaine (1966). "Assignment of responsibility for an accident.". Journal of Personality and Social Psychology. 3 (1): 73–79. doi:10.1037/h0022733. 
  99. Adams, John (1985). Risk and Freedom: Record of Road Safety Regulation. Brefi Press. ISBN 9780948537059. 
  100. Flock, Elizabeth (2012-02-17). "Dagen H: The day Sweden switched sides of the road". Washington Post. On the day of the change, only 150 minor accidents were reported. Traffic accidents over the next few months went down. ... By 1969, however, accidents were back at normal levels 
  101. "On September 4 there were 125 reported traffic accidents as opposed to 130-196 from the previous Mondays. No traffic fatalities were linked to the switch. In fact, fatalities dropped for two years, possibly because drivers were more vigilant after the switch." Sweden finally began driving on the right side of the road in 1967 The Examiner Sept 2, 2009
  102. Mok, D; Gore, G; Hagel, B; Mok, E; Magdalinos, H; Pless, B. "Risk compensation in children's activities: A pilot study". PMC 2721187Freely accessible. PMID 19657519. doi:10.1093/pch/9.5.327. 
  103. Chapman, L (1967). "Illusory correlation in observational report". Journal of Verbal Learning and Verbal Behavior. 6 (1): 151–155. doi:10.1016/S0022-5371(67)80066-5. 
  104. Chapman, L.J (1967). "Illusory correlation in observational report". Journal of Verbal Learning. 6: 151–155. doi:10.1016/s0022-5371(67)80066-5. 
  105. "Illusory Correlation". psychology.iresearchnet.com. Retrieved 17 July 2020. 
  106. Jones, E. E.; Harris, V. A. (1967). "The attribution of attitudes". Journal of Experimental Social Psychology. 3 (1): 1–24. doi:10.1016/0022-1031(67)90034-0. 
  107. Ross, L. (1977). "The intuitive psychologist and his shortcomings: Distortions in the attribution process". In Berkowitz, L. Advances in experimental social psychology. 10. New York: Academic Press. pp. 173–220. ISBN 978-0-12-015210-0. 
  108. "Fundamental Attribution Error". simplypsychology.org. Retrieved 7 May 2020. 
  109. Edwards, Ward. "Conservatism in Human Information Processing (excerpted)". In Daniel Kahneman, Paul Slovic and Amos Tversky. (1982). Judgment under uncertainty: Heuristics and biases. New York: Cambridge University Press. Original work published 1968.
  110. "Conservatism Bias". dwassetmgmt.com. Retrieved 8 May 2020. 
  111. "Statistics How To". statisticshowto.com. Retrieved 7 April 2020. 
  112. "Pygmalion Effect". alleydog.com. Retrieved 7 May 2020. 
  113. "To Become Super-Likable, Practice "The Ben Franklin Effect"". medium.com. Retrieved 13 March 2020. 
  114. "Ben Franklin Effect". alleydog.com. Retrieved 7 May 2020. 
  115. "The suffix effect: How many positions are involved?" (PDF). link.springer.com. Retrieved 5 May 2020. 
  116. "Two-component theory of the suffix effect: Contrary evidence". link.springer.com. Retrieved 16 July 2020. 
  117. Malle, BF. "The actor-observer asymmetry in attribution: a (surprising) meta-analysis.". PMID 17073526. doi:10.1037/0033-2909.132.6.895. 
  118. "The actor-observer asymmetry in attribution: A (surprising) meta-analysis.". psycnet.apa.org. Retrieved 7 May 2020. 
  119. "Cognitive Bias: How Your Mind Plays Tricks on You and How to Overcome That at Work". zapier.com. Retrieved 15 January 2021. 
  120. "Cognitive Bias". sciencedirect.com. Retrieved 16 January 2021. 
  121. Fischhoff, B (2007). "An early history of hindsight research". Social Cognition. 25: 10–13. doi:10.1521/soco.2007.25.1.10. 
  122. "Hindsight bias". Encyclopedia Britannica. Retrieved 27 January 2021. 
  123. "Why are we overconfident in our predictions?". thedecisionlab.com. Retrieved 10 April 2020. 
  124. "Illusion Of Validity". alleydog.com. Retrieved 7 May 2020. 
  125. Brenner, Malcolm (1973). "The next-in-line effect" (PDF). Journal of Verbal Learning and Verbal Behavior. 12 (3): 320–323. doi:10.1016/s0022-5371(73)80076-3. 
  126. "Memory Flashcards". Quizlet. Retrieved 27 January 2021. 
  127. Loftus, Elizabeth F.; Palmer, John C. (1974). "Reconstruction of automobile destruction: An example of the interaction between language and memory". Journal of Verbal Learning and Verbal Behavior. 13 (5): 585–589. doi:10.1016/s0022-5371(74)80011-3. 
  128. "False memory". scholarpedia.org. Retrieved 14 August 2020. 
  129. Ralph, Kelcie; Delbosc, Alexa. "I'm multimodal, aren't you? How ego-centric anchoring biases experts' perceptions of travel patterns". doi:10.1016/j.tra.2017.04.027. 
  130. "Anchoring Bias - Definition, Overview and Examples". Corporate Finance Institute. Retrieved 27 January 2021. 
  131. Larson, James; Rutger U; Douglass Coll (1977). "Evidence for a self-serving bias in the attribution of causality". Journal of Personality. 45 (3): 430–441. doi:10.1111/j.1467-6494.1977.tb00162.x. 
  132. "What Is a Self-Serving Bias and What Are Some Examples of It?". healthline.com. Retrieved 7 May 2020. 
  133. Staw, Barry M. (1976). "Knee-deep in the big muddy: a study of escalating commitment to a chosen course of action". Organizational Behavior and Human Performance. 16 (1): 27–44. doi:10.1016/0030-5073(76)90005-2. 
  134. "Escalation of Commitment: Definition, Causes & Examples". bizfluent.com. Retrieved 7 May 2020. 
  135. Duncan, B. L. (1976). "Differential social perception and attribution if intergroup violence: Testing the lower limits of stereotyping of Blacks". Journal of Personality and Social Psychology. 34 (4): 75–93. doi:10.1037/0022-3514.34.4.590. 
  136. "APA Dictionary of Psychology". dictionary.apa.org. Retrieved 7 May 2020. 
  137. Brown, R., Kulik J. (1977). "Flashbulb memories". Cognition. 5: 73–99. doi:10.1016/0010-0277(77)90018-X. 
  138. "Misattribution Effect". sites.google.com. Retrieved 7 May 2020. 
  139. "False Consensus Effect". psychology.iresearchnet.com. Retrieved 14 January 2021. 
  140. Hasher, Lynn; Goldstein, David; Toppino, Thomas (1977). "Frequency and the conference of referential validity" (PDF). Journal of Verbal Learning and Verbal Behavior. 16 (1): 107–112. doi:10.1016/S0022-5371(77)80012-1. 
  141. Newman, Eryn J.; Sanson, Mevagh; Miller, Emily K.; Quigley-Mcbride, Adele; Foster, Jeffrey L.; Bernstein, Daniel M.; Garry, Maryanne (September 6, 2014). "People with Easier to Pronounce Names Promote Truthiness of Claims". PLOS ONE. 9 (2): e88671. PMC 3935838Freely accessible. PMID 24586368. doi:10.1371/journal.pone.0088671. 
  142. "Illusory Truth, Lies, and Political Propaganda: Part 1". psychologytoday.com. Retrieved 7 May 2020. 
  143. Ross, Lee; Greene, David; House, Pamela (1977). "The "false consensus effect": An egocentric bias in social perception and attribution processes". Journal of Experimental Social Psychology. 13 (3): 279–301. doi:10.1016/0022-1031(77)90049-x. 
  144. Alicke, Mark; Largo, Edward. "The Role of Self in the False Consensus Effect". doi:10.1006/jesp.1995.1002. 
  145. "Self-Reference Effect". psychology.iresearchnet.com. Retrieved 12 January 2021. 
  146. Bentley, Sarah V.; Greenaway, Katharine H.; Haslam, S. Alexander. "An online paradigm for exploring the self-reference effect". doi:10.1371/journal.pone.0176611. 
  147. Zaragoza, Maria S.; Belli, Robert F.; Payment, Kristie E. "Misinformation Effectsand the Suggestibility of Eyewitness Memory". 
  148. "What Is Misinformation Effect?". growthramp.io. Retrieved 7 May 2020. 
  149. "PART TWO: THE BACKFIRE EFFECT AND HOW TO CHANGE MINDS". instituteforpr.org. Retrieved 14 August 2020. 
  150. Rudy Hiller, Fernando. "How to (dis)solve Nagel's paradox about moral luck and responsibility". doi:10.1590/0100-6045.2016.V39N1.FRH. 
  151. "Moral Luck". philpapers.org. Retrieved 7 May 2020. 
  152. Pettigrew, T. F. (1979). "The ultimate attribution error: Extending Allport's cognitive analysis of prejudice". Personality and Social Psychology Bulletin. 5 (4): 461–476. doi:10.1177/014616727900500407. 
  153. Fraser Pettigrew, Thomas. "The Ultimate Attribution Error: Extending Allport's Cognitive Analysis of Prejudice". doi:10.1177/014616727900500407. 
  154. "Loss aversion". behavioraleconomics.com. Retrieved 14 August 2020. 
  155. "Why is the pain of losing felt twice as powerfully compared to equivalent gains?". thedecisionlab.com. Retrieved 14 August 2020. 
  156. Pezzo, Mark V.; Litman, Jordan A.; Pezzo, Stephanie P. (2006). "On the distinction between yuppies and hippies: Individual differences in prediction biases for planning future tasks". Personality and Individual Differences. 41 (7): 1359–1371. ISSN 0191-8869. doi:10.1016/j.paid.2006.03.029. 
  157. Kahneman, Daniel; Tversky, Amos (1977). "Intuitive prediction: Biases and corrective procedures" (PDF).  Decision Research Technical Report PTR-1042-77-6. In Kahneman, Daniel; Tversky, Amos (1982). "Intuitive prediction: Biases and corrective procedures". In Kahneman, Daniel; Slovic, Paul; Tversky, Amos. Judgment Under Uncertainty: Heuristics and Biases. Science. 185. pp. 414–421. ISBN 978-0511809477. PMID 17835457. doi:10.1017/CBO9780511809477.031. 
  158. Buehler, Roger; Griffin, Dale; Peetz, Johanna. "Chapter One - The Planning Fallacy: Cognitive, Motivational, and Social Origins". doi:10.1016/S0065-2601(10)43001-4. 
  159. Goleman, Daniel (1984-06-12). "A bias puts self at center of everything". The New York Times. Retrieved 2016-12-09. 
  160. "The Egocentric Bias: Why It's Hard to See Things from a Different Perspective". effectiviology.com. Retrieved 16 July 2020. 
  161. Hamill, Ruth; Wilson, Timothy D.; Nisbett, Richard E. (1980). "Insensitivity to sample bias: Generalizing from atypical cases" (PDF). Journal of Personality and Social Psychology. 39 (4): 578–589. doi:10.1037/0022-3514.39.4.578. 
  162. "group attribution error". dictionary.apa.org. Retrieved 14 August 2020. 
  163. Frazier, Kendrick (1986). Science Confronts the Paranormal. Prometheus Books. p. 101. 
  164. "Subjective Validation". alleydog.com. Retrieved 14 August 2020. 
  165. "Understanding the Optimism Bias". verywellmind.com. Retrieved 15 January 2021. 
  166. "Optimism Bias - Biases & Heuristics". The Decision Lab. Retrieved 28 January 2021. 
  167. "Why do our decisions depend on how options are presented to us?". thedecisionlab.com. Retrieved 16 January 2021. 
  168. Tversky, A; Kahneman, D (30 January 1981). "The framing of decisions and the psychology of choice". Science. 211 (4481): 453–458. doi:10.1126/SCIENCE.7455683. 
  169. "Pseudocertainty effect". wiwi.europa-uni.de. Retrieved 14 August 2020. 
  170. Kammer, D. (1982). "Differences in trait ascriptions to self and friend: Unconfounding intensity from variability". Psychological Reports. 51 (1): 99–102. doi:10.2466/pr0.1982.51.1.99. 
  171. "Trait Ascription Bias". alleydog.com. Retrieved 14 August 2020. 
  172. "Decoy Effect definition". tactics.convertize.com. Retrieved 14 January 2021. 
  173. Mortimer, Gary. "The decoy effect: how you are influenced to choose without really knowing it". The Conversation. Retrieved 29 January 2021. 
  174. "Third-Person Effect". alleydog.com. Retrieved 7 May 2020. 
  175. 175.0 175.1 Hakim, Catherine. Models of the Family in Modern Societies: Ideals and Realities: Ideals and Realities. 
  176. "Courtesy Bias". alleydog.com. Retrieved 14 August 2020. 
  177. "Group attribution error". Wikipedia. 26 October 2020. Retrieved 27 January 2021. 
  178. "Disposition Effect". Behavioural Finance. Retrieved 11 January 2017. 
  179. "Disposition effect". behavioraleconomics.com. Retrieved 16 July 2020. 
  180. "Hot Hand Effect". psychology.iresearchnet.com. Retrieved 16 July 2020. 
  181. Iaccino, J. F.; Sowa, S. J. (February 1989). "Bizarre imagery in paired-associate learning: an effective mnemonic aid with mixed context, delayed testing, and self-paced conditions". Percept mot Skills. 68 (1): 307–16. PMID 2928063. doi:10.2466/pms.1989.68.1.307. 
  182. "Bizarreness effect". britannica.com. Retrieved 16 July 2020. 
  183. Baron, Jonathan (2006). "Information bias and the value of information". Thinking and Deciding (4th ed.). Cambridge University Press. p. 177. ISBN 978-0-521-68043-1. 
  184. "Information Bias". catalogofbias.org. Retrieved 22 September 2020. 
  185. Lee Ross, Constance A. Stillinger, "Psychological barriers to conflict resolution", Stanford Center on Conflict and Negotiation, Stanford University, 1988, p. 4
  186. "Why we often tend to devalue proposals made by people who we consider to be adversaries". thedecisionlab.com. Retrieved 22 September 2020. 
  187. Samuelson, W.; Zeckhauser, R. (1988). "Status quo bias in decision making". Journal of Risk and Uncertainty. 1: 7–59. doi:10.1007/bf00055564. 
  188. "Status Quo Bias: What It Means and How It Affects Your Behavior". thoughtco.com. Retrieved 22 September 2020. 
  189. "The Curse of Knowledge: What It Is and How to Account for It". effectiviology.com. Retrieved 6 May 2020. 
  190. Atladóttir, Kristín. "The Endowment Effect and other biases in creative goods transactions" (PDF). ISSN 1670-8288. 
  191. Bruno, Michael A. "256 Shades of gray: uncertainty and diagnostic error in radiology". doi:10.1515/dx-2017-0006. 
  192. Ashman, C. J.; Yu, J. S.; Wolfman, D. (August 2000). "Satisfaction of search in osteoradiology". AJR. American journal of roentgenology. 175 (2): 541–544. ISSN 0361-803X. doi:10.2214/ajr.175.2.1750541. Retrieved 27 January 2021. 
  193. "Self-Enhancement and Superiority Biases in Social Comparison". researchgate.net. Retrieved 14 August 2020. 
  194. "Illusory Superiority". alleydog.com. Retrieved 7 May 2020. 
  195. "The Courtesy Bias". smallbusinessforum.co. Retrieved 14 August 2020. 
  196. ""Women Are Wonderful" Effect". scribd.com. Retrieved 10 April 2020. 
  197. ""women are wonderful" effect". crazyfacts.com. Retrieved 18 July 2020. 
  198. "PROJECT IMPLICIT LECTURES AND WORKSHOPS". projectimplicit.net. Retrieved 12 March 2020. 
  199. "Implicit Bias". plato.stanford.edu. Retrieved 8 May 2020. 
  200. Kahneman, D. & Tversky, A. (1996). "On the reality of cognitive illusions" (PDF). Psychological Review. 103 (3): 582–591. CiteSeerX 10.1.1.174.5117Freely accessible. PMID 8759048. doi:10.1037/0033-295X.103.3.582. 
  201. S.X. Zhang; J. Cueto (2015). "The Study of Bias in Entrepreneurship". Entrepreneurship Theory and Practice. 41 (3): 419–454. doi:10.1111/etap.12212. 
  202. Medway, Dominic; Foos, Adrienne; Goatman, Anna. "Impact bias in student evaluations of higher education". Studies in Higher Education. doi:10.1080/03075079.2015.1071345. Retrieved 7 May 2020. 
  203. Medway, Dominic; Foos, Adrienne; Goatman, Anna. "Impact bias in student evaluations of higher education". Studies in Higher Education. doi:10.1080/03075079.2015.1071345. Retrieved 7 May 2020. 
  204. Greenwald, Anthony G.; McGhee, Debbie E.; Schwartz, Jordan L.K. (1998), "Measuring Individual Differences in Implicit Cognition: The Implicit Association Test", Journal of Personality and Social Psychology, 74 (6): 1464–1480, PMID 9654756, doi:10.1037/0022-3514.74.6.1464 
  205. Healy, Graham F.; Boran, Lorraine; Smeaton, Alan F. "Neural Patterns of the Implicit Association Test". PMC 4656831Freely accessible. PMID 26635570. doi:10.3389/fnhum.2015.00605. 
  206. Hsee, Christopher K. (1998). "Less Is Better: When Low-value Options Are Valued More Highly than High-value Options" (PDF). Journal of Behavioral Decision Making. 11 (2): 107–121. doi:10.1002/(SICI)1099-0771(199806)11:2<107::AID-BDM292>3.0.CO;2-Y. 
  207. "Why we prefer the smaller or the lesser alternative". thedecisionlab.com. Retrieved 7 May 2020. 
  208. Kruger, Justin; Dunning, David (1999). "Unskilled and Unaware of It: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments". Journal of Personality and Social Psychology. 77 (6): 1121–1134. PMID 10626367. doi:10.1037/0022-3514.77.6.1121. 
  209. "Dunning-Kruger Effect". psychologytoday.com. Retrieved 14 August 2020. 
  210. Gilovich, T.; Medvec, V. H.; Savitsky, K. (2000). "The spotlight effect in social judgment: An egocentric bias in estimates of the salience of one's own actions and appearance" (PDF). Journal of Personality and Social Psychology. 78 (2): 211–222. PMID 10707330. doi:10.1037//0022-3514.78.2.211. 
  211. "The Spotlight Effect". psychologytoday.com. Retrieved 14 August 2020. 
  212. Kruger, Justin; Gilovich, Thomas (1999). "'Naive cynicism' in everyday theories of responsibility assessment: On biased assumptions of bias.". Journal of Personality and Social Psychology. 76 (5): 743–753. doi:10.1037/0022-3514.76.5.743. 
  213. "Naive Cynicism". psychology.iresearchnet.com. Retrieved 16 July 2020. 
  214. Kahneman, Daniel; Frederick, Shane (2002). "Representativeness Revisited: Attribute Substitution in Intuitive Judgment". In Thomas Gilovich; Dale Griffin; Daniel Kahneman. Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge: Cambridge University Press. pp. 49–81. ISBN 978-0-521-79679-8. 
  215. "Attribute substitution- a quick guide". biasandbelief.wordpress.com. Retrieved 7 May 2020. 
  216. 216.0 216.1 Pronin, Emily; Lin, Daniel Y.; Ross, Lee. "The Bias Blind Spot: Perceptions of Bias in Self Versus Others". doi:10.1177/0146167202286008. 
  217. Garcia, S.M.; Weaver, K.; Darley, J.M.; Moskowitz, G.B. (2002). "Crowded minds: the implicit bystander effect". Journal of Personality and Social Psychology. 83 (4): 843–853. PMID 12374439. doi:10.1037/0022-3514.83.4.843. 
  218. "Bystander Effect". psychologytoday.com. Retrieved 7 May 2020. 
  219. Frederick, Shane; Loewenstein, George; O'Donoghue, Ted (2011). "Time Discounting and Time Preference: A Critical Review". In Camerer, Colin F.; Loewenstein, George; Rabin, Matthew. Advances in Behavioral Economics. Princeton University Press. pp. 187–188. ISBN 978-1400829118. 
  220. "Projection bias". behavioraleconomics.com. Retrieved 7 May 2020. 
  221. Lovallo, Dan; Kahneman, Daniel (July 2003). "Delusions of Success: How Optimism Undermines Executives' Decisions". Harvard Business Review. 81 (7): 56–63. PMID 12858711. 
  222. 222.0 222.1 222.2 222.3 "4 examples of herd mentality (and how to take advantage of it)". iwillteachyoutoberich.com. Retrieved 27 January 2021.  Cite error: Invalid <ref> tag; name "sdf" defined multiple times with different content
  223. Hsee, Christopher K.; Zhang, Jiao. "General Evaluability Theory". doi:10.1177/1745691610374586. 
  224. "Why we tend to view two options as more distinctive when evaluating them simultaneously then separately.". thedecisionlab.com. Retrieved 16 July 2020. 
  225. "Overcoming Bias". overcomingbias.com. Retrieved 13 March 2020. 
  226. "The "Ostrich Effect" and the Relationship between the Liquidity and the Yields of Financial Assets". The Journal of Business. doi:10.2139/ssrn.431180. 
  227. "Ostrich Effect". thinkingcollaborative.com. Retrieved 8 May 2020. 
  228. 228.0 228.1 Rickford, John R.; Wasow, Thomas; Zwicky, Arnold (2007). "Intensive and quotative all: something new, something old". American Speech. 82 (1): 3–31. doi:10.1215/00031283-2007-001Freely accessible. 
  229. Hamblin, James (November 4, 2013). "Cheerleader Effect: Why People Are More Beautiful in Groups". The Atlantic. Retrieved December 5, 2015. 
  230. Carragher, Daniel J.; Thomas, Nicole A.; Gwinn, O. Scott; Nicholls, Mike E. R. "Limited evidence of hierarchical encoding in the cheerleader effect". 
  231. "Why We Spend Coins Faster Than Bills". NPR. May 12, 2009. Retrieved 7 April 2020. 
  232. "Denomination effect". nlpnotes.com. Retrieved 7 May 2020. 
  233. "Pdf." (PDF). 
  234. "The Backfire Effect: Why Facts Don't Always Change Minds – Effectiviology". effectiviology.com. Retrieved 27 January 2021. 
  235. Ross, Lee; Lepper, Mark; Ward, Andrew (30 June 2010). "History of Social Psychology: Insights, Challenges, and Contributions to Theory and Application". Handbook of Social Psychology: socpsy001001. doi:10.1002/9780470561119.socpsy001001. 
  236. "Naive Realism". psychology.iresearchnet.com. Retrieved 17 July 2020. 
  237. "Cognitive Biases — The IKEA Effect". medium.com. Retrieved 14 August 2020. 
  238. "What is the Ikea Effect?". bloomreach.com. Retrieved 7 May 2020. 
  239. "Marketers Need To Be Aware Of Cognitive Bias". thecustomer.net. Retrieved 12 March 2020. 
  240. "Study Finds That Memory Works Differently in the Age of Google". Columbia University. July 14, 2011. 
  241. "The Google Effect and Digital Amnesia: How We Use Machines to Remember". effectiviology.com. Retrieved 16 July 2020. 
  242. Tom Chivers (2011-12-13). "An unconfirmed sighting of the elusive Higgs boson". Daily Telegraph. 
  243. "When a statistically significant observation should be overlooked.". thedecisionlab.com. Retrieved 7 May 2020. 
  244. Hilbert, Martin (2012). "Toward a synthesis of cognitive biases: How noisy information processing can bias human decision making" (PDF). Psychological Bulletin. 138 (2): 211–237. PMID 22122235. doi:10.1037/a0025940. 
  245. "Today's term from psychology is Subadditivity Effect.". steemit.com. Retrieved 7 May 2020. 
  246. Quoidbach, Jordi; Gilbert, Daniel T.; Wilson, Timothy D. (2013-01-04). "The End of History Illusion" (PDF). Science. 339 (6115): 96–98. PMID 23288539. doi:10.1126/science.1229294. Young people, middle-aged people, and older people all believed they had changed a lot in the past but would change relatively little in the future. 
  247. "Why You Won't Be the Person You Expect to Be". nytimes.com. Retrieved 7 May 2020.