Search This Blog

Tuesday, August 5, 2025

Curse of knowledge

From Wikipedia, the free encyclopedia

The curse of knowledge, also called the curse of expertise or expert's curse, is a cognitive bias that occurs when a person who has specialized knowledge assumes that others share in that knowledge.

For example, in a classroom setting, teachers may have difficulty if they cannot put themselves in the position of the student. A knowledgeable professor might no longer remember the difficulties that a young student encounters when learning a new subject for the first time. This curse of knowledge also explains the danger behind thinking about student learning based on what appears best to faculty members, as opposed to what has been verified with students.

History of concept

The term "curse of knowledge" was coined in a 1989 Journal of Political Economy article by economists Colin Camerer, George Loewenstein, and Martin Weber. The aim of their research was to counter the "conventional assumptions in such (economic) analyses of asymmetric information in that better-informed agents can accurately anticipate the judgement of less-informed agents".

Such research drew from Baruch Fischhoff's work in 1975 surrounding hindsight bias, a cognitive bias that knowing the outcome of a certain event makes it seem more predictable than may actually be true. Research conducted by Fischhoff revealed that participants did not know that their outcome knowledge affected their responses, and, if they did know, they could still not ignore or defeat the effects of the bias. Study participants could not accurately reconstruct their previous, less knowledgeable states of mind, which directly relates to the curse of knowledge. This poor reconstruction was theorized by Fischhoff to be because the participant was "anchored in the hindsightful state of mind created by receipt of knowledge". This receipt of knowledge returns to the idea of the curse proposed by Camerer, Loewenstein, and Weber: a knowledgeable person cannot accurately reconstruct what a person, be it themselves or someone else, without the knowledge would think, or how they would act. In his paper, Fischhoff questions the failure to empathize with ourselves in less knowledgeable states, and notes that how well people manage to reconstruct perceptions of lesser informed others is a crucial question for historians and "all human understanding".

This research led the economists Camerer, Loewenstein, and Weber to focus on the economic implications of the concept and question whether the curse harms the allocation of resources in an economic setting. The idea that better-informed parties may suffer losses in a deal or exchange was seen as something important to bring to the sphere of economic theory. Most theoretical analyses of situations where one party knew less than the other focused on how the lesser-informed party attempted to learn more information to minimize information asymmetry. However, in these analyses, there is an assumption that better-informed parties can optimally exploit their information asymmetry when they, in fact, cannot. People cannot utilize their additional, better information, even when they should in a bargaining situation.

For example, two people are bargaining over dividing money or provisions. One party may know the size of the amount being divided while the other does not. However, to fully exploit their advantage, the informed party should make the same offer regardless of the amount of material to be divided. But informed parties actually offer more when the amount to be divided is larger. Informed parties are unable to ignore their better information, even when they should.

Experimental evidence

A 1990 experiment by a Stanford University graduate student, Elizabeth Newton, illustrated the curse of knowledge in the results of a simple task. A group of subjects were asked to "tap" out well known songs with their fingers, while another group tried to name the melodies. When the "tappers" were asked to predict how many of the "tapped" songs would be recognized by listeners, they would always overestimate. The curse of knowledge is demonstrated here as the "tappers" are so familiar with what they were tapping that they assumed listeners would easily recognize the tune.

A study by Susan Birch and Paul Bloom involving Yale University undergraduate students used the curse of knowledge concept to explain the idea that the ability of people to reason about another person's actions is compromised by the knowledge of the outcome of an event. The perception the participant had of the plausibility of an event also mediated the extent of the bias. If the event was less plausible, knowledge was not as much of a "curse" as when there was a potential explanation for the way the other person could act. However, a replication study conducted in 2014 found that this finding was not reliably reproducible across seven experiments with large sample sizes, and the true effect size of this phenomenon was less than half of that reported in the original findings. Therefore, it is suggested that "the influence of plausibility on the curse of knowledge in adults appears to be small enough that its impact on real-life perspective-taking may need to be reevaluated."

Other researchers have linked the curse of knowledge bias with false-belief reasoning in both children and adults, as well as theory of mind development difficulties in children.

Related to this finding is the phenomenon experienced by players of charades: the actor may find it frustratingly hard to believe that their teammates keep failing to guess the secret phrase, known only to the actor, conveyed by pantomime.

Implications

In the Camerer, Loewenstein, and Weber article, it is mentioned that the setting closest in structure to the market experiments done would be underwriting, a task in which well-informed experts price goods that are sold to a less-informed public.

Investment bankers value securities, experts taste cheese, store buyers observe jewelry being modeled, and theater owners see movies before they are released. They then sell those goods to a less-informed public. If they suffer from the curse of knowledge, high-quality goods will be overpriced and low-quality goods underpriced relative to optimal, profit-maximizing prices; prices will reflect characteristics (e.g., quality) that are unobservable to uninformed buyers.

The curse of knowledge has a paradoxical effect in these settings. By making better-informed agents think that their knowledge is shared by others, the curse helps alleviate the inefficiencies that result from information asymmetries (a better informed party having an advantage in a bargaining situation), bringing outcomes closer to complete information. In such settings, the curse on individuals may actually improve social welfare ("you get what you pay for").

Applications

Marketing

Economists Camerer, Loewenstein, and Weber first applied the curse of knowledge phenomenon to economics, in order to explain why and how the assumption that better-informed agents can accurately anticipate the judgments of lesser-informed agents is not inherently true. They also sought to support the finding that sales agents who are better informed about their products may, in fact, be at a disadvantage against other, less-informed agents when selling their products. The reason is said to be that better-informed agents fail to ignore the privileged knowledge that they possess and are thus "cursed" and unable to sell their products at a value that more naïve agents would deem acceptable.

Education

It has also been suggested that the curse of knowledge could contribute to the difficulty of teaching. The curse of knowledge means that it could be potentially ineffective, if not harmful, to think about how students are viewing and learning material by asking the perspective of the teacher as opposed to what has been verified by students. The teacher already has the knowledge that they are trying to impart, but the way that knowledge is conveyed may not be the best for those who do not already possess the knowledge.

The curse of expertise may be counterproductive for learners acquiring new skills. This is important because the predictions of experts can influence educational equity and training as well as the personal development of young people, not to mention the allocation of time and resources to scientific research and crucial design decisions. Effective teachers must predict the issues and misconceptions that people will face when learning a complex new skill or understanding an unfamiliar concept. This should also encompass the teachers’ recognizing their own or each other's bias blind spots.

Quality assurance (QA) is a way of circumventing the curse of experience by applying comprehensive quality management techniques. Professionals by definition get paid for technically well defined work so that quality control procedures may be required which encompass the processes employed, the training of the expert and the ethos of the trade or profession of the expert. Some experts (lawyers, physicians, etc.) require a licence which may include a requirement to undertake ongoing professional development (i.e. obtain OPD credits issued by collegiate universities or professional associations – see also normative safety).

Decoding the Disciplines is another way of coping with the curse of knowledge in educational settings. It intends to increase student learning by narrowing the gap between expert and novice thinking resulting from the curse of knowledge. The process seeks to make explicit the tacit knowledge of experts and to help students master the mental actions they need for success in particular disciplines.

Academics

Academics are usually employed in research and development activities that are less well understood than those of professionals, and therefore submit themselves to peer review assessment by other appropriately qualified individuals.

Computer programming

It can also show up in computer programming where the programmer fails to produce understandable code, e.g. comment their code, because it seems obvious at the time they write it. But a few months later they themselves may have no idea why the code exists. The design of user interfaces is another example from the software industry, whereby software engineers (who have a deep understanding of the domain the software is written for) create user interfaces that they themselves can understand and use, but end users - who do not possess the same level of knowledge - find the user interfaces difficult to use and navigate. This problem has become so widespread in software design that the mantra "You are not the user" has become ubiquitous in the user experience industry to remind practitioners that their knowledge and intuitions do not always match those of the end users they are designing for.

To-do lists

Another example is writing a to-do list and viewing it at a future time but forgetting what you had meant as the knowledge at the time of writing is now lost.

The difficulty experienced people may encounter is exemplified fictionally by Dr. Watson in discourses with the insightful detective Sherlock Holmes. The xkcd comic "Average Familiarity" features two geochemists discussing the phenomenon.

Murphy's law


From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Murphy%27s_law

Murphy's law is an adage or epigram that is typically stated as: "Anything that can go wrong will go wrong."

Though similar statements and concepts have been made over the course of history, the law itself was coined by, and named after, American aerospace engineer Edward A. Murphy Jr.; its exact origins are debated, but it is generally agreed it originated from Murphy and his team following a mishap during rocket sled tests some time between 1948 and 1949, and was finalized and first popularized by testing project head John Stapp during a later press conference. Murphy's original quote was the precautionary design advice that "If there are two or more ways to do something and one of those results in a catastrophe, then someone will do it that way."

The law entered wider public knowledge in the late 1970s with the publication of Arthur Bloch's 1977 book Murphy's Law, and Other Reasons Why Things Go WRONG, which included other variations and corollaries of the law. Since then, Murphy's law has remained a popular (and occasionally misused) adage, though its accuracy has been disputed by academics.

Similar "laws" include Sod's law, Finagle's law, and Yhprum's law, among others.

History

British mathematician Augustus De Morgan (pictured circa 1860) wrote in 1866 that "whatever can happen will happen".
British stage magician Nevil Maskelyne wrote in 1908 that, during special occasions, "everything that can go wrong will go wrong".

The perceived perversity of the universe has long been a subject of comment, and precursors to the modern version of Murphy's law are abundant. According to Robert A. J. Matthews in a 1997 article in Scientific American, the name "Murphy's law" originated in 1949, but the concept itself had already long since been known. As quoted by Richard Rhodes, Matthews said, "The familiar version of Murphy's law is not quite 50 years old, but the essential idea behind it has been around for centuries. […] The modern version of Murphy's Law has its roots in U.S. Air Force studies performed in 1949 on the effects of rapid deceleration on pilots." Matthews goes on to explain how Edward A. Murphy Jr. was the eponym, but only because his original thought was modified subsequently into the now established form that is not exactly what he himself had said. Research into the origin of Murphy's law has been conducted by members of the American Dialect Society (ADS).

Mathematician Augustus De Morgan wrote on June 23, 1866: "The first experiment already illustrates a truth of the theory, well confirmed by practice, what-ever can happen will happen if we make trials enough." In later publications "whatever can happen will happen" occasionally is termed "Murphy's law", which raises the possibility that "Murphy" is simply "De Morgan" misremembered.

ADS member Stephen Goranson found a version of the law, not yet generalized or bearing that name, in a report by Alfred Holt at an 1877 meeting of an engineering society.

It is found that anything that can go wrong at sea generally does go wrong sooner or later, so it is not to be wondered that owners prefer the safe to the scientific … Sufficient stress can hardly be laid on the advantages of simplicity. The human factor cannot be safely neglected in planning machinery. If attention is to be obtained, the engine must be such that the engineer will be disposed to attend to it.

ADS member Bill Mullins found a slightly broader version of the aphorism in reference to stage magic. The British stage magician Nevil Maskelyne wrote in 1908:

It is an experience common to all men to find that, on any special occasion, such as the production of a magical effect for the first time in public, everything that can go wrong will go wrong. Whether we must attribute this to the malignity of matter or to the total depravity of inanimate things, whether the exciting cause is hurry, worry, or what not, the fact remains.

In astronomy, "Spode's Law" refers to the phenomenon that the skies are always cloudy at the wrong moment; the law was popularized by amateur astronomer Patrick Moore but dates from the 1930s.

In 1948, humorist Paul Jennings coined the term resistentialism, a jocular play on resistance and existentialism, to describe "seemingly spiteful behavior manifested by inanimate objects", where objects that cause problems (like lost keys or a runaway bouncy ball) are said to exhibit a high degree of malice toward humans.

In 1952, as an epigraph to the mountaineering book The Butcher: The Ascent of Yerupaja, John Sack described the same principle, "Anything that can possibly go wrong, does", as an "ancient mountaineering adage".

Association with Murphy

John Stapp riding a rocket sled at Muroc Army Air Field (pictured circa the late 1940s or early 1950s). Murphy's law most likely originated during similar tests in 1948 and 1949.

Differing recollections years later by various participants make it impossible to pinpoint who first coined the saying Murphy's law. The law's name supposedly stems from an attempt to use new measurement devices developed by Edward A. Murphy, a United States Air Force (USAF) captain and aeronautical engineer. The phrase was coined in an adverse reaction to something Murphy said when his devices failed to perform and was eventually cast into its present form prior to a press conference some months later – the first ever (of many) given by John Stapp, a USAF colonel and flight surgeon in the 1950s.

From 1948 to 1949, Stapp headed research project MX981 at Muroc Army Air Field (later renamed Edwards Air Force Base) for the purpose of testing the human tolerance for g-forces during rapid deceleration. The tests used a rocket sled mounted on a railroad track with a series of hydraulic brakes at the end. Initial tests used a humanoid crash test dummy strapped to a seat on the sled, but subsequent tests were performed by Stapp, at that time a USAF captain. During the tests, questions were raised about the accuracy of the instrumentation used to measure the g-forces Captain Stapp was experiencing. Edward Murphy proposed using electronic strain gauges attached to the restraining clamps of Stapp's harness to measure the force exerted on them by his rapid deceleration. Murphy was engaged in supporting similar research using high speed centrifuges to generate g-forces.

During a trial run of this method using a chimpanzee, supposedly around June 1949, Murphy's assistant wired the harness and the rocket sled was launched. The sensors provided a zero reading; however, it became apparent that they had been installed incorrectly, with some sensors wired backwards. It was at this point a frustrated Murphy made his pronouncement, despite being offered the time and chance to calibrate and test the sensor installation prior to the test proper, which he declined somewhat irritably, getting off on the wrong foot with the MX981 team. George E. Nichols, an engineer and quality assurance manager with the Jet Propulsion Laboratory who was present at the time, recalled in an interview that Murphy blamed the failure on his assistant after the failed test, saying, "If that guy has any way of making a mistake, he will."Nichols' account is that "Murphy's law" came about through conversation among the other members of the team; it was condensed to "If it can happen, it will happen", and named for Murphy in mockery of what Nichols perceived as arrogance on Murphy's part. Others, including Edward Murphy's surviving son Robert Murphy, deny Nichols' account, and claim that the phrase did originate with Edward Murphy. According to Robert Murphy's account, his father's statement was along the lines of "If there's more than one way to do a job, and one of those ways will result in disaster, then he will do it that way."

The phrase first received public attention during a press conference in which Stapp was asked how it was that nobody had been severely injured during the rocket sled tests. Stapp replied that it was because they always took Murphy's law under consideration; he then summarized the law and said that in general, it meant that it was important to consider all the possibilities (possible things that could go wrong) before doing a test and act to counter them. Thus Stapp's usage and Murphy's alleged usage are very different in outlook and attitude. One is sour, the other an affirmation of the predictable being surmountable, usually by sufficient planning and redundancy. Nichols believes Murphy was unwilling to take the responsibility for the device's initial failure (by itself a blip of no large significance) and is to be doubly damned for not allowing the MX981 team time to validate the sensor's operability and for trying to blame an underling in the embarrassing aftermath.

The name "Murphy's law" was not immediately secure. A story by Lee Correy in the February 1955 issue of Astounding Science Fiction referred to "Reilly's law", which states that "in any scientific or engineering endeavor, anything that can go wrong will go wrong". Atomic Energy Commission Chairman Lewis Strauss was quoted in the Chicago Daily Tribune on February 12, 1955, saying "I hope it will be known as Strauss' law. It could be stated about like this: If anything bad can happen, it probably will."

Martin Caidin a pilot of the Federal Aviation Agency, in his book Operation Nuke (1973) chapter 13: lists the Murphy's Three Laws of Physics as (1.) Whatever can go wrong, will go wrong. (2.) Whatever's wrong is bound to get worse. (3.) When the first two laws have passed, and you're still around panic.

Arthur Bloch, in the first volume (1977) of his Murphy's Law, and Other Reasons Why Things Go WRONG series, prints a letter that he received from Nichols, who recalled an event that occurred in 1949 at Edwards Air Force Base that, according to him, is the origination of Murphy's law, and first publicly recounted by Stapp. An excerpt from the letter reads:

The law's namesake was Capt. Ed Murphy, a development engineer from Wright Field Aircraft Lab. Frustration with a strap transducer which was malfunctioning due to an error in wiring the strain gage bridges caused him to remark – "If there is any way to do it wrong, he will" – referring to the technician who had wired the bridges at the Lab. I assigned Murphy's law to the statement and the associated variations.

Disputed origins

The association with the Muroc incident is by no means secure. Despite extensive research, no trace of documentation of the saying as "Murphy's law" has been found before 1951. The next citations are not found until 1955, when the May–June issue of Aviation Mechanics Bulletin included the line "Murphy's law: If an aircraft part can be installed incorrectly, someone will install it that way", and Lloyd Mallan's book Men, Rockets and Space Rats, referred to: "Colonel Stapp's favorite takeoff on sober scientific laws—Murphy's law, Stapp calls it—'Everything that can possibly go wrong will go wrong'." In 1962, the Mercury Seven attributed Murphy's law to United States Navy training films.

Fred R. Shapiro, the editor of the Yale Book of Quotations, has shown that in 1952 the adage was called "Murphy's law" in a book by Anne Roe, quoting an unnamed physicist:

he described [it] as "Murphy's law or the fourth law of thermodynamics" (actually there were only three last I heard) which states: "If anything can go wrong, it will."

In May 1951, Anne Roe gave a transcript of an interview (part of a thematic apperception test, asking impressions on a drawing) with said physicist: "As for himself he realized that this was the inexorable working of the second law of the thermodynamics which stated Murphy's law 'If anything can go wrong it will'. I always liked 'Murphy's law'. I was told that by an architect." ADS member Stephen Goranson, investigating this in 2008 and 2009, found that Anne Roe's papers, held in the American Philosophical Society's archives in Philadelphia, identified the interviewed physicist as Howard Percy "Bob" Robertson (1903–1961). Robertson's papers at the Caltech archives include a letter in which Robertson offers Roe an interview within the first three months of 1949, making this apparently predate the Muroc incident said to have occurred in or after June 1949.

John Paul Stapp, Edward A. Murphy, Jr., and George Nichols were jointly awarded an Ig Nobel Prize in 2003 in engineering " for (probably) giving birth to the name". Murphy's Law was also the theme of 2024 Ig Nobel Prize ceremony.

Academic and scientific views

According to Richard Dawkins, so-called laws like Murphy's law and Sod's law are nonsense because they require inanimate objects to have desires of their own, or else to react according to one's own desires. Dawkins points out that a certain class of events may occur all the time, but are only noticed when they become a nuisance. He gives an example of aircraft noise pollution interfering with filming: there are always aircraft in the sky at any given time, but they are only taken note of when they cause a problem. This is a form of confirmation bias, whereby the investigator seeks out evidence to confirm their already-formed ideas, but does not look for evidence that contradicts them.

Similarly, David Hand, emeritus professor of mathematics and senior research investigator at Imperial College London, points out that the law of truly large numbers should lead one to expect the kind of events predicted by Murphy's law to occur occasionally. Selection bias will ensure that those ones are remembered and the many times Murphy's law was not true are forgotten.

There have been persistent references to Murphy's law associating it with the laws of thermodynamics from early on (see the quotation from Anne Roe's book above). In particular, Murphy's law is often cited as a form of the second law of thermodynamics (the law of entropy) because both are predicting a tendency to a more disorganized state. Atanu Chatterjee investigated this idea by formally stating Murphy's law in mathematical terms and found that Murphy's law so stated could be disproved using the principle of least action.

Variations (corollaries) of the law

From its initial public announcement, Murphy's law quickly spread to various technical cultures connected to aerospace engineering. Before long, variations of the law applied to different topics and subjects had passed into the public imagination, changing over time. Arthur Bloch compiled a number of books of corollaries to Murphy's law and variations thereof, the first being Murphy's Law, and Other Reasons Why Things Go WRONG, which received several follow-ups and reprints.

Yhprum's law is an optimistic reversal of Murphy's law, stating that "anything that can go right will go right". Its name directly references this, being "Murphy" in reverse.

Management consultant Peter Drucker formulated "Drucker's law" in dealing with complexity of management: "If one thing goes wrong, everything else will, and at the same time."

"Mrs. Murphy's law" is a corollary of Murphy's law, which states that "Anything that can go wrong will go wrong while Mr. Murphy is out of town."

The term is sometimes used to describe concise, ironic, humorous rules of thumb that often do not share a relation to the original law or Edward Murphy himself, but still posit him as a relevant expert in the law's subject. Examples of these "Murphy's laws" include those for military tactics, technology, romance, social relations, research, and business.

In the 2014 film Interstellar, a character's interpretation of the law is that it "doesn't mean that something bad will happen. It means that whatever can happen, will happen."

Curse of knowledge

From Wikipedia, the free encyclopedia The curse of knowledge , also called the curse of expertise  or expe...