Gun politics in the United States tends to be polarized between advocates of gun rights, often conservative or libertarian, and those who support stricter gun control, often liberal.
The gun culture
of the United States is unique among developed countries in the large
number of firearms owned by civilians, generally permissive regulations,
and high levels of gun violence compared to other developed countries.
History
American militia culture
American attitudes on gun ownership date back to the American Revolutionary War, and also arise from traditions of hunting, militias, and frontier living.
Justifying the unique attitude toward gun ownership in the United States, James Madison wrote in Federalist No. 46, in 1788:
Those who are best acquainted with
the last successful resistance of this country against the British arms,
will be most inclined to deny the possibility of it. Besides the
advantage of being armed, which the Americans possess over the people of
almost every other nation, the existence of subordinate governments, to
which the people are attached, and by which the militia officers are
appointed, forms a barrier against the enterprises of ambition, more
insurmountable than any which a simple government of any form can admit
of. Notwithstanding the military establishments in the several kingdoms
of Europe, which are carried as far as the public resources will bear,
the governments are afraid to trust the people with arms. And it is not
certain, that with this aid alone they would not be able to shake off
their yokes. But were the people to possess the additional advantages of
local governments chosen by themselves, who could collect the national
will and direct the national force, and of officers appointed out of the
militia, by these governments, and attached both to them and to the
militia, it may be affirmed with the greatest assurance, that the throne
of every tyranny in Europe would be speedily overturned in spite of the
legions which surround it.
The American hunting and sporting passion comes from a time when the
United States was an agrarian, subsistence nation where hunting was a
profession for some, an auxiliary source of food for some settlers, and
also a deterrence to animal predators. A connection between shooting
skills and survival among rural American men was in many cases a
necessity and a rite of passage
for manhood. Hunting endures as a central sentimental component of a
gun culture to control animal populations across the country, regardless
of modern trends away from subsistence hunting and rural living.
The militia spirit derives from an early American dependence on
arms to protect themselves from foreign armies and hostile Native
Americans. Survival depended upon everyone being capable of using a
weapon. Before the American Revolution
there was neither budget nor manpower nor government desire to maintain
a full-time army. Therefore, the armed citizen-soldier carried the
responsibility. Service in militia, including providing one's own
ammunition and weapons, was mandatory for all men. Yet, as early as the
1790s, the mandatory universal militia duty gave way to voluntary
militia units and a reliance on a regular army. Throughout the 19th century, the institution of the civilian militia began to decline.
Closely related to the militia tradition was the frontier
tradition with the need for a means of self-protection closely
associated with the nineteenth-century westward expansion and the American frontier. In popular literature, frontier adventure was most famously told by James Fenimore Cooper, who is credited by Petri Liukkonen with creating the archetype of an 18th-century frontiersman through such novels as The Last of the Mohicans (1826) and The Deerslayer (1840).
The American Scots-Irish
settlers arguably best epitomized this frontier spirit. Emigrating from
Britain in what had historically been an economically poor and
incredibly violent region, these immigrants brought with them an intense
pride, individualism and love of guns which would shape future
decedent's views and help form the origin of American gun culture.
Settling in Appalachia, the Scots-Irish would lead the push westward and
eventually populate a band stretching from Appalachia to Texas and
Oklahoma, and particularly after the Dust Bowl into Southern California.
African American gun culture
A
distinct and growing sub-culture of American gun culture has been
developed and promoted by African Americans since at least the end of
the American Civil War. From Frederick Douglass, DuBois, Ida B. Wells and Marcus Garvey, the American Civil Rights movement, and the Pan-African movement, an array of African American gun cultures and philosophies of violence and self-defense have proliferated in American life.
Ownership levels
The U.S. gun homicide rate is ~18 times the average in other developed countries. The U.S. gun ownership rate is more than one per person.
Annual
gun production in the U.S. has increased substantially in the 21st
century, after having remained fairly level over preceding decades. By 2023, a majority of U.S. states allowed adults to carry concealed guns in public.
U.S. gun sales have risen in the 21st century, peaking during the COVID-19 pandemic. "NICS" is the FBI's National Instant Background Check System.
Almost every major gunmaker produces its own version of the AR-15, with ~16 million Americans owning at least one.
"Americans made up 4 percent of the world's population but owned
about 46 percent of the entire global stock of 857 million civilian
firearms."
U.S. civilians own 393 million guns. American civilians own more guns
"than those held by civilians in the other top 25 countries combined."
In 2018 it was estimated that U.S. civilians own 393 million firearms,
and that 40% to 42% of the households in the country have at least one
gun. However, record gun sales followed in the following years. The U.S. has by far the highest estimated number of guns per capita in the world, at 120.5 guns for every 100 people.
Although historically there have been significant differences in
respect to gun ownership between different races and sexes, that gap may
be closing. For example, women and ethnic minorities saw the sharpest
rise of private gun ownership in the United States in 2020 and the
ongoing ownership trends do not indicate any sign of abatement.
Also, in 2020 and 2021 a sharp increase in gun ownership was seen due to the riots and pandemic during that time.[22][23] Nearly half of the gun buyers appeared to be first-time owners.[23] Over 2 million firearms were purchased during the pandemic alone.
According to Gallup, in 2020, 32% of U.S. adults said they
personally own a gun, while a larger percentage, 44%, report living in a
gun household.
Popular culture
In the late 19th century, cowboy and "Wild West" imagery entered the collective imagination. The first American female superstar, Annie Oakley, was a sharpshooter who toured the country starting in 1885, performing in Buffalo Bill's Wild West show. The cowboy archetype of the individualist hero was established largely by Owen Wister in stories and novels, most notably The Virginian (1902), following close on the heels of Theodore Roosevelt's The Winning of the West (1889–1895), a history of the early frontier. Cowboys were also popularized in turn of the 20th century cinema, notably through such early classics as The Great Train Robbery (1903) and A California Hold Up (1906)—the most commercially successful film of the pre-nickelodeon era.
Gangster films
started in 1910, but became popular only with the advent of sound in
film in the 1930s. The genre was boosted by the events of the prohibition era, such as bootlegging and the St. Valentine's Day Massacre of 1929, the existence of real-life gangsters such as Al Capone and the rise of contemporary organized crime
and escalation of urban violence. These movies flaunted the archetypal
exploits of "swaggering, cruel, wily, tough, and law-defying bootleggers
and urban gangsters".[29]
Since World War II,
Hollywood produced many morale-boosting movies, patriotic rallying
cries that affirmed a sense of national purpose. The image of the lone
cowboy was replaced in these combat films by stories emphasizing group
efforts and the value of individual sacrifices for a larger cause, often
featuring a group of men from diverse ethnic backgrounds who were
thrown together, tested on the battlefield, and molded into a dedicated
fighting unit.
Guns frequently accompanied famous heroes and villains in late 20th-century American films, from the outlaws of Bonnie and Clyde (1967) and The Godfather (1972), to the fictitious law and order avengers like Dirty Harry (1971) and RoboCop (1987). In the 1970s, films portrayed fictitious and exaggerated characters, madmen ostensibly produced by the Vietnam War in films like Taxi Driver (1976) and Apocalypse Now
(1979), while other films told stories of fictitious veterans who were
supposedly victims of the war and in need of rehabilitation (Coming Home and The Deer Hunter, both 1978).
Many action films continue to celebrate the gun toting hero in
fantastical settings. At the same time, the negative role of the gun in
fictionalized modern urban violence has been explored in films like Boyz n the Hood (1991) and Menace 2 Society (1993).
Political and cultural theories
Gun culture and its effects have been at the center of major debates in the US's public sphere for decades. In his 1970 article "America as a Gun Culture," historian Richard Hofstadter
used the phrase "gun culture" to characterize America as having a
long-held affection for guns, embracing and celebrating the association
of guns and an overall heritage relating to guns. He also noted that the
US "is the only industrial nation in which the possession of rifles,
shotguns, and handguns is lawfully prevalent among large numbers of its
population". In 1995, political scientist Robert Spitzer
said that the modern American gun culture is founded on three factors:
the proliferation of firearms since the earliest days of the nation, the
connection between personal ownership of weapons and the country's
revolutionary and frontier history, and the cultural mythology regarding
the gun in the frontier and in modern life.
In 2008, the US Supreme Court affirmed that the right of individuals
to possess firearms is guaranteed by the Second Amendment.
Terms applied to opponents
Terms used by gun rights and gun control advocates to refer to opponents are part of the larger topic of gun politics.
The term gun nut refers to firearms enthusiasts who are deeply involved with the gun culture. It is regarded as a pejorativestereotype cast upon gun owners by gun controladvocates as a means of implying that they are fanatical, exhibit abnormal behavior, or are a threat to the safety of others. Some gun owners embrace the term affectionately.
The term hoplophobia refers to an "irrational aversion to firearms", and US Marine Jeff Cooper claims to have invented the term in the 1960s.
Foreign perspective
The
US attitude to guns generally perplexes those in other developed
countries, many of whom do not understand the unusual permissiveness of
American gun laws, and believe that the American public should push for
harsher gun control measures due to mass shootings.
Critics contrast the US reaction to terrorism given how few deaths it
causes, with their high death rates from non-terror related gun crime.
The doomsday argument (DA), or Carter catastrophe, is a probabilistic argument
that claims to predict the future population of the human species based
on an estimation of the number of humans born to date. The doomsday
argument was originally proposed by the astrophysicistBrandon Carter in 1983, leading to the initial name of the Carter catastrophe. The argument was subsequently championed by the philosopherJohn A. Leslie and has since been independently conceived by J. Richard Gott and Holger Bech Nielsen. Similar principles of eschatology were proposed earlier by Heinz von Foerster, among others. A more general form was given earlier in the Lindy effect,
which proposes that for certain phenomena, the future life expectancy
is proportional to (though not necessarily equal to) the current age and
is based on a decreasing mortality rate over time.
Summary
The
premise of the argument is as follows: suppose that the total number of
human beings that will ever exist is fixed. If so, the likelihood of a
randomly selected person existing at a particular time in history would
be proportional to the total population at that time. Given this, the
argument posits that a person alive today should adjust their
expectations about the future of the human race because their existence
provides information about the total number of humans that will ever
live.
If the total number of humans who were born or will ever be born is denoted by , then the Copernican principle suggests that any one human is equally likely (along with the other humans) to find themselves in any position of the total population so humans assume that our fractional position is uniformly distributed on the interval [0,1] before learning our absolute position.
is uniformly distributed on (0,1) even after learning the absolute position . For example, there is a 95% chance that is in the interval (0.05,1), that is .
In other words, one can assume with 95% certainty that any individual
human would be within the last 95% of all the humans ever to be born. If
the absolute position is known, this argument implies a 95% confidence upper bound for obtained by rearranging to give .
If Leslie's figure
is used, then approximately 60 billion humans have been born so far, so
it can be estimated that there is a 95% chance that the total number of
humans will be less than 2060 billion = 1.2 trillion. Assuming that the world population stabilizes at 10 billion and a life expectancy of 80 years,
it can be estimated that the remaining 1140 billion humans will be born
in 9120 years. Depending on the projection of the world population in
the forthcoming centuries, estimates may vary, but the argument states
that it is unlikely that more than 1.2 trillion humans will ever live.
Aspects
Assume, for simplicity, that the total number of humans who will ever be born is 60 billion (N1), or 6,000 billion (N2). If there is no prior knowledge of the position that a currently living individual, X, has in the history of humanity, one may instead compute how many humans were born before X, and arrive at say 59,854,795,447, which would necessarily place X among the first 60 billion humans who have ever lived.
It is possible to sum the probabilities for each value of N and, therefore, to compute a statistical 'confidence limit' on N. For example, taking the numbers above, it is 99% certain that N is smaller than 6 trillion.
Note that as remarked above, this argument assumes that the prior probability for N is flat, or 50% for N1 and 50% for N2 in the absence of any information about X. On the other hand, it is possible to conclude, given X, that N2 is more likely than N1 if a different prior is used for N. More precisely, Bayes' theorem tells us that P(N|X) = P(X|N)P(N)/P(X), and the conservative application of the Copernican principle tells us only how to calculate P(X|N). Taking P(X) to be flat, we still have to assume the prior probability P(N) that the total number of humans is N. If we conclude that N2 is much more likely than N1
(for example, because producing a larger population takes more time,
increasing the chance that a low probability but cataclysmic natural
event will take place in that time), then P(X|N) can become more heavily weighted towards the bigger value of N. A further, more detailed discussion, as well as relevant distributions P(N), are given below in the Rebuttals section.
The doomsday argument does not say that humanity cannot or
will not exist indefinitely. It does not put any upper limit on the
number of humans that will ever exist nor provide a date for when
humanity will become extinct. An abbreviated form of the argument does
make these claims, by confusing probability with certainty. However,
the actual conclusion for the version used above is that there is a 95% chance
of extinction within 9,120 years and a 5% chance that some humans will
still be alive at the end of that period. (The precise numbers vary
among specific doomsday arguments.)
Variations
This
argument has generated a philosophical debate, and no consensus has yet
emerged on its solution. The variants described below produce the DA by
separate derivations.
Gott's formulation: "vague prior" total population
Gott specifically proposes the functional form for the prior distribution of the number of people who will ever be born (N). Gott's DA used the vague prior distribution:
.
where
P(N) is the probability prior to discovering n, the total number of humans who have yet been born.
The constant, k, is chosen to normalize the sum of P(N). The value chosen is not important here, just the functional form (this is an improper prior, so no value of k gives a valid distribution, but Bayesian inference is still possible using it.)
Since Gott specifies the prior distribution of total humans, P(N), Bayes' theorem and the principle of indifference alone give us P(N|n), the probability of N humans being born if n is a random draw from N:
The unconditioned n distribution of the current population is identical to the vague prior N probability density function, so:
,
giving P (N | n) for each specific N (through a substitution into the posterior probability equation):
.
The easiest way to produce the doomsday estimate with a given confidence (say 95%) is to pretend that N is a continuous variable (since it is very large) and integrate over the probability density from N = n to N = Z. (This will give a function for the probability that N ≤ Z):
Defining Z = 20n gives:
.
This is the simplest Bayesian derivation of the doomsday argument:
The chance that the total number of humans that will ever be born (N) is greater than twenty times the total that have been is below 5%
The use of a vague prior distribution seems well-motivated as it assumes as little knowledge as possible about N,
given that some particular function must be chosen. It is equivalent to
the assumption that the probability density of one's fractional
position remains uniformly distributed even after learning of one's
absolute position (n).
Gott's "reference class" in his original 1993 paper was not the
number of births, but the number of years "humans" had existed as a
species, which he put at 200,000. Also, Gott tried to give a 95% confidence interval between a minimum
survival time and a maximum. Because of the 2.5% chance that he gives
to underestimating the minimum, he has only a 2.5% chance of
overestimating the maximum. This equates to 97.5% confidence that
extinction occurs before the upper boundary of his confidence interval,
which can be used in the integral above with Z = 40n, and n = 200,000 years:
This is how Gott produces a 97.5% confidence of extinction within N ≤ 8,000,000 years. The number he quoted was the likely time remaining, N − n
= 7.8 million years. This was much higher than the temporal confidence
bound produced by counting births, because it applied the principle of
indifference to time. (Producing different estimates by sampling
different parameters in the same hypothesis is Bertrand's paradox.)
Similarly, there is a 97.5% chance that the present lies in the first
97.5% of human history, so there is a 97.5% chance that the total
lifespan of humanity will be at least
;
In other words, Gott's argument gives a 95% confidence that humans
will go extinct between 5,100 and 7.8 million years in the future.
Gott has also tested this formulation against the Berlin Wall and Broadway and off-Broadway plays.
Leslie's argument differs from Gott's version in that he does not assume a vague prior probability distribution for N.
Instead, he argues that the force of the doomsday argument resides
purely in the increased probability of an early doomsday once you take
into account your birth position, regardless of your prior probability
distribution for N. He calls this the probability shift.
Heinz von Foerster
argued that humanity's abilities to construct societies, civilizations
and technologies do not result in self-inhibition. Rather, societies'
success varies directly with population size. Von Foerster found that
this model fits some 25 data points from the birth of Jesus to 1958, with only 7% of the variance left unexplained. Several follow-up letters (1961, 1962, ...) were published in Science
showing that von Foerster's equation was still on track. The data
continued to fit up until 1973. The most remarkable thing about von
Foerster's model was it predicted that the human population would reach
infinity or a mathematical singularity, on Friday, November 13, 2026. In
fact, von Foerster did not imply that the world population on that day
could actually become infinite. The real implication was that the world
population growth pattern followed for many centuries prior to 1960 was
about to come to an end and be transformed into a radically different
pattern. Note that this prediction began to be fulfilled just in a few
years after the "doomsday" argument was published.
Reference classes
The reference class from which n is drawn, and of which N is the ultimate size, is a crucial point of contention in the doomsday argument argument. The "standard" doomsday argument hypothesis
skips over this point entirely, merely stating that the reference class
is the number of "people". Given that you are human, the Copernican
principle might be used to determine if you were born exceptionally
early, however the term "human" has been heavily contested on practical and philosophical reasons. According to Nick Bostrom, consciousness is (part of) the discriminator between what is in and what is out of the reference class, and therefore extraterrestrial intelligence might have a significant impact on the calculation.
The following sub-sections relate to different suggested
reference classes, each of which has had the standard doomsday argument
applied to it.
SSSA: Sampling from observer-moments
Nick Bostrom, considering observation selection effects, has produced a Self-Sampling Assumption
(SSA): "that you should think of yourself as if you were a random
observer from a suitable reference class". If the "reference class" is
the set of humans to ever be born, this gives N < 20n with 95% confidence (the standard doomsday argument). However, he has refined this idea to apply to observer-moments rather than just observers. He has formalized this as:
The strong self-sampling assumption (SSSA): Each observer-moment
should reason as if it were randomly selected from the class of all
observer-moments in its reference class.
An application of the principle underlying SSSA (though this
application is nowhere expressly articulated by Bostrom), is: If the
minute in which you read this article is randomly selected from every
minute in every human's lifespan, then (with 95% confidence) this event
has occurred after the first 5% of human observer-moments. If the mean
lifespan in the future is twice the historic mean lifespan, this implies
95% confidence that N < 10n (the average future human
will account for twice the observer-moments of the average historic
human). Therefore, the 95th percentile extinction-time estimate in this
version is 4560 years.
Rebuttals
We are in the earliest 5%, a priori
One
counterargument to the doomsday argument agrees with its statistical
methods but disagrees with its extinction-time estimate. This position
requires justifying why the observer cannot be assumed to be randomly
selected from the set of all humans ever to be born, which implies that
this set is not an appropriate reference class. By disagreeing with the
doomsday argument, it implies that the observer is within the first 5%
of humans to be born.
By analogy, if one is a member of 50,000 people in a
collaborative project, the reasoning of the doomsday argument implies
that there will never be more than a million members of that project,
within a 95% confidence interval. However, if one's characteristics are
typical of an early adopter,
rather than typical of an average member over the project's lifespan,
then it may not be reasonable to assume one has joined the project at a
random point in its life. For instance, the mainstream of potential
users will prefer to be involved when the project is nearly complete.
However, if one were to enjoy the project's incompleteness, it is
already known that he or she is unusual, before the discovery of his or
her early involvement.
If one has measurable attributes that set one apart from the
typical long-run user, the project doomsday argument can be refuted
based on the fact that one could expect to be within the first 5% of
members, a priori. The analogy to the total-human-population form of the argument is that confidence in a prediction of the distribution
of human characteristics that places modern and historic humans outside
the mainstream implies that it is already known, before examining n, that it is likely to be very early in N. This is an argument for changing the reference class.
For example, if one is certain that 99% of humans who will ever live will be cyborgs,
but that only a negligible fraction of humans who have been born to
date are cyborgs, one could be equally certain that at least one hundred
times as many people remain to be born as have been.
Robin Hanson's paper sums up these criticisms of the doomsday argument:
All else is not equal; we have good reasons for thinking we are not randomly selected humans from all who will ever live.
Human extinction is distant, a posteriori
The a posteriori observation that extinction level events are rare could be offered as evidence that the doomsday argument's predictions are implausible; typically, extinctions of dominant species happen less often than once in a million years. Therefore, it is argued that human extinction is unlikely within the next ten millennia. (Another probabilistic argument, drawing a different conclusion than the doomsday argument.)
In Bayesian terms, this response to the doomsday argument says
that our knowledge of history (or ability to prevent disaster) produces a
prior marginal for N with a minimum value in the trillions. If N is distributed uniformly from 1012 to 1013, for example, then the probability of N < 1,200 billion inferred from n = 60 billion will be extremely small. This is an equally impeccable Bayesian calculation, rejecting the Copernican principle
because we must be 'special observers' since there is no likely
mechanism for humanity to go extinct within the next hundred thousand
years.
This response is accused of overlooking the technological threats to humanity's survival, to which earlier life was not subject, and is specifically rejected by most academic critics of the doomsday argument (arguably excepting Robin Hanson).
The prior N distribution may make n very uninformative
Here, c and q are constants. If q is large, then our 95% confidence upper bound is on the uniform draw, not the exponential value of N.
The simplest way to compare this with Gott's Bayesian argument is
to flatten the distribution from the vague prior by having the
probability fall off more slowly with N (than inverse
proportionally). This corresponds to the idea that humanity's growth may
be exponential in time with doomsday having a vague prior probability density function in time. This would mean that N, the last birth, would have a distribution looking like the following:
This prior N distribution is all that is required (with the principle of indifference) to produce the inference of N from n, and this is done in an identical way to the standard case, as described by Gott (equivalent to = 1 in this distribution):
Substituting into the posterior probability equation):
Integrating the probability of any N above xn:
For example, if x = 20, and = 0.5, this becomes:
Therefore, with this prior, the chance of a trillion births is well
over 20%, rather than the 5% chance given by the standard DA. If is reduced further by assuming a flatter prior N distribution, then the limits on N given by n become weaker. An of one reproduces Gott's calculation with a birth reference class, and around 0.5 could approximate his temporal confidence interval calculation (if the population were expanding exponentially). As (gets smaller) n becomes less and less informative about N. In the limit this distribution approaches an (unbounded) uniform distribution, where all values of N are equally likely. This is Page et al.'s "Assumption 3", which they find few reasons to reject, a priori. (Although all distributions with are improper priors, this applies to Gott's vague-prior distribution also, and they can all be converted to produce proper integrals by postulating a finite upper population limit.) Since the probability of reaching a population of size 2N is usually thought of as the chance of reaching N multiplied by the survival probability from N to 2N it follows that Pr(N) must be a monotonically decreasing function of N, but this doesn't necessarily require an inverse proportionality.
Infinite expectation
Another objection to the doomsday argument is that the expected total human population is actually infinite. The calculation is as follows:
The total human population N = n/f, where n is the human population to date and f is our fractional position in the total.
We assume that f is uniformly distributed on (0,1].
The expectation of N is
For a similar example of counterintuitive infinite expectations, see the St. Petersburg paradox.
Self-indication assumption: The possibility of not existing at all
One objection is that the possibility of a human existing at all depends on how many humans will ever exist (N).
If this is a high number, then the possibility of their existing is
higher than if only a few humans will ever exist. Since they do indeed
exist, this is evidence that the number of humans that will ever exist
is high.
This objection, originally by Dennis Dieks (1992), is now known by Nick Bostrom's name for it: the "Self-Indication Assumption objection". It can be shown that some SIAs prevent any inference of N from n (the current population).
Caves gives a number of examples to argue that Gott's rule is
implausible. For instance, he says, imagine stumbling into a birthday
party, about which you know nothing:
Your friendly enquiry about the age of the celebrant elicits the reply that she is celebrating her (tp=)
50th birthday. According to Gott, you can predict with 95% confidence
that the woman will survive between [50]/39 = 1.28 years and 39[×50] =
1,950 years into the future. Since the wide range encompasses reasonable
expectations regarding the woman's survival, it might not seem so bad,
till one realizes that [Gott's rule] predicts that with probability 1/2
the woman will survive beyond 100 years old and with probability 1/3
beyond 150. Few of us would want to bet on the woman's survival using
Gott's rule. (See Caves' online paper below.)
Although this example exposes a weakness in J. Richard Gott's
"Copernicus method" DA (that he does not specify when the "Copernicus
method" can be applied) it is not precisely analogous with the modern DA; epistemological refinements of Gott's argument by philosophers such as Nick Bostrom specify that:
Knowing the absolute birth rank (n) must give no information on the total population (N).
Careful DA variants specified with this rule aren't shown implausible
by Caves' "Old Lady" example above, because the woman's age is given
prior to the estimate of her lifespan. Since human age gives an estimate
of survival time (via actuarial tables) Caves' Birthday party age-estimate could not fall into the class of DA problems defined with this proviso.
To produce a comparable "Birthday Party Example" of the carefully
specified Bayesian DA, we would need to completely exclude all prior
knowledge of likely human life spans; in principle this could be done
(e.g.: hypothetical Amnesia chamber). However, this would remove the
modified example from everyday experience. To keep it in the everyday
realm the lady's age must be hidden prior to the survival estimate being made. (Although this is no longer exactly the DA, it is much more comparable to it.)
Without knowing the lady's age, the DA reasoning produces a rule to convert the birthday (n) into a maximum lifespan with 50% confidence (N). Gott's Copernicus method rule is simply: Prob (N < 2n) = 50%. How accurate would this estimate turn out to be? Western demographics are now fairly uniform across ages, so a random birthday (n) could be (very roughly) approximated by a U(0,M] draw where M is the maximum lifespan in the census. In this 'flat' model, everyone shares the same lifespan so N = M. If n happens to be less than (M)/2 then Gott's 2n estimate of N will be under M, its true figure. The other half of the time 2n underestimates M, and in this case (the one Caves highlights in his example) the subject will die before the 2n estimate is reached. In this "flat demographics" model Gott's 50% confidence figure is proven right 50% of the time.
Some philosophers have suggested that only people who have
contemplated the doomsday argument (DA) belong in the reference class "human". If that is the appropriate reference class, Carter defied his own prediction when he first described the argument (to the Royal Society). An attendant could have argued thus:
Presently, only one person in the world understands the
Doomsday argument, so by its own logic there is a 95% chance that it is a
minor problem which will only ever interest twenty people, and I should
ignore it.
Jeff Dewynne and Professor Peter Landsberg suggested that this line of reasoning will create a paradox for the doomsday argument:
If a member of the Royal Society did pass such a comment, it
would indicate that they understood the DA sufficiently well that in
fact 2 people could be considered to understand it, and thus there would
be a 5% chance that 40 or more people would actually be interested.
Also, of course, ignoring something because you only expect a small
number of people to be interested in it is extremely short sighted—if
this approach were to be taken, nothing new would ever be explored, if
we assume no a priori knowledge of the nature of interest and attentional mechanisms.
Conflation of future duration with total duration
Various
authors have argued that the doomsday argument rests on an incorrect
conflation of future duration with total duration. This occurs in the
specification of the two time periods as "doom soon" and "doom deferred"
which means that both periods are selected to occur after the observed value of the birth order. A rebuttal in Pisaturo (2009) argues that the doomsday argument relies on the equivalent of this equation:
,
where:
X = the prior information;
Dp = the data that past duration is tp;
HFS = the hypothesis that the future duration of the phenomenon will be short;
HFL = the hypothesis that the future duration of the phenomenon will be long;
HTS = the hypothesis that the total duration of the phenomenon will be short—i.e., that tt, the phenomenon's total longevity, = tTS;
HTL = the hypothesis that the total duration of the phenomenon will be long—i.e., that tt, the phenomenon's total longevity, = tTL, with tTL > tTS.
Pisaturo then observes:
Clearly, this is an invalid application of Bayes' theorem, as it conflates future duration and total duration.
Pisaturo takes numerical examples based on two possible corrections
to this equation: considering only future durations and considering only
total durations. In both cases, he concludes that the doomsday
argument's claim, that there is a "Bayesian shift" in favor of the
shorter future duration, is fallacious.
This argument is also echoed in O'Neill (2014).
In this work O'Neill argues that a unidirectional "Bayesian Shift" is
an impossibility within the standard formulation of probability theory
and is contradictory to the rules of probability. As with Pisaturo, he
argues that the doomsday argument conflates future duration with total
duration by specification of doom times that occur after the observed
birth order. According to O'Neill:
The reason for the hostility to the doomsday argument and its
assertion of a "Bayesian shift" is that many people who are familiar
with probability theory are implicitly aware of the absurdity of the
claim that one can have an automatic unidirectional shift in beliefs
regardless of the actual outcome that is observed. This is an example of
the "reasoning to a foregone conclusion" that arises in certain kinds
of failures of an underlying inferential mechanism. An examination of
the inference problem used in the argument shows that this suspicion is
indeed correct, and the doomsday argument is invalid. (pp. 216-217)
Confusion over the meaning of confidence intervals
Gelman and Robert assert that the doomsday argument confuses frequentist confidence intervals with Bayesian credible intervals. Suppose that every individual knows their number n and uses it to estimate an upper bound on N. Every individual has a different estimate, and these estimates are constructed so that 95% of them contain the true value of N
and the other 5% do not. This, say Gelman and Robert, is the defining
property of a frequentist lower-tailed 95% confidence interval. But,
they say, "this does not mean that there is a 95% chance that any
particular interval will contain the true value." That is, while 95% of
the confidence intervals will contain the true value of N, this is not the same as N
being contained in the confidence interval with 95% probability. The
latter is a different property and is the defining characteristic of a
Bayesian credible interval. Gelman and Robert conclude:
the Doomsday argument is the
ultimate triumph of the idea, beloved among Bayesian educators, that our
students and clients do not really understand Neyman–Pearson confidence
intervals and inevitably give them the intuitive Bayesian
interpretation.