Search This Blog

Saturday, November 11, 2017

Half-life

From Wikipedia, the free encyclopedia
Number of
half-lives
elapsed
Fraction
remaining
Percentage
remaining
0 11 100
1 12 50
2 14 25
3 18 12 .5
4 116 6 .25
5 132 3 .125
6 164 1 .563
7 1128 0 .781
... ... ...
n 1/2n 100⁄(2n)
Half-life (symbol t1⁄2) is the time required for a quantity to reduce to half its initial value. The term is commonly used in nuclear physics to describe how quickly unstable atoms undergo, or how long stable atoms survive, radioactive decay. The term is also used more generally to characterize any type of exponential or non-exponential decay. For example, the medical sciences refer to the biological half-life of drugs and other chemicals in the human body. The converse of half-life is doubling time.
The original term, half-life period, dating to Ernest Rutherford's discovery of the principle in 1907, was shortened to half-life in the early 1950s.[1] Rutherford applied the principle of a radioactive element's half-life to studies of age determination of rocks by measuring the decay period of radium to lead-206.
Half-life is constant over the lifetime of an exponentially decaying quantity, and it is a characteristic unit for the exponential decay equation. The accompanying table shows the reduction of a quantity as a function of the number of half-lives elapsed.

Probabilistic nature

Simulation of many identical atoms undergoing radioactive decay, starting with either 4 atoms per box (left) or 400 (right). The number at the top is how many half-lives have elapsed. Note the consequence of the law of large numbers: with more atoms, the overall decay is more regular and more predictable.

A half-life usually describes the decay of discrete entities, such as radioactive atoms. In that case, it does not work to use the definition that states "half-life is the time required for exactly half of the entities to decay". For example, if there is just one radioactive atom, and its half-life is one second, there will not be "half of an atom" left after one second.

Instead, the half-life is defined in terms of probability: "Half-life is the time required for exactly half of the entities to decay on average". In other words, the probability of a radioactive atom decaying within its half-life is 50%.

For example, the image on the right is a simulation of many identical atoms undergoing radioactive decay. Note that after one half-life there are not exactly one-half of the atoms remaining, only approximately, because of the random variation in the process. Nevertheless, when there are many identical atoms decaying (right boxes), the law of large numbers suggests that it is a very good approximation to say that half of the atoms remain after one half-life.

There are various simple exercises that demonstrate probabilistic decay, for example involving flipping coins or running a statistical computer program.[2][3][4]

Formulas for half-life in exponential decay

An exponential decay can be described by any of the following three equivalent formulas:
{\displaystyle {\begin{aligned}N(t)&=N_{0}\left({\frac {1}{2}}\right)^{\frac {t}{t_{1/2}}}\\N(t)&=N_{0}e^{-{\frac {t}{\tau }}}\\N(t)&=N_{0}e^{-\lambda t}\end{aligned}}}
where
  • N0 is the initial quantity of the substance that will decay (this quantity may be measured in grams, moles, number of atoms, etc.),
  • N(t) is the quantity that still remains and has not yet decayed after a time t,
  • t1⁄2 is the half-life of the decaying quantity,
  • τ is a positive number called the mean lifetime of the decaying quantity,
  • λ is a positive number called the decay constant of the decaying quantity.
The three parameters t1⁄2, τ, and λ are all directly related in the following way:
{\displaystyle t_{1/2}={\frac {\ln(2)}{\lambda }}=\tau \ln(2)}
where ln(2) is the natural logarithm of 2 (approximately 0.693).

By plugging in and manipulating these relationships, we get all of the following equivalent descriptions of exponential decay, in terms of the half-life:
{\displaystyle {\begin{aligned}N(t)&=N_{0}\left({\frac {1}{2}}\right)^{\frac {t}{t_{1/2}}}=N_{0}2^{-t/t_{1/2}}\\&=N_{0}e^{-t\ln(2)/t_{1/2}}\\t_{1/2}&={\frac {t}{\log _{2}(N_{0}/N(t))}}={\frac {t}{\log _{2}(N_{0})-\log _{2}(N(t))}}\\&={\frac {1}{\log _{2^{t}}(N_{0})-\log _{2^{t}}(N(t))}}={\frac {t\ln(2)}{\ln(N_{0})-\ln(N(t))}}\end{aligned}}}
Regardless of how it's written, we can plug into the formula to get
  • N(0)=N_{0} as expected (this is the definition of "initial quantity")
  • {\displaystyle N\left(t_{1/2}\right)={\frac {1}{2}}N_{0}} as expected (this is the definition of half-life)
  • \lim _{t\to \infty }N(t)=0; i.e., amount approaches zero as t approaches infinity as expected (the longer we wait, the less remains).

Decay by two or more processes

Some quantities decay by two exponential-decay processes simultaneously. In this case, the actual half-life T1⁄2 can be related to the half-lives t1 and t2 that the quantity would have if each of the decay processes acted in isolation:
{\displaystyle {\frac {1}{T_{1/2}}}={\frac {1}{t_{1}}}+{\frac {1}{t_{2}}}}
For three or more processes, the analogous formula is:
{\displaystyle {\frac {1}{T_{1/2}}}={\frac {1}{t_{1}}}+{\frac {1}{t_{2}}}+{\frac {1}{t_{3}}}+\cdots }

Examples

Half life demonstrated using dice in a classroom experiment

There is a half-life describing any exponential-decay process. For example:
  • The current flowing through an RC circuit or RL circuit decays with a half-life of RCln(2) or ln(2)L/R, respectively. For this example, the term half time might be used instead of "half life", but they mean the same thing.
  • In a first-order chemical reaction, the half-life of the reactant is ln(2)/λ, where λ is the reaction rate constant.
  • In radioactive decay, the half-life is the length of time after which there is a 50% chance that an atom will have undergone nuclear decay. It varies depending on the atom type and isotope, and is usually determined experimentally. See List of nuclides.
The half life of a species is the time it takes for the concentration of the substance to fall to half of its initial value.

In non-exponential decay

The decay of many physical quantities is not exponential—for example, the evaporation of water from a puddle, or (often) the chemical reaction of a molecule. In such cases, the half-life is defined the same way as before: as the time elapsed before half of the original quantity has decayed. However, unlike in an exponential decay, the half-life depends on the initial quantity, and the prospective half-life will change over time as the quantity decays.

As an example, the radioactive decay of carbon-14 is exponential with a half-life of 5,730 years. A quantity of carbon-14 will decay to half of its original amount (on average) after 5,730 years, regardless of how big or small the original quantity was. After another 5,730 years, one-quarter of the original will remain. On the other hand, the time it will take a puddle to half-evaporate depends on how deep the puddle is. Perhaps a puddle of a certain size will evaporate down to half its original volume in one day. But on the second day, there is no reason to expect that one-quarter of the puddle will remain; in fact, it will probably be much less than that. This is an example where the half-life reduces as time goes on. (In other non-exponential decays, it can increase instead.)

The decay of a mixture of two or more materials which each decay exponentially, but with different half-lives, is not exponential. Mathematically, the sum of two exponential functions is not a single exponential function. A common example of such a situation is the waste of nuclear power stations, which is a mix of substances with vastly different half-lives. Consider a mixture of a rapidly decaying element A, with a half-life of 1 second, and a slowly decaying element B, with a half-life of 1 year. In a couple of minutes, almost all atoms of element A will have decayed after repeated halving of the initial number of atoms, but very few of the atoms of element B will have done so as only a tiny fraction of its half-life has elapsed. Thus, the mixture taken as a whole will not decay by halves.

In biology and pharmacology

A biological half-life or elimination half-life is the time it takes for a substance (drug, radioactive nuclide, or other) to lose one-half of its pharmacologic, physiologic, or radiological activity. In a medical context, the half-life may also describe the time that it takes for the concentration of a substance in blood plasma to reach one-half of its steady-state value (the "plasma half-life"). The relationship between the biological and plasma half-lives of a substance can be complex, due to factors including accumulation in tissues, active metabolites, and receptor interactions.[5]

While a radioactive isotope decays almost perfectly according to so-called "first order kinetics" where the rate constant is a fixed number, the elimination of a substance from a living organism usually follows more complex chemical kinetics.

For example, the biological half-life of water in a human being is about 9 to 10 days,[citation needed] though this can be altered by behavior and various other conditions. The biological half-life of cesium in human beings is between one and four months.

Bandwagon effect

A literal "bandwagon", from which the metaphor is derived.

The bandwagon effect is a phenomenon whereby the rate of uptake of beliefs, ideas, fads and trends increases the more that they have already been adopted by others. In other words, the bandwagon effect is characterized by the probability of individual adoption increasing with respect to the proportion who have already done so.[1] As more people come to believe in something, others also "hop on the bandwagon" regardless of the underlying evidence.

The tendency to follow the actions or beliefs of others can occur because individuals directly prefer to conform, or because individuals derive information from others. Both explanations have been used for evidence of conformity in psychological experiments. For example, social pressure has been used to explain Asch's conformity experiments,[2] and information has been used to explain Sherif's autokinetic experiment.[3]

According to this concept, the increasing popularity of a product or phenomenon encourages more people to "get on the bandwagon", too. The bandwagon effect explains why there are fashion trends.[4]

When individuals make rational choices based on the information they receive from others, economists have proposed that information cascades can quickly form in which people decide to ignore their personal information signals and follow the behavior of others.[5] Cascades explain why behavior is fragile—people understand that they are based on very limited information. As a result, fads form easily but are also easily dislodged. Such informational effects have been used to explain political bandwagons.[6]

Origin

The definition of a bandwagon is a wagon which carries a band during the course of a parade, circus or other entertainment event.[7] The phrase "jump on the bandwagon" first appeared in American politics in 1848 when Dan Rice, a famous and popular circus clown of the time, used his bandwagon and its music to gain attention for his political campaign appearances. As his campaign became more successful, other politicians strove for a seat on the bandwagon, hoping to be associated with his success. Later, during the time of William Jennings Bryan's 1900 presidential campaign, bandwagons had become standard in campaigns,[8] and the phrase "jump on the bandwagon" was used as a derogatory term, implying that people were associating themselves with success without considering that with which they associated themselves.

In politics

The bandwagon effect occurs in voting:[9] some people vote for those candidates or parties who are likely to succeed (or are proclaimed as such by the media), hoping to be on the "winner's side" in the end.[citation needed] The bandwagon effect has been applied to situations involving majority opinion, such as political outcomes, where people alter their opinions to the majority view.[10] Such a shift in opinion can occur because individuals draw inferences from the decisions of others, as in an informational cascade.[citation needed]

Because of time zones, election results are broadcast in the eastern parts of the United States while polls are still open in the west. This difference has led to research on how the behavior of voters in western United States is influenced by news about the decisions of voters in other time zones. In 1980, NBC News declared Ronald Reagan to be the winner of the presidential race on the basis of the exit polls several hours before the voting booths closed in the west.

It is also said to be important in the American presidential primary elections. States all vote at different times, spread over some months, rather than all on one day. Some states (Iowa, New Hampshire) have special precedence to go early while others choose to wait until a certain date. This is often said to give undue influence to these states, a win in these early states is said to give a candidate the "Big Mo" (momentum) and has propelled many candidates to win the nomination. Because of this, other states often try front loading (going as early as possible) to make their say as influential as they can. In the 2008 presidential primaries two states had all or some of their delegates banned from the convention by the central party organizations for voting too early.[11][12]

Several studies have tested this theory of the bandwagon effect in political decision making. In the 1994 study of Robert K. Goidel and Todd G. Shields in The Journal of Politics, 180 students at the University of Kentucky were randomly assigned to nine groups and were asked questions about the same set of election scenarios. About 70% of subjects received information about the expected winner.[13] Independents, which are those who do not vote based on the endorsement of any party and are ultimately neutral, were influenced strongly in favor of the person expected to win.[14] Expectations played a significant role throughout the study. It was found that independents are twice as likely to vote for the Republican candidate when the Republican is expected to win. From the results, it was also found that when the Democrat was expected to win, independent Republicans and weak Republicans were more likely to vote for the Democratic candidate.[15]

A study by Albert Mehrabian, reported in the Journal of Applied Social Psychology (1998), tested the relative importance of the bandwagon (rally around the winner) effect versus the underdog (empathic support for those trailing) effect. Bogus poll results presented to voters prior to the 1996 Republican primary clearly showed the bandwagon effect to predominate on balance. Indeed, approximately 6% of the variance in the vote was explained in terms of the bogus polls, showing that poll results (whether accurate or inaccurate) can significantly influence election results in closely contested elections. In particular, assuming that one candidate "is an initial favorite by a slim margin, reports of polls showing that candidate as the leader in the race will increase his or her favorable margin".[16] Thus, as poll results are repeatedly reported, the bandwagon effect will tend to snowball and become a powerful aid to leading candidates.

During the 1992 U.S. presidential election, Vicki G. Morwitz and Carol Pluzinski conducted a study, which was published in The Journal of Consumer Research (1996). At a large northeastern university, some of 214 volunteer business students were given the results of student and national polls indicating that Bill Clinton was in the lead. Others were not exposed to the results of the polls. Several students who had intended to vote for Bush changed their minds after seeing the poll results.[17]

Additionally, British polls have shown an increase to public exposure. Sixty-eight percent of voters had heard of the general election campaign results of the opinion poll in 1979. In 1987, this number of voters aware of the results increased to 74%.[18] According to British studies, there is a consistent pattern of apparent bandwagon effects for the leading party.

In microeconomics

In microeconomics, bandwagon effect describes interactions of demand and preference.[19] The bandwagon effect arises when people's preference for a commodity increases as the number of people buying it increases. This interaction potentially disturbs the normal results of the theory of supply and demand, which assumes that consumers make buying decisions solely based on price and their own personal preference.
Gary Becker has even argued that the bandwagon effect could be so strong as to make the demand curve slope upward.[20]

In medicine

Medical bandwagons have been identified as “the overwhelming acceptance of unproved but popular ideas”. They have led to inappropriate therapies for numerous numbers of patients, and have impeded the development of more appropriate treatment.

In Lawrence Cohen and Henry Rothschild's exposition The Bandwagons of Medicine (1979) several of these therapeutic misadventures, some of which persisted for centuries before they were abandoned, substituted by another bandwagon, or replaced by a scientifically valid alternative.[21] The ancient serpent cult of Aesculapius, in which sacred snakes licked the afflicted as treatment of their diseases, is an example of a bandwagon gathering momentum based on a strong personality, in this case a Roman god.[22]

In sport

Stephen Curry, two-time NBA MVP (2014/15 - 2015/16)

One who supports a particular sports team, despite having shown no interest in that team until it started gaining success, can be considered a "bandwagon fan". One recent example in the United States is the Golden State Warriors, who rose to prominence by winning the 2015 NBA Finals, followed by a record-breaking 73-9 record the following year.[23] The bandwagon effect can be seen in the statistics of the sales of point guard Stephen Curry's jersey. Curry merchandise sales in the first two weeks of the 2015–2016 season were 453% higher than in the first two weeks of the 2014–2015 season, including a 581% increase in sales of his jersey; his merchandise was a top-seller in 38 of the 50 U.S. states, and the Warriors' merchandise became the best-selling of any NBA team.[24]

Delayed-choice quantum eraser

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Delayed-choice_quantum_eraser A delayed-cho...