Search This Blog

Tuesday, November 25, 2025

Effective altruism

From Wikipedia, the free encyclopedia

Effective altruism (EA) is a 21st-century philosophical and social movement that advocates impartially calculating benefits and prioritizing causes to provide the greatest good. It is motivated by "using evidence and reason to figure out how to benefit others as much as possible, and taking action on that basis". People who pursue the goals of effective altruism, who are sometimes called effective altruists, follow a variety of approaches proposed by the movement, such as donating to selected charities and choosing careers, with the goal of maximising positive impact. The movement has spurred the creation of research centers, advisory organizations, and charities, which collectively have donated several hundred million dollars.

A defining feature of effective altruism is impartiality, specifically the global equal consideration of interests when choosing beneficiaries. Popular cause priorities within effective altruism include global health and development, social and economic inequality, animal welfare, and risks to the survival or flourishing of humanity over the long-term future. Only a small portion of all charities are affiliated with effective altruism, except in niche areas such as farmed-animal welfare, AI safety, and biosecurity.

The movement developed during the 2000s, and the name effective altruism was coined in 2011. Philosophers influential to the movement include Peter Singer, Toby Ord, and William MacAskill. Effective altruism is most popular within the anglosphere, with concentrations at elite universities in the United States and United Kingdom, as well as in and around the technology industry in the San Francisco Bay Area.

The movement received mainstream attention and criticism with the bankruptcy of the cryptocurrency exchange FTX as founder Sam Bankman-Fried was a major funder of effective altruism causes prior to late 2022.

History

Peter Singer and William MacAskill are among several philosophers who have helped popularize effective altruism.

Beginning in the latter half of the 2000s, several communities centered around altruist, rationalist, and futurological concerns started to converge, such as:

In 2011, Giving What We Can and 80,000 Hours decided to incorporate into an umbrella organization and held a vote for their new name; the "Centre for Effective Altruism" was selected. The Effective Altruism Global conference has been held since 2013. As the movement formed, it attracted individuals who were not part of a specific community, but who had been following the Australian moral philosopher Peter Singer's work on applied ethics, particularly "Famine, Affluence, and Morality" (1972), Animal Liberation (1975), and The Life You Can Save (2009). Singer himself used the term in 2013, in a TED talk titled "The Why and How of Effective Altruism".

Notable philanthropists

An estimated $416 million was donated to effective charities identified by the movement in 2019, representing a 37% annual growth rate since 2015. Two of the largest donors in the effective altruism community, Dustin Moskovitz, who had become wealthy through co-founding Facebook, and his wife Cari Tuna, hope to donate most of their net worth of over $11 billion for effective altruist causes through the private foundation Good Ventures and are major funders of Open Philanthropy. Others influenced by effective altruism include Sam Bankman-Fried, and professional poker players Dan Smith and Liv BoereeJaan Tallinn, the Estonian billionaire founder of Skype, is known for donating to some effective altruist causes. Sam Bankman-Fried launched a philanthropic organization called the FTX Foundation in February 2021, and it made contributions to a number of effective altruist organizations, but it was shut down in November 2022 when FTX collapsed.

Notable publications and media

A number of books and articles related to effective altruism have been published that have codified, criticized, and brought more attention to the movement. In 2015, philosopher Peter Singer published The Most Good You Can Do: How Effective Altruism Is Changing Ideas About Living Ethically. The same year, the Scottish philosopher and ethicist William MacAskill published Doing Good Better: How Effective Altruism Can Help You Make a Difference.

In 2018, American news website Vox launched its Future Perfect section, led by journalist Dylan Matthews, which publishes articles and podcasts on "finding the best ways to do good".

In 2019, Oxford University Press published the volume Effective Altruism: Philosophical Issues, edited by Hilary Greaves and Theron Pummer.

More recent books have emphasized concerns for future generations. In 2020, the Australian moral philosopher Toby Ord published The Precipice: Existential Risk and the Future of Humanity, while MacAskill published What We Owe the Future in 2022.

In 2023, Oxford University Press published the volume The Good it Promises, The Harm it Does: Critical Essays on Effective Altruism, edited by Carol J. Adams, Alice Crary, and Lori Gruen.

Philosophy

Effective altruists focus on the many philosophical questions related to the most effective ways to benefit others. Such philosophical questions shift the starting point of reasoning from "what to do" to "why" and "how". There is not a consensus on the answers, and there are also differences between effective altruists who believe that they should do the most good they possibly can with all of their resources and those who only try do the most good they can within a defined budget.

According to MacAskill, the view of effective altruism as doing the most good one can within a defined budget can be compatible with a wide variety of views on morality and meta-ethics, as well as traditional religious teachings on altruism such as in Christianity. Effective altruism can also be in tension with religion where religion emphasizes spending resources on worship and evangelism instead of causes that do the most good.

Other than Peter Singer and William MacAskill, philosophers associated with effective altruism include Nick BostromToby OrdHilary Greaves, and Derek Parfit. Economist Yew-Kwang Ng conducted similar research in welfare economics and moral philosophy.

The Centre for Effective Altruism lists the following four principles that unite effective altruism: prioritization, impartial altruism, open truthseeking, and a collaborative spirit. To support people's ability to act altruistically on the basis of impartial reasoning, the effective altruism movement promotes values and actions such as a collaborative spirit, honesty, transparency, and publicly pledging to donate a certain percentage of income or other resources.

Impartiality

Effective altruism aims to emphasize impartial reasoning in that everyone's well-being counts equally. Singer, in his 1972 essay "Famine, Affluence, and Morality", wrote:

It makes no moral difference whether the person I can help is a neighbor's child ten yards away from me or a Bengali whose name I shall never know, ten thousand miles away ... The moral point of view requires us to look beyond the interests of our own society.

Impartiality combined with seeking to do the most good leads to prioritizing benefits to those who are in a worse state, because anyone who happens to be worse off will benefit more from an improvement in their state, all other things being equal.

Scope of moral consideration

One issue related to moral impartiality is the question of which beings are deserving of moral consideration. Some effective altruists consider the well-being of non-human animals in addition to humans, and advocate for animal welfare issues such as ending factory farming. Those who subscribe to longtermism include future generations as possible beneficiaries and try to improve the moral value of the long-term future by, for example, reducing existential risks.

Criticism of impartiality

The drowning child analogy in Singer's essay provoked philosophical debate. In response to a version of Singer's drowning child analogy, philosopher Kwame Anthony Appiah in 2006 asked whether the most effective action of a man in an expensive suit, confronted with a drowning child, would not be to save the child and ruin his suit—but rather, sell the suit and donate the proceeds to charity. Appiah believed that he "should save the drowning child and ruin my suit". In a 2015 debate, when presented with a similar scenario of either saving a child from a burning building or saving a Picasso painting to sell and donate the proceeds to charity, MacAskill responded that the effective altruist should save and sell the Picasso. Psychologist Alan Jern called MacAskill's choice "unnatural, even distasteful, to many people", although Jern concluded that effective altruism raises questions "worth asking". MacAskill later endorsed a "qualified definition of effective altruism" in which effective altruists try to do the most good "without violating constraints" such as any obligations that someone might have to help those nearby.

William Schambra has criticized the impartial logic of effective altruism, arguing that benevolence arising from reciprocity and face-to-face interactions is stronger and more prevalent than charity based on impartial, detached altruism. Such community-based charitable giving, he wrote, is foundational to civil society and, in turn, democracyLarissa MacFarquhar said that people have diverse moral emotions, and she suggested that some effective altruists are not unemotional and detached but feel as much empathy for distant strangers as for people nearby. Richard Pettigrew concurred that many effective altruists "feel more profound dismay at the suffering of people unknown to them than many people feel", and he argued that impartiality in EA need not be dispassionate and "is not obviously in tension with much in care ethics" as some philosophers have argued. Ross Douthat of The New York Times criticized the movement's "'telescopic philanthropy' aimed at distant populations" and envisioned "effective altruists sitting around in a San Francisco skyscraper calculating how to relieve suffering halfway around the world while the city decays beneath them", while he also praised the movement for providing "useful rebukes to the solipsism and anti-human pessimism that haunts the developed world today".

Cause prioritization

A key component of effective altruism is "cause prioritization". Cause prioritization is based on the principle of cause neutrality, the idea that resources should be distributed to causes based on what will do the most good, irrespective of the identity of the beneficiary and the way in which they are helped. By contrast, many non-profits emphasize effectiveness and evidence with respect to a single cause such as education or climate change.

One tool that EA-based organizations may use to prioritize cause areas is the importance, tractability, and neglectedness framework. Importance is the amount of value that would be created if a problem were solved, tractability is the fraction of a problem that would be solved if additional resources were devoted to it, and neglectedness is the quantity of resources already committed to a cause.

The information required for cause prioritization may involve data analysis, comparing possible outcomes with what would have happened under other conditions (counterfactual reasoning), and identifying uncertainty. The difficulty of these tasks has led to the creation of organizations that specialize in researching the relative prioritization of causes.

Criticism of cause prioritization

This practice of "weighing causes and beneficiaries against one another" was criticized by Ken Berger and Robert Penna of Charity Navigator for being "moralistic, in the worst sense of the word" and "elitist". William MacAskill responded to Berger and Penna, defending the rationale for comparing one beneficiary's interests against another and concluding that such comparison is difficult and sometimes impossible but often necessary. MacAskill argued that the more pernicious form of elitism was that of donating to art galleries (and like institutions) instead of charity. Ian David Moss suggested that the criticism of cause prioritization could be resolved by what he called "domain-specific effective altruism", which would encourage "that principles of effective altruism be followed within an area of philanthropic focus, such as a specific cause or geography" and could resolve the conflict between local and global perspectives for some donors.

Cost-effectiveness

Some charities are considered to be far more effective than others, as charities may spend different amounts of money to achieve the same goal, and some charities may not achieve the goal at all. Effective altruists seek to identify interventions that are highly cost-effective in expectation. Many interventions have uncertain benefits, and the expected value of one intervention can be higher than that of another if its benefits are larger, even if it has a smaller chance of succeeding. One metric effective altruists use to choose between health interventions is the estimated number of quality-adjusted life years (QALY) added per dollar.

Some effective altruist organizations prefer randomized controlled trials as a primary form of evidence, as they are commonly considered the highest level of evidence in healthcare research. Others have argued that requiring this stringent level of evidence unnecessarily narrows the focus to issues where the evidence can be developed. Kelsey Piper argues that uncertainty is not a good reason for effective altruists to avoid acting on their best understanding of the world, because most interventions have mixed evidence regarding their effectiveness.

Pascal-Emmanuel Gobry and others have warned about the "measurement problem", with issues such as medical research or government reform worked on "one grinding step at a time", and results being hard to measure with controlled experiments. Gobry also argues that such interventions risk being undervalued by the effective altruism movement. As effective altruism emphasizes a data-centric approach, critics say principles which do not lend themselves to quantification—justice, fairness, equality—get left in the sidelines.

Counterfactual reasoning

Counterfactual reasoning involves considering the possible outcomes of alternative choices. It has been employed by effective altruists in a number of contexts, including career choice. Many people assume that the best way to help others is through direct methods, such as working for a charity or providing social services. However, since there is a high supply of candidates for such positions, it makes sense to compare the amount of good one candidate does to how much good the next-best candidate would do. According to this reasoning, the marginal impact of a career is likely to be smaller than the gross impact.

Differences from utilitarianism

Although EA aims for maximizing like utilitarianism, EA differs from utilitarianism in a few ways; for example, EA does not claim that people should always maximize the good regardless of the means, and EA does not claim that the good is the sum total of well-being. Toby Ord has described utilitarians as "number-crunching", compared with most effective altruists whom he called "guided by conventional wisdom tempered by an eye to the numbers". Other philosophers have argued that EA still retains some core ethical commitments that are essential and distinctive to utilitarianism, such as the principle of impartiality, welfarism and good-maximization.

MacAskill has argued that one shouldn't be absolutely certain about which ethical view is correct, and that "when we are morally uncertain, we should act in a way that serves as a best compromise between different moral views". He also wrote that even from a purely consequentialist perspective, "naive calculations that justify some harmful action because it has good consequences are, in practice, almost never correct".

Differences from effective accelerationism

Effective accelerationism (abbreviated e/acc) is influenced by ideas of accelerationism. Its proponents advocate unrestricted technological progress in the hope that artificial general intelligence will solve major challenges and maximize overall good, arguing that deceleration and stagnation of technology is a greater risk than any posed by AI. Effective altruists are generally more cautious about AI, considering that going too fast could increase existential risks.

Cause priorities

The principles and goals of effective altruism are wide enough to support furthering any cause that allows people to do the most good, while taking into account cause neutrality. Many people in the effective altruism movement have prioritized global health and development, animal welfare, and mitigating risks that threaten the future of humanity.

Global health and development

The alleviation of global poverty and neglected tropical diseases has been a focus of some of the earliest and most prominent organizations associated with effective altruism. Charity evaluator GiveWell was founded by Holden Karnofsky and Elie Hassenfeld in 2007 to address poverty, where they believe additional donations to be the most impactful. GiveWell's leading recommendations include: malaria prevention charities Against Malaria Foundation and Malaria Consortium, deworming charities Schistosomiasis Control Initiative and Deworm the World Initiative, and GiveDirectly for direct cash transfers to beneficiaries. The organization The Life You Can Save, which originated from Singer's book The Life You Can Save, works to alleviate global poverty by promoting evidence-backed charities, conducting philanthropy education, and changing the culture of giving in affluent countries.

Animal welfare

Improving animal welfare has been a focus of many effective altruists. Singer and Animal Charity Evaluators (ACE) have argued that effective altruists should prioritize changes to factory farming over pet welfare. Over 80 billion land animals are slaughtered and between 1 and 2.7 trillion individual fish are killed each year for human consumption.

A number of non-profit organizations have been established that adopt an effective altruist approach toward animal welfare. ACE evaluates animal charities based on their cost-effectiveness and transparency, particularly those tackling factory farming. Faunalytics focuses on animal welfare research. Other animal initiatives affiliated with effective altruism include Animal Ethics' and Wild Animal Initiative's work on wild animal suffering, addressing farm animal suffering with cultured meat, and increasing concern for all kinds of animals. The Sentience Institute is a think tank founded to expand the moral circle to other sentient beings.

Long-term future and global catastrophic risks

The ethical stance of longtermism, emphasizing the importance of positively influencing the long-term future, developed closely in relation to effective altruism. Longtermism argues that "distance in time is like distance in space", suggesting that the welfare of future individuals matters as much as the welfare of currently existing individuals. Given the potentially extremely high number of individuals that could exist in the future, longtermists seek to decrease the probability that an existential catastrophe irreversibly ruins it. Toby Ord has stated that "the people of the future may be even more powerless to protect themselves from the risks we impose than the dispossessed of our own time".

Existential risks, such as dangers associated with biotechnology and advanced artificial intelligence, are often highlighted and the subject of active research. Existential risks have such huge impacts that achieving a very small change in such a risk—say a 0.0001-percent reduction—"might be worth more than saving a billion people today", reported Gideon Lewis-Kraus in 2022, but he added that nobody in the EA community openly endorses such an extreme conclusion.

Organizations that work actively on research and advocacy for improving the long-term future, and have connections with the effective altruism community, are the Future of Humanity Institute at the University of Oxford, the Centre for the Study of Existential Risk at the University of Cambridge, and the Future of Life Institute. In addition, the Machine Intelligence Research Institute is focused on the more narrow mission of managing advanced artificial intelligence.

S-risks

Some effective altruists focus on reducing risks of astronomical suffering (s-risks). S-risks are a particularly severe type of existential risk due to their potential scope and severity, surpassing even human extinction in negative impact. Efforts to mitigate these risks include research and advocacy by organizations like the Center on Long-Term Risk, which explores strategies to avoid large-scale suffering. S-risks could arise from a long-term neglect for the welfare of some types of sentient beings. Another suggested scenario involves repressive totalitarian regimes that would become irreversibly stable due to advanced technology.

Approaches

Effective altruists pursue different approaches to doing good, such as donating to effective charitable organizations, using their career to make more money for donations or directly contributing their labor, and starting new non-profit or for-profit ventures.

Donation

Financial donation

Many effective altruists engage in charitable donation. Some believe it is a moral duty to alleviate suffering through donations if other possible uses of those funds do not offer comparable benefits to oneself. Some lead a frugal lifestyle in order to donate more.

Giving What We Can (GWWC) is an organization whose members pledge to donate at least 10% of their future income to the causes that they believe are the most effective. GWWC was founded in 2009 by Toby Ord, who lives on £18,000 ($27,000) per year and donates the balance of his income. In 2020, Ord said that people had donated over $100 million to date through the GWWC pledge.

Founders Pledge is a similar initiative, founded out of the non-profit Founders Forum for Good, whereby entrepreneurs make a legally binding commitment to donate a percentage of their personal proceeds to charity in the event that they sell their business. As of April 2024, nearly 1,900 entrepreneurs had pledged around $10 billion and nearly $1.1 billion had been donated.

Organ donation

EA has been used to argue that humans should donate organs, whilst alive or after death, and some effective altruists do.

Career choice

Effective altruists often consider using their career to do good, both by direct service and indirectly through their consumption, investment, and donation decisions. 80,000 Hours is an organization that conducts research and gives advice on which careers have the largest positive impact.

Earning to give

Earning to give involves deliberately pursuing a high-earning career for the purpose of donating a significant portion of earned income, typically because of a desire to do effective altruism. Advocates of earning to give contend that maximizing the amount one can donate to charity is an important consideration for individuals when deciding what career to pursue.

Founding effective organizations

Some effective altruists start non-profit or for-profit organizations to implement cost-effective ways of doing good. On the non-profit side, for example, Michael Kremer and Rachel Glennerster conducted randomized controlled trials in Kenya to find out the best way to improve students' test scores. They tried new textbooks and flip charts, as well as smaller class sizes, but found that the only intervention that raised school attendance was treating intestinal worms in children. Based on their findings, they started the Deworm the World Initiative. From 2013 to August 2022, GiveWell designated Deworm the World (now run by nonprofit Evidence Action) as a top charity based on their assessment that mass deworming is "generally highly cost-effective"; however, there is substantial uncertainty about the benefits of mass deworming programs, with some studies finding long-term effects and others not. The Happier Lives Institute conducts research on the effectiveness of cognitive behavioral therapy (CBT) in developing countries; Canopie develops an app that provides cognitive behavioural therapy to women who are expecting or postpartum; Giving Green analyzes and ranks climate interventions for effectiveness; the Fish Welfare Initiative works on improving animal welfare in fishing and aquaculture; and the Lead Exposure Elimination Project works on reducing lead poisoning in developing countries.

Incremental versus systemic change

While much of the initial focus of effective altruism was on direct strategies such as health interventions and cash transfers, more systematic social, economic, and political reforms have also attracted attention. Mathew Snow in Jacobin wrote that effective altruism "implores individuals to use their money to procure necessities for those who desperately need them, but says nothing about the system that determines how those necessities are produced and distributed in the first place". Philosopher Amia Srinivasan criticized William MacAskill's Doing Good Better for a perceived lack of coverage of global inequality and oppression, while noting that effective altruism is in principle open to whichever means of doing good is most effective, including political advocacy aimed at systemic change. Srinivasan said, "Effective altruism has so far been a rather homogeneous movement of middle-class white men fighting poverty through largely conventional means, but it is at least in theory a broad church." Judith Lichtenberg in The New Republic said that effective altruists "neglect the kind of structural and political change that is ultimately necessary". An article in The Ecologist published in 2016 argued that effective altruism is an apolitical attempt to solve political problems, describing the concept as "pseudo-scientific". The Ethiopian-American AI scientist Timnit Gebru has condemned effective altruists "for acting as though their concerns are above structural issues as racism and colonialism", as Gideon Lewis-Kraus summarized her views in 2022.

Philosophers such as Susan Dwyer, Joshua Stein, and Olúfẹ́mi O. Táíwò have criticized effective altruism for furthering the disproportionate influence of wealthy individuals in domains that should be the responsibility of democratic governments and organizations.

Arguments have been made that movements focused on systemic or institutional change, for example democratization, are compatible with effective altruism. Philosopher Elizabeth Ashford posits that people are obligated to both donate to effective aid charities and to reform the structures that are responsible for poverty. Open Philanthropy has given grants for progressive advocacy work in areas such as criminal justice, economic stabilization, and housing reform, despite pegging the success of political reform as being "highly uncertain".

Psychological research

Researchers in psychology and related fields have identified psychological barriers to effective altruism that can cause people to choose less effective options when they engage in altruistic activities such as charitable giving.

Other criticism and controversies

Although the movement's original leaders were associated with frugal lifestyles, the arrival of big donors, including Bankman-Fried, led to more spending and opulence, which seemed incongruous with the movement's espoused values. In 2022, Effective Ventures Foundation purchased the estate of Wytham Abbey for the purpose of running workshops, but put it up for sale in 2024.

Timnit Gebru claimed that effective altruism has acted to overrule any other concerns regarding AI ethics (e.g. deepfake porn, algorithmic bias), in the name of either preventing or controlling artificial general intelligence. She and Émile P. Torres further assert that the movement belongs to a network of interconnected movements they've termed TESCREAL, which they contend serves as intellectual justification for wealthy donors to shape humanity's future.

Sam Bankman-Fried

Sam Bankman-Fried, the eventual founder of the cryptocurrency exchange FTX, had a lunch with philosopher William MacAskill in 2012 while he was an undergraduate at MIT in which MacAskill encouraged him to go earn money and donate it, rather than volunteering his time for causes. Bankman-Fried went on to a career in investing and around 2019 became more publicly associated with the effective altruism movement, announcing that his goal was to "donate as much as [he] can". Bankman-Fried founded the FTX Future Fund, which brought on MacAskill as one of its advisers, and which made a $13.9 million grant to the Centre for Effective Altruism where MacAskill holds a board role.

After the collapse of FTX in late 2022, the movement underwent additional public scrutiny. Bankman-Fried's relationship with effective altruism damaged the movement's reputation. Some journalists asked whether the effective altruist movement was complicit in FTX's collapse, because it was convenient for leaders to overlook specific warnings about Bankman-Fried's behavior or questionable ethics at the trading firm Alameda. Fortune's crypto editor Jeff John Roberts said that "Bankman-Fried and his cronies professed devotion to 'EA,' but all their high-minded words turned out to be flimflam to justify robbing people".

MacAskill condemned Bankman-Fried's actions, saying that effective altruism emphasizes integrity.

Philosopher Leif Wenar argued that Bankman-Fried's conduct typified much of the movement by focusing on positive impacts and expected value without adequately weighing risk and harm from philanthropy. He argued that the FTX case is not separable, as some in the EA community maintained, from the assumptions and reasoning that molded effective altruism as a philosophy in the first place and that Wenar considered to be simplistic.

Sexual misconduct accusations

Critiques also arose concerning issues of exclusion and sexual harassment. A 2023 Bloomberg article featured some members of the effective altruism community who alleged that the philosophy masked a culture of predatory behavior. In a 2023 Time magazine article, seven women reported misconduct and controversy in the effective altruism movement. They accused men within the movement, typically in the Bay Area, of using their power to groom younger women for polyamorous sexual relationships. The accusers argued that the majority male demographic and the polyamorous subculture combined to create an environment where sexual misconduct was tolerated, excused or rationalized away. In response to the accusations, the Centre for Effective Altruism told Time that some of the alleged perpetrators had already been banned from the organization and said it would investigate new claims. The organization also argued that it is challenging to discern to what extent sexual misconduct issues were specific to the effective altruism community or reflective of broader societal misogyny.

Other prominent people

Businessman Elon Musk spoke at an effective altruism conference in 2015. He described MacAskill's 2022 book What We Owe the Future as "a close match for my philosophy", but has not officially joined the movement. An article in The Chronicle of Philanthropy argued that the record of Musk's substantive alignment with effective altruism was "choppy", and Bloomberg News noted that his 2021 charitable contributions showed "few obvious signs that effective altruism... impacted Musk's giving."

Actor Joseph Gordon-Levitt has publicly stated he would like to bring the ideas of effective altruism to a broader audience.

Sam Altman, the CEO of OpenAI, has called effective altruism an "incredibly flawed movement" that shows "very weird emergent behavior". Effective altruist concerns about AI risk were present among the OpenAI board members who fired Altman in November 2023; he has later been reinstated as CEO and the Board membership has changed.

Quantum cognition

From Wikipedia, the free encyclopedia

Quantum cognition uses the mathematical formalism of quantum probability theory to model psychology phenomena when classical probability theory fails. The field focuses on modeling phenomena in cognitive science that have resisted traditional techniques or where traditional models seem to have reached a barrier (e.g., human memory), and modeling preferences in decision theory that seem paradoxical from a traditional rational point of view (e.g., preference reversals). Since the use of a quantum-theoretic framework is for modeling purposes, the identification of quantum structures in cognitive phenomena does not presuppose the existence of microscopic quantum processes in the human brain.

Quantum cognition can be applied to model cognitive phenomena such as information processing by the human brain, language, decision makinghuman memory, concepts and conceptual reasoning, human judgment, and perception.

Challenges for classical probability theory

Classical probability theory is a rational approach to inference which does not easily explain some observations of human inference in psychology. Some cases where quantum probability theory has advantages include the conjunction fallacy, the disjunction fallacy, the failures of the sure-thing principle, and question-order bias in judgement.

Conjunction fallacy

If participants in a psychology experiment are told about "Linda", described as looking like a feminist but not like a bank teller, then asked to rank the probability, that Linda is feminist, a bank teller or a feminist and a bank teller, they respond with values that indicate: Rational classical probability theory makes the incorrect prediction: it expects humans to rank the conjunction less probable than the bank teller option. Many variations of this experiment demonstrate that the fallacy represents human cognition in this case and not an artifact of one presentation.

Quantum cognition models this probability-estimation scenario with quantum probability theory which always ranks sequential probability, , greater than the direct probability, . The idea is that a person's understanding of "bank teller" is affected by the context of the question involving "feminist". The two questions are "incompatible": to treat them with classical theory would require separate reasoning steps.

Main subjects of research

Quantum-like models of information processing

The quantum cognition concept is based on the observation that various cognitive phenomena are more adequately described by quantum probability theory than by the classical probability theory (see examples below). Thus, the quantum formalism is considered an operational formalism that describes non-classical processing of probabilistic data.

Here, contextuality is the key word (see the monograph of Khrennikov for detailed representation of this viewpoint). Quantum mechanics is fundamentally contextual. Quantum systems do not have objective properties which can be defined independently of measurement context. As has been pointed out by Niels Bohr, the whole experimental arrangement must be taken into account. Contextuality implies existence of incompatible mental variables, violation of the classical law of total probability, and constructive or destructive interference effects. Thus, the quantum cognition approach can be considered an attempt to formalize contextuality of mental processes, by using the mathematical apparatus of quantum mechanics.

Decision making

Suppose a person is given an opportunity to play two rounds of the following gamble: a coin toss will determine whether the subject wins $200 or loses $100. Suppose the subject has decided to play the first round, and does so. Some subjects are then given the result (win or lose) of the first round, while other subjects are not yet given any information about the results. The experimenter then asks whether the subject wishes to play the second round. Performing this experiment with real subjects gives the following results:

  1. When subjects believe they won the first round, the majority of subjects choose to play again on the second round.
  2. When subjects believe they lost the first round, the majority of subjects choose to play again on the second round.

Given these two separate choices, according to the sure thing principle of rational decision theory, they should also play the second round even if they don't know or think about the outcome of the first round. But, experimentally, when subjects are not told the results of the first round, the majority of them decline to play a second round. This finding violates the law of total probability, yet it can be explained as a quantum interference effect in a manner similar to the explanation for the results from double-slit experiment in quantum physics. Similar violations of the sure-thing principle are seen in empirical studies of the Prisoner's Dilemma and have likewise been modeled in terms of quantum interference.

The above deviations from classical rational expectations in agents’ decisions under uncertainty produce well known paradoxes in behavioral economics, that is, the Allais, Ellsberg and Machina paradoxes. These deviations can be explained if one assumes that the overall conceptual landscape influences the subject's choice in a neither predictable nor controllable way. A decision process is thus an intrinsically contextual process, hence it cannot be modeled in a single Kolmogorovian probability space, which justifies the employment of quantum probability models in decision theory. More explicitly, the paradoxical situations above can be represented in a unified Hilbert space formalism where human behavior under uncertainty is explained in terms of genuine quantum aspects, namely, superposition, interference, contextuality and incompatibility.

Considering automated decision making, quantum decision trees have different structure compared to classical decision trees. Data can be analyzed to see if a quantum decision tree model fits the data better.

Human probability judgments

Quantum probability provides a new way to explain human probability judgment errors including the conjunction and disjunction errors. A conjunction error occurs when a person judges the probability of a likely event L and an unlikely event U to be greater than the unlikely event U; a disjunction error occurs when a person judges the probability of a likely event L to be greater than the probability of the likely event L or an unlikely event U. Quantum probability theory is a generalization of Bayesian probability theory because it is based on a set of von Neumann axioms that relax some of the classic Kolmogorov axioms. The quantum model introduces a new fundamental concept to cognition—the compatibility versus incompatibility of questions and the effect this can have on the sequential order of judgments. Quantum probability provides a simple account of conjunction and disjunction errors as well as many other findings such as order effects on probability judgments.

The liar paradox - The contextual influence of a human subject on the truth behavior of a cognitive entity is explicitly exhibited by the so-called liar paradox, that is, the truth value of a sentence like "this sentence is false". One can show that the true-false state of this paradox is represented in a complex Hilbert space, while the typical oscillations between true and false are dynamically described by the Schrödinger equation.

Knowledge representation

Concepts are basic cognitive phenomena, which provide the content for inference, explanation, and language understanding. Cognitive psychology has researched different approaches for understanding concepts including exemplars, prototypes, and neural networks, and different fundamental problems have been identified, such as the experimentally tested non classical behavior for the conjunction and disjunction of concepts, more specifically the Pet-Fish problem or guppy effect, and the overextension and underextension of typicality and membership weight for conjunction and disjunction. By and large, quantum cognition has drawn on quantum theory in three ways to model concepts.

  1. Exploit the contextuality of quantum theory to account for the contextuality of concepts in cognition and language and the phenomenon of emergent properties when concepts combine
  2. Use quantum entanglement to model the semantics of concept combinations in a non-decompositional way, and to account for the emergent properties/associates/inferences in relation to concept combinations
  3. Use quantum superposition to account for the emergence of a new concept when concepts are combined, and as a consequence put forward an explanatory model for the Pet-Fish problem situation, and the overextension and underextension of membership weights for the conjunction and disjunction of concepts.

The large amount of data collected by Hampton on the combination of two concepts can be modeled in a specific quantum-theoretic framework in Fock space where the observed deviations from classical set (fuzzy set) theory, the above-mentioned over- and under- extension of membership weights, are explained in terms of contextual interactions, superposition, interference, entanglement and emergence. And, more, a cognitive test on a specific concept combination has been performed which directly reveals, through the violation of Bell's inequalities, quantum entanglement between the component concepts.

Semantic analysis and information retrieval

The research in (iv) had a deep impact on the understanding and initial development of a formalism to obtain semantic information when dealing with concepts, their combinations and variable contexts in a corpus of unstructured documents. This conundrum of natural language processing (NLP) and information retrieval (IR) on the web – and data bases in general – can be addressed using the mathematical formalism of quantum theory. As basic steps, (a) K. Van Rijsbergen introduced a quantum structure approach to IR, (b) Widdows and Peters utilised a quantum logical negation for a concrete search system, and Aerts and Czachor identified quantum structure in semantic space theories, such as latent semantic analysis. Since then, the employment of techniques and procedures induced from the mathematical formalisms of quantum theory – Hilbert space, quantum logic and probability, non-commutative algebras, etc. – in fields such as IR and NLP, has produced significant results.

History

Ideas for applying the formalisms of quantum theory to cognition first appeared in the 1990s by Diederik Aerts and his collaborators Jan Broekaert, Sonja Smets and Liane Gabora, by Harald Atmanspacher, Robert Bordley, and Andrei Khrennikov. A special issue on Quantum Cognition and Decision appeared in the Journal of Mathematical Psychology (2009, vol 53.), which planted a flag for the field. A few books related to quantum cognition have been published including those by Khrennikov (2004, 2010), Ivancivic and Ivancivic (2010), Busemeyer and Bruza (2012), E. Conte (2012). The first Quantum Interaction workshop was held at Stanford in 2007 organized by Peter Bruza, William Lawless, C. J. van Rijsbergen, and Don Sofge as part of the 2007 AAAI Spring Symposium Series. This was followed by workshops at Oxford in 2008, Saarbrücken in 2009, at the 2010 AAAI Fall Symposium Series held in Washington, D.C., 2011 in Aberdeen, 2012 in Paris, and 2013 in Leicester. Tutorials also were presented annually beginning in 2007 until 2013 at the annual meeting of the Cognitive Science Society. A Special Issue on Quantum models of Cognition appeared in 2013 in the journal Topics in Cognitive Science.

Abiogenesis

From Wikipedia, the free encyclopedia
Stages in the origin of life range from the well understood, such as the habitable Earth and the abiotic synthesis of simple molecules, to the largely unknown, like the derivation of the last universal common ancestor (LUCA) with its complex molecular functionalities.

Abiogenesis, at times referred to as biopoesis, is the natural process by which life arises from non-living matter, such as simple organic compounds. The prevailing scientific hypothesis is that the transition from non-living to living entities on Earth was not a single event, but a process of increasing complexity involving the formation of a habitable planet, the prebiotic synthesis of organic molecules, molecular self-replication, self-assembly, autocatalysis, and the emergence of cell membranes. The transition from non-life to life has not been observed experimentally, but many proposals have been made for different stages of the process.

The study of abiogenesis aims to determine how pre-life chemical reactions gave rise to life under conditions strikingly different from those on Earth today. It uses tools from biology and chemistry, attempting a synthesis of many sciences. Life functions through the chemistry of carbon and water, and builds on four chemical families: lipids for cell membranes, carbohydrates such as sugars, amino acids for protein metabolism, and the nucleic acids DNA and RNA for heredity. A theory of abiogenesis must explain the origins and interactions of these classes of molecules.

Many approaches investigate how self-replicating molecules came into existence. Researchers think that life descends from an RNA world, although other self-replicating and self-catalyzing molecules may have preceded RNA. Other approaches ("metabolism-first" hypotheses) focus on how catalysis on the early Earth might have provided the precursor molecules for self-replication. The 1952 Miller–Urey experiment demonstrated that amino acids can be synthesized from inorganic compounds under conditions like early Earth's. Subsequently, amino acids have been found in meteorites, comets, asteroids, and star-forming regions of space.

While the last universal common ancestor of all modern organisms (LUCA) existed millions of years after the origin of life, its study can guide research into early universal characteristics. A genomics approach has sought to characterize LUCA by identifying the genes shared by Archaea and Bacteria, major branches of life. It appears there are 60 proteins common to all life and 355 prokaryotic genes that trace to LUCA; their functions imply that LUCA was anaerobic with the Wood–Ljungdahl pathway, deriving energy by chemiosmosis, and used DNA, the genetic code, and ribosomes. Earlier cells might have had a leaky membrane and been powered by a naturally occurring proton gradient near a deep-sea white smoker hydrothermal vent; or, life may have originated inside the continental crust or in water at Earth's surface.

Earth is the only place known to harbor life. Geochemical and fossil evidence informs most studies. The Earth was formed at 4.54 Gya, and the earliest evidence of life on Earth dates from 3.8 Gya from Western Australia. Fossil micro-organisms may have lived in hydrothermal vent precipitates from Quebec, soon after ocean formation during the Hadean.

Overview

NASA's 2015 strategy for astrobiology aimed to solve the puzzle of the origin of life – how a fully functioning living system could emerge from non-living components – through research on the prebiotic origin of life's chemicals, both in space and on planets, as well as the functioning of early biomolecules to catalyse reactions and support inheritance.

Life consists of reproduction with (heritable) variations. NASA defines life as "a self-sustaining chemical system capable of Darwinian evolution." Such a system is complex; the last universal common ancestor (LUCA), presumably a single-celled organism which lived some 4 billion years ago, already had hundreds of genes encoded in the DNA genetic code that is universal today. That in turn implies a suite of cellular machinery including messenger RNA, transfer RNA, and ribosomes to translate the code into proteins. Those proteins included enzymes to operate its anaerobic respiration via the Wood–Ljungdahl metabolic pathway, and a DNA polymerase to replicate its genetic material.

The challenge for abiogenesis (origin of life) researchers is to explain how such a complex and tightly interlinked system could develop by evolutionary steps, as at first sight all its parts are necessary to enable it to function. For example, a cell, whether the LUCA or in a modern organism, copies its DNA with the DNA polymerase enzyme, which is itself produced by translating the DNA polymerase gene in the DNA. Neither the enzyme nor the DNA can be produced without the other. The evolutionary process could have started with molecular self-replication, self-assembly such as of cell membranes, and autocatalysis via RNA ribozymes in an RNA world environment. The transition of non-life to life has not been observed experimentally.

The preconditions to the development of a living cell like the LUCA are known, though disputed in detail: a habitable world is formed with a supply of minerals and liquid water. Prebiotic synthesis creates a range of simple organic compounds, which are assembled into polymers such as proteins and RNA. On the other side, the process after the LUCA is readily understood: biological evolution caused the development of a wide range of species with varied forms and biochemical capabilities. However, the derivation of the LUCA from simple components is far from understood.

Although Earth remains the only place where life is known, the science of astrobiology seeks evidence of life on other planets. The 2015 NASA strategy on the origin of life aimed to solve the puzzle by identifying interactions, intermediary structures and functions, energy sources, and environmental factors that contributed to evolvable macromolecular systems, and mapping the chemical landscape of potential primordial informational polymers. The advent of such polymers was most likely a critical step in prebiotic chemical evolution. Those polymers derived, in turn, from simple organic compounds such as nucleobases, amino acids, and sugars, likely formed by reactions in the environment. A successful theory of the origin of life must explain how all these chemicals came into being.

Pre-1960s conceptual history

The Miller–Urey experiment was a synthesis of small organic molecules in a mixture of simple gases in a thermal gradient created by heating (right) and cooling (left) the mixture at the same time, with electrical discharges.

Spontaneous generation

One ancient view of the origin of life, from Aristotle until the 19th century, was of spontaneous generation. This held that "lower" animals such as insects were generated by decaying organic substances, and that life arose by chance. This was questioned from the 17th century, in works like Thomas Browne's Pseudodoxia Epidemica. In 1665, Robert Hooke published the first drawings of a microorganism. In 1676, Antonie van Leeuwenhoek drew and described microorganisms, probably protozoa and bacteria. Van Leeuwenhoek disagreed with spontaneous generation, and by the 1680s convinced himself, using experiments ranging from sealed and open meat incubation and the close study of insect reproduction, that the theory was incorrect. In 1668 Francesco Redi showed that no maggots appeared in meat when flies were prevented from laying eggs. By the middle of the 19th century, spontaneous generation was considered disproven.

Panspermia

Dating back to Anaxagoras in the 5th century BC, panspermia is the idea that life originated elsewhere in the universe and came to Earth. The modern version of panspermia holds that life may have been distributed to Earth by meteoroids, asteroids, comets or planetoids. This shifts the origin of life to another heavenly body. The advantage is that life is not required to have formed on each planet it occurs on, but in a more limited set of locations, and then spread about the galaxy to other star systems. There is some interest in the possibility that life originated on Mars and later transferred to Earth.

"A warm little pond": primordial soup

The idea that life originated from non-living matter in slow stages appeared in Herbert Spencer's 1864–1867 book Principles of Biology, and in William Turner Thiselton-Dyer's 1879 paper "On spontaneous generation and evolution". On 1 February 1871 Charles Darwin wrote about these publications to Joseph Hooker, and set out his own speculation that the original spark of life may have been in a "warm little pond, with all sorts of ammonia and phosphoric salts,—light, heat, electricity &c present, that a protein compound was chemically formed". Darwin explained that "at the present day such matter would be instantly devoured or absorbed, which would not have been the case before living creatures were formed."

Alexander Oparin in 1924 and J. B. S. Haldane in 1929 proposed that the earliest cells slowly self-organized from a primordial soup, the Oparin–Haldane hypothesis. Haldane suggested that the Earth's prebiotic oceans consisted of a "hot dilute soup" in which organic compounds could have formed. J. D. Bernal showed that such mechanisms could form most of the necessary molecules for life from inorganic precursors. In 1967, he suggested three "stages": the origin of biological monomers; the origin of biological polymers; and the evolution from molecules to cells.

Miller–Urey experiment

In 1952, Stanley Miller and Harold Urey carried out a chemical experiment to demonstrate how organic molecules could have formed spontaneously from inorganic precursors under prebiotic conditions like those posited by the Oparin–Haldane hypothesis. It used a highly reducing (lacking oxygen) mixture of gases—methane, ammonia, and hydrogen, with water vapor—to form organic monomers such as amino acids. Bernal said of the Miller–Urey experiment that "it is not enough to explain the formation of such molecules, what is necessary, is a physical-chemical explanation of the origins of these molecules that suggests the presence of suitable sources and sinks for free energy." However, current scientific consensus describes the primitive atmosphere as weakly reducing or neutral, diminishing the amount and variety of amino acids that could be produced. The addition of iron and carbonate minerals, present in early oceans, produces a diverse array of amino acids. Later work has focused on two other potential reducing environments: outer space and deep-sea hydrothermal vents.

Producing a habitable Earth

−13 —
−12 —
−11 —
−10 —
−9 —
−8 —
−7 —
−6 —
−5 —
−4 —
−3 —
−2 —
−1 —
0 —

Evolutionary history

Early universe with first stars

Soon after the Big Bang, roughly 14 Gya, the only chemical elements present in the universe were hydrogen, helium, and lithium, the three lightest atoms in the periodic table. These elements gradually accreted and began orbiting in disks of gas and dust. Gravitational accretion of material at the hot and dense centers of these protoplanetary disks formed stars by the fusion of hydrogen. Early stars were massive and short-lived, producing all the heavier elements by stellar nucleosynthesis. Such element formation proceeds to its most stable element Iron-56. Heavier elements were formed during supernovae at the end of a star's lifecycle. Carbon, currently the fourth most abundant element in the universe, was formed mainly in white dwarf stars. As these stars reached the end of their lifecycles, they ejected heavier elements, including carbon and oxygen, throughout the universe. These allowed for the formation of rocky planets. According to the nebular hypothesis, the Solar System began to form 4.6 Gya with the gravitational collapse of part of a giant molecular cloud. Most of the collapsing mass collected in the center, forming the Sun, while the rest flattened into a protoplanetary disk out of which the planets formed.

Emergence of Earth

The age of the Earth is 4.54 Gya as found by radiometric dating of calcium-aluminium-rich inclusions in carbonaceous chrondrite meteorites, the oldest material in the Solar System. Earth, during the Hadean eon (from its formation until 4.031 Gya,) was at first inhospitable to life. During its formation, the Earth lost much of its initial mass, and so lacked the gravity to hold molecular hydrogen and the bulk of the original inert gases. Soon after initial accretion of Earth at 4.48 Gya, its collision with Theia, a hypothesised impactor, is thought to have created the ejected debris that eventually formed the Moon. This impact removed the Earth's primary atmosphere, leaving behind clouds of viscous silicates and carbon dioxide. This unstable atmosphere was short-lived, soon condensing to form the bulk silicate Earth, leaving behind an atmosphere largely consisting of water vapor, nitrogen, and carbon dioxide, with smaller amounts of carbon monoxide, hydrogen, and sulfur compounds. The solution of carbon dioxide in water is thought to have made the seas slightly acidic, with a pH of about 5.5.

Condensation to form liquid oceans is theorised to have occurred as early as the Moon-forming impact. This scenario is supported by the dating of 4.404 Gya zircon crystals with high δ18O values from metamorphosed quartzite of Mount Narryer in Western Australia. The Hadean atmosphere has been characterized as a "gigantic, productive outdoor chemical laboratory," similar to volcanic gases today which still support some abiotic chemistry. Despite the likely increased volcanism from early plate tectonics, the Earth may have been a predominantly water world between 4.4 and 4.3 Gya. It is debated whether crust was exposed above this ocean. Immediately after the Moon-forming impact, Earth likely had little if any continental crust, a turbulent atmosphere, and a hydrosphere subject to intense ultraviolet light from a T Tauri stage Sun. It was also affected by cosmic radiation, and continued asteroid and comet impacts.

The Late Heavy Bombardment hypothesis posits that a period of intense impact occurred at 4.1 to 3.8 Gya during the Hadean and early Archean eons. Originally it was thought that the Late Heavy Bombardment was a single cataclysmic impact event occurring at 3.9 Gya; this would have had the potential to sterilise Earth by volatilising liquid oceans and blocking sunlight needed for photosynthesis, delaying the earliest possible emergence of life. More recent research questioned the intensity of the Late Heavy Bombardment and its potential for sterilisation. If it was not one giant impact but a period of raised impact rate, it would have had much less destructive power. The 3.9 Gya date arose from dating of Apollo mission sample returns collected mostly near the Imbrium Basin, biasing the age of recorded impacts. Impact modelling of the lunar surface reveals that rather than a cataclysmic event at 3.9 Gya, multiple small-scale, short-lived periods of bombardment likely occurred. Terrestrial data backs this idea by showing multiple periods of ejecta in the rock record both before and after the 3.9 Gya marker, suggesting that the early Earth was subject to continuous impacts with less impact on extinction.

If life evolved in the ocean at depths of more than ten meters, it would have been shielded both from late impacts and the then high levels of ultraviolet radiation from the sun. Geothermically heated oceanic crust could have yielded far more organic compounds through deep hydrothermal vents than the Miller–Urey experiments indicated. The available energy is maximized at 100–150 °C, the temperatures at which hyperthermophilic bacteria and thermoacidophilic archaea live.

Earliest evidence of life

If banded iron formation rocks of Archaean age (like these from Australia) are fossilized stromatolites, they would be among the earliest life-forms.
Modern stromatolites in Shark Bay, created by photosynthetic cyanobacteria

Based on evidence from the geologic record, life most likely emerged on Earth between 3.48 and 4.32 Gya. In 2017, the earliest physical evidence of life was reported to consist of microbialites in the Nuvvuagittuq Greenstone Belt of Northern Quebec, in banded iron formation rocks at least 3.77 and possibly as old as 4.32 Gya. The micro-organisms could have lived within hydrothermal vent precipitates, soon after the 4.4 Gya formation of oceans during the Hadean. The microbes resemble modern hydrothermal vent bacteria, supporting the view that abiogenesis began in such an environment. Later research disputed this interpretation of the data, stating that the observations may be better explained by abiotic processes in silica-rich waters, "chemical gardens," circulating hydrothermal fluids, or volcanic ejecta.

Biogenic graphite has been found in 3.7 Gya metasedimentary rocks from southwestern Greenland and in microbial mat fossils from 3.49 Gya cherts in the Pilbara region of Western Australia. Evidence of early life in rocks from Akilia Island, near the Isua supracrustal belt in southwestern Greenland, dating to 3.7 Gya, have shown biogenic carbon isotopes. In other parts of the Isua supracrustal belt, graphite inclusions trapped within garnet crystals are connected to the other elements of life: oxygen, nitrogen, and possibly phosphorus in the form of phosphate, providing further evidence for life 3.7 Gya. In the Pilbara region of Western Australia, compelling evidence of early life was found in pyrite-bearing sandstone in a fossilized beach, with rounded tubular cells that oxidized sulfur by photosynthesis in the absence of oxygen. Carbon isotope ratios on graphite inclusions from the Jack Hills zircons suggest that life could have existed on Earth from 4.1 Gya. A 2024 study inferred LUCA's age as around 4.2 Gya (4.09–4.33 Gya) by analysing pre-LUCA gene duplicates, with calibration from fossil micro-organisms, much sooner after the origin of life than previously thought.

The Pilbara region of Western Australia contains the Dresser Formation with rocks 3.48 Gya, including layered structures called stromatolites. Their modern counterparts are created by photosynthetic micro-organisms including cyanobacteria. These lie within undeformed hydrothermal-sedimentary strata; their texture indicates a biogenic origin. Parts of the Dresser formation preserve hot springs on land, but other regions seem to have been shallow seas. A molecular clock analysis suggests the LUCA emerged prior to 3.9 Gya.

Producing molecules: prebiotic synthesis

All chemical elements derive from stellar nucleosynthesis except for hydrogen and some helium and lithium. Basic chemical ingredients of life – the carbon-hydrogen molecule (CH), the carbon-hydrogen positive ion (CH+) and the carbon ion (C+) – can be produced by ultraviolet light from stars. Complex molecules, including organic molecules, form naturally both in space and on planets. Organic molecules on the early Earth could have had either terrestrial origins, with organic molecule synthesis driven by impact shocks or by other energy sources, such as ultraviolet light, redox coupling, or electrical discharges; or extraterrestrial origins (pseudo-panspermia), with organic molecules formed in interstellar dust clouds raining down on to the planet.

Observed extraterrestrial organic molecules

An organic compound is a chemical whose molecules contain carbon. Carbon is abundant in the Sun, stars, comets, and in the atmospheres of most planets of the Solar System. Organic compounds are relatively common in space, formed by "factories of complex molecular synthesis" which occur in molecular clouds and circumstellar envelopes, and chemically evolve after reactions are initiated mostly by ionizing radiation. Purine and pyrimidine nucleobases including guanine, adenine, cytosine, uracil, and thymine, as well as sugars, have been found in meteorites. These could have provided the materials for DNA and RNA to form on the early Earth. The amino acid glycine was found in material ejected from comet Wild 2; it had earlier been detected in meteorites. Comets are encrusted with dark material, thought to be a tar-like organic substance formed from simple carbon compounds under ionizing radiation. A rain of material from comets could have brought such complex organic molecules to Earth. During the Late Heavy Bombardment, meteorites may have delivered up to five million tons of organic prebiotic elements to Earth per year. Currently 40,000 tons of cosmic dust falls to Earth each year.

Polycyclic aromatic hydrocarbons

The Cat's Paw Nebula is inside the Milky Way Galaxy, in the constellation Scorpius.
Green areas show regions where radiation from hot stars collided with large molecules and small dust grains called "polycyclic aromatic hydrocarbons" (PAHs), causing them to fluoresce. Spitzer Space Telescope, 2018.

Polycyclic aromatic hydrocarbons (PAH) are the most common and abundant polyatomic molecules in the observable universe, and are a major store of carbon. They seem to have formed shortly after the Big Bang, and are associated with new stars and exoplanets. They are a likely constituent of Earth's primordial sea. PAHs have been detected in nebulae, and in the interstellar medium, in comets, and in meteorites.

A star, HH 46-IR, resembling the sun early in its life, is surrounded by a disk of material which contains molecules including cyanide compounds, hydrocarbons, and carbon monoxide. PAHs in the interstellar medium can be transformed through hydrogenation, oxygenation, and hydroxylation to more complex organic compounds used in living cells.

Nucleobases and nucleotides

Organic compounds introduced on Earth by interstellar dust particles can help to form complex molecules, thanks to their peculiar surface-catalytic activities. The RNA component uracil and related molecules, including xanthine, in the Murchison meteorite were likely formed extraterrestrially, as suggested by studies of 12C/13C isotopic ratios. NASA studies of meteorites suggest that all four DNA nucleobases (adenine, guanine and related organic molecules) have been formed in outer space. The cosmic dust permeating the universe contains complex organics ("amorphous organic solids with a mixed aromaticaliphatic structure") that could be created rapidly by stars. Glycolaldehyde, a sugar molecule and RNA precursor, has been detected in regions of space including around protostars and on meteorites.

Laboratory synthesis

As early as the 1860s, experiments demonstrated that biologically relevant molecules can be produced from interaction of simple carbon sources with abundant inorganic catalysts. The spontaneous formation of complex polymers from abiotically generated monomers under the conditions posited by the "soup" theory is not straightforward. Besides the necessary basic organic monomers, compounds that would have prohibited the formation of polymers were also formed in high concentration during the Miller–Urey experiment and Joan Oró experiments. Biology uses essentially 20 amino acids for its coded protein enzymes, representing a very small subset of the structurally possible products. Since life tends to use whatever is available, an explanation is needed for why the set used is so small. Formamide is attractive as a medium that potentially provided a source of amino acid derivatives from simple aldehyde and nitrile feedstocks.

Sugars

The Breslow catalytic cycle for formaldehyde dimerization and C2-C6 sugar formation

Alexander Butlerov showed in 1861 that the formose reaction created sugars including tetroses, pentoses, and hexoses when formaldehyde is heated under basic conditions with divalent metal ions like calcium. R. Breslow proposed that the reaction was autocatalytic in 1959.

Nucleobases

Nucleobases, such as guanine and adenine, can be synthesized from simple carbon and nitrogen sources, such as hydrogen cyanide (HCN) and ammonia. On early Earth, HCN has been shown in modelling experiments to have likely been supplied via photochemical production in transient, highly reducing atmospheres (see Prebiotic atmosphere) following major impacts. Formamide, produced by the reaction of water and HCN, is ubiquitous and produces all four ribonucleotides when warmed with terrestrial minerals. It can be concentrated by the evaporation of water. HCN is poisonous only to aerobic organisms, which did not exist during the earliest phases of life's origin. It can contribute to chemical processes such as the synthesis of the amino acid glycine.

DNA and RNA components including uracil, cytosine and thymine can be synthesized under outer space conditions, using starting chemicals such as pyrimidine found in meteorites. Pyrimidine may have been formed in red giant stars, in interstellar dust and gas clouds, or may have been synthesized on Earth via precursors such as cyanoacetylene and other intermediates made available following early asteroid impacts. All four RNA-bases may be synthesized from formamide in high-energy density events like extraterrestrial impacts. Several ribonucleotides for RNA formation have been synthesized in a laboratory environment which replicates prebiotic conditions via autocatalytic formose reaction.

Other pathways for synthesizing bases from inorganic materials have been reported. Freezing temperatures assist the synthesis of purines, by concentrating key precursors such as HCN. However, while adenine and guanine require freezing conditions, cytosine and uracil may require boiling temperatures. Seven amino acids and eleven types of nucleobases formed in ice when ammonia and cyanide were left in a freezer for 25 years. S-triazines (alternative nucleobases), pyrimidines including cytosine and uracil, and adenine can be synthesized by subjecting a urea solution to freeze-thaw cycles under a reductive atmosphere with spark discharges. The unusual speed of these low-temperature reactions is due to eutectic freezing, which crowds impurities in microscopic pockets of liquid within the ice.

Peptides

Prebiotic peptide synthesis could have occurred by several routes. Some center on high temperature/concentration conditions in which condensation becomes energetically favorable, while others use plausible prebiotic condensing agents.

Experimental evidence for the formation of peptides in uniquely concentrated environments is bolstered by work suggesting that wet-dry cycles and the presence of specific salts can greatly increase spontaneous condensation of glycine into poly-glycine chains. Other work suggests that while mineral surfaces, such as those of pyrite, calcite, and rutile catalyze peptide condensation, they also catalyze their hydrolysis. The authors suggest that additional chemical activation or coupling would be necessary to produce peptides at sufficient concentrations. Thus, mineral surface catalysis, while important, is not sufficient alone for peptide synthesis.

Many prebiotically plausible condensing/activating agents have been identified, including the following: cyanamide, dicyanamide, dicyandiamide, diaminomaleonitrile, urea, trimetaphosphate, NaCl, CuCl2, (Ni,Fe)S, CO, carbonyl sulfide (COS), carbon disulfide (CS2), SO2, and diammonium phosphate (DAP).

A 2024 experiment used a sapphire substrate with a web of thin cracks under a heat flow, mimicking deep-ocean vents, to concentrate prebiotically-relevant building blocks from a dilute mixture by up to three orders of magnitude. This could help to create biopolymers such as peptides. A similar role has been suggested for clays.

The prebiotic synthesis of peptides from simpler molecules such as CO, NH3 and C, skipping the step of amino acid formation, is also very efficient.

Producing protocells

The three main structures composed of phospholipids form spontaneously by self-assembly in solution: the liposome (a closed bilayer), the micelle and the bilayer.

The largest unanswered question in evolution is how simple protocells first arose and differed in reproductive contribution to the following generation, thus initiating evolution. The lipid world theory postulates that the first self-replicating object was lipid-like. Phospholipids form lipid bilayers (as in cell membranes) in water while under agitation. These molecules were not present on early Earth, but other membrane-forming amphiphilic long-chain molecules were. These bodies may expand by insertion of additional lipids, and may spontaneously split into two offspring of similar size and composition. Lipid bodies may have provided sheltering envelopes for information storage, allowing the evolution of information-storing polymers like RNA. Only one or two types of vesicle-forming amphiphiles have been studied. There is an enormous number of possible arrangements of lipid bilayer membranes, and those with the best reproductive characteristics would have converged toward a hypercycle reaction, a positive feedback composed of two mutual catalysts represented by a membrane site and a specific compound trapped in the vesicle. Such site/compound pairs are transmissible to the daughter vesicles, leading to the emergence of distinct lineages of vesicles, subject to natural selection.

A protocell is a self-organized, self-ordered, spherical collection of lipids proposed as a stepping-stone to life. A functional protocell has (as of 2014) not yet been achieved in a laboratory setting. Self-assembled vesicles are essential components of primitive cells. The theory of classical irreversible thermodynamics treats self-assembly under a generalized chemical potential within the framework of dissipative systems. The second law of thermodynamics requires that overall entropy increases, yet life is distinguished by its great degree of organization. Therefore, a boundary is needed to separate ordered life processes from chaotic non-living matter.

Irene Chen and Jack W. Szostak suggest that elementary protocells can give rise to cellular behaviors including primitive forms of differential reproduction, competition, and energy storage. Competition for membrane molecules would favor stabilized membranes, suggesting a selective advantage for cross-linked fatty acids and even modern phospholipids. Such micro-encapsulation would allow for metabolism within the membrane and the exchange of small molecules, while retaining large biomolecules inside. Such a membrane is needed for a cell to create its own electrochemical gradient. Fatty acid vesicles in conditions relevant to alkaline hydrothermal vents can be stabilized by isoprenoids which are synthesized by the formose reaction; the advantages and disadvantages of isoprenoids incorporated within the lipid bilayer in different microenvironments might have led to the divergence of the membranes of archaea and bacteria.

Vesicles can undergo an evolutionary process under pressure cycling conditions. Simulating the systemic environment in tectonic fault zones within the Earth's crust, pressure cycling leads to the periodic formation of vesicles. Under the same conditions, random peptide chains are formed and selected for their ability to integrate into the vesicle membrane. A further selection of the vesicles for stability potentially leads to functional peptide structures, increasing the survival rate of the vesicles.

Producing biology

Energy and entropy

Life requires a loss of entropy, or disorder, as molecules organize themselves into living matter. At the same time, the emergence of life is associated with the formation of structures beyond a certain threshold of complexity. The emergence of life with increasing order and complexity does not contradict the second law of thermodynamics, which states that overall entropy never decreases, since a living organism creates order in some places (e.g. its living body) at the expense of an increase of entropy elsewhere (e.g. heat and waste production).

Multiple sources of energy were available for chemical reactions on the early Earth. Heat from geothermal processes is a standard energy source for chemistry. Other examples include sunlight, lightning, atmospheric entries of micro-meteorites, and implosion of bubbles in sea and ocean waves. This has been confirmed by experiments and simulations. Unfavorable reactions can be driven by highly favorable ones, as in the case of iron-sulfur chemistry. For example, this was probably important for carbon fixation. Carbon fixation by reaction of CO2 with H2S via iron-sulfur chemistry is favorable, and occurs at neutral pH and 100 °C. Iron-sulfur surfaces, which are abundant near hydrothermal vents, can drive the production of small amounts of amino acids and other biomolecules.

Chemiosmosis

ATP synthase uses the chemiosmotic proton gradient to power ATP synthesis through oxidative phosphorylation.

In 1961, Peter Mitchell proposed chemiosmosis as a cell's primary system of energy conversion. The mechanism, now ubiquitous in living cells, powers energy conversion in micro-organisms and in the mitochondria of eukaryotes, making it a likely candidate for early life. Mitochondria produce adenosine triphosphate (ATP), the energy currency of the cell used to drive cellular processes such as chemical syntheses. The mechanism of ATP synthesis involves a closed membrane in which the ATP synthase enzyme is embedded. The energy required to release strongly bound ATP has its origin in protons that move across the membrane. In modern cells, those proton movements are caused by the pumping of ions across the membrane, maintaining an electrochemical gradient. In the first organisms, the gradient could have been provided by the difference in chemical composition between the flow from a hydrothermal vent and the surrounding seawater, or perhaps meteoric quinones that were conducive to the development of chemiosmotic energy across lipid membranes if at a terrestrial origin.

Chemiosmotic coupling in the membranes of a mitochondrion

PAH world hypothesis

The PAH world hypothesis is a speculative hypothesis that proposes that polycyclic aromatic hydrocarbons (PAHs), known to be abundant in the universe, including in comets, and assumed to be abundant in the primordial soup of the early Earth, played a major role in the origin of life by mediating the synthesis of RNA molecules, leading into the RNA world. However, as yet, the hypothesis is untested.

The RNA world

The RNA world hypothesis proposes that undirected polymerisation led to the emergence of ribozymes, and in turn to an RNA replicase.

The RNA world hypothesis describes an early Earth with self-replicating and catalytic RNA but no DNA or proteins. It was proposed in 1962 by Alexander Rich; the term was coined by Walter Gilbert in 1986. Many researchers concur that an RNA world must have preceded modern DNA-based life. However, it may not have been the first to exist. Some have proposed a timeline of more than 30 chemical events between pre-RNA world to near but before LUCA, just involving RNA.

RNA is central to the translation process. Small RNAs can catalyze all the chemical groups and information transfers required for life. RNA both expresses and maintains genetic information in modern organisms; its components are easily synthesized under early Earth conditions. The structure of the ribosome has been called the "smoking gun", with a central core of RNA and no amino acid side chains within 18 Å of the active site that catalyzes peptide bond formation.

RNA replicase can function as both code and catalyst for further RNA replication, i.e. it can be autocatalytic. Some catalytic RNAs can link smaller RNA sequences together, enabling self-replication. Natural selection would then favor the proliferation of such autocatalytic sets, to which further functionalities could be added. Self-assembly of RNA may occur spontaneously in hydrothermal vents. A preliminary form of tRNA could have assembled into a replicator molecule. When this began to replicate, it may have been capable of the three mechanisms of Darwinian selection: heritability, variation of type, and differential reproductive output. Its fitness would likely have been a function of its intrinsic adaptive capabilities determined by its nucleotide sequence, and the availability of resources.

Possible precursors to protein synthesis include the synthesis of short peptide cofactors or the self-catalysing duplication of RNA. It is likely that the ancestral ribosome was composed entirely of RNA, although some roles have since been taken over by proteins. Major remaining questions on this topic include identifying the selective force for the evolution of the ribosome and determining how the genetic code arose.

From RNA to directed protein synthesis

In line with the RNA world hypothesis, much of modern biology's templated protein biosynthesis is done by RNA molecules—namely tRNAs and the ribosome (consisting of both protein and rRNA). The most central reaction of peptide bond synthesis is carried out by base catalysis by the 23S rRNA domain V. Di- and tripeptides can be synthesized with a system consisting of only aminoacyl phosphate adaptors and RNA guides. Aminoacylation ribozymes that can charge tRNAs with their cognate amino acids have been selected in in vitro experimentation.

Early functional peptides

The first proteins had to arise without a fully-fledged system of protein biosynthesis. Random sequence peptides would not have had biological function. Thus, significant study has gone into exploring how early functional proteins could have arisen from random sequences. Evidence on hydrolysis rates shows that abiotically plausible peptides likely contained significant "nearest-neighbor" biases. This could have had some effect on early protein sequence diversity. A search found that approximately 1 in 1011 random sequences had ATP binding function.

Phylogeny and LUCA

Starting with the work of Carl Woese from 1977, genomics studies have placed the last universal common ancestor (LUCA) of all modern life-forms between Bacteria and a clade formed by Archaea and Eukaryota in the phylogenetic tree of life. It lived over 4 Gya. A minority of studies have placed the LUCA in Bacteria, proposing that Archaea and Eukaryota are evolutionarily derived from within Eubacteria; Thomas Cavalier-Smith suggested in 2006 that the phenotypically diverse bacterial phylum Chloroflexota contained the LUCA.

In 2016, a set of 355 genes likely present in the LUCA was identified. A total of 6.1 million prokaryotic genes from Bacteria and Archaea were sequenced, identifying 355 protein clusters from among 286,514 protein clusters that were probably common to the LUCA. The results suggest that the LUCA was anaerobic with a Wood–Ljungdahl (reductive Acetyl-CoA) pathway, nitrogen- and carbon-fixing, thermophilic. Its cofactors suggest dependence upon an environment rich in hydrogen, carbon dioxide, iron, and transition metals. Its genetic material was probably DNA, requiring the 4-nucleotide genetic code, messenger RNA, transfer RNA, and ribosomes to translate the code into proteins such as enzymes. LUCA likely inhabited an anaerobic hydrothermal vent setting in a geochemically active environment. It was evidently already a complex organism, and must have had precursors; it was not the first living thing. The physiology of LUCA has been in dispute. Previous research identified 60 proteins common to all life. Metabolic reactions inferred in LUCA are the incomplete reverse Krebs cycle, gluconeogenesis, the pentose phosphate pathway, glycolysis, reductive amination, and transamination.

Suitable geological environments

A variety of geologic and environmental settings have been proposed for an origin of life. These theories are often in competition with one another as there are many views of prebiotic compound availability, geophysical setting, and early life characteristics. The first organism on Earth likely differed from LUCA. Between the first appearance of life and where all modern phylogenies began branching, an unknown amount of time passed, with unknown gene transfers, extinctions, and adaptation to environmental niches. Modern phylogenies provide more genetic evidence about LUCA than about its precursors.

Deep sea hydrothermal vents

Hot fluids

The earliest known life forms may be putative fossilized microorganisms, found in white smoker hydrothermal vent precipitates. They may have lived as early as 4.28 Gya (billion years ago), relatively soon after the formation of the oceans 4.41 Gya, not long after the formation of the Earth 4.54 Gya.

Early micro-fossils may have come from a hot world of gases such as methane, ammonia, carbon dioxide, and hydrogen sulfide, toxic to much current life. Analysis of the tree of life places thermophilic and hyperthermophilic bacteria and archaea closest to the root, suggesting that life may have evolved in a hot environment. The deep sea or alkaline hydrothermal vent theory posits that life began at submarine hydrothermal vents. William Martin and Michael Russell have suggested that this could have been in metal-sulphide-walled compartments acting as precursors for cell walls.

These form where hydrogen-rich fluids emerge from below the sea floor, as a result of serpentinization of ultra-mafic olivine with seawater and a pH interface with carbon dioxide-rich ocean water. The vents form a sustained chemical energy source derived from redox reactions, in which electron donors (molecular hydrogen) react with electron acceptors (carbon dioxide); see iron–sulfur world theory. These are exothermic reactions.

Chemiosmotic gradient

Proposed model of an early cell powered by external proton gradient near a deep-sea hydrothermal vent. As long as the membrane (or passive ion channels within it) is permeable to protons, the mechanism can function without ion pumps.

Russell demonstrated that alkaline vents create an abiogenic proton motive force chemiosmotic gradient, ideal for abiogenesis. Their microscopic compartments "provide a natural means of concentrating organic molecules," composed of iron-sulfur minerals such as mackinawite, endowed these mineral cells with the catalytic properties envisaged by Günter Wächtershäuser. This movement of ions across the membrane depends on two factors:

  1. Diffusion force caused by concentration gradient—all particles including ions diffuse from higher concentration to lower.
  2. Electrostatic force caused by electrical potential gradient—cations like protons H+ diffuse down the electrical potential, anions in the opposite direction.

These two gradients together can be expressed as an electrochemical gradient, providing energy for abiogenic synthesis. The proton motive force measures the potential energy stored as proton and voltage gradients across a membrane (differences in proton concentration and electrical potential).

The surfaces of mineral particles inside deep-ocean hydrothermal vents have catalytic properties similar to those of enzymes, and can create simple organic molecules, such as methanol (CH3OH) and formic, acetic, and pyruvic acids out of the dissolved CO2 in the water, if driven by an applied voltage or by reaction with H2 or H2S.

Starting in 1985, researchers proposed that life arose at hydrothermal vents, that spontaneous chemistry in the Earth's crust driven by rock–water interactions at disequilibrium thermodynamically underpinned life's origin, and that the founding lineages of the archaea and bacteria were H2-dependent autotrophs that used CO2 as their terminal acceptor in energy metabolism. In 2016, Martin suggested that the LUCA "may have depended heavily on the geothermal energy of the vent to survive". That same year, RNA was produced in synthetic alkaline hydrothermal chimneys inside a laboratory setting which replicated deep-sea vents. Pores at deep sea hydrothermal vents are suggested to have been occupied by membrane-bound compartments which promoted biochemical reactions. Metabolic intermediates in the Krebs cycle, gluconeogenesis, amino acid bio-synthetic pathways, glycolysis, the pentose phosphate pathway, and including sugars like ribose, and lipid precursors can occur non-enzymatically at conditions relevant to deep-sea alkaline hydrothermal vents.

If the deep marine hydrothermal setting was the site, then abiogenesis could have happened as early as 4.0–4.2 Gya. If life evolved in the ocean at depths of more than ten meters, it would have been shielded both from impacts and the then high levels of solar ultraviolet radiation. The available energy in hydrothermal vents is maximized at 100–150 °C, the temperatures at which hyperthermophilic bacteria and thermoacidophilic archaea live.

Arguments against a vent setting

Arguments against a hydrothermal origin of life state that hyperthermophily was a result of convergent evolution in bacteria and archaea, and that a mesophilic environment is more likely.

Production of prebiotic organic compounds at hydrothermal vents is estimated to be 108 kg/yr. While a large amount of key prebiotic compounds, such as methane, are found at vents, they are in far lower concentrations than in a Miller-Urey Experiment environment: for methane the rate is 2–4 orders of magnitude lower.

Other counter-arguments include the inability to concentrate prebiotic materials, due to strong dilution by seawater. This open system cycles compounds through vent minerals, leaving little residence time to accumulate. All modern cells rely on phosphates and potassium for nucleotide backbone and protein formation respectively, making it likely that the first life forms shared these functions. These elements were not available in high quantities in the Archaean oceans, as both primarily come from the weathering of continental rocks on land, far from vents. Submarine hydrothermal vents are not conducive to condensation reactions needed for polymerisation of macromolecules.

An older argument was that key polymers were encapsulated in vesicles after condensation, which supposedly would not happen in saltwater. However, while salinity inhibits vesicle formation from low-diversity mixtures of fatty acids, vesicle formation from a broader, more realistic mix of fatty-acid and 1-alkanol species is more resilient.

Surface bodies of water

Surface bodies of water provide environments that dry out and rewet. Wet-dry cycles concentrate prebiotic compounds and enable condensation reactions to polymerise macromolecules. Moreover, lakes and ponds receive detrital input from weathering of continental apatite-containing rocks, the most common source of phosphates. The amount of exposed continental crust in the Hadean is unknown, but models of early ocean depths and rates of ocean island and continental crust growth make it plausible that there was exposed land. Another line of evidence for a surface start to life is the requirement for Ultraviolet radiation (UV) for organism function. UV is necessary for the formation of the U+C nucleotide base pair by partial hydrolysis and nucleobase loss. Simultaneously, UV can be harmful and sterilising to life, especially for simple early lifeforms with little ability to repair radiation damage. Radiation levels from a young Sun were likely greater, and, with no ozone layer, harmful shortwave UV rays would reach the surface of Earth. For life to begin, a shielded environment with influx from UV-exposed sources is necessary to both benefit and protect from UV. Shielding under ice, liquid water, mineral surfaces (e.g. clay) or regolith is possible in a range of surface water settings.

Hot springs

Most branching phylogenies are thermophilic or hyperthermophilic, making it possible that LUCA and preceding lifeforms were similarly thermophilic. Hot springs are formed from the heating of groundwater by geothermal activity. This intersection allows for influxes of material from deep penetrating waters and from surface runoff that transports eroded continental sediments. Interconnected groundwater systems create a mechanism for distribution of life to wider area.

Mulkidjanian and co-authors argue that marine environments did not provide the ionic balance and composition universally found in cells, or the ions required by essential proteins and ribozymes, especially with respect to high K+/Na+ ratio, Mn2+, Zn2+ and phosphate concentrations. They argue that the only environments that do this are hot springs similar to ones at Kamchatka. Mineral deposits in these environments under an anoxic atmosphere would have suitable pH, contain precipitates of photocatalytic sulfide minerals that absorb harmful ultraviolet radiation, and have wet-dry cycles that concentrate substrate solutions enough for spontaneous formation of biopolymers created both by chemical reactions in the hydrothermal environment, and by exposure to UV light during transport from vents to adjacent pools. The hypothesized pre-biotic environments are similar to hydrothermal vents, with additional components that help explain peculiarities of the LUCA.

A phylogenomic and geochemical analysis of proteins plausibly traced to the LUCA shows that the ionic composition of its intracellular fluid is identical to that of hot springs. The LUCA likely was dependent upon synthesized organic matter for its growth. Experiments show that RNA-like polymers can be synthesized in wet-dry cycling and UV light exposure. These polymers were encapsulated in vesicles after condensation. Potential sources of organics at hot springs might have been transport by interplanetary dust particles, extraterrestrial projectiles, or atmospheric or geochemical synthesis. Hot springs could have been abundant in volcanic landmasses during the Hadean.

Temperate surface bodies of water

A mesophilic start in surface bodies of waters hypothesis has evolved from Darwin's concept of a 'warm little pond' and the Oparin-Haldane hypothesis. Freshwater bodies under temperate climates can accumulate prebiotic materials while providing suitable environmental conditions conducive to simple life forms. The Archaean climate is uncertain. Atmospheric reconstructions from geochemical proxies and models suggest that sufficient greenhouse gases were present to maintain surface temperatures between 0–40 °C. If so, the temperature was suitable for life could begin.

Evidence for mesophily from biomolecular studies includes Galtier's G+C nucleotide thermometer. G+C are more abundant in thermophiles due to the added stability of an additional hydrogen bond not present between A+T nucleotides. rRNA sequencing of modern lifeforms shows that LUCA's reconstructed G+C content was likely representative of moderate temperatures.

The diversity of thermophiles today could be a product of convergent evolution and horizontal gene transfer rather than an inherited trait from LUCA. The reverse gyrase topoisomerase is found exclusively in thermophiles and hyperthermophiles, as it allows for coiling of DNA. This enzyme requires the complex molecule ATP to function. If an origin of life is hypothesised to involve a simple organism that had not yet evolved a membrane, let alone ATP, this would make the existence of reverse gyrase improbable. Moreover, phylogenetic studies show that reverse gyrase originated in archaea, and transferred to bacteria by horizontal gene transfer, implying it was not present in the LUCA.

Icy surface bodies of water

Cold-start theories presuppose large ice-covered regions. Stellar evolution models predict that the Sun's luminosity was ≈25% weaker than it is today. Fuelner states that although this significant decrease in solar energy would have formed an icy planet, there is strong evidence for the presence of liquid water, possibly driven by a greenhouse effect. This would mean an early Earth with both liquid oceans and icy poles.

Ice melts that form from ice sheets or glacier melts create freshwater pools, another niche capable of wet-dry cycles. While surface pools would be exposed to intense UV radiation, bodies of water within and under ice would be shielded, while remaining connected to exposed areas through ice cracks. Impact melting would allow freshwater and meteoritic input, creating prebiotic components. Near-seawater levels of sodium chloride destabilize fatty acid membrane self-assembly, making freshwater settings appealing for early membranous life.

Icy environments would trade the faster reaction rates that occur in warm environments for increased stability and accumulation of larger polymers. Experiments simulating Europa-like conditions of ≈20 °C have synthesised amino acids and adenine, showing that Miller-Urey type syntheses can occur at low temperatures. In an RNA world, the ribozyme would have had even more functions than in a later DNA-RNA-protein-world. For RNA to function, it must be able to fold, a process hindered by temperatures above 30 °C. While RNA folding in psychrophilic organisms is slower, so is hydrolysis, so folding is more successful. Shorter nucleotides would not suffer from higher temperatures.

Inside the continental crust

An alternative geological environment has been proposed by the geologist Ulrich Schreiber and the physical chemist Christian Mayer: the continental crustTectonic fault zones could present a stable and well-protected environment for long-term prebiotic evolution. Inside these systems of cracks and cavities, water and carbon dioxide present the bulk solvents. Their phase state could vary between liquid, gaseous and supercritical, depending on pressure and temperature. When forming two separate phases (e.g., liquid water and supercritical carbon dioxide in depths of little more than 1& km), the system provides optimal conditions for phase transfer reactions. Concurrently, the contents of the tectonic fault zones are being supplied by a multitude of inorganic educts (e.g., carbon monoxide, hydrogen, ammonia, hydrogen cyanide, nitrogen, and even phosphate from dissolved apatite) and simple organic molecules formed by hydrothermal chemistry (e.g. amino acids, long-chain amines, fatty acids, long-chain aldehydes).

Part of the tectonic fault zones is at a depth of around 1000 m. For the carbon dioxide part of the bulk solvent, it provides temperature and pressure conditions near the phase transition point between the supercritical and the gaseous state. This allows lipophilic organic molecules that dissolve well in supercritical CO2 to accumulate, but not in its gaseous state, leading to their local precipitation. Periodic pressure variations such as caused by geysers or tidal influences result in periodic phase transitions, keeping the local reaction environment in a constant non-equilibrium state. In presence of amphiphilic compounds (such as the long chain amines and fatty acids), subsequent generations of vesicles are formed that are constantly selected for their stability.

Homochirality

Many biomolecules, such as L-glutamic acid, are asymmetric, and occur in living systems in only one of the two possible forms, in the case of amino acids the left-handed form. Prebiotic chemistry would produce both forms, creating a puzzle for abiogenesis researchers.

Homochirality is the uniformity of materials composed of chiral (non-mirror-symmetric) units. Living organisms use molecules with the same chirality: with almost no exceptions, amino acids are left-handed while nucleotides and sugars are right-handed. Chiral molecules can be synthesized, but in the absence of a chiral source or a chiral catalyst, they are formed in a 50/50 (racemic) mixture of both forms. Non-racemic mixtures can arise from racemic materials by asymmetric physical laws such as the electroweak interaction or asymmetric environments such as circularly polarized light.

Once established, chirality would be selected for. A small bias in the population can be amplified by asymmetric autocatalysis, as in the Soai reaction, where a chiral molecule catalyzes its own production.

Homochirality may have started in outer space: on the Murchison meteorite, the left-handed amino acid L-alanine is more than twice as frequent as its right-handed D form, and L-glutamic acid is more than three times as abundant as its D counterpart. Amino acids from meteorites show a left-handed bias, whereas sugars show a right-handed bias, as in living organisms.

Ribosome

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Rib...