Search This Blog

Friday, August 15, 2014

Atheism

Atheism

From Wikipedia, the free encyclopedia
 
Atheism is, in a broad sense, the rejection of belief in the existence of deities.[1][2] In a narrower sense, atheism is specifically the position that there are no deities.[3][4][5] Most inclusively, atheism is the absence of belief that any deities exist.[4][5][6][7] Atheism is contrasted with theism,[8][9] which in its most general form is the belief that at least one deity exists.[9][10]

The term atheism originated from the Greek ἄθεος (atheos), meaning "without god(s)", used as a pejorative term applied to those thought to reject the gods worshiped by the larger society. With the spread of freethought, skeptical inquiry, and subsequent increase in criticism of religion, application of the term narrowed in scope. The first individuals to identify themselves using the word "atheist" lived in the 18th century.[11] Some ancient and modern religions are referred to as atheistic, as they either have no concepts of deities or deny a creator deity, yet still revere other god-like entities.

Arguments for atheism range from the philosophical to social and historical approaches. Rationales for not believing in any supernatural deity include the lack of empirical evidence,[12][13] the problem of evil, the argument from inconsistent revelations, rejection of concepts which cannot be falsified, and the argument from nonbelief.[12][14] Although some atheists have adopted secular philosophies,[15][16] there is no one ideology or set of behaviors to which all atheists adhere.[17] Many atheists hold that atheism is a more parsimonious worldview than theism, and therefore the burden of proof lies not on the atheist to disprove the existence of God, but on the theist to provide a rationale for theism.[18]

Since conceptions of atheism vary, determining how many atheists exist in the world today is difficult.[19] According to one 2007 estimate, atheists make up about 2.3% of the world's population, while a further 11.9% are nonreligious.[20] According to a 2012 global poll conducted by WIN/GIA, 13% of the participants say they are atheists.[21] According to other studies, rates of atheism are among the highest in Western nations, again to varying degrees: United States (4%)[22] and Canada (28%).[23] The figures for a 2010 Eurobarometer survey in the European Union (EU), reported that 20% of the EU population do not believe in "any sort of spirit, God or life force".[24]

Definitions and distinctions

A diagram showing the relationship between the definitions of weak/strong and implicit/explicit atheism.
Explicit strong/positive/hard atheists (in purple on the right) assert that "at least one deity exists" is a false statement.
Explicit weak/negative/soft atheists (in blue on the right) reject or eschew belief that any deities exist without actually asserting that "at least one deity exists" is a false statement.
Implicit weak/negative atheists (in blue on the left) would include people (such as young children and some agnostics) who do not believe in a deity, but have not explicitly rejected such belief.
(Sizes in the diagram are not meant to indicate relative sizes within a population.)

Writers disagree how best to define and classify atheism,[25] contesting what supernatural entities it applies to, whether it is an assertion in its own right or merely the absence of one, and whether it requires a conscious, explicit rejection. Atheism has been regarded as compatible with agnosticism,[26][27][28][29][30][31][32] and has also been contrasted with it.[33][34][35] A variety of categories have been used to distinguish the different forms of atheism.

Range

Some of the ambiguity and controversy involved in defining atheism arises from difficulty in reaching a consensus for the definitions of words like deity and god. The plurality of wildly different conceptions of God and deities leads to differing ideas regarding atheism's applicability. The ancient Romans accused Christians of being atheists for not worshiping the pagan deities. Gradually, this view fell into disfavor as theism came to be understood as encompassing belief in any divinity.[36]

With respect to the range of phenomena being rejected, atheism may counter anything from the existence of a deity, to the existence of any spiritual, supernatural, or transcendental concepts, such as those of Buddhism, Hinduism, Jainism and Taoism.[37]

Implicit vs. explicit

Definitions of atheism also vary in the degree of consideration a person must put to the idea of gods to be considered an atheist. Atheism has sometimes been defined to include the simple absence of belief that any deities exist. This broad definition would include newborns and other people who have not been exposed to theistic ideas. As far back as 1772, Baron d'Holbach said that "All children are born Atheists; they have no idea of God."[38] Similarly, George H. Smith (1979) suggested that: "The man who is unacquainted with theism is an atheist because he does not believe in a god. This category would also include the child with the conceptual capacity to grasp the issues involved, but who is still unaware of those issues. The fact that this child does not believe in god qualifies him as an atheist."[39] Smith coined the term implicit atheism to refer to "the absence of theistic belief without a conscious rejection of it" and explicit atheism to refer to the more common definition of conscious disbelief. Ernest Nagel contradicts Smith's definition of atheism as merely "absence of theism", acknowledging only explicit atheism as true "atheism".[40]

Positive vs. negative

Philosophers such as Antony Flew[41] and Michael Martin[36] have contrasted positive (strong/hard) atheism with negative (weak/soft) atheism. Positive atheism is the explicit affirmation that gods do not exist. Negative atheism includes all other forms of non-theism. According to this categorization, anyone who is not a theist is either a negative or a positive atheist. The terms weak and strong are relatively recent, while the terms negative and positive atheism are of older origin, having been used (in slightly different ways) in the philosophical literature[41] and in Catholic apologetics.[42] Under this demarcation of atheism, most agnostics qualify as negative atheists.

While Martin, for example, asserts that agnosticism entails negative atheism,[29] many agnostics see their view as distinct from atheism,[43][44] which they may consider no more justified than theism or requiring an equal conviction.[43] The assertion of unattainability of knowledge for or against the existence of gods is sometimes seen as indication that atheism requires a leap of faith.[45][46] Common atheist responses to this argument include that unproven religious propositions deserve as much disbelief as all other unproven propositions,[47] and that the unprovability of a god's existence does not imply equal probability of either possibility.[48] Scottish philosopher J. J. C. Smart even argues that "sometimes a person who is really an atheist may describe herself, even passionately, as an agnostic because of unreasonable generalised philosophical skepticism which would preclude us from saying that we know anything whatever, except perhaps the truths of mathematics and formal logic."[49] Consequently, some atheist authors such as Richard Dawkins prefer distinguishing theist, agnostic and atheist positions along a spectrum of theistic probability—the likelihood that each assigns to the statement "God exists".[50]

Definition as impossible or impermanent

Before the 18th century, the existence of God was so universally accepted in the western world that even the possibility of true atheism was questioned. This is called theistic innatism—the notion that all people believe in God from birth; within this view was the connotation that atheists are simply in denial.[51]

There is also a position claiming that atheists are quick to believe in God in times of crisis, that atheists make deathbed conversions, or that "there are no atheists in foxholes".[52] There have however been examples to the contrary, among them examples of literal "atheists in foxholes".[53]
Some atheists have doubted the very need for the term "atheism". In his book Letter to a Christian Nation, Sam Harris wrote:
In fact, "atheism" is a term that should not even exist. No one ever needs to identify himself as a "non-astrologer" or a "non-alchemist". We do not have words for people who doubt that Elvis is still alive or that aliens have traversed the galaxy only to molest ranchers and their cattle. Atheism is nothing more than the noises reasonable people make in the presence of unjustified religious beliefs.[54]

Concepts

Paul Henri Thiry, Baron d'Holbach, an 18th-century advocate of atheism.
The source of man's unhappiness is his ignorance of Nature. The pertinacity with which he clings to blind opinions imbibed in his infancy, which interweave themselves with his existence, the consequent prejudice that warps his mind, that prevents its expansion, that renders him the slave of fiction, appears to doom him to continual error.
—d'Holbach, The System of Nature[55]
The broadest demarcation of atheistic rationale is between practical and theoretical atheism.

Practical atheism

In practical or pragmatic atheism, also known as apatheism, individuals live as if there are no gods and explain natural phenomena without reference to any deities. The existence of gods is not rejected, but may be designated unnecessary or useless; gods neither provide purpose to life, nor influence everyday life, according to this view.[56] A form of practical atheism with implications for the scientific community is methodological naturalism—the "tacit adoption or assumption of philosophical naturalism within scientific method with or without fully accepting or believing it."[57]
Practical atheism can take various forms:
  • Absence of religious motivation—belief in gods does not motivate moral action, religious action, or any other form of action;
  • Active exclusion of the problem of gods and religion from intellectual pursuit and practical action;
  • Indifference—the absence of any interest in the problems of gods and religion; or
  • Unawareness of the concept of a deity.[58]

Theoretical atheism

Ontological arguments

Theoretical (or theoric) atheism explicitly posits arguments against the existence of gods, responding to common theistic arguments such as the argument from design or Pascal's Wager. Theoretical atheism is mainly an ontology, precisely a physical ontology.

Epistemological arguments

Epistemological atheism argues that people cannot know a God or determine the existence of a God. The foundation of epistemological atheism is agnosticism, which takes a variety of forms. In the philosophy of immanence, divinity is inseparable from the world itself, including a person's mind, and each person's consciousness is locked in the subject. According to this form of agnosticism, this limitation in perspective prevents any objective inference from belief in a god to assertions of its existence. The rationalistic agnosticism of Kant and the Enlightenment only accepts knowledge deduced with human rationality; this form of atheism holds that gods are not discernible as a matter of principle, and therefore cannot be known to exist. Skepticism, based on the ideas of Hume, asserts that certainty about anything is impossible, so one can never know for sure whether or not a god exists. Hume, however, held that such unobservable metaphysical concepts should be rejected as "sophistry and illusion".[59] The allocation of agnosticism to atheism is disputed; it can also be regarded as an independent, basic worldview.[56]

Other arguments for atheism that can be classified as epistemological or ontological, including logical positivism and ignosticism, assert the meaninglessness or unintelligibility of basic terms such as "God" and statements such as "God is all-powerful." Theological noncognitivism holds that the statement "God exists" does not express a proposition, but is nonsensical or cognitively meaningless. It has been argued both ways as to whether such individuals can be classified into some form of atheism or agnosticism. Philosophers A. J. Ayer and Theodore M. Drange reject both categories, stating that both camps accept "God exists" as a proposition; they instead place noncognitivism in its own category.[60][61]

Metaphysical arguments

One author writes:
"Metaphysical atheism ... includes all doctrines that hold to metaphysical monism (the homogeneity of reality). Metaphysical atheism may be either: a) absolute — an explicit denial of God's existence associated with materialistic monism (all materialistic trends, both in ancient and modern times); b) relative — the implicit denial of God in all philosophies that, while they accept the existence of an absolute, conceive of the absolute as not possessing any of the attributes proper to God: transcendence, a personal character or unity. Relative atheism is associated with idealistic monism (pantheism, panentheism, deism)."[62]
Epicurus is credited with first expounding the problem of evil. David Hume in his Dialogues concerning Natural Religion (1779) cited Epicurus in stating the argument as a series of questions:[63] "Is God willing to prevent evil, but not able? Then he is impotent. Is he able, but not willing? Then he is malevolent. Is he both able and willing? Then whence cometh evil? Is he neither able nor willing? Then why call him God?"

Logical arguments

Logical atheism holds that the various conceptions of gods, such as the personal god of Christianity, are ascribed logically inconsistent qualities. Such atheists present deductive arguments against the existence of God, which assert the incompatibility between certain traits, such as perfection, creator-status, immutability, omniscience, omnipresence, omnipotence, omnibenevolence, transcendence, personhood (a personal being), nonphysicality, justice, and mercy.[12]

Theodicean atheists believe that the world as they experience it cannot be reconciled with the qualities commonly ascribed to God and gods by theologians. They argue that an omniscient, omnipotent, and omnibenevolent God is not compatible with a world where there is evil and suffering, and where divine love is hidden from many people.[14] A similar argument is attributed to Siddhartha Gautama, the founder of Buddhism.[64]

Reductionary accounts of religion

Philosopher Ludwig Feuerbach[65] and psychoanalyst Sigmund Freud have argued that God and other religious beliefs are human inventions, created to fulfill various psychological and emotional wants or needs. This is also a view of many Buddhists.[66] Karl Marx and Friedrich Engels, influenced by the work of Feuerbach, argued that belief in God and religion are social functions, used by those in power to oppress the working class. According to Mikhail Bakunin, "the idea of God implies the abdication of human reason and justice; it is the most decisive negation of human liberty, and necessarily ends in the enslavement of mankind, in theory and practice." He reversed Voltaire's famous aphorism that if God did not exist, it would be necessary to invent him, writing instead that "if God really existed, it would be necessary to abolish him."[67]

Atheism within religions

Atheism is acceptable within some religious and spiritual belief systems, including Hinduism, Jainism, Buddhism, Raelism,[68] and Neopagan movements[69] such as Wicca.[70] Āstika schools in Hinduism hold atheism to be a valid path to moksha, but extremely difficult, for the atheist can not expect any help from the divine on their journey.[71] Jainism believes the universe is eternal and has no need for a creator deity, however Tirthankaras are revered that can transcend space and time [72] and have more power than the god Indra.[73] Secular Buddhism does not advocate belief in gods. Early Buddhism was atheistic as Gautama Buddha's path involved no mention of gods. Later conceptions of Buddhism consider Buddha himself a god, suggest adherents can attain godhood, and revere Bodhisattvas[74] and Eternal Buddhas.

Atheist philosophies

Axiological, or constructive, atheism rejects the existence of gods in favor of a "higher absolute", such as humanity. This form of atheism favors humanity as the absolute source of ethics and values, and permits individuals to resolve moral problems without resorting to God. Marx and Freud used this argument to convey messages of liberation, full-development, and unfettered happiness.[56] One of the most common criticisms of atheism has been to the contrary—that denying the existence of a god leads to moral relativism, leaving one with no moral or ethical foundation,[75] or renders life meaningless and miserable.[76] Blaise Pascal argued this view in his Pensées.[77]

French philosopher Jean-Paul Sartre identified himself as a representative of an "atheist existentialism"[78] concerned less with denying the existence of God than with establishing that "man needs ... to find himself again and to understand that nothing can save him from himself, not even a valid proof of the existence of God."[79] Sartre said a corollary of his atheism was that "if God does not exist, there is at least one being in whom existence precedes essence, a being who exists before he can be defined by any concept, and ... this being is man."[78] The practical consequence of this atheism was described by Sartre as meaning that there are no a priori rules or absolute values that can be invoked to govern human conduct, and that humans are "condemned" to invent these for themselves, making "man" absolutely "responsible for everything he does".[80]

Atheism, religion, and morality

Association with world views and social behaviors

Sociologist Phil Zuckerman analyzed previous social science research on secularity and non-belief, and concluded that societal well-being is positively correlated with irreligion. He found that there are much lower concentrations of atheism and secularity in poorer, less developed nations (particularly in Africa and South America) than in the richer industrialized democracies.[81][82] His findings relating specifically to atheism in the US were that compared to religious people in the US, "atheists and secular people" are less nationalistic, prejudiced, antisemitic, racist, dogmatic, ethnocentric, closed-minded, and authoritarian, and in US states with the highest percentages of atheists, the murder rate is lower than average. In the most religious states, the murder rate is higher than average.[83][84]

Atheism and irreligion

Buddhism is sometimes described as nontheistic because of the absence of a creator god, but that can be too simplistic a view.[85]

People who self-identify as atheists are often assumed to be irreligious, but some sects within major religions reject the existence of a personal, creator deity.[86] In recent years, certain religious denominations have accumulated a number of openly atheistic followers, such as atheistic or humanistic Judaism[87][88] and Christian atheists.[89][90][91]

The strictest sense of positive atheism does not entail any specific beliefs outside of disbelief in any deity; as such, atheists can hold any number of spiritual beliefs. For the same reason, atheists can hold a wide variety of ethical beliefs, ranging from the moral universalism of humanism, which holds that a moral code should be applied consistently to all humans, to moral nihilism, which holds that morality is meaningless.[92]

Philosophers such as Georges Bataille, Slavoj Žižek,[93] Alain de Botton,[94] and Alexander Bard and Jan Söderqvist,[95] have all argued that atheists should reclaim religion as an act of defiance against theism, precisely not to leave religion as an unwarranted monopoly to theists.

Divine command vs. ethics

According to Plato's Euthyphro dilemma, the role of the gods in determining right from wrong is either unnecessary or arbitrary. The argument that morality must be derived from God, and cannot exist without a wise creator, has been a persistent feature of political if not so much philosophical debate.[96][97][98] Moral precepts such as "murder is wrong" are seen as divine laws, requiring a divine lawmaker and judge. However, many atheists argue that treating morality legalistically involves a false analogy, and that morality does not depend on a lawmaker in the same way that laws do.[99] Friedrich Nietzsche believed in a morality independent of theistic belief, and stated that morality based upon God "has truth only if God is truth—it stands or falls with faith in God."[100][101][102]
There exist normative ethical systems that do not require principles and rules to be given by a deity.
Some include virtue ethics, social contract, Kantian ethics, utilitarianism, and Objectivism. Sam Harris has proposed that moral prescription (ethical rule making) is not just an issue to be explored by philosophy, but that we can meaningfully practice a science of morality. Any such scientific system must, nevertheless, respond to the criticism embodied in the naturalistic fallacy.[103]

Philosophers Susan Neiman[104] and Julian Baggini[105] (among others) assert that behaving ethically only because of divine mandate is not true ethical behavior but merely blind obedience. Baggini argues that atheism is a superior basis for ethics, claiming that a moral basis external to religious imperatives is necessary to evaluate the morality of the imperatives themselves—to be able to discern, for example, that "thou shalt steal" is immoral even if one's religion instructs it—and that atheists, therefore, have the advantage of being more inclined to make such evaluations.[106] The contemporary British political philosopher Martin Cohen has offered the more historically telling example of Biblical injunctions in favour of torture and slavery as evidence of how religious injunctions follow political and social customs, rather than vice versa, but also noted that the same tendency seems to be true of supposedly dispassionate and objective philosophers.[107] Cohen extends this argument in more detail in Political Philosophy from Plato to Mao, where he argues that the Qur'an played a role in perpetuating social codes from the early 7th century despite changes in secular society.[108]

Dangers of religions

Some prominent atheists—most recently Christopher Hitchens, Daniel Dennett, Sam Harris and Richard Dawkins, and following such thinkers as Bertrand Russell, Robert G. Ingersoll, Voltaire, and novelist José Saramago—have criticized religions, citing harmful aspects of religious practices and doctrines.[109]

The 19th-century German political theorist and sociologist Karl Marx criticised religion as "the sigh of the oppressed creature, the heart of a heartless world, and the soul of soulless conditions. It is the opium of the people". He goes on to say, "The abolition of religion as the illusory happiness of the people is the demand for their real happiness. To call on them to give up their illusions about their condition is to call on them to give up a condition that requires illusions. The criticism of religion is, therefore, in embryo, the criticism of that vale of tears of which religion is the halo.[110] Lenin said that "every religious idea and every idea of God "is unutterable vileness ... of the most dangerous kind, 'contagion' of the most abominable kind. Millions of sins, filthy deeds, acts of violence and physical contagions ... are far less dangerous than the subtle, spiritual idea of God decked out in the smartest ideological constumes ..."[111]

Sam Harris criticises Western religion's reliance on divine authority as lending itself to authoritarianism and dogmatism.[112] There is a correlation between religious fundamentalism and extrinsic religion (when religion is held because it serves ulterior interests)[113] and authoritarianism, dogmatism, and prejudice.[114] These arguments—combined with historical events that are argued to demonstrate the dangers of religion, such as the Crusades, inquisitions, witch trials, and terrorist attacks—have been used in response to claims of beneficial effects of belief in religion.[115] Believers counter-argue that some regimes that espouse atheism, such as in Soviet Russia, have also been guilty of mass murder.[116][117] In response to those claims, atheists such as Sam Harris and Richard
Dawkins have stated that Stalin's atrocities were influenced not by atheism but by dogmatic Marxism, and that while Stalin and Mao happened to be atheists, they did not do their deeds in the name of atheism.[118][119]

Etymology

The Greek word αθεοι (atheoi), as it appears in the Epistle to the Ephesians (2:12) on the early 3rd-century Papyrus 46. It is usually translated into English as "[those who are] without God".[120]

In early ancient Greek, the adjective átheos (ἄθεος, from the privative ἀ- + θεός "god") meant "godless". It was first used as a term of censure roughly meaning "ungodly" or "impious". In the 5th century BCE, the word began to indicate more deliberate and active godlessness in the sense of "severing relations with the gods" or "denying the gods". The term ἀσεβής (asebēs) then came to be applied against those who impiously denied or disrespected the local gods, even if they believed in other gods. Modern translations of classical texts sometimes render átheos as "atheistic". As an abstract noun, there was also ἀθεότης (atheotēs), "atheism". Cicero transliterated the Greek word into the Latin átheos. The term found frequent use in the debate between early Christians and Hellenists, with each side attributing it, in the pejorative sense, to the other.[121]

The term atheist (from Fr. athée), in the sense of "one who ... denies the existence of God or gods",[122] predates atheism in English, being first found as early as 1566,[123] and again in 1571.[124] Atheist as a label of practical godlessness was used at least as early as 1577.[125] The term atheism was derived from the French athéisme,[126] and appears in English about 1587.[127] An earlier work, from about 1534, used the term atheonism.[128][129] Related words emerged later: deist in 1621,[130] theist in 1662,[131] deism in 1675,[132] and theism in 1678.[133] At that time "deist" and "deism" already carried their modern meaning. The term theism came to be contrasted with deism.

Karen Armstrong writes that "During the sixteenth and seventeenth centuries, the word 'atheist' was still reserved exclusively for polemic ... The term 'atheist' was an insult. Nobody would have dreamed of calling himself an atheist."[11]

Atheism was first used to describe a self-avowed belief in late 18th-century Europe, specifically denoting disbelief in the monotheistic Abrahamic god.[134][135] In the 20th century, globalization contributed to the expansion of the term to refer to disbelief in all deities, though it remains common in Western society to describe atheism as simply "disbelief in God".[36]

History

While the earliest-found usage of the term atheism is in 16th-century France,[126][127] ideas that would be recognized today as atheistic are documented from the Vedic period and the classical antiquity.

Early Indic religion

Atheistic schools are found in early Indian thought and have existed from the times of the historical Vedic religion.[136] Among the six orthodox schools of Hindu philosophy, Samkhya, the oldest philosophical school of thought, does not accept God, and the early Mimamsa also rejected the notion of God.[137] The thoroughly materialistic and anti-theistic philosophical Cārvāka (also called Nastika or Lokaiata) school that originated in India around the 6th century BCE is probably the most explicitly atheistic school of philosophy in India, similar to the Greek Cyrenaic school. This branch of Indian philosophy is classified as heterodox due to its rejection of the authority of Vedas and hence is not considered part of the six orthodox schools of Hinduism, but it is noteworthy as evidence of a materialistic movement within Hinduism.[138] Chatterjee and Datta explain that our understanding of Cārvāka philosophy is fragmentary, based largely on criticism of the ideas by other schools, and that it is not a living tradition:
"Though materialism in some form or other has always been present in India, and occasional references are found in the Vedas, the Buddhistic literature, the Epics, as well as in the later philosophical works we do not find any systematic work on materialism, nor any organized school of followers as the other philosophical schools possess. But almost every work of the other schools states, for refutation, the materialistic views. Our knowledge of Indian materialism is chiefly based on these."[139]
Other Indian philosophies generally regarded as atheistic include Classical Samkhya and Purva Mimamsa. The rejection of a personal creator God is also seen in Jainism and Buddhism in India.[140]

Classical antiquity

In Plato's Apology, Socrates (pictured) was accused by Meletus of not believing in the gods.

Western atheism has its roots in pre-Socratic Greek philosophy, but did not emerge as a distinct world-view until the late Enlightenment.[141] The 5th-century BCE Greek philosopher Diagoras is known as the "first atheist",[142] and is cited as such by Cicero in his De Natura Deorum.[143] Atomists such as Democritus attempted to explain the world in a purely materialistic way, without reference to the spiritual or mystical. Critias viewed religion as a human invention used to frighten people into following moral order[144] and Prodicus also appears to have made clear atheistic statements in his work. Philodemus reports that Prodicus believed that "the gods of popular belief do not exist nor do they know, but primitive man, [out of admiration, deified] the fruits of the earth and virtually everything that contributed to his existence". Protagoras has sometimes been taken to be an atheist but rather espoused agnostic views, commenting that "Concerning the gods I am unable to discover whether they exist or not, or what they are like in form; for there are many hindrances to knowledge, the obscurity of the subject and the brevity of human life."[145] In the 3rd-century BCE the Greek philosophers Theodorus Cyrenaicus[143][146] and Strato of Lampsacus[147] did not believe gods exist.
Socrates (c. 470–399 BCE) was associated in the Athenian public mind with the trends in pre-Socratic philosophy towards naturalistic inquiry and the rejection of divine explanations for phenomena. Although such an interpretation misrepresents his thought he was portrayed in such a way in Aristophanes' comic play Clouds and was later to be tried and executed for impiety and corrupting the young. At his trial Socrates is reported as vehemently denying that he was an atheist and contemporary scholarship provides little reason to doubt this claim.[148][149]

Euhemerus (c. 300 BCE) published his view that the gods were only the deified rulers, conquerors and founders of the past, and that their cults and religions were in essence the continuation of vanished kingdoms and earlier political structures.[150] Although not strictly an atheist, Euhemerus was later criticized for having "spread atheism over the whole inhabited earth by obliterating the gods".[151]

Also important in the history of atheism was Epicurus (c. 300 BCE). Drawing on the ideas of Democritus and the Atomists, he espoused a materialistic philosophy according to which the universe was governed by the laws of chance without the need for divine intervention (see scientific determinism). Although he stated that deities existed, he believed that they were uninterested in human existence. The aim of the Epicureans was to attain peace of mind and one important way of doing this was by exposing fear of divine wrath as irrational. The Epicureans also denied the existence of an afterlife and the need to fear divine punishment after death.[152]

The Roman philosopher Sextus Empiricus held that one should suspend judgment about virtually all beliefs—a form of skepticism known as Pyrrhonism—that nothing was inherently evil, and that ataraxia ("peace of mind") is attainable by withholding one's judgment. His relatively large volume of surviving works had a lasting influence on later philosophers.[153]

The meaning of "atheist" changed over the course of classical antiquity. The early Christians were labeled atheists by non-Christians because of their disbelief in pagan gods.[154] During the Roman Empire, Christians were executed for their rejection of the Roman gods in general and Emperor-worship in particular. When Christianity became the state religion of Rome under Theodosius I in 381, heresy became a punishable offense.[155]

Early Middle Ages to the Renaissance

During the Early Middle Ages, the Islamic world underwent a Golden Age. With the associated advances in science and philosophy, Arab and Persian lands produced outspoken rationalists and atheists, including Muhammad al Warraq (fl. 7th century), Ibn al-Rawandi (827–911), Al-Razi (854–925) and Al-Maʿarri (973–1058). Al-Ma'arri wrote and taught that religion itself was a "fable invented by the ancients"[156] and that humans were "of two sorts: those with brains, but no religion, and those with religion, but no brains."[157] Despite being relatively prolific writers, nearly none of their writing survives to the modern day, most of what little remains being preserved through quotations and excerpts in later works by Muslim apologists attempting to refute them.[158] Other prominent Golden Age scholars have been associated with rationalist thought and atheism as well, although the current intellectual atmosphere in the Islamic world, and the scant evidence that survives from the era, make this point a contentious one today.

In Europe, the espousal of atheistic views was rare during the Early Middle Ages and Middle Ages (see Medieval Inquisition); metaphysics and theology were the dominant interests pertaining to religion.[159] There were, however, movements within this period that furthered heterodox conceptions of the Christian god, including differing views of the nature, transcendence, and knowability of God. Individuals and groups such as Johannes Scotus Eriugena, David of Dinant, Amalric of Bena, and the Brethren of the Free Spirit maintained Christian viewpoints with pantheistic tendencies. Nicholas of Cusa held to a form of fideism he called docta ignorantia ("learned ignorance"), asserting that God is beyond human categorization, and thus our knowledge of him is limited to conjecture. William of Ockham inspired anti-metaphysical tendencies with his nominalistic limitation of human knowledge to singular objects, and asserted that the divine essence could not be intuitively or rationally apprehended by human intellect. Followers of Ockham, such as John of Mirecourt and Nicholas of Autrecourt furthered this view. The resulting division between faith and reason influenced later radical and reformist theologians such as John Wycliffe, Jan Hus, and Martin Luther.[159]

The Renaissance did much to expand the scope of free thought and skeptical inquiry. Individuals such as Leonardo da Vinci sought experimentation as a means of explanation, and opposed arguments from religious authority. Other critics of religion and the Church during this time included Niccolò Machiavelli, Bonaventure des Périers, and François Rabelais.[153]

Early modern period

Historian Geoffrey Blainey wrote that the Reformation had paved the way for atheists by attacking the authority of the Catholic Church, which in turn "quietly inspired other thinkers to attack the authority of the new Protestant churches".[160] Deism gained influence in France, Prussia, and England. The Dutch philosopher Baruch Spinoza was "probably the first well known 'semi-atheist' to announce himself in a Christian land in the modern era", according to Blainey. Spinoza believed that natural laws explained the workings of the universe. In 1661 he published his Short Treatise on God.[161]

Criticism of Christianity became increasingly frequent in the 17th and 18th centuries, especially in France and England, where there appears to have been a religious malaise, according to contemporary sources. Some Protestant thinkers, such as Thomas Hobbes, espoused a materialist philosophy and skepticism toward supernatural occurrences, while the Jewish-Dutch philosopher Spinoza rejected divine providence in favour of a panentheistic naturalism. By the late 17th century, deism came to be openly espoused by intellectuals such as John Toland who coined the term "pantheist".[162]

The first known explicit atheist was the German critic of religion Matthias Knutzen in his three writings of 1674.[163] He was followed by two other explicit atheist writers, the Polish ex-Jesuit philosopher Kazimierz Łyszczyński and in the 1720s by the French priest Jean Meslier.[164] In the course of the 18th century, other openly atheistic thinkers followed, such as Baron d'Holbach, Jacques-André Naigeon, and other French materialists.[165] John Locke in contrast, though an advocate of tolerance, urged authorities not to tolerate atheism, believing that the denial of God's existence would undermine the social order and lead to chaos.[166]

The philosopher David Hume developed a skeptical epistemology grounded in empiricism, and Immanuel Kant's philosophy has strongly questioned the very possibility of a metaphysical knowledge. Both philosophers undermined the metaphysical basis of natural theology and criticized classical arguments for the existence of God.
Ludwig Feuerbach's The Essence of Christianity (1841) would greatly influence philosophers such as Engels, Marx, David Strauss, Nietzsche, and Max Stirner. He considered God to be a human invention and religious activities to be wish-fulfillment. For this he is considered the founding father of modern anthropology of religion.

Blainey notes that, although Voltaire is widely considered to have strongly contributed to atheistic thinking during the Revolution, he also considered fear of God to have discouraged further disorder, having said "If God did not exist, it would be necessary to invent him."[167] In Reflections on the Revolution in France (1790), the philosopher Edmund Burke denounced atheism, writing of a "literary cabal" who had "some years ago formed something like a regular plan for the destruction of the Christian religion. This object they pursued with a degree of zeal which hitherto had been discovered only in the propagators of some system of piety ... These atheistical fathers have a bigotry of their own ...". But, Burke asserted, "man is by his constitution a religious animal" and "atheism is against, not only our reason, but our instincts; and ... it cannot prevail long".[168]

Baron d'Holbach was a prominent figure in the French Enlightenment who is best known for his atheism and for his voluminous writings against religion, the most famous of them being The System of Nature (1770) but also Christianity Unveiled. One goal of the French Revolution was a restructuring and subordination of the clergy with respect to the state through the Civil Constitution of the Clergy. Attempts to enforce it led to anti-clerical violence and the expulsion of many clergy from France, lasting until the Thermidorian Reaction. The radical Jacobins seized power in 1793, ushering in the Reign of Terror. The Jacobins were deists and introduced the Cult of the Supreme Being as a new French state religion. Some atheists surrounding Jacques Hébert instead sought to establish a Cult of Reason, a form of atheistic pseudo-religion with a goddess personifying reason.
The Napoleonic era further institutionalized the secularization of French society, and exported the revolution to northern Italy, in the hopes of creating pliable republics.

In the latter half of the 19th century, atheism rose to prominence under the influence of rationalistic and freethinking philosophers. Many prominent German philosophers of this era denied the existence of deities and were critical of religion, including Ludwig Feuerbach, Arthur Schopenhauer, Max Stirner, Karl Marx, and Friedrich Nietzsche.[169]

Since 1900

 
Vladimir Lenin, the first leader of the Soviet Union. Marxist‒Leninist atheism was influential in the 20th century.

Atheism in the 20th century, particularly in the form of practical atheism, advanced in many societies. Atheistic thought found recognition in a wide variety of other, broader philosophies, such as existentialism, objectivism, secular humanism, nihilism, anarchism, logical positivism, Marxism, feminism,[170] and the general scientific and rationalist movement. In addition, state atheism emerged in Eastern Europe and Asia during that period, particularly in the Soviet Union under Vladimir Lenin and Joseph Stalin, and in Communist China under Mao Zedong. Atheist and anti-religious policies in the Soviet Union included numerous legislative acts, the outlawing of religious instruction in the schools, and the emergence of the League of Militant Atheists.[171][172]
1929 cover of the USSR League of Militant Atheists magazine, showing the gods of the Abrahamic religions being crushed by the Communist 5-year plan

Blainey wrote that during the twentieth century, atheists in Western societies became more active and even militant, though they often "relied essentially on arguments used by numerous radical Christians since at least the eighteenth century". They rejected the idea of an interventionist God, and said that Christianity promoted war and violence, though "the most ruthless leaders in the Second World War were atheists and secularists who were intensely hostile to both Judaism and Christianity" and "Later massive atrocities were committed in the East by those ardent atheists, Pol Pot and Mao Zedong". Some scientists were meanwhile articulating a view that as the world becomes more educated, religion would be superseded.[173]

Logical positivism and scientism paved the way for neopositivism, analytical philosophy, structuralism, and naturalism. Neopositivism and analytical philosophy discarded classical rationalism and metaphysics in favor of strict empiricism and epistemological nominalism.
Proponents such as Bertrand Russell emphatically rejected belief in God. In his early work, Ludwig Wittgenstein attempted to separate metaphysical and supernatural language from rational discourse. A. J. Ayer asserted the unverifiability and meaninglessness of religious statements, citing his adherence to the empirical sciences. Relatedly the applied structuralism of Lévi-Strauss sourced religious language to the human subconscious in denying its transcendental meaning. J. N. Findlay and J. J. C. Smart argued that the existence of God is not logically necessary. Naturalists and materialistic monists such as John Dewey considered the natural world to be the basis of everything, denying the existence of God or immortality.[49][174]

Other developments

The British philosopher Bertrand Russell

Other leaders like E. V. Ramasami Naicker (Periyar), a prominent atheist leader of India, fought against Hinduism and Brahmins for discriminating and dividing people in the name of caste and religion.[175] This was highlighted in 1956 when he arranged for the erection of a statue depicting a Hindu god in a humble representation and made antitheistic statements.[176]

Atheist Vashti McCollum was the plaintiff in a landmark 1948 Supreme Court case that struck down religious education in US public schools.[177] Madalyn Murray O'Hair was perhaps one of the most influential American atheists; she brought forth the 1963 Supreme Court case Murray v. Curlett which banned compulsory prayer in public schools.[178] In 1966, Time magazine asked "Is God Dead?"[179] in response to the Death of God theological movement, citing the estimation that nearly half of all people in the world lived under an anti-religious power, and millions more in Africa, Asia, and South America seemed to lack knowledge of the Christian view of theology.[180] The Freedom From Religion Foundation was co-founded by Anne Nicol Gaylor and her daughter, Annie Laurie Gaylor, in 1976 in the United States, and incorporated nationally in 1978. It promotes the separation of church and state.[181][182]

Since the fall of the Berlin Wall, the number of actively anti-religious regimes has reduced considerably. In 2006, Timothy Shah of the Pew Forum noted "a worldwide trend across all major religious groups, in which God-based and faith-based movements in general are experiencing increasing confidence and influence vis-à-vis secular movements and ideologies."[183] However, Gregory S. Paul and Phil Zuckerman consider this a myth and suggest that the actual situation is much more complex and nuanced.[184]

A 2010 survey found that those identifying themselves as atheists or agnostics are on average more knowledgeable about religion than followers of major faiths. Nonbelievers scored better on questions about tenets central to Protestant and Catholic faiths. Only Mormon and Jewish faithful scored as well as atheists and agnostics.[185]

In 2012, the first "Women in Secularism" conference was held in Arlington, Virginia.[186] Secular Woman was organized in 2012 as a national organization focused on nonreligious women.[187] The atheist feminist movement has also become increasingly focused on fighting sexism and sexual harassment within the atheist movement itself.[188] In August 2012, Jennifer McCreight (the organizer of Boobquake) founded a movement within atheism known as Atheism Plus, or A+, that "applies skepticism to everything, including social issues like sexism, racism, politics, poverty, and crime".[189][190][191]

In 2013 the first atheist monument on American government property was unveiled at the Bradford County Courthouse in Florida: a 1,500-pound granite bench and plinth inscribed with quotes by Thomas Jefferson, Benjamin Franklin, and Madalyn Murray O'Hair.[192][193]

New Atheism

New Atheism is the name given to a movement among some early-21st-century atheist writers who have advocated the view that "religion should not simply be tolerated but should be countered, criticized, and exposed by rational argument wherever its influence arises."[194] The movement is commonly associated with Sam Harris, Daniel C. Dennett, Richard Dawkins, Victor J. Stenger and Christopher Hitchens.[195][196] Several best-selling books by these authors, published between 2004 and 2007, form the basis for much of the discussion of New Atheism.[196]

These atheists generally seek to disassociate themselves from the mass political atheism that gained ascendency in various nations in the 20th century. In best selling books, the religiously motivated terrorist events of 9/11 and the partially successful attempts of the Discovery Institute to change the American science curriculum to include creationist ideas, together with support for those ideas from George W. Bush in 2005, have been cited by authors such as Harris, Dennett, Dawkins, Stenger and Hitchens as evidence of a need to move society towards atheism.[197]

Demographics

 
Percentage of people in various European countries who said: "I don't believe there is any sort of spirit, God or life force." (2005)[198]

It is difficult to quantify the number of atheists in the world. Respondents to religious-belief polls may define "atheism" differently or draw different distinctions between atheism, non-religious beliefs, and non-theistic religious and spiritual beliefs.[199] A Hindu atheist would declare oneself as a Hindu, although also being an atheist at the same time.[200] A 2010 survey published in Encyclopædia Britannica found that the non-religious made up about 9.6% of the world's population, and atheists about 2.0%, with a very large majority based in Asia. This figure did not include those who follow atheistic religions, such as some Buddhists.[201] The average annual change for atheism from 2000 to 2010 was −0.17%.[201] A broad figure estimates the number of atheists and agnostics on Earth at 1.1 billion.[202]

The 2012 Gallup Global Index of Religiosity and Atheism, measured the percentage of people who viewed themselves as "a religious person, not a religious person or a convinced atheist?" 13% reported to be "convinced atheists".[203] The top ten countries with people who viewed themselves as "convinced atheists" were China (47%), Japan (31%), the Czech Republic (30%), France (29%), South Korea (15%), Germany (15%), Netherlands (14%), Austria (10%), Iceland (10%), Australia (10%) and the Republic of Ireland (10%). In contrast, the top ten countries with people who described themselves as "a religious person" were Ghana (96%), Nigeria (93%), Armenia (92%), Fiji 92%, Macedonia (90%), Romania (89%), Iraq (88%), Kenya (88%), Peru (86%), and Brazil (85%).[204]

According to the 2010 Eurobarometer Poll, the percentages of the population that agreed with the stand "You don't believe there is any sort of spirit, God or life force" ranged from: France (40%), Czech Republic (37%), Sweden (34%), Netherlands (30%), and Estonia (29%), down to Poland (5%), Greece (4%), Cyprus (3%), Malta (2%), and Romania (1%), with the European Union as a whole at 20%.[24] According to the Australian Bureau of Statistics, 22% of Australians have "no religion", a category that includes atheists.[205]

In the United States, there was a 1% to 5% increase in self-reported atheism from 2005 to 2012, and a larger drop in those who self-identified as "religious", down by 13%, from 73% to 60%.[206]
According to a 2012 report by the Pew Research Center, 2.4% of the US adult population identify as atheist, up from 1.6% in 2007, and within the religiously unaffiliated (or "no religion") demographic, atheists made up 12%.[207]
Proportion of atheists and agnostics around the world.

A study noted positive correlations between levels of education and secularity, including atheism, in America.[83]

According to evolutionary psychologist Nigel Barber, atheism blossoms in places where most people feel economically secure, particularly in the social democracies of Europe, as there is less uncertainty about the future with extensive social safety nets and better health care resulting in a greater quality of life and higher life expectancy. By contrast, in underdeveloped countries, there are virtually no atheists.[208]

A letter published in Nature in 1998 reported a survey suggesting that belief in a personal god or afterlife was at an all-time low among the members of the U.S. National Academy of Science, 7.0% of whom believed in a personal god as compared with more than 85% of the general U.S. population,[209] although this study has been criticized by Rodney Stark and Roger Finke for its definition of belief in God. The definition was "I believe in a God to whom one may pray in the expectation of receiving an answer".[210]

An article published by The University of Chicago Chronicle that discussed the above study, stated that 76% of physicians in the United States believe in God, more than the 7% of scientists above, but still less than the 85% of the general population.[211] Another study assessing religiosity among scientists who are members of the American Association for the Advancement of Science found that "just over half of scientists (51%) believe in some form of deity or higher power; specifically, 33% of scientists say they believe in God, while 18% believe in a universal spirit or higher power."[212]
Frank Sulloway of the Massachusetts Institute of Technology and Michael Shermer of California State University conducted a study which found in their polling sample of "credentialed" U.S. adults (12% had Ph.Ds and 62% were college graduates) 64% believed in God, and there was a correlation indicating that religious conviction diminished with education level.[213]

In 1958, Professor Michael Argyle of the University of Oxford analyzed seven research studies that had investigated correlation between attitude to religion and measured intelligence among school and college students from the U.S. Although a clear negative correlation was found, the analysis did not identify causality but noted that factors such as authoritarian family background and social class may also have played a part.[214] Sociologist Philip Schwadel found that higher levels of education are associated with increased religious participation and religious practice in daily life, but also correlate with greater tolerance for atheists' public opposition to religion and greater skepticism of "exclusivist religious viewpoints and biblical literalism".[215] Other studies have also examined the relationship between religiosity and intelligence; in a meta-analysis, 53 of 63 studies found that analytical intelligence correlated negatively with religiosity, with 35 of the 53 reaching statistical significance, while 10 studies found a positive correlation, 2 of which reached significance.[216]

Humans Are to Blame for Rapidly Melting Glaciers

Humans Are to Blame for Rapidly Melting Glaciers

//

The steady melt of glacial ice around the world is largely due to man-made factors, such as greenhouse-gas emissions and aerosols, a new study finds.

Humans have caused roughly a quarter of the globe's glacial loss between 1851 and 2010, and about 69 percent of glacial melting between 1991 and 2010, the study suggests.

"In a sense, we got a confirmation that by now, it is really mostly humans that are responsible for the melting glaciers," said lead researcher Ben Marzeion, an associate professor of meteorology and geophysics at the University of Innsbruck in Austria. [Images of Melt: See Earth's Vanishing Ice]
We look at the dire state of perhaps the most interesting and diverse part of our planet.
DCI

PHOTOS: Global Warming Right Before Your Eyes

Vanishing glaciers are often associated with global warming, and other studies have estimated past ice loss and made projections of future melt. But until now, researchers were unsure how much glacial loss was tied to human factors.

"So far, it has been unclear how much of the observed mass losses are caused by humans rather than natural climate variations," Regine Hock, a professor of geophysics at the University of Alaska Fairbanks who was not involved in the study, wrote an in email to Live Science.

The researchers used "state-of-the art modeling techniques," in their work, Hock said.

The research team relied on 12 climate models, most of them from the latest reports from the Intergovernmental Panel on Climate Change, an international group of climate-change experts convened by the United Nations. By combining the models, along with data from the Randolph Glacier Inventory (a catalog of nearly 200,000 glaciers), the researchers created a computer model that included only natural contributions to glacier melt, such as volcanic eruptions and solar variability, and another model with both human and natural factors.

Using data from 1851 to 2010, the researchers compared the two models with real measurements of glaciers to determine which one better represented reality. The study did not include glaciers in Antarctica, because not enough data on the region was available during the 159 years covered by the study.

The model with the man-made influences was a better fit, they found.

"Glaciers thin and retreat around the world as a result of rising air temperature, but the glaciers don't care whether or not the increase in temperature is due to natural or human causes," Hock said. "Over the last 150 years, most of the mass loss was due to natural climate variability, caused, for example, by volcanic eruptions or changes in solar activity.

"However, during the last 20 years, almost 70 percent of the glacier mass changes were caused by climate change due to humans," she wrote.      

Interestingly, the study found that glaciers, which are slow to react to climate change, are still recovering from the end of the Little Ice Age that lasted from the 14th to the 19th centuries. During the Little Ice Age, temperatures were about 1.8 degrees Fahrenheit (1 degree Celsius) colder than they are today.

Warmer temperatures after the Little Ice Age affected the glaciers. "Essentially, what we find is that glaciers would be melting without any human influence," Marzeion told Live Science.

The melt, however, would not be happening as quickly as it is today if it weren't for man-made contributions, such as aerosols from wood or coal fires, he said. Aerosols are particles suspended in the atmosphere that absorb and scatter the sun's radiation.
Even if climate change from both man-made and natural causes stopped today, the glaciers would continue to melt and are projected to raise ocean levels by 2.7 inches (7 centimeters) during this century, Marzeion said.

As global temperatures continue to rise, the glaciers will continue to disappear. The melt may provide more water for irrigation and other needs, but it won't be sustainable because the glaciers may eventually vanish, Marzeion said. In the meantime, people can try to reduce man-made contributions to global warming and adapt to the changing planet, he said.
The study was published online today (Aug. 14) in the journal Science.

More From LiveScience:
Original article on Live Science.

Copyright 2014 LiveScience, a TechMediaNetwork company. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.

Terraforming

Terraforming

From Wikipedia, the free encyclopedia
 
An artist's conception shows a terraformed Mars in four stages of development.

Terraforming (literally, "Earth-shaping") of a planet, moon, or other body is the theoretical process of deliberately modifying its atmosphere, temperature, surface topography or ecology to be similar to the biosphere of Earth to make it habitable by Earth-like life.

The term "terraforming" is sometimes used more generally as a synonym for planetary engineering, although some consider this more general usage an error.[citation needed] The concept of terraforming developed from both science fiction and actual science. The term was coined by Jack Williamson in a science-fiction story (Collision Orbit) published during 1942 in Astounding Science Fiction,[1] but the concept may pre-date this work.

Based on experiences with Earth, the environment of a planet can be altered deliberately; however, the feasibility of creating an unconstrained planetary biosphere that mimics Earth on another planet has yet to be verified. Mars is usually considered to be the most likely candidate for terraforming.
Much study has been done concerning the possibility of heating the planet and altering its atmosphere, and NASA has even hosted debates on the subject. Several potential methods of altering the climate of Mars may fall within humanity's technological capabilities, but at present the economic resources required to do so are far beyond that which any government or society is willing to allocate to it. The long timescales and practicality of terraforming are the subject of debate. Other unanswered questions relate to the ethics, logistics, economics, politics, and methodology of altering the environment of an extraterrestrial world.

History of scholarly study

Carl Sagan, an astronomer, proposed the planetary engineering of Venus in an article published in the journal Science in 1961.[2] Sagan imagined seeding the atmosphere of Venus with algae, which would convert water, nitrogen and carbon dioxide into organic compounds. As this process removed carbon dioxide from the atmosphere, the greenhouse effect would be reduced until surface temperatures dropped to "comfortable" levels. The resulting carbon, Sagan supposed, would be incinerated by the high surface temperatures of Venus, and thus be sequestered in the form of "graphite or some involatile form of carbon" on the planet's surface.[3] However, later discoveries about the conditions on Venus made this particular approach impossible. One problem is that the clouds of Venus are composed of a highly concentrated sulfuric acid solution. Even if atmospheric algae could thrive in the hostile environment of Venus' upper atmosphere, an even more insurmountable problem is that its atmosphere is simply far too thick—the high atmospheric pressure would result in an "atmosphere of nearly pure molecular oxygen" and cause the planet's surface to be thickly covered in fine graphite powder.[3] This volatile combination could not be sustained through time. Any carbon that was fixed in organic form would be liberated as carbon dioxide again through combustion, "short-circuiting" the terraforming process.[3]

Sagan also visualized making Mars habitable for human life in "Planetary Engineering on Mars" (1973), an article published in the journal Icarus.[4] Three years later, NASA addressed the issue of planetary engineering officially in a study, but used the term "planetary ecosynthesis" instead.[5] The study concluded that it was possible for Mars to support life and be made into a habitable planet. The first conference session on terraforming, then referred to as "Planetary Modeling", was organized that same year.

In March 1979, NASA engineer and author James Oberg organized the First Terraforming Colloquium, a special session at the Lunar and Planetary Science Conference in Houston. Oberg popularized the terraforming concepts discussed at the colloquium to the general public in his book New Earths (1981).[6] Not until 1982 was the word terraforming used in the title of a published journal article. Planetologist Christopher McKay wrote "Terraforming Mars", a paper for the Journal of the British Interplanetary Society.[7] The paper discussed the prospects of a self-regulating Martian biosphere, and McKay's use of the word has since become the preferred term. In 1984, James Lovelock and Michael Allaby published The Greening of Mars.[8] Lovelock's book was one of the first to describe a novel method of warming Mars, where chlorofluorocarbons (CFCs) are added to the atmosphere.

Motivated by Lovelock's book, biophysicist Robert Haynes worked behind the scenes to promote terraforming, and contributed the neologism Ecopoiesis, forming the word from the Greek οἶκος, oikos, "house",[9] and ποίησις, poiesis, "production".[10] Ecopoiesis refers to the origin of an ecosystem. In the context of space exploration, Haynes describes ecopoiesis as the "fabrication of a sustainable ecosystem on a currently lifeless, sterile planet". Ecopoiesis is a type of planetary engineering and is one of the first stages of terraformation. This primary stage of ecosystem creation is usually restricted to the initial seeding of microbial life.[11] As conditions approach that of Earth, plant life could be brought in, and this will accelerate the production of oxygen, theoretically making the planet eventually able to support animal life.

Aspects and definitions

Beginning in 1985, Martyn J. Fogg began publishing several articles on terraforming. He also served as editor for a full issue on terraforming for the Journal of the British Interplanetary Society in 1991. In his book Terraforming: Engineering Planetary Environments (1995), Fogg proposed the following definitions for different aspects related to terraforming:[12]
  • Planetary engineering: the application of technology for the purpose of influencing the global properties of a planet.
  • Geoengineering: planetary engineering applied specifically to the Earth. It includes only those macroengineering concepts that deal with the alteration of some global parameter, such as the greenhouse effect, atmospheric composition, insolation or impact flux.
  • Terraforming: a process of planetary engineering, specifically directed at enhancing the capacity of an extraterrestrial planetary environment to support life as we know it. The ultimate achievement in terraforming would be to create an open planetary biosphere emulating all the functions of the biosphere of the Earth, one that would be fully habitable for human beings.
  • Astrophysical engineering: taken to represent proposed activities, relating to future habitation, that are envisaged to occur on a scale greater than that of "conventional" planetary engineering.
Fogg also devised definitions for candidate planets of varying degrees of human compatibility:[13]
  • Habitable Planet (HP): A world with an environment sufficiently similar to the Earth as to allow comfortable and free human habitation.
  • Biocompatible Planet (BP): A planet possessing the necessary physical parameters for life to flourish on its surface. If initially lifeless, then such a world could host a biosphere of considerable complexity without the need for terraforming.
  • Easily Terraformable Planet (ETP): A planet that might be rendered biocompatible, or possibly habitable, and maintained so by modest planetary engineering techniques and with the limited resources of a starship or robot precursor mission.
Fogg suggests that Mars was a biologically compatible planet in its youth, but is not now in any of these three categories, since it could only be terraformed with greater difficulty.[citation needed] Mars Society founder Robert Zubrin produced a plan for a Mars return mission called Mars Direct that would set up a permanent human presence on Mars and steer efforts towards eventual terraformation.[14]

Requirements for sustaining terrestrial life

An absolute requirement for life is an energy source, but the notion of planetary habitability implies that many other geophysical, geochemical, and astrophysical criteria must be met before the surface of an astronomical body is able to support life. Of particular interest is the set of factors that has sustained complex, multicellular animals in addition to simpler organisms on this planet. Research and theory in this regard is a component of planetary science and the emerging discipline of astrobiology.

In its astrobiology roadmap, NASA has defined the principal habitability criteria as "extended regions of liquid water, conditions favorable for the assembly of complex organic molecules, and energy sources to sustain metabolism."[15]

Preliminary stages

Once conditions become more suitable for life of the introduced species, the importation of microbial life could begin.[12] As conditions approach that of Earth, plant life could also be brought in. This would accelerate the production of oxygen, which theoretically would make the planet eventually able to support animal life.

Prospective planets

Artist's conception of a terraformed Mars

Mars

In many respects, Mars is the most earthlike of all the other planets in our Solar system.[16] Indeed, it is thought that Mars once did have a more Earth-like environment early in its history, with a thicker atmosphere and abundant water that was lost over the course of hundreds of millions of years.[17]

The exact mechanism of this loss is still unclear, though three mechanisms in particular seem likely: First, whenever surface water is present, carbon dioxide reacts with rocks to form carbonates, thus drawing atmosphere off and binding it to the planetary surface. On Earth, this process is counteracted when plate tectonics works to cause volcanic eruptions that vent carbon dioxide back to the atmosphere. On Mars, the lack of such tectonic activity worked to prevent the recycling of gases locked up in sediments.[18]

Second, the lack of a magnetosphere surrounding the entire surface of Mars may have allowed the solar wind to gradually erode the atmosphere.[19] Convection within the core of Mars, which is made mostly of iron,[20] originally generated a magnetic field. However the dynamo ceased to function long ago,[21] and the magnetic field of Mars has largely disappeared, probably due to "... loss of core heat, solidification of most of the core, and/or changes in the mantle convection regime."[22] Mars does still retain a limited magnetosphere that covers approximately 40% of its surface. Rather than uniformly covering and protecting the atmosphere from solar wind, however, the magnetic field takes the form of a collection of smaller, umbrella-shaped fields, mainly clustered together around the planet's southern hemisphere.[23] It is within these regions that chunks of atmosphere are violently "blown away", as astronomer David Brain explains:
The joined fields wrapped themselves around a packet of gas at the top of the Martian atmosphere, forming a magnetic capsule a thousand kilometres wide with ionised air trapped inside... Solar wind pressure caused the capsule to 'pinch off' and it blew away, taking its cargo of air with it.[23]
Finally, between approximately 4.1 and 3.8 billion years ago, asteroid impacts during the Late Heavy Bombardment caused significant changes to the surface environment of objects in our Solar system. The low gravity of Mars suggests that these impacts could have ejected much of the Martian atmosphere into deep space.[24]

Terraforming Mars would entail two major interlaced changes: building the atmosphere and heating it.[25] A thicker atmosphere of greenhouse gases such as carbon dioxide would trap incoming solar radiation. Because the raised temperature would add greenhouse gases to the atmosphere, the two processes would augment each other.[26]
Artist's conception of a terraformed Venus

Venus

Terraforming Venus requires two major changes; removing most of the planet's dense 9 MPa carbon dioxide atmosphere and reducing the planet's 450 °C (723.15 K) surface temperature. These goals are closely interrelated, since Venus' extreme temperature is thought to be due to the greenhouse effect caused by its dense atmosphere. Sequestering the atmospheric carbon would likely solve the
temperature problem as well.

Europa (moon)

Europa, a moon of Jupiter, is a potential candidate for terraforming.[citation needed] One advantage to Europa is the presence of liquid water which could be extremely helpful for the introduction of any form of life.[27][not in citation given] The difficulties are numerous; Europa is near a huge radiation belt around Jupiter.[28] This would require the building of radiation deflectors, which is currently impractical. Additionally, this satellite is covered in ice and would have to be heated, and there would need to be a supply of oxygen,[29] though this could, at sufficient energy cost, be manufactured locally by electrolysis of the copious water available.
Artist's conception of what the Moon might look like terraformed

Other bodies in the Solar System

Other possible candidates for terraforming (possibly only partial or paraterraforming) include Titan, Callisto, Ganymede, the Moon, and even Mercury, Saturn's moon Enceladus and the dwarf planet Ceres. Most, however, have too little mass and gravity to hold an atmosphere indefinitely (although it may be possible, but it is not quite certain, that an atmosphere could remain for tens of thousands of years or be replenished as needed). In addition, aside from the Moon and Mercury, most of these worlds are so far from the Sun that adding sufficient heat would be much more difficult than it would be for Mars. Terraforming Mercury would present different challenges, but in certain aspects would be easier than terraforming Venus. Though not widely discussed, the possibility of terraforming Mercury's poles has been presented. Saturn's moon Titan offers several unique advantages, such as an atmospheric pressure similar to Earth and an abundance of nitrogen and frozen water. Jupiter's moons Europa, Ganymede, and Callisto also have an abundance of water ice.

Paraterraforming

Also known as the "worldhouse" concept, or domes in smaller versions, paraterraforming involves the construction of a habitable enclosure on a planet which eventually grows to encompass most of the planet's usable area.[30] The enclosure would consist of a transparent roof held one or more kilometers above the surface, pressurized with a breathable atmosphere, and anchored with tension towers and cables at regular intervals. Proponents claim worldhouses can be constructed with technology known since the 1960s. The Biosphere 2 project built a dome on Earth that contained a habitable environment. The project encountered difficulties in operation, including unexpected population explosions of some plants and animals,[31][32] and a lower than anticipated production of oxygen by plants, requiring extra oxygen to be pumped in.[33]

Paraterraforming has several advantages over the traditional approach to terraforming. For example, it provides an immediate payback to investors (assuming a capitalistic financing model). Although it starts out in a small area (a domed city for example), it quickly provides habitable space. The paraterraforming approach also allows for a modular approach that can be tailored to the needs of the planet's population, growing only as fast and only in those areas where it is required. Finally, paraterraforming greatly reduces the amount of atmosphere that one would need to add to planets like Mars to provide Earth-like atmospheric pressures. By using a solid envelope in this manner, even bodies which would otherwise be unable to retain an atmosphere at all (such as asteroids) could be given a habitable environment. The environment under an artificial worldhouse roof would also likely be more amenable to artificial manipulation.[citation needed] Paraterraforming is also less likely to cause harm to any native lifeforms that may hypothetically inhabit the planet, as the parts of the planet outside the enclosure will not normally be affected unlike terraforming which affects the entire planet.[citation needed]

It has the disadvantage of requiring massive amounts of construction and maintenance activity. It also would not likely have a completely independent water cycle, because although rainfall may be able to develop with a high enough roof, but probably not efficiently enough for agriculture or a water cycle.[citation needed] The extra cost might be off-set somewhat by automated manufacturing and repair mechanisms.[citation needed] A worldhouse might also be more susceptible to catastrophic failure if a major breach occurred, though this risk might be reduced by compartmentalization and other active safety precautions.[citation needed] Meteor strikes are a particular concern because without any external atmosphere they would reach the surface before burning up.[citation needed]

Ethical issues

There is a philosophical debate within biology and ecology as to whether terraforming other worlds is an ethical endeavor. From the point of view of a cosmocentric ethic, this involves balancing the need for the preservation of human life against the intrinsic value of existing planetary ecologies.[34]

On the pro-terraforming side of the argument, there are those like Robert Zubrin, Martyn J. Fogg, Richard L. S. Taylor and the late Carl Sagan who believe that it is humanity's moral obligation to make other worlds suitable for life, as a continuation of the history of life transforming the environments around it on Earth.[35][36] They also point out that Earth would eventually be destroyed if nature takes its course, so that humanity faces a very long-term choice between terraforming other worlds or allowing all terrestrial life to become extinct. Terraforming totally barren planets, it is asserted, is not morally wrong as it does not affect any other life.

The opposing argument posits that terraforming would be an unethical interference in nature, and that given humanity's past treatment of the Earth, other planets may be better off without human interference. Still others strike a middle ground, such as Christopher McKay, who argues that terraforming is ethically sound only once we have completely assured that an alien planet does not harbor life of its own; but that if it does, while we should not try to reshape the planet to our own use, we should engineer the planet's environment to artificially nurture the alien life and help it thrive and co-evolve, or even co-exist with humans.[37] Even this would be seen as a type of terraforming to the strictest of ecocentrists, who would say that all life has the right, in its home biosphere, to evolve without outside interference.

Economic issues

The initial cost of such projects as planetary terraforming would be gargantuan, and the infrastructure of such an enterprise would have to be built from scratch. Such technology is not yet developed, let alone financially feasible at the moment. John Hickman has pointed out that almost none of the current schemes for terraforming incorporate economic strategies, and most of their models and expectations seem highly optimistic.[38] Access to the vast resources of space may make such projects more economically feasible, though the initial investment required to enable easy access to space will likely be tremendous (see Asteroid mining, solar power satellites, In-Situ Resource Utilization, bootstrapping, space elevator).

Political issues

National pride, rivalries between nations, and the politics of public relations have in the past been the primary motivations for shaping space projects.[39][40] It is reasonable to assume that these factors would also be present in planetary terraforming efforts.

In popular culture

Terraforming is a common concept in science fiction, ranging from television, movies and novels to video games.

The concept of changing a planet for habitation precedes the use of the word 'terraforming', with H. G. Wells describing a reverse-terraforming, where aliens in his story The War of the Worlds change Earth for their own benefit. Olaf Stapledon's Last and First Men (1930) provides the first example in fiction in which Venus is modified, after a long and destructive war with the original inhabitants, who naturally object to the process. The word itself was coined in fiction by Jack Williamson, but features in many other stories of the 1950s & 60s, such Poul Anderson's The Big Rain, and James Blish's "Pantropy" stories. Recent works involving terraforming of Mars include the Mars trilogy by Kim Stanley Robinson and The Platform by James Garvey. In Isaac Asimov's Robot Series, fifty planets have been colonized and terraformed by the powerful race of humans called Spacers, and when Earth is allowed to attempt colonization once more, the Settlers begin the process of terraforming their new worlds immediately. After twenty thousand years in the future, all the habitable planets in the galaxy have been terraformed and form the basis of the Galactic Empire in Asimov's Foundation Series. In the Star Wars series, the planet Manaan uses a paraterraforming-like infrastructure, with all buildings being built above the water as the habitable land of the planet. There is no natural land on the planet. In the Star Wars Expanded Universe, the planet Taris is restored to its former state after a Sith bombardment through aggressive terraforming.

Terraforming has also been explored on television and in feature films, including the "Genesis device", developed to quickly terraform barren planets, in the movie The Wrath of Khan. A similar device exists in the animated feature film Titan A.E. which depicts the eponymous ship Titan, capable of creating a planet. The word 'terraforming' was used in James Cameron's Aliens to describe the act of processing a planet's atmosphere through nuclear reactors over several decades in order to make it habitable. The 2000 movie Red Planet also uses the motif: after humanity faces heavy overpopulation and pollution on Earth, uncrewed space probes loaded with algae are sent to Mars with the aim of terraforming and creating a breathable atmosphere. The television series Firefly and its cinematic sequel Serenity (circa 2517) are set in a solar system with about seventy terraformed planets and moons. In the 2008 video game Spore, the player is able to terraform any planet by using either terraforming rays or a "Staff of Life" that completely terraforms the planet and fills it with creatures. Doctor Who episode "The Doctor's Daughter" also references terraforming, where a glass orb is broken to release gases which terraform the planet the characters are on at the time. One crew member in Ridley Scott's 2012 Prometheus bets another that the purpose of their visit is Terraforming.

In the video game Halo (2001), the main setting is an ancient ring-shaped structure whose radius is nearly that of Earth; the structure is terraformed to support an Earth-like ecosystem. The rings are created using Forerunner technology, and terraformed during their construction by an extra-galactic construct known as The Ark or Installation 00. Various works of fiction based on Halo also mention the terraforming of planets.[41]

John Christopher's "Tripods" trilogy has a twist on terraforming. Aliens have conquered the earth. They live in three domed cities located in Germany, China, and Panama where they breathe an atmosphere poisonous to earth life (probably containing chlorine). As the plot unfolds, the protagonist determines the aliens are awaiting the arrival of another ship from their home star containing the equipment for them to terraform (or alienscape) the earth. If this occurs, all earth life will be wiped out by the poisoned atmosphere. In M. Night Shyamalan's After Earth, the planet Nova Prime has been terraformed to be adaptable for human life as the Earth has lost all properties of being adjustable for humanity (e.g. violent thermal shifts)

In Zack Snyder's Man of Steel, General Zod attempts to use terraforming to revive the environment of planet Krypton on Earth.

Thursday, August 14, 2014

Stinky Landfill Gases Could Transform Into Clean Energy

Stinky Landfill Gases Could Transform Into Clean Energy

 
Thursday, August 14, 2014 2:34 from Before It's News

A new technique that transforms stinky, air-polluting landfill gas could produce the sweet smell of success as it leads to development of a fuel cell generating clean electricity for homes, offices and hospitals, researchers say. The advance would convert methane gas into hydrogen, an efficient, clean form of energy.
 
Researchers have devised a catalyst that could one day turn the smelly gases at this landfill in Niterói, Brazil, into clean hydrogen fuel.
 
 
Credit: Luiz Almeida
The researcher’s report is part of the 248th National Meeting of the American Chemical Society (ACS), the world’s largest scientific society.

The meeting, attended by thousands of scientists, features nearly 12,000 reports on new advances in science and other topics. It is being held in San Francisco through Thursday.

Recently, hydrogen has received much attention as a clean alternative to fossil fuels, which release carbon dioxide — the main greenhouse gas — when burned. Hydrogen, however, only emits water vapor when it is burned. For this reason, some companies are developing hydrogen fuel cells for automobiles and homes.

One way to do this is to convert methane, another greenhouse gas, to hydrogen by reacting it with carbon dioxide. And smelly landfills are excellent sources of these gases — microbes living in the waste produce large amounts of methane and carbon dioxide as by-products.

But researchers have faced challenges bringing this idea to reality. For example, finding a proper catalyst has been a major hurdle, says Fabio B. Noronha, Ph.D., who is with the National Institute of Technology in Rio de Janeiro, Brazil. A catalyst is a substance that speeds up processes that otherwise would occur slowly. In this case, researchers are using catalysts to help turn methane and carbon dioxide into hydrogen and carbon monoxide. The problem is that carbon, which forms as a contaminant during the process, deposits onto the catalyst.

“The heart of the process for the production of hydrogen from landfill gas is the catalyst, and this can be disrupted by the presence of carbon,” Noronha explains. “Because of carbon deposition, the catalyst loses the capacity to convert the landfill gases into hydrogen.”

He says that to solve this problem, Noronha’s team developed a new catalyst material that removes the carbon as soon as it is formed. This approach is based on the automotive catalysts developed in the past to control car and truck emissions, he adds. The material is a perovskite-type oxide supported on ceria, which is a component of ceramics.

Right now, the researchers are working on the reaction in the laboratory, but the new, highly stable catalyst should be ideal for commercialization. As a step in that direction, the team plans to test it on a larger scale using material from a local landfill, says Noronha.

The researchers acknowledge funding from Fundação Carlos Chagas Filho de Amparo à Pesquisa do Estado do Rio de JaneiroSão Paulo Research Foundation and Conselho Nacional de Desenvolvimento Científico e Tecnológico.


Contacts and sources:
Michael Bernstein
Katie Cottingham, Ph.D.
American Chemical Society 

Proof confirmed of 400-year-old fruit-stacking problem

Proof confirmed of 400-year-old fruit-stacking problem

  • 11:42 12 August 2014 by Jacob Aron for New Scientist

A computer-verified proof of a 400-year-old problem could pave the way for a new era of mathematics, in which machines do the grunt work and leave humans free for deeper thinking.
 
The problem is a puzzle familiar to greengrocers everywhere: what is the best way to stack a collection of spherical objects, such as a display of oranges for sale? In 1611 Johannes Kepler suggested that a pyramid arrangement was the most efficient, but couldn't prove it.
Now, a mathematician has announced the completion of an epic quest to formally prove the so-called Kepler conjecture. "An enormous burden has been lifted from my shoulders," says Thomas Hales of the University of Pittsburgh, Pennsylvania, who led the work. "I suddenly feel ten years younger!"
Hales first presented a proof that Kepler's intuition was correct in 1998. Although there are infinite ways to stack infinitely many spheres, most are variations on only a few thousand themes. Hales broke the problem down into the thousands of possible sphere arrangements that mathematically represent the infinite possibilities, and used software to check them all.
 
But the proof was a 300-page monster that took 12 reviewers four years to check for errors. Even when it was published in the journal Annals of Mathematics in 2005, the reviewers could say only that they were "99 per cent certain" the proof was correct.

So in 2003, Hales started the Flyspeck project, an effort to vindicate his proof through formal verification. His team used two formal proof software assistants called Isabelle and HOL Light, both of which are built on a small kernel of logic that has been intensely scrutinised for any errors – this provides a foundation which ensures the computer can check any series of logical statements to confirm they are true.
 
On Sunday, the Flyspeck team announced they had finally translated the dense mathematics of Hale's proof into computerised form, and verified that it is indeed correct.
"This technology cuts the mathematical referees out of the verification process," says Hales. "Their opinion about the correctness of the proof no longer matters."
 
"It has been a huge effort," says Alan Bundy of the University of Edinburgh, UK, who was not involved in the work. He adds that he hopes Flyspeck's success will inspire other mathematicians to start using proof assistants. "A world-famous mathematician has turned his hand toward automated theorem proving, that kind of sociological fact is very important," he says. "This is a case study which could start to become the norm."
 
Ideally, proof assistants would work in the background as mathematicians puzzled through new ideas. Software can already prove some basic concepts by itself, but it could be easier to use. "We need some way of exploring the proof, getting a big picture," says Bundy. "To see everything in all the gory detail is just beyond us, as humans we can't absorb that much."
 
As for Hales, he's ready to move on. "I have a box full of ideas that I have set aside while working on this formal proof," he says. "Let's hope that the next project does not take 20 years!" 
       
 


Are Apes as Empathetic as Humans?

Are Apes as Empathetic as Humans?

In the absence of complex emotional bonds, humans and bonobos show similar empathy, according to a study.
 
By | August 12, 2014 for The Scientist
 
WIKIMEDIA, CHRISTINA BERGEY

Contagious yawning has long been linked to empathy: humans and apes yawn more in response to the yawns of their kin and friends. Now, scientists studying yawn contagion have shown that humans may not always be the most empathetic species. Their results, published today (August 12) in PeerJ, show that humans yawn more than bonobos only when close family and friends trigger the yawns. In the presence of mere acquaintances, however, humans and bonobos showed similar yawn sensitivity.

“It seems that the basal level of empathetic capacity is the same in the two species,” said Elisabetta Palagi from the Natural History Museum at the University of Pisa in Italy, who co-led the study. “But when an emotional bonding comes into play, people overcome bonobos.”

Matthew Campbell, a primatologist at California State University Channel Islands who was not involved with the work, agreed. “These are interesting results,” Campbell wrote in an e-mail to The Scientist. “They show that the most basic form of empathy, also called emotional contagion, appears to work very similarly in humans and bonobos, who are as closely related to us as chimpanzees.”

Several studies have examined empathy in great apes and humans separately. But the present study is the first to directly compare empathetic abilities between species, according to Palagi. “This has probably not been done before because quantifying empathy in animals is difficult,” she said. “In addition, most empathy studies in humans have used questionnaires and direct interviews, which cannot be replicated in bonobos.”

Yawning, however, is easily detected and morphologically identical across species. Palagi’s team had previously found that both humans and bonobos show yawn contagion according to an empathetic gradient—both yawn more frequently and more often among kin and friends, followed by acquaintances, and then strangers. “So we thought of comparing the two species, and evaluating their level of sensitivity to others’ yawns as a factor of relationship quality and bonding,” she explained.
The team hypothesized that if humans are the most empathetic species, then they should show the most empathetic behaviors at all times. Between 2009 and 2013, the team observed yawn contagion among 33 humans going about their everyday activities in different locations—restaurants, offices, and waiting rooms—recording 1,375 yawns over 380 hours. Similarly, the team observed yawn contagion among 16 bonobos in two zoos while the bonobo colonies rested, moved, and foraged, recording nearly 2,130 yawns over 800 hours.

“The authors have done a fantastic job in keeping the methods as similar as possible across two species,” Jingzhi Tan, who studies human and bonobo cognition and behavior at Duke University and was not involved in the study, told The Scientist in an e-mail. “This is incredibly challenging because nonhuman primates would not walk in your testing rooms, listen to your instructions word-by-word, and sit down patiently to work on your tasks.”

Contrary to the team’s hypothesis, people yawned more frequently and more promptly than bonobos only when the yawners and responders were close friends and family. However, when the yawners and responders were familiar, but not emotionally close, the two species yawned just as frequently and promptly.

“This suggests that the way emotional contagion works may not have changed much since humans split off from chimpanzees and bonobos 5 [million] to 7 million years ago,” said Campbell. “Rather, what appears to have changed is something about deep emotional connections that humans have with close family and friends.  Humans could bond with others more intensely than bonobos, or humans could do this more frequently than bonobos.”

“All of this seems in line with our general expectations about empathy-driven behavior,” primatologist Frans de Waal of Emory University who was not part of the study told The Scientist in an e-mail. “Social closeness and familiarity matter.”

Campbell, however, pointed out a potential limitation of the work. “These bonobos were in a small captive group where their choices of individuals to interact with were severely constrained, but the humans were not,” he said. “As difficult as it would be to do, studying bonobos in the wild may be necessary to rule out that the lack of choice prevented bonobos from forming deep bonds with others.”

“As for further exploration, it would be great to add more distant relatives of humans in the future,” he added. “By testing the other apes, adding monkeys, and some non-primate species, we could start to get a picture of how emotional contagion evolved in mammals, all of whom show this ability to some extent.”

E. Palagi et al., “Yawn contagion in humans and bonobos: emotional affinity matters more than species,” PeerJ, 2:e519, 2014.

Infant exposure

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Infant_exposure   ...