Search This Blog

Thursday, September 24, 2015

Planetary habitability


From Wikipedia, the free encyclopedia


Understanding planetary habitability is partly an extrapolation of the conditions on Earth, as this is the only planet known to support life.

Planetary habitability is the measure of a planet's or a natural satellite's potential to develop and sustain life. Life may develop directly on a planet or satellite or be transferred to it from another body, a theoretical process known as panspermia. As the existence of life beyond Earth is unknown, planetary habitability is largely an extrapolation of conditions on Earth and the characteristics of the Sun and Solar System which appear favourable to life's flourishing—in particular those factors that have sustained complex, multicellular organisms and not just simpler, unicellular creatures. Research and theory in this regard is a component of planetary science and the emerging discipline of astrobiology.

An absolute requirement for life is an energy source, and the notion of planetary habitability implies that many other geophysical, geochemical, and astrophysical criteria must be met before an astronomical body can support life. In its astrobiology roadmap, NASA has defined the principal habitability criteria as "extended regions of liquid water,[1] conditions favourable for the assembly of complex organic molecules, and energy sources to sustain metabolism."[2]

In determining the habitability potential of a body, studies focus on its bulk composition, orbital properties, atmosphere, and potential chemical interactions. Stellar characteristics of importance include mass and luminosity, stable variability, and high metallicity. Rocky, terrestrial-type planets and moons with the potential for Earth-like chemistry are a primary focus of astrobiological research, although more speculative habitability theories occasionally examine alternative biochemistries and other types of astronomical bodies.

The idea that planets beyond Earth might host life is an ancient one, though historically it was framed by philosophy as much as physical science.[a] The late 20th century saw two breakthroughs in the field. The observation and robotic spacecraft exploration of other planets and moons within the Solar System has provided critical information on defining habitability criteria and allowed for substantial geophysical comparisons between the Earth and other bodies. The discovery of extrasolar planets, beginning in the early 1990s[3][4] and accelerating thereafter, has provided further information for the study of possible extraterrestrial life. These findings confirm that the Sun is not unique among stars in hosting planets and expands the habitability research horizon beyond the Solar System.

The chemistry of life may have begun shortly after the Big Bang, 13.8 billion years ago, during a habitable epoch when the Universe was only 10–17 million years old.[5][6] According to the panspermia hypothesis, microscopic life—distributed by meteoroids, asteroids and other small Solar System bodies—may exist throughout the universe.[7] Nonetheless, Earth is the only place in the universe known to harbor life.[8][9] Estimates of habitable zones around other stars,[10][11] along with the discovery of hundreds of extrasolar planets and new insights into the extreme habitats here on Earth, suggest that there may be many more habitable places in the universe than considered possible until very recently.[12] On 4 November 2013, astronomers reported, based on Kepler space mission data, that there could be as many as 40 billion Earth-sized planets orbiting in the habitable zones of Sun-like stars and red dwarfs within the Milky Way.[13][14] 11 billion of these estimated planets may be orbiting Sun-like stars.[15] The nearest such planet may be 12 light-years away, according to the scientists.[13][14]

Suitable star systems

An understanding of planetary habitability begins with stars. While bodies that are generally Earth-like may be plentiful, it is just as important that their larger system be agreeable to life. Under the auspices of SETI's Project Phoenix, scientists Margaret Turnbull and Jill Tarter developed the "HabCat" (or Catalogue of Habitable Stellar Systems) in 2002. The catalogue was formed by winnowing the nearly 120,000 stars of the larger Hipparcos Catalogue into a core group of 17,000 "HabStars", and the selection criteria that were used provide a good starting point for understanding which astrophysical factors are necessary to habitable planets.[16] According to research published in August 2015, very large galaxies may be more favorable to the creation and development of habitable planets than smaller galaxies, like the Milky Way galaxy.[17]

Spectral class

The spectral class of a star indicates its photospheric temperature, which (for main-sequence stars) correlates to overall mass. The appropriate spectral range for "HabStars" is considered to be "early F" or "G", to "mid-K". This corresponds to temperatures of a little more than 7,000 K down to a little more than 4,000 K (6,700 °C to 3,700 °C); the Sun, a G2 star at 5,777 K, is well within these bounds. "Middle-class" stars of this sort have a number of characteristics considered important to planetary habitability:
  • They live at least a few billion years, allowing life a chance to evolve. More luminous main-sequence stars of the "O", "B", and "A" classes usually live less than a billion years and in exceptional cases less than 10 million.[18][b]
  • They emit enough high-frequency ultraviolet radiation to trigger important atmospheric dynamics such as ozone formation, but not so much that ionisation destroys incipient life.[19]
  • Liquid water may exist on the surface of planets orbiting them at a distance that does not induce tidal locking. K Spectrum stars may be able to support life for long periods, far longer than the Sun.[20]
This spectral range probably accounts for between 5% and 10% of stars in the local Milky Way galaxy. Whether fainter late K and M class red dwarf stars are also suitable hosts for habitable planets is perhaps the most important open question in the entire field of planetary habitability given their prevalence (habitability of red dwarf systems). Gliese 581 c, a "super-Earth", has been found orbiting in the "habitable zone" of a red dwarf and may possess liquid water. However it is also possible that a greenhouse effect may render it too hot to support life, while its neighbor, Gliese 581 d, may be a more likely candidate for habitability.[21] In September 2010, the discovery was announced of another planet, Gliese 581 g, in an orbit between these two planets. However, reviews of the discovery have placed the existence of this planet in doubt, and it is listed as "unconfirmed". In September 2012, the discovery of two planets orbiting Gliese 163[22] was announced.[23][24] One of the planets, Gliese 163 c, about 6.9 times the mass of Earth and somewhat hotter, was considered to be within the habitable zone.[23][24]

A recent study suggests that cooler stars that emit more light in the infrared and near infrared may actually host warmer planets with less ice and incidence of snowball states. These wavelengths are absorbed by their planets' ice and greenhouse gases and remain warmer.[25][26]

A stable habitable zone

The habitable zone (HZ, categorized by the Planetary Habitability Index) is a shell-shaped region of space surrounding a star in which a planet could maintain liquid water on its surface. After an energy source, liquid water is considered the most important ingredient for life, considering how integral it is to all life systems on Earth. This may reflect the known dependence of life on water; however, if life is discovered in the absence of water, the definition of an HZ may have to be greatly expanded.
A "stable" HZ implies two factors. First, the range of an HZ should not vary greatly over time. All stars increase in luminosity as they age, and a given HZ thus migrates outwards, but if this happens too quickly (for example, with a super-massive star) planets may only have a brief window inside the HZ and a correspondingly smaller chance of developing life. Calculating an HZ range and its long-term movement is never straightforward, as negative feedback loops such as the CNO cycle will tend to offset the increases in luminosity. Assumptions made about atmospheric conditions and geology thus have as great an impact on a putative HZ range as does stellar evolution: the proposed parameters of the Sun's HZ, for example, have fluctuated greatly.[27]

Second, no large-mass body such as a gas giant should be present in or relatively close to the HZ, thus disrupting the formation of Earth-like bodies. The matter in the asteroid belt, for example, appears to have been unable to accrete into a planet due to orbital resonances with Jupiter; if the giant had appeared in the region that is now between the orbits of Venus and Mars, Earth would almost certainly not have developed in its present form. However a gas giant inside the HZ might have habitable moons under the right conditions.[28]

In the Solar System, the inner planets are terrestrial, and the outer ones are gas giants, but discoveries of extrasolar planets suggest that this arrangement may not be at all common: numerous Jupiter-sized bodies have been found in close orbit about their primary, disrupting potential HZs. However, present data for extrasolar planets is likely to be skewed towards that type (large planets in close orbits) because they are far easier to identify; thus it remains to be seen which type of planetary system is the norm, or indeed if there is one.

Low stellar variation

Changes in luminosity are common to all stars, but the severity of such fluctuations covers a broad range. Most stars are relatively stable, but a significant minority of variable stars often undergo sudden and intense increases in luminosity and consequently in the amount of energy radiated toward bodies in orbit. These stars are considered poor candidates for hosting life-bearing planets, as their unpredictability and energy output changes would negatively impact organisms: living things adapted to a specific temperature range could not survive too great a temperature variation. Further, upswings in luminosity are generally accompanied by massive doses of gamma ray and X-ray radiation which might prove lethal. Atmospheres do mitigate such effects, but their atmosphere might not be retained by planets orbiting variables, because the high-frequency energy buffeting these planets would continually strip them of their protective covering.
The Sun, in this respect as in many others, is relatively benign: the variation between its maximum and minimum energy output is roughly 0.1% over its 11-year solar cycle. There is strong (though not undisputed) evidence that even minor changes in the Sun's luminosity have had significant effects on the Earth's climate well within the historical era: the Little Ice Age of the mid-second millennium, for instance, may have been caused by a relatively long-term decline in the Sun's luminosity.[29] Thus, a star does not have to be a true variable for differences in luminosity to affect habitability. Of known solar analogs, one that closely resembles the Sun is considered to be 18 Scorpii; unfortunately for the prospects of life existing in its proximity, the only significant difference between the two bodies is the amplitude of the solar cycle, which appears to be much greater for 18 Scorpii.[30]

High metallicity

While the bulk of material in any star is hydrogen and helium, there is a great variation in the amount of heavier elements (metals) that stars contain. A high proportion of metals in a star correlates to the amount of heavy material initially available in the protoplanetary disk. A smaller amount of metal makes the formation of planets much less likely, under the solar nebula theory of planetary system formation. Any planets that did form around a metal-poor star would probably be low in mass, and thus unfavorable for life. Spectroscopic studies of systems where exoplanets have been found to date confirm the relationship between high metal content and planet formation: "Stars with planets, or at least with planets similar to the ones we are finding today, are clearly more metal rich than stars without planetary companions."[31] This relationship between high metallicity and planet formation also means that habitable systems are more likely to be found around younger stars, since stars that formed early in the universe's history have low metal content.

Planetary characteristics


The moons of some gas giants could potentially be habitable.[32]

The chief assumption about habitable planets is that they are terrestrial. Such planets, roughly within one order of magnitude of Earth mass, are primarily composed of silicate rocks, and have not accreted the gaseous outer layers of hydrogen and helium found on gas giants. That life could evolve in the cloud tops of giant planets has not been decisively ruled out,[c] though it is considered unlikely, as they have no surface and their gravity is enormous.[35] The natural satellites of giant planets, meanwhile, remain valid candidates for hosting life.[32]

In February 2011 the Kepler Space Observatory Mission team released a list of 1235 extrasolar planet candidates, including 54 that may be in the habitable zone.[36][37] Six of the candidates in this zone are smaller than twice the size of Earth.[36] A more recent study found that one of these candidates (KOI 326.01) is much larger and hotter than first reported.[38] Based on the findings, the Kepler team estimated there to be "at least 50 billion planets in the Milky Way" of which "at least 500 million" are in the habitable zone.[39]

In analyzing which environments are likely to support life, a distinction is usually made between simple, unicellular organisms such as bacteria and archaea and complex metazoans (animals). Unicellularity necessarily precedes multicellularity in any hypothetical tree of life, and where single-celled organisms do emerge there is no assurance that greater complexity will then develop.[d] The planetary characteristics listed below are considered crucial for life generally, but in every case multicellular organisms are more picky than unicellular life.

Mass


Mars, with its rarefied atmosphere, is colder than the Earth would be if it were at a similar distance from the Sun.

Low-mass planets are poor candidates for life for two reasons. First, their lesser gravity makes atmosphere retention difficult. Constituent molecules are more likely to reach escape velocity and be lost to space when buffeted by solar wind or stirred by collision. Planets without a thick atmosphere lack the matter necessary for primal biochemistry, have little insulation and poor heat transfer across their surfaces (for example, Mars, with its thin atmosphere, is colder than the Earth would be if it were at a similar distance from the Sun), and provide less protection against meteoroids and high-frequency radiation. Further, where an atmosphere is less dense than 0.006 Earth atmospheres, water cannot exist in liquid form as the required atmospheric pressure, 4.56 mm Hg (608 Pa) (0.18 inch Hg), does not occur. The temperature range at which water is liquid is smaller at low pressures generally.

Secondly, smaller planets have smaller diameters and thus higher surface-to-volume ratios than their larger cousins. Such bodies tend to lose the energy left over from their formation quickly and end up geologically dead, lacking the volcanoes, earthquakes and tectonic activity which supply the surface with life-sustaining material and the atmosphere with temperature moderators like carbon dioxide. Plate tectonics appear particularly crucial, at least on Earth: not only does the process recycle important chemicals and minerals, it also fosters bio-diversity through continent creation and increased environmental complexity and helps create the convective cells necessary to generate Earth's magnetic field.[40]

"Low mass" is partly a relative label: the Earth is low mass when compared to the Solar System's gas giants, but it is the largest, by diameter and mass, and the densest of all terrestrial bodies.[e] It is large enough to retain an atmosphere through gravity alone and large enough that its molten core remains a heat engine, driving the diverse geology of the surface (the decay of radioactive elements within a planet's core is the other significant component of planetary heating). Mars, by contrast, is nearly (or perhaps totally) geologically dead and has lost much of its atmosphere.[41] Thus it would be fair to infer that the lower mass limit for habitability lies somewhere between that of Mars and that of Earth or Venus: 0.3 Earth masses has been offered as a rough dividing line for habitable planets.[42] However, a 2008 study by the Harvard-Smithsonian Center for Astrophysics suggests that the dividing line may be higher. Earth may in fact lie on the lower boundary of habitability: if it were any smaller, plate tectonics would be impossible. Venus, which has 85% of Earth's mass, shows no signs of tectonic activity. Conversely, "super-Earths", terrestrial planets with higher masses than Earth, would have higher levels of plate tectonics and thus be firmly placed in the habitable range.[43]

Exceptional circumstances do offer exceptional cases: Jupiter's moon Io (which is smaller than any of the terrestrial planets) is volcanically dynamic because of the gravitational stresses induced by its orbit, and its neighbor Europa may have a liquid ocean or icy slush underneath a frozen shell also due to power generated from orbiting a gas giant.

Saturn's Titan, meanwhile, has an outside chance of harbouring life, as it has retained a thick atmosphere and has liquid methane seas on its surface. Organic-chemical reactions that only require minimum energy are possible in these seas, but whether any living system can be based on such minimal reactions is unclear, and would seem unlikely. These satellites are exceptions, but they prove that mass, as a criterion for habitability, cannot necessarily be considered definitive at this stage of our understanding.[citation needed]

A larger planet is likely to have a more massive atmosphere. A combination of higher escape velocity to retain lighter atoms, and extensive outgassing from enhanced plate tectonics may greatly increase the atmospheric pressure and temperature at the surface compared to Earth. The enhanced greenhouse effect of such a heavy atmosphere would tend to suggest that the habitable zone should be further out from the central star for such massive planets.

Finally, a larger planet is likely to have a large iron core. This allows for a magnetic field to protect the planet from stellar wind and cosmic radiation, which otherwise would tend to strip away planetary atmosphere and to bombard living things with ionized particles. Mass is not the only criterion for producing a magnetic field—as the planet must also rotate fast enough to produce a dynamo effect within its core[44]—but it is a significant component of the process.

Orbit and rotation

As with other criteria, stability is the critical consideration in evaluating the effect of orbital and rotational characteristics on planetary habitability. Orbital eccentricity is the difference between a planet's farthest and closest approach to its parent star divided by the sum of said distances. It is a ratio describing the shape of the elliptical orbit. The greater the eccentricity the greater the temperature fluctuation on a planet's surface. Although they are adaptive, living organisms can stand only so much variation, particularly if the fluctuations overlap both the freezing point and boiling point of the planet's main biotic solvent (e.g., water on Earth). If, for example, Earth's oceans were alternately boiling and freezing solid, it is difficult to imagine life as we know it having evolved. The more complex the organism, the greater the temperature sensitivity.[45] The Earth's orbit is almost wholly circular, with an eccentricity of less than 0.02; other planets in the Solar System (with the exception of Mercury) have eccentricities that are similarly benign.

Data collected on the orbital eccentricities of extrasolar planets has surprised most researchers: 90% have an orbital eccentricity greater than that found within the Solar System, and the average is fully 0.25.[46] This means that the vast majority of planets have highly eccentric orbits and of these, even if their average distance from their star is deemed to be within the HZ, they nonetheless would be spending only a small portion of their time within the zone.

A planet's movement around its rotational axis must also meet certain criteria if life is to have the opportunity to evolve. A first assumption is that the planet should have moderate seasons. If there is little or no axial tilt (or obliquity) relative to the perpendicular of the ecliptic, seasons will not occur and a main stimulant to biospheric dynamism will disappear. The planet would also be colder than it would be with a significant tilt: when the greatest intensity of radiation is always within a few degrees of the equator, warm weather cannot move poleward and a planet's climate becomes dominated by colder polar weather systems.

If a planet is radically tilted, meanwhile, seasons will be extreme and make it more difficult for a biosphere to achieve homeostasis. The axial tilt of the Earth is higher now (in the Quaternary) than it has been in the past, coinciding with reduced polar ice, warmer temperatures and less seasonal variation. Scientists do not know whether this trend will continue indefinitely with further increases in axial tilt (see Snowball Earth).

The exact effects of these changes can only be computer modelled at present, and studies have shown that even extreme tilts of up to 85 degrees do not absolutely preclude life "provided it does not occupy continental surfaces plagued seasonally by the highest temperature."[47] Not only the mean axial tilt, but also its variation over time must be considered. The Earth's tilt varies between 21.5 and 24.5 degrees over 41,000 years. A more drastic variation, or a much shorter periodicity, would induce climatic effects such as variations in seasonal severity.

Other orbital considerations include:
  • The planet should rotate relatively quickly so that the day-night cycle is not overlong. If a day takes years, the temperature differential between the day and night side will be pronounced, and problems similar to those noted with extreme orbital eccentricity will come to the fore.
  • The planet also should rotate quickly enough so that a magnetic dynamo may be started in its iron core to produce a magnetic field.
  • Change in the direction of the axis rotation (precession) should not be pronounced. In itself, precession need not affect habitability as it changes the direction of the tilt, not its degree. However, precession tends to accentuate variations caused by other orbital deviations; see Milankovitch cycles. Precession on Earth occurs over a 26,000-year cycle.
The Earth's Moon appears to play a crucial role in moderating the Earth's climate by stabilising the axial tilt. It has been suggested that a chaotic tilt may be a "deal-breaker" in terms of habitability—i.e. a satellite the size of the Moon is not only helpful but required to produce stability.[48] This position remains controversial.[f]

Geochemistry

It is generally assumed that any extraterrestrial life that might exist will be based on the same fundamental biochemistry as found on Earth, as the four elements most vital for life, carbon, hydrogen, oxygen, and nitrogen, are also the most common chemically reactive elements in the universe. Indeed, simple biogenic compounds, such as very simple amino acids such as glycine, have been found in meteorites and in the interstellar medium.[49] These four elements together comprise over 96% of Earth's collective biomass. Carbon has an unparalleled ability to bond with itself and to form a massive array of intricate and varied structures, making it an ideal material for the complex mechanisms that form living cells. Hydrogen and oxygen, in the form of water, compose the solvent in which biological processes take place and in which the first reactions occurred that led to life's emergence. The energy released in the formation of powerful covalent bonds between carbon and oxygen, available by oxidizing organic compounds, is the fuel of all complex life-forms. These four elements together make up amino acids, which in turn are the building blocks of proteins, the substance of living tissue. In addition, neither sulfur, required for the building of proteins, nor phosphorus, needed for the formation of DNA, RNA, and the adenosine phosphates essential to metabolism, is rare.
Relative abundance in space does not always mirror differentiated abundance within planets; of the four life elements, for instance, only oxygen is present in any abundance in the Earth's crust.[50] This can be partly explained by the fact that many of these elements, such as hydrogen and nitrogen, along with their simplest and most common compounds, such as carbon dioxide, carbon monoxide, methane, ammonia, and water, are gaseous at warm temperatures. In the hot region close to the Sun, these volatile compounds could not have played a significant role in the planets' geological formation. Instead, they were trapped as gases underneath the newly formed crusts, which were largely made of rocky, involatile compounds such as silica (a compound of silicon and oxygen, accounting for oxygen's relative abundance). Outgassing of volatile compounds through the first volcanoes would have contributed to the formation of the planets' atmospheres. The Miller–Urey experiment showed that, with the application of energy, simple inorganic compounds exposed to a primordial atmosphere can react to synthesize amino acids.[51]

Even so, volcanic outgassing could not have accounted for the amount of water in Earth's oceans.[52] The vast majority of the water —and arguably carbon— necessary for life must have come from the outer Solar System, away from the Sun's heat, where it could remain solid. Comets impacting with the Earth in the Solar System's early years would have deposited vast amounts of water, along with the other volatile compounds life requires onto the early Earth, providing a kick-start to the origin of life.

Thus, while there is reason to suspect that the four "life elements" ought to be readily available elsewhere, a habitable system probably also requires a supply of long-term orbiting bodies to seed inner planets. Without comets there is a possibility that life as we know it would not exist on Earth.

Microenvironments and extremophiles


The Atacama Desert provides an analog to Mars and an ideal environment to study the boundary between sterility and habitability.

One important qualification to habitability criteria is that only a tiny portion of a planet is required to support life. Astrobiologists often concern themselves with "micro-environments", noting that "we lack a fundamental understanding of how evolutionary forces, such as mutation, selection, and genetic drift, operate in micro-organisms that act on and respond to changing micro-environments."[53] Extremophiles are Earth organisms that live in niche environments under severe conditions generally considered inimical to life. Usually (although not always) unicellular, extremophiles include acutely alkaliphilic and acidophilic organisms and others that can survive water temperatures above 100 °C in hydrothermal vents.

The discovery of life in extreme conditions has complicated definitions of habitability, but also generated much excitement amongst researchers in greatly broadening the known range of conditions under which life can persist. For example, a planet that might otherwise be unable to support an atmosphere given the solar conditions in its vicinity, might be able to do so within a deep shadowed rift or volcanic cave.[54] Similarly, craterous terrain might offer a refuge for primitive life. The Lawn Hill crater has been studied as an astrobiological analog, with researchers suggesting rapid sediment infill created a protected microenvironment for microbial organisms; similar conditions may have occurred over the geological history of Mars.[55]

Earth environments that cannot support life are still instructive to astrobiologists in defining the limits of what organisms can endure. The heart of the Atacama desert, generally considered the driest place on Earth, appears unable to support life, but it has been subject to study by NASA and ESA for that reason: it provides a Mars analog and the moisture gradients along its edges are ideal for studying the boundary between sterility and habitability.[56] The Atacama was the subject of study in 2003 that partly replicated experiments from the Viking landings on Mars in the 1970s; no DNA could be recovered from two soil samples, and incubation experiments were also negative for biosignatures.[57]

Ecological factors

The two current ecological approaches for predicting the potential habitability use 19 or 20 environmental factors, with emphasis on water availability, temperature, presence of nutrients, an energy source, and protection from solar ultraviolet and galactic cosmic radiation.[58][59]

Some habitability factors[59]
Water  · Activity of liquid water
 · Past or future liquid (ice) inventories
 · Salinity, pH, and Eh of available water
Chemical environment Nutrients:
 · C, H, N, O, P, S, essential metals, essential micronutrients
 · Fixed nitrogen
 · Availability/mineralogy
Toxin abundances and lethality:
 · Heavy metals (e.g. Zn, Ni, Cu, Cr, As, Cd, etc.; some are essential, but toxic at high levels)
 · Globally distributed oxidizing soils
Energy for metabolism Solar (surface and near-surface only)
Geochemical (subsurface)
 · Oxidants
 · Reductants
 · Redox gradients
Conducive
physical conditions
 · Temperature
 · Extreme diurnal temperature fluctuations
 · Low pressure (is there a low-pressure threshold for terrestrial anaerobes?)
 · Strong ultraviolet germicidal irradiation
 · Galactic cosmic radiation and solar particle events (long-term accumulated effects)
 · Solar UV-induced volatile oxidants, e.g. O 2, O, H2O2, O3
 · Climate and its variability (geography, seasons, diurnal, and eventually, obliquity variations)
 · Substrate (soil processes, rock microenvironments, dust composition, shielding)
 · High CO2 concentrations in the global atmosphere
 · Transport (aeolian, ground water flow, surface water, glacial)

Uninhabited habitats

An important distinction in habitability is between habitats that contain active life (inhabited habitats) and habitats that are habitable for life, but uninhabited.[60] Uninhabited (or vacant) habitats could arise on a planet where there was no origin of life (and no transfer of life to the planet from another, inhabited, planet), but where habitable environments exist. They might also occur on a planet that is inhabited, but the lack of connectivity between habitats might mean that many habitats remain uninhabited. Uninhabited habitats underline the importance of decoupling habitability and the presence of life. Charles Cockell and co-workers discuss Mars as one plausible world that might harbor uninhabited habitats. Other stellar systems might host planets that are habitable, but devoid of life.

Alternative star systems

In determining the feasibility of extraterrestrial life, astronomers had long focused their attention on stars like the Sun. However, since planetary systems that resemble the Solar System are proving to be rare, they have begun to explore the possibility that life might form in systems very unlike our own.

Binary systems

Typical estimates often suggest that 50% or more of all stellar systems are binary systems. This may be partly sample bias, as massive and bright stars tend to be in binaries and these are most easily observed and catalogued; a more precise analysis has suggested that the more common fainter stars are usually singular, and that up to two thirds of all stellar systems are therefore solitary.[61]
The separation between stars in a binary may range from less than one astronomical unit (AU, the average Earth–Sun distance) to several hundred. In latter instances, the gravitational effects will be negligible on a planet orbiting an otherwise suitable star and habitability potential will not be disrupted unless the orbit is highly eccentric (see Nemesis, for example). However, where the separation is significantly less, a stable orbit may be impossible. If a planet's distance to its primary exceeds about one fifth of the closest approach of the other star, orbital stability is not guaranteed.[62] Whether planets might form in binaries at all had long been unclear, given that gravitational forces might interfere with planet formation. Theoretical work by Alan Boss at the Carnegie Institution has shown that gas giants can form around stars in binary systems much as they do around solitary stars.[63]

One study of Alpha Centauri, the nearest star system to the Sun, suggested that binaries need not be discounted in the search for habitable planets. Centauri A and B have an 11 AU distance at closest approach (23 AU mean), and both should have stable habitable zones. A study of long-term orbital stability for simulated planets within the system shows that planets within approximately three AU of either star may remain rather stable (i.e. the semi-major axis deviating by less than 5% during 32 000 binary periods). The HZ for Centauri A is conservatively estimated at 1.2 to 1.3 AU and Centauri B at 0.73 to 0.74—well within the stable region in both cases.[64]

Red dwarf systems


Relative star sizes and photospheric temperatures. Any planet around a red dwarf such as the one shown here (Gliese 229A) would have to huddle close to achieve Earth-like temperatures, probably inducing tidal locking. See Aurelia. Credit: MPIA/V. Joergens.

Determining the habitability of red dwarf stars could help determine how common life in the universe might be, as red dwarfs make up between 70 to 90% of all the stars in the galaxy.

Size

Astronomers for many years ruled out red dwarfs as potential abodes for life. Their small size (from 0.08 to 0.45 solar masses) means that their nuclear reactions proceed exceptionally slowly, and they emit very little light (from 3% of that produced by the Sun to as little as 0.01%). Any planet in orbit around a red dwarf would have to huddle very close to its parent star to attain Earth-like surface temperatures; from 0.3 AU (just inside the orbit of Mercury) for a star like Lacaille 8760, to as little as 0.032 AU for a star like Proxima Centauri[65] (such a world would have a year lasting just 6.3 days). At those distances, the star's gravity would cause tidal locking. One side of the planet would eternally face the star, while the other would always face away from it. The only ways in which potential life could avoid either an inferno or a deep freeze would be if the planet had an atmosphere thick enough to transfer the star's heat from the day side to the night side, or if there was a gas giant in the habitable zone, with a habitable moon, which would be locked to the planet instead of the star, allowing a more even distribution of radiation over the planet. It was long assumed that such a thick atmosphere would prevent sunlight from reaching the surface in the first place, preventing photosynthesis.

An artist's impression of GJ 667 Cc, a potentially habitable planet orbiting a red dwarf constituent in a trinary star system.

This pessimism has been tempered by research. Studies by Robert Haberle and Manoj Joshi of NASA's Ames Research Center in California have shown that a planet's atmosphere (assuming it included greenhouse gases CO2 and H2O) need only be 100 mbs, or 10% of Earth's atmosphere, for the star's heat to be effectively carried to the night side.[66] This is well within the levels required for photosynthesis, though water would still remain frozen on the dark side in some of their models. Martin Heath of Greenwich Community College, has shown that seawater, too, could be effectively circulated without freezing solid if the ocean basins were deep enough to allow free flow beneath the night side's ice cap. Further research—including a consideration of the amount of photosynthetically active radiation—suggested that tidally locked planets in red dwarf systems might at least be habitable for higher plants.[67]

Other factors limiting habitability

Size is not the only factor in making red dwarfs potentially unsuitable for life, however. On a red dwarf planet, photosynthesis on the night side would be impossible, since it would never see the sun. On the day side, because the sun does not rise or set, areas in the shadows of mountains would remain so forever. Photosynthesis as we understand it would be complicated by the fact that a red dwarf produces most of its radiation in the infrared, and on the Earth the process depends on visible light. There are potential positives to this scenario. Numerous terrestrial ecosystems rely on chemosynthesis rather than photosynthesis, for instance, which would be possible in a red dwarf system. A static primary star position removes the need for plants to steer leaves toward the sun, deal with changing shade/sun patterns, or change from photosynthesis to stored energy during night. Because of the lack of a day-night cycle, including the weak light of morning and evening, far more energy would be available at a given radiation level.

Red dwarfs are far more variable and violent than their more stable, larger cousins. Often they are covered in starspots that can dim their emitted light by up to 40% for months at a time, while at other times they emit gigantic flares that can double their brightness in a matter of minutes.[68] Such variation would be very damaging for life, as it would not only destroy any complex organic molecules that could possibly form biological precursors, but also because it would blow off sizeable portions of the planet's atmosphere.

For a planet around a red dwarf star to support life, it would require a rapidly rotating magnetic field to protect it from the flares. However, a tidally locked planet rotates only very slowly, and so cannot produce a geodynamo at its core. However, the violent flaring period of a red dwarf's life cycle is estimated to only last roughly the first 1.2 billion years of its existence. If a planet forms far away from a red dwarf so as to avoid tidal locking, and then migrates into the star's habitable zone after this turbulent initial period, it is possible that life may have a chance to develop.[69]

Longevity and ubiquity

There is, however, one major advantage that red dwarfs have over other stars as abodes for life: they live a long time. It took 4.5 billion years before humanity appeared on Earth, and life as we know it will see suitable conditions for 1[70] to 2.3[71] billion years more. Red dwarfs, by contrast, could live for trillions of years because their nuclear reactions are far slower than those of larger stars, meaning that life would have longer to evolve and survive.

While the odds of finding a planet in the habitable zone around any specific red dwarf are slim, the total amount of habitable zone around all red dwarfs combined is equal to the total amount around Sun-like stars given their ubiquity.[72] Furthermore, this total amount of habitable zone will last longer, because red dwarf stars live for hundreds of billions of years or even longer on the main sequence.[73]

Massive stars

Recent research suggests that very large stars, greater than ~100 solar masses, could have planetary systems consisting of hundreds of Mercury-sized planets within the habitable zone. Such systems could also contain brown dwarfs and low-mass stars (~0.1–0.3 solar masses).[74] However the very short lifespans of stars of more than a few solar masses would scarcely allow time for a planet to cool, let alone the time needed for a stable biosphere to develop. Massive stars are thus eliminated as possible abodes for life.[75]

However, a massive-star system could be a progenitor of life in another way – the supernova explosion of the massive star in the central part of the system. This supernova will disperse heavier elements throughout its vicinity, created during the phase when the massive star has moved off of the main sequence, and the systems of the potential low-mass stars (which are still on the main sequence) within the former massive-star system may be enriched with the relatively large supply of the heavy elements so close to a supernova explosion. However, this states nothing about what types of planets would form as a result of the supernova material, or what their habitability potential would be.

The galactic neighborhood

Along with the characteristics of planets and their star systems, the wider galactic environment may also impact habitability. Scientists considered the possibility that particular areas of galaxies (galactic habitable zones) are better suited to life than others; the Solar System in which we live, in the Orion Spur, on the Milky Way galaxy's edge is considered to be in a life-favorable spot:[76]
  • It is not in a globular cluster where immense star densities are inimical to life, given excessive radiation and gravitational disturbance. Globular clusters are also primarily composed of older, probably metal-poor, stars. Furthermore, in globular clusters, the great ages of the stars would mean a large amount of stellar evolution by the host or other nearby stars, which due to their proximity may cause extreme harm to life on any planets, provided that they can form.
  • It is not near an active gamma ray source.
  • It is not near the galactic center where once again star densities increase the likelihood of ionizing radiation (e.g., from magnetars and supernovae). A supermassive black hole is also believed to lie at the middle of the galaxy which might prove a danger to any nearby bodies.
  • The circular orbit of the Sun around the galactic center keeps it out of the way of the galaxy's spiral arms where intense radiation and gravitation may again lead to disruption.[77]
Thus, relative isolation is ultimately what a life-bearing system needs. If the Sun were crowded amongst other systems, the chance of being fatally close to dangerous radiation sources would increase significantly. Further, close neighbors might disrupt the stability of various orbiting bodies such as Oort cloud and Kuiper belt objects, which can bring catastrophe if knocked into the inner Solar System.

While stellar crowding proves disadvantageous to habitability, so too does extreme isolation. A star as metal-rich as the Sun would probably not have formed in the very outermost regions of the Milky Way given a decline in the relative abundance of metals and a general lack of star formation. Thus, a "suburban" location, such as the Solar System enjoys, is preferable to a Galaxy's center or farthest reaches.[78]

Other considerations

Alternative biochemistries

While most investigations of extraterrestrial life start with the assumption that advanced life-forms must have similar requirements for life as on Earth, the hypothesis of other types of biochemistry suggests the possibility of lifeforms evolving around a different metabolic mechanism. In Evolving the Alien, biologist Jack Cohen and mathematician Ian Stewart argue astrobiology, based on the Rare Earth hypothesis, is restrictive and unimaginative. They suggest that Earth-like planets may be very rare, but non-carbon-based complex life could possibly emerge in other environments. The most frequently mentioned alternative to carbon is silicon-based life, while ammonia and hydrocarbons are sometimes suggested as alternative solvents to water. The astrobiologist Dirk Schulze-Makuch and other scientists have proposed a Planet Habitability Index whose criteria include "potential for holding a liquid solvent" that is not necessarily restricted to water.[79][80]
More speculative ideas have focused on bodies altogether different from Earth-like planets. Astronomer Frank Drake, a well-known proponent of the search for extraterrestrial life, imagined life on a neutron star: submicroscopic "nuclear molecules" combining to form creatures with a life cycle millions of times quicker than Earth life.[81] Called "imaginative and tongue-in-cheek", the idea gave rise to science fiction depictions.[82] Carl Sagan, another optimist with regards to extraterrestrial life, considered the possibility of organisms that are always airborne within the high atmosphere of Jupiter in a 1976 paper.[33][34] Cohen and Stewart also envisioned life in both a solar environment and in the atmosphere of a gas giant.

"Good Jupiters"

"Good Jupiters" are gas giants, like the Solar System's Jupiter, that orbit their stars in circular orbits far enough away from the habitable zone not to disturb it but close enough to "protect" terrestrial planets in closer orbit in two critical ways. First, they help to stabilize the orbits, and thereby the climates, of the inner planets. Second, they keep the inner Solar System relatively free of comets and asteroids that could cause devastating impacts.[83] Jupiter orbits the Sun at about five times the distance between the Earth and the Sun. This is the rough distance we should expect to find good Jupiters elsewhere. Jupiter's "caretaker" role was dramatically illustrated in 1994 when Comet Shoemaker–Levy 9 impacted the giant; had Jovian gravity not captured the comet, it may well have entered the inner Solar System.

However, the story is not quite so clear cut. Research has shown that Jupiter's role in determining the rate at which objects hit the Earth is, at the very least, significantly more complicated than once thought.[84][85][86][87] Whilst for the long-period comets (which contribute only a small fraction of the impact risk to the Earth) it is true that Jupiter acts as a shield, it actually seems to increase the rate at which asteroids and short-period comets are flung towards our planet. Were Jupiter absent, it seems likely that the Earth would actually experience significantly fewer impacts from potentially hazardous objects. By extension, it is becoming clear that the presence of Jupiter-like planets is no longer required as a pre-requisite for planetary habitability – indeed, our first searches for life beyond the Solar System might be better directed to systems where no such planet has formed, since in those systems, less material will be directed to impact on the potentially inhabited planets.

The role of Jupiter in the early history of the Solar System is somewhat better established, and the source of significantly less debate. Early in the Solar System's history, Jupiter is accepted as having played an important role in the hydration of our planet: it increased the eccentricity of asteroid belt orbits and enabled many to cross Earth's orbit and supply the planet with important volatiles. Before Earth reached half its present mass, icy bodies from the Jupiter–Saturn region and small bodies from the primordial asteroid belt supplied water to the Earth due to the gravitational scattering of Jupiter and, to a lesser extent, Saturn.[88] Thus, while the gas giants are now helpful protectors, they were once suppliers of critical habitability material.

In contrast, Jupiter-sized bodies that orbit too close to the habitable zone but not in it (as in 47 Ursae Majoris), or have a highly elliptical orbit that crosses the habitable zone (like 16 Cygni B) make it very difficult for an independent Earth-like planet to exist in the system. See the discussion of a stable habitable zone above. However, during the process of migrating into a habitable zone, a Jupiter-size planet may capture a terrestrial planet as a moon. Even if such a planet is initially loosely bound and following a strongly inclined orbit, gravitational interactions with the star can stabilize the new moon into a close, circular orbit that is coplanar with the planet's orbit around the star.[89]

Life's impact on habitability

A supplement to the factors that support life's emergence is the notion that life itself, once formed, becomes a habitability factor in its own right. An important Earth example was the production of oxygen by ancient cyanobacteria, and eventually photosynthesizing plants, leading to a radical change in the composition of Earth's atmosphere. This oxygen would prove fundamental to the respiration of later animal species. The Gaia hypothesis, a class of scientific models of the geo-biosphere pioneered by Sir James Lovelock in 1975, argues that life as a whole fosters and maintains suitable conditions for itself by helping to create a planetary environment suitable for its continuity. Similarly, David Grinspoon has suggested a "living worlds hypothesis" in which our understanding of what constitutes habitability cannot be separated from life already extant on a planet. Planets that are geologically and meteorologically alive are much more likely to be biologically alive as well and "a planet and its life will co-evolve."[90]

Wednesday, September 23, 2015

Search for extraterrestrial intelligence


From Wikipedia, the free encyclopedia

Screen shot of the screensaver for SETI@home, a distributed computing project in which volunteers donate idle computer power to analyze radio signals for signs of extraterrestrial intelligence

The search for extraterrestrial intelligence (SETI) is the collective name for scientific activities undertaken to search for intelligent extraterrestrial life. For example, electromagnetic radiation is monitored for signs of transmissions from civilizations on other worlds.[1][2]
There are great challenges in searching the universe for signs of intelligent life, including their identification and interpretation. As various SETI projects have progressed, some have criticized early claims by researchers as being too "euphoric".[3]

Scientific investigation of the potential phenomenon began shortly after the advent of radio in the early 1900s. Focused international efforts to answer a variety of scientific questions have been going on since the 1980s. More recently, Stephen Hawking, British physicist, and Yuri Milner, Russian billionaire, along with the SETI Institute, announced a well-funded effort, called the Breakthrough Initiatives, to expand efforts to search for extraterrestrial life.[4]

History of SETI

Early work

As early as 1896, Nikola Tesla suggested that an extreme version of his wireless electrical transmission system could be used to contact beings on Mars.[5] In 1899 while conducting experiments at his Colorado Springs experimental station he thought he had detected a signal from the planet since an odd repetitive static signal seemed to cut off when Mars set in the night sky. Analysis of Tesla's research has ranged from suggestions that Tesla detected nothing, he simply was misunderstanding the new technology he was working with,[6] to claims that Tesla may have been observing signals from Marconi's European radio experiments and even that he could have picked up naturally occurring Jovian plasma torus signals.[7] In the early 1900s, Guglielmo Marconi, Lord Kelvin, and David Peck Todd also stated their belief that radio could be used to contact Martians, with Marconi stating that his stations had also picked up potential Martian signals.[8][better source needed]

On August 21–23, 1924, Mars entered an opposition closer to Earth than any time in a century before or the next 80 years.[9] In the United States, a "National Radio Silence Day" was promoted during a 36-hour period from the 21–23, with all radios quiet for five minutes on the hour, every hour. At the United States Naval Observatory, a radio receiver was lifted 3 kilometers (2 miles) above the ground in a dirigible tuned to a wavelength between 8 and 9 kilometers (~5 miles), using a "radio-camera" developed by Amherst College and Charles Francis Jenkins. The program was led by David Peck Todd with the military assistance of Admiral Edward W. Eberle (Chief of Naval Operations), with William F. Friedman (chief cryptographer of the United States Army), assigned to translate any potential Martian messages.[10][11]

A 1959 paper by Philip Morrison and Giuseppe Cocconi first pointed out the possibility of searching the microwave spectrum, and proposed frequencies and a set of initial targets[12][13]

In 1960, Cornell University astronomer Frank Drake performed the first modern SETI experiment, named "Project Ozma", after the Queen of Oz in L. Frank Baum's fantasy books.[14] Drake used a radio telescope 26 meters in diameter at Green Bank, West Virginia, to examine the stars Tau Ceti and Epsilon Eridani near the 1.420 gigahertz marker frequency, a region of the radio spectrum dubbed the "water hole" due to its proximity to the hydrogen and hydroxyl radical spectral lines. A 400 kilohertz band was scanned around the marker frequency, using a single-channel receiver with a bandwidth of 100 hertz. He found nothing of interest..

The Soviet scientists took a strong interest in SETI during the 1960s and performed a number of searches with omnidirectional antennas in the hope of picking up powerful radio signals. Soviet astronomer Iosif Shklovsky wrote the pioneering book in the field Universe, Life, Intelligence (1962), which was expanded upon by American astronomer Carl Sagan as the best-selling Intelligent Life in the Universe (1966).[15]

In the March 1955 issue of Scientific American, John D. Kraus described a concept to scan the cosmos for natural radio signals using a flat-plane radio telescope equipped with a parabolic reflector. Within two years, his concept was approved for construction by Ohio State University. With US$71,000 total in grants from the National Science Foundation, construction began on a 20-acre (8.1 ha) plot in Delaware, Ohio. This Ohio State University Radio Observatory telescope was called "Big Ear". Later, it began the world's first continuous SETI program, called the Ohio State University SETI program.

In 1971, NASA funded a SETI study that involved Drake, Bernard M. Oliver of Hewlett-Packard Corporation, and others. The resulting report proposed the construction of an Earth-based radio telescope array with 1,500 dishes known as "Project Cyclops". The price tag for the Cyclops array was US$10 billion. Cyclops was not built, but the report[16] formed the basis of much SETI work that followed.

The WOW! Signal
Credit: The Ohio State University Radio Observatory and the North American AstroPhysical Observatory (NAAPO).

The OSU SETI program gained fame on August 15, 1977, when Jerry Ehman, a project volunteer, witnessed a startlingly strong signal received by the telescope. He quickly circled the indication on a printout and scribbled the exclamation "Wow!" in the margin. Dubbed the Wow! signal, it is considered by some[who?] to be the best candidate for a radio signal from an artificial, extraterrestrial source ever discovered, but it has not been detected again in several additional searches.[17]

Sentinel, META, and BETA

In 1980, Carl Sagan, Bruce Murray, and Louis Friedman founded the U.S. Planetary Society, partly as a vehicle for SETI studies.

In the early 1980s, Harvard University physicist Paul Horowitz took the next step and proposed the design of a spectrum analyzer specifically intended to search for SETI transmissions. Traditional desktop spectrum analyzers were of little use for this job, as they sampled frequencies using banks of analog filters and so were restricted in the number of channels they could acquire. However, modern integrated-circuit digital signal processing (DSP) technology could be used to build autocorrelation receivers to check far more channels. This work led in 1981 to a portable spectrum analyzer named "Suitcase SETI" that had a capacity of 131,000 narrow band channels. After field tests that lasted into 1982, Suitcase SETI was put into use in 1983 with the 26-meter (85 ft) Harvard/Smithsonian radio telescope at Oak Ridge Observatory in Harvard, Massachusetts. This project was named "Sentinel" and continued into 1985.

Even 131,000 channels were not enough to search the sky in detail at a fast rate, so Suitcase SETI was followed in 1985 by Project "META", for "Megachannel Extra-Terrestrial Assay". The META spectrum analyzer had a capacity of 8.4 million channels and a channel resolution of 0.05 hertz. An important feature of META was its use of frequency Doppler shift to distinguish between signals of terrestrial and extraterrestrial origin. The project was led by Horowitz with the help of the Planetary Society, and was partly funded by movie maker Steven Spielberg. A second such effort, META II, was begun in Argentina in 1990, to search the southern sky. META II is still in operation, after an equipment upgrade in 1996.

The follow-on to META was named "BETA", for "Billion-channel Extraterrestrial Assay", and it commenced observation on October 30, 1995. The heart of BETA's processing capability consisted of 63 dedicated fast Fourier transform (FFT) engines, each capable of performing a 222-point complex FFTs in two seconds, and 21 general-purpose personal computers equipped with custom digital signal processing boards. This allowed BETA to receive 250 million simultaneous channels with a resolution of 0.5 hertz per channel. It scanned through the microwave spectrum from 1.400 to 1.720 gigahertz in eight hops, with two seconds of observation per hop. An important capability of the BETA search was rapid and automatic re-observation of candidate signals, achieved by observing the sky with two adjacent beams, one slightly to the east and the other slightly to the west. A successful candidate signal would first transit the east beam, and then the west beam and do so with a speed consistent with Earth's sidereal rotation rate. A third receiver observed the horizon to veto signals of obvious terrestrial origin. On March 23, 1999, the 26-meter radio telescope on which Sentinel, META and BETA were based was blown over by strong winds and seriously damaged.[18] This forced the BETA project to cease operation.

MOP and Project Phoenix


Sensitivity vs range for SETI radio searches. The diagonal lines show transmitters of different effective powers. The x-axis is the sensitivity of the search. The y-axis on the right is the range in light-years, and on the left is the number of Sun-like stars within this range. The vertical line labeled SS is the typical sensitivity achieved by a full sky search, such as BETA above. The vertical line labeled TS is the typical sensitivity achieved by a targeted search such as Phoenix.[19]

In 1978, the NASA SETI program had been heavily criticized by Senator William Proxmire, and funding for SETI research was removed from the NASA budget by Congress in 1981;[20] however, funding was restored in 1982, after Carl Sagan talked with Proxmire and convinced him of the program's value.[20] In 1992, the U.S. government funded an operational SETI program, in the form of the NASA Microwave Observing Program (MOP). MOP was planned as a long-term effort to conduct a general survey of the sky and also carry out targeted searches of 800 specific nearby stars. MOP was to be performed by radio antennas associated with the NASA Deep Space Network, as well as the 140-foot (43 m) radio telescope of the National Radio Astronomy Observatory at Green Bank, West Virginia and the 1,000-foot (300 m) radio telescope at the Arecibo Observatory in Puerto Rico. The signals were to be analyzed by spectrum analyzers, each with a capacity of 15 million channels. These spectrum analyzers could be grouped together to obtain greater capacity. Those used in the targeted search had a bandwidth of 1 hertz per channel, while those used in the sky survey had a bandwidth of 30 hertz per channel.

MOP drew the attention of the United States Congress, where the program was ridiculed[21] and canceled one year after its start.[20] SETI advocates continued without government funding, and in 1995 the nonprofit SETI Institute of Mountain View, California resurrected the MOP program under the name of Project "Phoenix", backed by private sources of funding. Project Phoenix, under the direction of Jill Tarter, is a continuation of the targeted search program from MOP and studies roughly 1,000 nearby Sun-like stars. From 1995 through March 2004, Phoenix conducted observations at the 64-meter (210 ft) Parkes radio telescope in Australia, the 140-foot (43 m) radio telescope of the National Radio Astronomy Observatory in Green Bank, West Virginia, and the 1,000-foot (300 m) radio telescope at the Arecibo Observatory in Puerto Rico. The project observed the equivalent of 800 stars over the available channels in the frequency range from 1200 to 3000 MHz. The search was sensitive enough to pick up transmitters with 1 GW EIRP to a distance of about 200 light-years. According to Prof. Tarter, in 2012 it costs around "$2 million per year to keep SETI research going at the SETI Institute" and approximately 10 times that to support "all kinds of SETI activity around the world."[22]

Ongoing radio searches


Microwave window as seen by a ground based system. From NASA report SP-419: SETI – the Search for Extraterrestrial Intelligence

Many radio frequencies penetrate Earth's atmosphere quite well, and this led to radio telescopes that investigate the cosmos using large radio antennas. Furthermore, human endeavors emit considerable electromagnetic radiation as a byproduct of communications such as television and radio. These signals would be easy to recognize as artificial due to their repetitive nature and narrow bandwidths. If this is typical, one way of discovering an extraterrestrial civilization might be to detect artificial radio emissions from a location outside the Solar System.

Allen Telescope Array

The SETI Institute collaborated with the Radio Astronomy Laboratory at University of California, Berkeley to develop a specialized radio telescope array for SETI studies, something like a mini-cyclops array. Formerly known as the One Hectare Telescope (1HT), the concept was renamed the "Allen Telescope Array" (ATA) after the project's benefactor Paul Allen. Its sensitivity would be equivalent to a single large dish more than 100 meters in diameter if completed. Presently, the array under construction has 42 dishes at the Hat Creek Radio Observatory in rural northern California.[23][24]

The full array (ATA-350) is planned to consist of 350 or more offset-Gregorian radio dishes, each 6.1 meters (20 feet) in diameter. These dishes are the largest producible with commercially available satellite television dish technology. The ATA was planned for a 2007 completion date, at a very modest cost of US$25 million. The SETI Institute provided money for building the ATA while University of California, Berkeley designed the telescope and provided operational funding. The first portion of the array (ATA-42) became operational in October 2007 with 42 antennas. The DSP system planned for ATA-350 is extremely ambitious. Completion of the full 350 element array will depend on funding and the technical results from ATA-42.

ATA-42 (ATA) is designed to allow multiple observers simultaneous access to the interferometer output at the same time. Typically, the ATA snapshot imager (used for astronomical surveys and SETI) is run in parallel with the beam forming system (used primarily for SETI).[25] ATA also supports observations in multiple synthesized pencil beams at once, through a technique known as "multibeaming." Multibeaming provides an effective filter for identifying false positives in SETI, since a very distant transmitter must appear at only one point on the sky.[26][27][28]

SETI Institute's Center for SETI Research (CSR) uses ATA in the search for extraterrestrial intelligence, observing 12 hours a day, 7 days a week. From 2007-2015, ATA has identified hundreds of millions of technological signals. So far, all these signals have been assigned the status of noise or radio frequency interference because a) they appear to be generated by satellites or Earth-based transmitters, or b) they disappeared before the threshold time limit of ~1 hour.[29][30] Researchers in CSR are presently working on ways to reduce the threshold time limit, and to expand ATA's capabilities for detection of signals that may have embedded messages.[31]

Berkeley astronomers used the ATA to pursue several science topics, some of which might have turned up transient SETI signals,[32][33][34] until 2011, when the collaboration between the University of California and the SETI Institute was terminated. The DSP system planned for the ATA is extremely ambitious. The first portion of the array became operational in October 2007 with 42 antennas. Completion of the full 350 element array will depend on funding and the technical results from the 42-element sub-array.

CNET published an article and pictures about the Allen Telescope Array (ATA) on December 12, 2008.[35][36]

In April 2011, the ATA was forced to enter an 8-month "hibernation" due to funding shortfalls. Regular operation of the ATA was resumed on December 5, 2011.[37][38]

In 2012, new life was breathed into the ATA thanks to a $3.6M philanthropic donation by Franklin Antonio, Co-Founder and Chief Scientist of QUALCOMM Incorporated.[39] This gift supports upgrades of all the receivers on the ATA dishes to have dramatically (2x - 10x from 1–8 GHz) greater sensitivity than before and supporting sensitive observations over a wider frequency range from 1–18 GHz, though initially the radio frequency electronics go to only 12 GHz. As of July, 2013 the first of these receivers was installed and proven. Full installation on all 42 antennas is expected in June, 2014. ATA is especially well suited to the search for extraterrestrial intelligence SETI and to discovery of astronomical radio sources, such as heretofore unexplained non-repeating, possibly extragalactic, pulses known as fast radio bursts or FRBs.

View of Arecibo Observatory in Puerto Rico with its 300 m (980 ft) dish- the world's largest. A small fraction of its observation time is devoted to SETI searches.

SERENDIP

SERENDIP (Search for Extraterrestrial Radio Emissions from Nearby Developed Intelligent Populations) is a SETI program launched in 1979 by the University of California, Berkeley.[40] SERENDIP takes advantage of ongoing "mainstream" radio telescope observations as a "piggy-back" or "commensal" program, using large radio telescopes including the NRAO 90m telescope at Green Bank and the Arecibo 305m telescope. Rather than having its own observation program, SERENDIP analyzes deep space radio telescope data that it obtains while other astronomers are using the telescopes.
The most recently deployed SERENDIP spectrometer, SERENDIP V.v, was installed at the Arecibo Observatory in June 2009 and is currently operational. The digital back-end instrument is an FPGA-based 128 million-channel digital spectrometer covering 200 MHz of bandwidth. It takes data commensally with the seven-beam Arecibo L-band Feed Array[41] (ALFA). The program has found around 400 suspicious signals, but there is not enough data to prove that they belong to extraterrestrial intelligence.[42]

Breakthrough Listen

Breakthrough Listen is a ten-year initiative with $100 million funding begun in July 2015 to actively search for intelligent extraterrestrial communications in the universe, in a substantially expanded way, using resources that had not previously been extensively used for the purpose.[43][44][45] It has been described as the most comprehensive search for alien communications to date.[44]
Announced in July 2015, the project will use thousands of hours every year on two major radiotelescopes, the Green Bank Observatory in West Virginia and the Parkes Observatory in Australia.[46] Previously, only about 24 to 36 hours of telescope per year was used in the search for alien life.[44] Furthermore, the Automated Planet Finder of Lick Observatory will search for optical signals coming from laser transmissions. For processing of the massive data, the experience of SETI and SETI@home will be used.[46] SETI founder Frank Drake is one of the project's scientists.[43][44]

Community SETI projects

SETI@home


SETI@home logo

SETI@home was conceived by David Gedye along with Craig Kasnoff and is a popular volunteer distributed computing project that was launched by the University of California, Berkeley, in May 1999. It was originally funded by The Planetary Society and Paramount Pictures, and later by the state of California. The project is run by director David P. Anderson and chief scientist Dan Werthimer. Any individual can become involved with SETI research by downloading the Berkeley Open Infrastructure for Network Computing (BOINC) software program, attaching to the SETI@home project, and allowing the program to run as a background process that uses idle computer power. The SETI@home program itself runs signal analysis on a "work unit" of data recorded from the central 2.5 MHz wide band of the SERENDIP IV instrument. After computation on the work unit is complete, the results are then automatically reported back to SETI@home servers at University of California, Berkeley. By June 28, 2009, the SETI@home project had over 180,000 active participants volunteering a total of over 290,000 computers. These computers give SETI@home an average computational power of 617 teraFLOPS.[47] In 2004 radio source SHGb02+14a was an interesting signal but was quickly shown to have a natural source.[48][49]

As of 2010, after 10 years of data collection, SETI@home has listened to that one frequency at every point of over 67 percent of the sky observable from Arecibo with at least three scans (out of the goal of nine scans), which covers about 20 percent of the full celestial sphere.[50]

SETI Net

SETI Net is a private search system created by a single individual. It is closely affiliated with the SETI League and is one of the project Argus stations (DM12jw).

The SETI Net station consists of off-the-shelf, consumer-grade electronics to minimize cost and to allow this design to be replicated as simply as possible. It has a 3-meter parabolic antenna that can be directed in azimuth and elevation, an LNA that covers the 1420 MHz spectrum, a receiver to reproduce the wideband audio, and a standard personal computer as the control device and for deploying the detection algorithms.

The antenna can be pointed and locked to one sky location, enabling the system to integrate on it for long periods. Currently the Wow! signal area is being monitored when it is above the horizon. All search data are collected and made available on the Internet archive.

SETI Net started operation in the early 1980s as a way to learn about the science of the search, and has developed several software packages for the amateur SETI community. It has provided an astronomical clock, a file manager to keep track of SETI data files, a spectrum analyzer optimized for amateur SETI, remote control of the station from the Internet, and other packages.

The SETI League and Project Argus

Founded in 1994 in response to the United States Congress cancellation of the NASA SETI program, The SETI League, Inc. is a membership-supported nonprofit organization with 1,500 members in 62 countries. This grass-roots alliance of amateur and professional radio astronomers is headed by executive director emeritus H. Paul Shuch, the engineer credited with developing the world's first commercial home satellite TV receiver. Many SETI League members are licensed radio amateurs and microwave experimenters. Others are digital signal processing experts and computer enthusiasts.

The SETI League pioneered the conversion of backyard satellite TV dishes 3 to 5 m (10–16 ft) in diameter into research-grade radio telescopes of modest sensitivity.[51] The organization concentrates on coordinating a global network of small, amateur-built radio telescopes under Project Argus, an all-sky survey seeking to achieve real-time coverage of the entire sky.[52] Project Argus was conceived as a continuation of the all-sky survey component of the late NASA SETI program (the targeted search having been continued by the SETI Institute's Project Phoenix). There are currently 143 Project Argus radio telescopes operating in 27 countries. Project Argus instruments typically exhibit sensitivity on the order of 10−23 Watts/square metre, or roughly equivalent to that achieved by the Ohio State University Big Ear radio telescope in 1977, when it detected the landmark "Wow!" candidate signal.

The name "Argus" derives from the mythical Greek guard-beast who had 100 eyes, and could see in all directions at once. In the SETI context, the name has been used for radio telescopes in fiction (Arthur C. Clarke, "Imperial Earth"; Carl Sagan, "Contact"), was the name initially used for the NASA study ultimately known as "Cyclops," and is the name given to an omnidirectional radio telescope design being developed at the Ohio State University.

Optical experiments

While most SETI sky searches have studied the radio spectrum, some SETI researchers have considered the possibility that alien civilizations might be using powerful lasers for interstellar communications at optical wavelengths. The idea was first suggested by R. N. Schwartz and Charles Hard Townes in a 1961 paper published in the journal Nature titled "Interstellar and Interplanetary Communication by Optical Masers". However, the 1971 Cyclops study discounted the possibility of optical SETI, reasoning that construction of a laser system that could outshine the bright central star of a remote star system would be too difficult. In 1983, Townes published a detailed study of the idea in the United States journal Proceedings of the National Academy of Sciences, which was met with widespread agreement by the SETI community.[citation needed]

There are two problems with optical SETI. The first problem is that lasers are highly "monochromatic", that is, they emit light only on one frequency, making it troublesome to figure out what frequency to look for. However, according to the uncertainty principle, emitting light in narrow pulses results in a broad spectrum of emission; the spread in frequency becomes higher as the pulse width becomes narrower, making it easier to detect an emission.

The other problem is that while radio transmissions can be broadcast in all directions, lasers are highly directional. This means that a laser beam could be easily blocked by clouds of interstellar dust, and Earth would have to cross its direct line of fire by chance to receive it.

Optical SETI supporters have conducted paper studies[53] of the effectiveness of using contemporary high-energy lasers and a ten-meter diameter mirror as an interstellar beacon. The analysis shows that an infrared pulse from a laser, focused into a narrow beam by such a mirror, would appear thousands of times brighter than the Sun to a distant civilization in the beam's line of fire. The Cyclops study proved incorrect in suggesting a laser beam would be inherently hard to see.

Such a system could be made to automatically steer itself through a target list, sending a pulse to each target at a constant rate. This would allow targeting of all Sun-like stars within a distance of 100 light-years. The studies have also described an automatic laser pulse detector system with a low-cost, two-meter mirror made of carbon composite materials, focusing on an array of light detectors. This automatic detector system could perform sky surveys to detect laser flashes from civilizations attempting contact.

Several optical SETI experiments are now in progress. A Harvard-Smithsonian group that includes Paul Horowitz designed a laser detector and mounted it on Harvard's 155 centimeters (61 inches) optical telescope. This telescope is currently being used for a more conventional star survey, and the optical SETI survey is "piggybacking" on that effort. Between October 1998 and November 1999, the survey inspected about 2,500 stars. Nothing that resembled an intentional laser signal was detected, but efforts continue. The Harvard-Smithsonian group is now working with Princeton University to mount a similar detector system on Princeton's 91-centimeter (36-inch) telescope. The Harvard and Princeton telescopes will be "ganged" to track the same targets at the same time, with the intent being to detect the same signal in both locations as a means of reducing errors from detector noise.

The Harvard-Smithsonian group is now building a dedicated all-sky optical survey system along the lines of that described above, featuring a 1.8-meter (72-inch) telescope. The new optical SETI survey telescope is being set up at the Oak Ridge Observatory in Harvard, Massachusetts.

The University of California, Berkeley, home of SERENDIP and SETI@home, is also conducting optical SETI searches. One is being directed by Geoffrey Marcy, an extrasolar planet hunter, and involves examination of records of spectra taken during extrasolar planet hunts for a continuous, rather than pulsed, laser signal. The other Berkeley optical SETI effort is more like that being pursued by the Harvard-Smithsonian group and is being directed by Dan Werthimer of Berkeley, who built the laser detector for the Harvard-Smithsonian group. The Berkeley survey uses a 76-centimeter (30-inch) automated telescope at Leuschner Observatory and an older laser detector built by Werthimer.

The 74m Colossus Telescope[54] is designed to detect optical and thermal signatures of extraterrestrial civilizations from planetary systems within 60 light-years from the Sun.

Gamma-ray bursts

Gamma-ray bursts (GRBs) are candidates for extraterrestrial communication. These high-energy bursts are observed about once per day and originate throughout the observable universe. SETI currently omits gamma ray frequencies in their monitoring and analysis because they are absorbed by the Earth's atmosphere and difficult to detect with ground-based receivers. In addition, the wide burst bandwidths pose a serious analysis challenge for modern digital signal processing systems. Still, the continued mysteries surrounding gamma-ray bursts have encouraged hypotheses invoking extraterrestrials. John A. Ball from the MIT Haystack Observatory suggests that an advanced civilization that has reached a technological singularity would be capable of transmitting a two-millisecond pulse encoding 1×1018 bits of information. This is "comparable to the estimated total information content of Earth's biosystem—genes and memes and including all libraries and computer media."[55]

Search for extraterrestrial artifacts

The possibility of using interstellar messenger probes in the search for extraterrestrial intelligence was first suggested by Ronald N. Bracewell in 1960 (see Bracewell probe), and the technical feasibility of this approach was demonstrated by the British Interplanetary Society's starship study Project Daedalus in 1978. Starting in 1979, Robert Freitas advanced arguments[56][57][58] for the proposition that physical space-probes are a superior mode of interstellar communication to radio signals. See Voyager Golden Record.

In recognition that any sufficiently advanced interstellar probe in the vicinity of Earth could easily monitor the terrestrial Internet, Invitation to ETI was established by Prof. Allen Tough in 1996, as a Web-based SETI experiment inviting such spacefaring probes to establish contact with humanity. The project's 100 Signatories includes prominent physical, biological, and social scientists, as well as artists, educators, entertainers, philosophers and futurists. Prof. H. Paul Shuch, executive director emeritus of The SETI League, serves as the project's Principal Investigator.

Inscribing a message in matter and transporting it to an interstellar destination can be enormously more energy efficient than communication using electromagnetic waves if delays larger than light transit time can be tolerated.[59] That said, for simple messages such as "hello," radio SETI could be far more efficient.[60] If energy requirement is used as a proxy for technical difficulty, then a solarcentric Search for Extraterrestrial Artifacts (SETA)[61] may be a useful supplement to traditional radio or optical searches.[62][63]

Much like the "preferred frequency" concept in SETI radio beacon theory, the Earth-Moon or Sun-Earth libration orbits[64] might therefore constitute the most universally convenient parking places for automated extraterrestrial spacecraft exploring arbitrary stellar systems. A viable long-term SETI program may be founded upon a search for these objects.

In 1979, Freitas and Valdes conducted a photographic search of the vicinity of the Earth-Moon triangular libration points L4 and L5, and of the solar-synchronized positions in the associated halo orbits, seeking possible orbiting extraterrestrial interstellar probes, but found nothing to a detection limit of about 14th magnitude.[64] The authors conducted a second, more comprehensive photographic search for probes in 1982[65] that examined the five Earth-Moon Lagrangian positions and included the solar-synchronized positions in the stable L4/L5 libration orbits, the potentially stable nonplanar orbits near L1/L2, Earth-Moon L3, and also L2 in the Sun-Earth system. Again no extraterrestrial probes were found to limiting magnitudes of 17–19th magnitude near L3/L4/L5, 10–18th magnitude for L1/L2, and 14–16th magnitude for Sun-Earth L2.

In June 1983, Valdes and Freitas[66] used the 26 m radiotelescope at Hat Creek Radio Observatory to search for the tritium hyperfine line at 1516 MHz from 108 assorted astronomical objects, with emphasis on 53 nearby stars including all visible stars within a 20 light-year radius. The tritium frequency was deemed highly attractive for SETI work because (1) the isotope is cosmically rare, (2) the tritium hyperfine line is centered in the SETI waterhole region of the terrestrial microwave window, and (3) in addition to beacon signals, tritium hyperfine emission may occur as a byproduct of extensive nuclear fusion energy production by extraterrestrial civilizations. The wideband- and narrowband-channel observations achieved sensitivities of 5–14 x 10−21 W/m²/channel and 0.7-2 x 10−24 W/m²/channel, respectively, but no detections were made.

Technosignatures

Technosignatures, including all signs of technology with the exception of the interstellar radio messages that define traditional SETI, are a recent avenue in the search for extraterrestrial intelligence. Technosignatures may originate from various sources, from megastructures such as Dyson spheres and space mirrors or space shaders[67] to the atmospheric contamination created by an industrial civilization,[68] or city lights on extrasolar planets, and may be detectable in the future with large hypertelescopes.[69]
Technosignatures can be divided into three broad categories: astroengineering projects, signals of planetary origin, and spacecraft within and outside the Solar System. An astroengineering installation such as a Dyson sphere, designed to convert all of the incident radiation of its host star into energy, could be detected through the observation of an infrared excess from a solar analog star.[70] After examining some 100,000 nearby large galaxies a team of researchers has concluded that none of them contain any obvious signs of highly advanced technological civilizations. [71][72][73] Another form of astroengineering, the Shkadov thruster, moves its host star by reflecting some of the star's light back on itself, and can be detected by observing if its transits across the star abruptly end with the thruster in front.[74] Asteroid mining within the Solar System is also a detectable technosignature of the first kind.[75]

Individual extrasolar planets can be analyzed for signs of technology. Avi Loeb of the Harvard-Smithsonian Center for Astrophysics has proposed that persistent light signals on the night side of an exoplanet can be an indication of the presence of cities and an advanced civilization.[76][77] In addition, the excess infrared radiation[69][78] and chemicals[79][80] produced by various industrial processes or terraforming efforts[81] may point to intelligence.

Clearly, light and heat detected from planets are to be distinguished from natural sources to conclusively prove the existence of civilization on a planet. However, as argued by the Colossus team,[82] a civilization heat signature should be within a "comfortable" temperature range, like terrestrial urban heat islands, i.e. only a few degrees warmer than the planet itself. In contrast, such natural sources as wild fires, volcanoes, etc. are significantly hotter, so they will be well distinguished by their maximum flux at a different wavelength.

Extraterrestrial craft are another target in the search for technosignatures. Magnetic sail interstellar spacecraft are detectable over thousands of light-years of distance through the synchrotron radiation they produce through interaction with the interstellar medium; other interstellar spacecraft designs can be detected at more modest distances.[83] In addition, robotic probes within the Solar System are also being sought out with optical and radio searches.[84][85]

Fermi paradox

Italian physicist Enrico Fermi suggested in the 1950s that if technologically advanced civilizations are common in the universe, then they should be detectable in one way or another. (According to those who were there,[86] Fermi either asked "Where are they?" or "Where is everybody?")
The Fermi paradox is commonly understood as asking why extraterrestrials have not visited Earth,[87] but the same reasoning applies to the question of why signals from extraterrestrials have not been heard. The SETI version of the question is sometimes referred to as "the Great Silence".

The Fermi paradox can be stated more completely as follows:
The size and age of the universe incline us to believe that many technologically advanced civilizations must exist. However, this belief seems logically inconsistent with our lack of observational evidence to support it. Either (1) the initial assumption is incorrect and technologically advanced intelligent life is much rarer than we believe, or (2) our current observations are incomplete and we simply have not detected them yet, or (3) our search methodologies are flawed and we are not searching for the correct indicators.
There are multiple explanations proposed for the Fermi paradox,[88] ranging from analyses suggesting that intelligent life is rare (the "Rare Earth hypothesis"), to analyses suggesting that although extraterrestrial civilizations may be common, they would not communicate, or would not travel across interstellar distances.

Science writer Timothy Ferris has posited that since galactic societies are most likely only transitory, an obvious solution is an interstellar communications network, or a type of library consisting mostly of automated systems. They would store the cumulative knowledge of vanished civilizations and communicate that knowledge through the galaxy. Ferris calls this the "Interstellar Internet", with the various automated systems acting as network "servers". If such an Interstellar Internet exists, the hypothesis states, communications between servers are mostly through narrow-band, highly directional radio or laser links. Intercepting such signals is, as discussed earlier, very difficult. However, the network could maintain some broadcast nodes in hopes of making contact with new civilizations.

Although somewhat dated in terms of "information culture" arguments, not to mention the obvious technological problems of a system that could work effectively for billions of years and requires multiple lifeforms agreeing on certain basics of communications technologies, this hypothesis is actually testable (see below).

A significant problem is the vastness of space. Despite piggybacking on the world's most sensitive radio telescope, Charles Stuart Bowyer said, the instrument could not detect random radio noise emanating from a civilization like ours, which has been leaking radio and TV signals for less than 100 years. For SERENDIP and most other SETI projects to detect a signal from an extraterrestrial civilization, the civilization would have to be beaming a powerful signal directly at us. It also means that Earth civilization will only be detectable within a distance of 100 light-years.[89]

Post detection disclosure protocol

The International Academy of Astronautics (IAA) has a long-standing SETI Permanent Study Group (SPSG, formerly called the IAA SETI Committee), which addresses matters of SETI science, technology, and international policy. The SPSG meets in conjunction with the International Astronautical Congress (IAC) held annually at different locations around the world, and sponsors two SETI Symposia at each IAC. In 2005, the IAA established the SETI: Post-Detection Science and Technology Taskgroup (Chairman, Professor Paul Davies) "to act as a Standing Committee to be available to be called on at any time to advise and consult on questions stemming from the discovery of a putative signal of extraterrestrial intelligent (ETI) origin."

When awarded the 2009 TED Prize, SETI Institute's Jill Tarter outlined the organisation's "post detection protocol".[90] During NASA's funding of the project, an administrator would be first informed with the intention of informing the United States executive government. The current protocol for SETI Institute is to first internally investigate the signal, seeking independent verification and confirmation. During the process, the organisation's private financiers would be secretly informed. Once a signal has been verified, a telegram would be sent via the Central Bureau for Astronomical Telegrams. Following this process, Tarter says that the organisation will hold a press conference with the aim of broadcasting to the public. SETI Institute's Seth Shostak has claimed that knowledge of the discovery would likely leak as early as the verification process.[91]

However, the protocols mentioned apply only to radio SETI rather than for METI (Active SETI).[92] The intention for METI is covered under the SETI charter "Declaration of Principles Concerning Sending Communications with Extraterrestrial Intelligence".

The SETI Institute does not officially recognise the Wow! signal as of extraterrestrial origin (as it was unable to be verified). The SETI Institute has also publicly denied that the candidate signal Radio source SHGb02+14a is of extraterrestrial origin[93][94] though full details of the signal, such as its exact location have never been disclosed to the public.[speculation?] Although other volunteering projects such as Zooniverse credit users for discoveries, there is currently no crediting or early notification by SETI@Home following the discovery of a signal.

Some people, including Steven M. Greer,[95] have expressed cynicism that the general public might not be informed in the event of a genuine discovery of extraterrestrial intelligence due to significant vested interests. Some, such as Bruce Jakosky[96] have also argued that the official disclosure of extraterrestrial life may have far reaching and as yet undetermined implications for society, particularly for the world's religions.

Active SETI

Active SETI, also known as messaging to extraterrestrial intelligence (METI), consists of sending signals into space in the hope that they will be picked up by an alien intelligence.

Realized interstellar radio message projects

In November 1974, a largely symbolic attempt was made at the Arecibo Observatory to send a message to other worlds. Known as the Arecibo Message, it was sent towards the globular cluster M13, which is 25,000 light-years from Earth. Further IRMs Cosmic Call, Teen Age Message, Cosmic Call 2, and A Message From Earth were transmitted in 1999, 2001, 2003 and 2008 from the Evpatoria Planetary Radar.

Debate

Physicist Stephen Hawking, in his book A Brief History of Time, suggests that "alerting" extraterrestrial intelligences to our existence is foolhardy, citing mankind's history of treating his fellow man harshly in meetings of civilizations with a significant technology gap. He suggests, in view of this history, that we "lay low".

The concern over METI was raised by the science journal Nature in an editorial in October 2006, which commented on a recent meeting of the International Academy of Astronautics SETI study group. The editor said, "It is not obvious that all extraterrestrial civilizations will be benign, or that contact with even a benign one would not have serious repercussions" (Nature Vol 443 12 October 06 p 606). Astronomer and science fiction author David Brin has expressed similar concerns.[97]

Richard Carrigan, a particle physicist at the Fermi National Accelerator Laboratory near Chicago, Illinois, suggested that passive SETI could also be dangerous and that a signal released onto the Internet could act as a computer virus.[98] Computer security expert Bruce Schneier dismissed this possibility as a "bizarre movie-plot threat".[99]

To lend a quantitative basis to discussions of the risks of transmitting deliberate messages from Earth, the SETI Permanent Study Group of the International Academy of Astronautics adopted in 2007 a new analytical tool, the San Marino Scale.[100] Developed by Prof. Ivan Almar and Prof. H. Paul Shuch, the scale evaluates the significance of transmissions from Earth as a function of signal intensity and information content. Its adoption suggests that not all such transmissions are equal, and each must be evaluated separately before establishing blanket international policy regarding active SETI.[citation needed]

However, some scientists consider these fears about the dangers of METI as panic and irrational superstition; see, for example, Alexander L. Zaitsev's papers.[101][102]

On 13 February 2015, scientists (including Geoffrey Marcy, Seth Shostak, Frank Drake, Elon Musk and David Brin) at a convention of the American Association for the Advancement of Science, discussed Active SETI and whether transmitting a message to possible intelligent extraterrestrials in the Cosmos was a good idea;[103][104] one result was a statement, signed by many, that a "worldwide scientific, political and humanitarian discussion must occur before any message is sent".[105] On 28 March 2015, a related essay was written by Seth Shostak and published in The New York Times.[106]

Breakthrough Message

The Breakthrough Message program is an open competition announced in July 2015 to design a digital message that could be transmitted from Earth to an extraterrestrial civilization, with a US$1,000,000 prize pool. The message should be "representative of humanity and planet Earth". The program pledges "not to transmit any message until there has been a wide-ranging debate at high levels of science and politics on the risks and rewards of contacting advanced civilizations".[107]

Criticism

As various SETI projects have progressed, some have criticized early claims by researchers as being too "euphoric". For example, Peter Schenkel, while remaining a supporter of SETI projects, has written that
"[i]n light of new findings and insights, it seems appropriate to put excessive euphoria to rest and to take a more down-to-earth view ... We should quietly admit that the early estimates—that there may be a million, a hundred thousand, or ten thousand advanced extraterrestrial civilizations in our galaxy—may no longer be tenable."[1]
Clive Trotman presents some sobering but realistic calculations emphasizing the timeframe dimension.[108]

SETI has also occasionally been the target of criticism by those who suggest that it is a form of pseudoscience. In particular, critics allege that no observed phenomena suggest the existence of extraterrestrial intelligence, and furthermore that the assertion of the existence of extraterrestrial intelligence has no good Popperian criteria for falsifiability.[3]

In response, SETI advocates note, among other things, that the Drake Equation was never a hypothesis, and so never intended to be testable, nor to be "solved"; it was merely a clever representation of the agenda for the world's first scientific SETI meeting in 1961, and it serves as a tool in formulating testable hypotheses. Further, they note that the existence of intelligent life on Earth is a plausible reason to expect it elsewhere, and that individual SETI projects have clearly defined "stop" conditions.

Representation of a Lie group

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Representation_of_a_Lie_group...