Search This Blog

Saturday, January 10, 2015

Space exploration

From Wikipedia, the free encyclopedia
 
Saturn V rocket, used for the American manned lunar landing missions
The Moon as seen in a digitally processed image from data collected during a spacecraft flyby

Space exploration is the ongoing discovery and exploration of celestial structures in outer space by means of continuously evolving and growing space technology. While the study of space is carried out mainly by astronomers with telescopes, the physical exploration of space is conducted both by unmanned robotic probes and human spaceflight.

While the observation of objects in space, known as astronomy, predates reliable recorded history, it was the development of large and relatively efficient rockets during the early 20th century that allowed physical space exploration to become a reality. Common rationales for exploring space include advancing scientific research, uniting different nations, ensuring the future survival of humanity and developing military and strategic advantages against other countries.

Space exploration has often been used as a proxy competition for geopolitical rivalries such as the Cold War. The early era of space exploration was driven by a "Space Race" between the Soviet Union and the United States, the launch of the first man-made object to orbit the Earth, the USSR's Sputnik 1, on 4 October 1957, and the first Moon landing by the American Apollo 11 craft on 20 July 1969 are often taken as landmarks for this initial period. The Soviet space program achieved many of the first milestones, including the first living being in orbit in 1957, the first human spaceflight (Yuri Gagarin aboard Vostok 1) in 1961, the first spacewalk (by Aleksei Leonov) on 18 March 1965, the first automatic landing on another celestial body in 1966, and the launch of the first space station (Salyut 1) in 1971.

After the first 20 years of exploration, focus shifted from one-off flights to renewable hardware, such as the Space Shuttle program, and from competition to cooperation as with the International Space Station (ISS).

With the substantial completion of the ISS[1] following STS-133 in March 2011, plans for space exploration by the USA remain in flux. Constellation, a Bush Administration program for a return to the Moon by 2020[2] was judged inadequately funded and unrealistic by an expert review panel reporting in 2009.[3] The Obama Administration proposed a revision of Constellation in 2010 to focus on the development of the capability for crewed missions beyond low earth orbit (LEO), envisioning extending the operation of the ISS beyond 2020, transferring the development of launch vehicles for human crews from NASA to the private sector, and developing technology to enable missions to beyond LEO, such as Earth/Moon L1, the Moon, Earth/Sun L2, near-earth asteroids, and Phobos or Mars orbit.[4]

In the 2000s, the People's Republic of China initiated a successful manned spaceflight program, while the European Union, Japan, and India have also planned future manned space missions. China, Russia, Japan, and India have advocated manned missions to the Moon during the 21st century, while the European Union has advocated manned missions to both the Moon and Mars during the 21st century.

From the 1990s onwards, private interests began promoting space tourism and then private space exploration of the Moon (see Google Lunar X Prize).

History of exploration in the 20th century


Most orbital flight actually takes place in upper layers of the atmosphere, especially in the thermosphere (not to scale)
Timeline of Solar System exploration.
In July 1950 the first Bumper rocket is launched from Cape Canaveral, Florida. The Bumper was a two-stage rocket consisting of a Post-War V-2 topped by a WAC Corporal rocket. It could reach then-record altitudes of almost 400 km. Launched by General Electric Company, this Bumper was used primarily for testing rocket systems and for research on the upper atmosphere. They carried small payloads that allowed them to measure attributes including air temperature and cosmic ray impacts.

The first steps of putting a man-made object into space were taken by German scientists during World War II while testing the V-2 rocket, which became the first man-made object in space on 3 October 1942 with the launching of the A-4. After the war, the U.S. used German scientists and their captured rockets in programs for both military and civilian research. The first scientific exploration from space was the cosmic radiation experiment launched by the U.S. on a V-2 rocket on 10 May 1946.[5] The first images of Earth taken from space followed the same year[6][7] while the first animal experiment saw fruit flies lifted into space in 1947, both also on modified V-2s launched by Americans. Starting in 1947, the Soviets, also with the help of German teams, launched sub-orbital V-2 rockets and their own variant, the R-1, including radiation and animal experiments on some flights. These suborbital experiments only allowed a very short time in space which limited their usefulness.

First flights

Sputnik 1, the first artificial satellite orbited earth at 939 to 215 km (583 to 134 mi) in 1957, and was soon followed by Sputnik 2. See First satellite by country (Replica Pictured)
Apollo CSM in lunar orbit
Apollo 17 astronaut Harrison Schmitt standing next to a boulder at Taurus-Littrow.

The first successful orbital launch was of the Soviet unmanned Sputnik 1 ("Satellite 1") mission on 4 October 1957. The satellite weighed about 83 kg (183 lb), and is believed to have orbited Earth at a height of about 250 km (160 mi). It had two radio transmitters (20 and 40 MHz), which emitted "beeps" that could be heard by radios around the globe. Analysis of the radio signals was used to gather information about the electron density of the ionosphere, while temperature and pressure data was encoded in the duration of radio beeps. The results indicated that the satellite was not punctured by a meteoroid. Sputnik 1 was launched by an R-7 rocket. It burned up upon re-entry on 3 January 1958.

This success led to an escalation of the American space program, which unsuccessfully attempted to launch a Vanguard satellite into orbit two months later. On 31 January 1958, the U.S. successfully orbited Explorer 1 on a Juno rocket. In the meantime, the Soviet dog Laika became the first animal in orbit on 3 November 1957.

First human flights

The first successful human spaceflight was Vostok 1 ("East 1"), carrying 27 year old Russian cosmonaut Yuri Gagarin on 12 April 1961. The spacecraft completed one orbit around the globe, lasting about 1 hour and 48 minutes. Gagarin's flight resonated around the world; it was a demonstration of the advanced Soviet space program and it opened an entirely new era in space exploration: human spaceflight.

The U.S. first launched a person into space within a month of Vostok 1 with Alan Shepard's suborbital flight in Mercury-Redstone 3. Orbital flight was achieved by the United States when John Glenn's Mercury-Atlas 6 orbited the Earth on 20 February 1962.

Valentina Tereshkova, the first woman in space, orbited the Earth 48 times aboard Vostok 6 on 16 June 1963.

China first launched a person into space 42 years after the launch of Vostok 1, on 15 October 2003, with the flight of Yang Liwei aboard the Shenzhou 5 (Spaceboat 5) spacecraft.

First planetary explorations

The first artificial object to reach another celestial body was Luna 2 in 1959.[8] The first automatic landing on another celestial body was performed by Luna 9[9] in 1966. Luna 10 became the first artificial satellite of the Moon.[10]

The first manned landing on another celestial body was performed by Apollo 11 in its lunar landing on 20 July 1969.

The first successful interplanetary flyby was the 1962 Mariner 2 flyby of Venus (closest approach 34,773 kilometers). Flybys for the other planets were first achieved in 1965 for Mars by Mariner 4, 1973 for Jupiter by Pioneer 10, 1974 for Mercury by Mariner 10, 1979 for Saturn by Pioneer 11, 1986 for Uranus by Voyager 2, and 1989 for Neptune by Voyager 2.

The first interplanetary surface mission to return at least limited surface data from another planet was the 1970 landing of Venera 7 on Venus which returned data to earth for 23 minutes. In 1971 the Mars 3 mission achieved the first soft landing on Mars returning data for almost 20 seconds. Later much longer duration surface missions were achieved, including over 6 years of Mars surface operation by Viking 1 from 1975 to 1982 and over 2 hours of transmission from the surface of Venus by Venera 13 in 1982, the longest ever Soviet planetary surface mission.

Key people in early space exploration

The dream of stepping into the outer reaches of the Earth's atmosphere was driven by the fiction of Jules Verne[11][12][13] and H.G.Wells,[14] and rocket technology was developed to try to realise this vision.

The German V-2 was the first rocket to travel into space, overcoming the problems of thrust and material failure. During the final days of World War II this technology was obtained by both the Americans and Soviets as were its designers. The initial driving force for further development of the technology was a weapons race for intercontinental ballistic missiles (ICBMs) to be used as long-range carriers for fast nuclear weapon delivery, but in 1961 when USSR launched the first man into space, the U.S. declared itself to be in a "Space Race" with the Soviets.

Konstantin Tsiolkovsky, Robert Goddard, Hermann Oberth, and Reinhold Tiling laid the groundwork of rocketry in the early years of the 20th century.

Wernher von Braun was the lead rocket engineer for Nazi Germany's World War II V-2 rocket project. In the last days of the war he led a caravan of workers in the German rocket program to the American lines, where they surrendered and were brought to the USA to work on U.S. rocket development ("Operation Paperclip"). He acquired American citizenship and led the team that developed and launched Explorer 1, the first American satellite. Von Braun later led the team at NASA's Marshall Space Flight Center which developed the Saturn V moon rocket.

Initially the race for space was often led by Sergei Korolyov, whose legacy includes both the R7 and Soyuz—which remain in service to this day. Korolev was the mastermind behind the first satellite, first man (and first woman) in orbit and first spacewalk. Until his death his identity was a closely guarded state secret; not even his mother knew that he was responsible for creating the Soviet space program.
Kerim Kerimov was one of the founders of the Soviet space program and was one of the lead architects behind the first human spaceflight (Vostok 1) alongside Sergey Korolyov. After Korolyov's death in 1966, Kerimov became the lead scientist of the Soviet space program and was responsible for the launch of the first space stations from 1971 to 1991, including the Salyut and Mir series, and their precursors in 1967, the Cosmos 186 and Cosmos 188.[15][16]

Other key people

  • Valentin Glushko held the role of Chief Engine Designer for USSR. Glushko designed many of the engines used on the early Soviet rockets, but was constantly at odds with Korolyov.
  • Vasily Mishin was Chief Designer working under Sergey Korolyov and one of first Soviets to inspect the captured German V-2 design. Following the death of Sergei Korolev, Mishin was held responsible for the Soviet failure to be first country to place a man on the moon.
  • Robert Gilruth was the NASA head of the Space Task Force and director of 25 manned space flights. Gilruth was the person who suggested to John F. Kennedy that the Americans take the bold step of reaching the Moon in an attempt to reclaim space superiority from the Soviets.
  • Christopher C. Kraft, Jr. was NASA's first flight director, who oversaw development of Mission Control and associated technologies and procedures.
  • Maxime Faget was the designer of the Mercury capsule; he played a key role in designing the Gemini and Apollo spacecraft, and contributed to the design of the Space Shuttle.

Targets of exploration

Image of the Sun from 7 June 1992 showing some sunspots

The Sun

While the Sun will probably not be physically explored in the close future, one of the reasons for going into space is to know more about the Sun. Once above the atmosphere in particular and the Earth's magnetic field, this gives access to the Solar wind and infrared and ultraviolet radiations that cannot reach the surface of the Earth. The Sun generates most space weather, which can affect power generation and transmission systems on Earth and interfere with, and even damage, satellites and space probes.
MESSENGER image of Mercury

Mercury

Mercury remains the least explored of the inner planets. As of May 2013, the Mariner 10 and MESSENGER missions have been the only missions that have made close observations of Mercury. MESSENGER entered orbit around Mercury in March 2011, to further investigate the observations made by Mariner 10 in 1975 (Munsell, 2006b).
A MESSENGER image from 18,000 km showing a region about 500 km across

A third mission to Mercury, scheduled to arrive in 2020, BepiColombo is to include two probes. BepiColombo is a joint mission between Japan and the European Space Agency. MESSENGER and BepiColombo are intended to gather complementary data to help scientists understand many of the mysteries discovered by Mariner 10's flybys.

Flights to other planets within the Solar System are accomplished at a cost in energy, which is described by the net change in velocity of the spacecraft, or delta-v. Due to the relatively high delta-v to reach Mercury and its proximity to the Sun, it is difficult to explore and orbits around it are rather unstable.
Mariner 10 image of Venus

Venus

Venus was the first target of interplanetary flyby and lander missions and, despite one of the most hostile surface environments in the solar system, has had more landers sent to it (nearly all from the Soviet Union) than any other planet in the solar system. The first successful Venus flyby was the American Mariner 2 spacecraft, which flew past Venus in 1962. Mariner 2 has been followed by several other flybys by multiple space agencies often as part of missions using a Venus flyby to provide a gravitational assist en route to other celestial bodies. In 1967 Venera 4 became the first probe to enter and directly examine the atmosphere of Venus. In 1970 Venera 7 became the first successful lander to reach the surface of Venus and by 1985 it had been followed by eight additional successful Soviet Venus landers which provided images and other direct surface data. Starting in 1975 with the Soviet orbiter Venera 9 some ten successful orbiter missions have been sent to Venus, including later missions which were able to map the surface of Venus using radar to pierce the obscuring atmosphere.
The "marble" Earth picture taken by Apollo 17
First television image of Earth from space

Earth

Space exploration has been used as a tool to understand the Earth as a celestial object in its own right. Orbital missions can provide data for the Earth that can be difficult or impossible to obtain from a purely ground-based point of reference.
For example, the existence of the Van Allen belts was unknown until their discovery by the United States' first artificial satellite, Explorer 1. These belts contain radiation trapped by the Earth's magnetic fields, which currently renders construction of habitable space stations above 1000 km impractical. Following this early unexpected discovery, a large number of Earth observation satellites have been deployed specifically to explore the Earth from a space based perspective. These satellites have significantly contributed to the understanding of a variety of earth based phenomena. For instance, the hole in the ozone layer was found by an artificial satellite that was exploring Earth's atmosphere, and satellites have allowed for the discovery of archeological sites or geological formations that were difficult or impossible to otherwise identify.
The Moon as seen from the Earth
Apollo 16 astronaut John Young

Earth's Moon

Earth's Moon was the first celestial body to be the object of space exploration. It holds the distinctions of being the first remote celestial object to be flown by, orbited, and landed upon by spacecraft, and the only remote celestial object ever to be visited by humans.
In 1959 the Soviets obtained the first images of the far side of the Moon, never previously visible to humans. The U.S. exploration of the Moon began with the Ranger 4 impactor in 1962. Starting in 1966 the Soviets successfully deployed a number of landers to the Moon which were able to obtain data directly from the Moon's surface; just four months later, Surveyor 1 marked the debut of a successful series of U.S. landers. The Soviet unmanned missions culminated in the Lunokhod program in the early '70s which included the first unmanned rovers and also successfully returned lunar soil samples to the Earth for study. This marked the first (and to date the only) automated return of extraterrestrial soil samples to the Earth. Unmanned exploration of the Moon continues with various nations periodically deploying lunar orbiters, and in 2008 the Indian Moon Impact Probe.

Manned exploration of the Moon began in 1968 with the Apollo 8 mission that successfully orbited the Moon, the first time any extraterrestrial object was orbited by humans. In 1969 the Apollo 11 mission marked the first time humans set foot upon another world. Manned exploration of the Moon did not continue for long, however. The Apollo 17 mission in 1972 marked the most recent human visit there, and the next, Exploration Mission 2, is due to orbit the Moon in 2019. Robotic missions are still pursued vigorously.
Mars as seen by the HST
Surface of mars by the Spirit rover in 2004

Mars

The exploration of Mars has been an important part of the space exploration programs of the Soviet Union (later Russia), the United States, Europe, and Japan. Dozens of robotic spacecraft, including orbiters, landers, and rovers, have been launched toward Mars since the 1960s. These missions were aimed at gathering data about current conditions and answering questions about the history of Mars. The questions raised by the scientific community are expected to not only give a better appreciation of the red planet but also yield further insight into the past, and possible future, of Earth.
The exploration of Mars has come at a considerable financial cost with roughly two-thirds of all spacecraft destined for Mars failing before completing their missions, with some failing before they even began. Such a high failure rate can be attributed to the complexity and large number of variables involved in an interplanetary journey, and has led researchers to jokingly speak of The Great Galactic Ghoul[17] which subsists on a diet of Mars probes. This phenomenon is also informally known as the Mars Curse.[18] In contrast to overall high failure rates in the exploration of Mars, India has become the first country to achieve success of its maiden attempt. India's Mars Orbiter Mission (MOM)[19][20][21] is one of the least expensive interplanetary missions ever undertaken with an approximate total cost of INR450 Crore (US$73 million).[22][23]

Phobos

The Russian space mission Fobos-Grunt, which launched on 9 November 2011 experienced a failure leaving it stranded in low Earth orbit.[24] It was to begin exploration of the Phobos and Martian circumterrestrial orbit, and study whether the moons of Mars, or at least Phobos, could be a "trans-shipment point" for spaceships travelling to Mars.[25]

Jupiter

Voyager 1 image of Jupiter
Image of Io taken by the Galileo spacecraft

The exploration of Jupiter has consisted solely of a number of automated NASA spacecraft visiting the planet since 1973. A large majority of the missions have been "flybys", in which detailed observations are taken without the probe landing or entering orbit; the Galileo spacecraft is the only one to have orbited the planet. As Jupiter is believed to have only a relatively small rocky core and no real solid surface, a landing mission is nearly impossible.

Reaching Jupiter from Earth requires a delta-v of 9.2 km/s,[26] which is comparable to the 9.7 km/s delta-v needed to reach low Earth orbit.[27] Fortunately, gravity assists through planetary flybys can be used to reduce the energy required at launch to reach Jupiter, albeit at the cost of a significantly longer flight duration.[26]

Jupiter has over 60 known moons, many of which have relatively little known information about them.
A picture of Saturn taken by Voyager 2.
Huygens image from the surface of Titan

Saturn

Saturn has been explored only through unmanned spacecraft launched by NASA, including one mission (Cassini–Huygens) planned and executed in cooperation with other space agencies. These missions consist of flybys in 1979 by Pioneer 11, in 1980 by Voyager 1, in 1982 by Voyager 2 and an orbital mission by the Cassini spacecraft which entered orbit in 2004 and is expected to continue its mission well into 2012.
Saturn has at least 62 known moons, although the exact number is debatable since Saturn's rings are made up of vast numbers of independently orbiting objects of varying sizes. The largest of the moons is Titan. Titan holds the distinction of being the only moon in the solar system with an atmosphere denser and thicker than that of the Earth. As a result of the deployment from the Cassini spacecraft of the Huygens probe and its successful landing on Titan, Titan also holds the distinction of being the only moon (apart from Earth's own Moon) to be successfully explored with a lander.
Uranus from Voyager 2
Voyager 2 image showing the tortured surface of Miranda

Uranus

The exploration of Uranus has been entirely through the Voyager 2 spacecraft, with no other visits currently planned. Given its axial tilt of 97.77°, with its polar regions exposed to sunlight or darkness for long periods, scientists were not sure what to expect at Uranus. The closest approach to Uranus occurred on 24 January 1986. Voyager 2 studied the planet's unique atmosphere and magnetosphere. Voyager 2 also examined its ring system and the moons of Uranus including all five of the previously known moons, while discovering an additional ten previously unknown moons.
Images of Uranus proved to have a very uniform appearance, with no evidence of the dramatic storms or atmospheric banding evident on Jupiter and Saturn. Great effort was required to even identify a few clouds in the images of the planet. The magnetosphere of Uranus, however, proved to be completely unique and proved to be profoundly affected by the planet's unusual axial tilt. In contrast to the bland appearance of Uranus itself, striking images were obtained of the moons of Uranus, including evidence that Miranda had been unusually geologically active.
Picture of Neptune taken by Voyager 2
Triton as imaged by Voyager 2

Neptune

The exploration of Neptune began with the 25 August 1989 Voyager 2 flyby, the sole visit to the system as of 2014. The possibility of a Neptune Orbiter has been discussed, but no other missions have been given serious thought.
Although the extremely uniform appearance of Uranus during Voyager 2's visit in 1986 had led to expectations that Neptune would also have few visible atmospheric phenomena, Voyager 2 found that Neptune had obvious banding, visible clouds, auroras, and even a conspicuous anticyclone storm system rivaled in size only by Jupiter's small Spot. Neptune also proved to have the fastest winds of any planet in the solar system, measured as high as 2,100 km/h.[28] Voyager 2 also examined Neptune's ring and moon system. It discovered 900 complete rings and additional partial ring "arcs" around Neptune. In addition to examining Neptune's three previously known moons, Voyager 2 also discovered five previously unknown moons, one of which, Proteus, proved to be the last largest moon in the system. Data from Voyager further reinforced the view that Neptune's largest moon, Triton, is a captured Kuiper belt object.[29]

Other objects in the Solar system

Pluto

Pluto and Charon (1994)

The dwarf planet Pluto (considered a planet until the IAU redefined "planet" in October 2006[30]) presents significant challenges for spacecraft because of its great distance from Earth (requiring high velocity for reasonable trip times) and small mass (making capture into orbit very difficult at present). Voyager 1 could have visited Pluto, but controllers opted instead for a close flyby of Saturn's moon Titan, resulting in a trajectory incompatible with a Pluto flyby. Voyager 2 never had a plausible trajectory for reaching Pluto.[31]

Pluto continues to be of great interest, despite its reclassification as the lead and nearest member of a new and growing class of distant icy bodies of intermediate size, in mass between the remaining eight planets and the small rocky objects historically termed asteroids (and also the first member of the important subclass, defined by orbit and known as "Plutinos"). After an intense political battle, a mission to Pluto dubbed New Horizons was granted funding from the US government in 2003.[32] New Horizons was launched successfully on 19 January 2006. In early 2007 the craft made use of a gravity assist from Jupiter. Its closest approach to Pluto will be on 14 July 2015; scientific observations of Pluto will begin five months prior to closest approach and will continue for at least a month after the encounter.

Asteroids and comets

Asteroid 4 Vesta, imaged by the Dawn spacecraft

Until the advent of space travel, objects in the asteroid belt were merely pinpricks of light in even the largest telescopes, their shapes and terrain remaining a mystery. Several asteroids have now been visited by probes, the first of which was Galileo, which flew past two: 951 Gaspra in 1991, followed by 243 Ida in 1993. Both of these lay near enough to Galileo's planned trajectory to Jupiter that they could be visited at acceptable cost. The first landing on an asteroid was performed by the NEAR Shoemaker probe in 2000, following an orbital survey of the object. The dwarf planet Ceres and the asteroid 4 Vesta, two of the three largest asteroids, are targets of NASA's Dawn mission, launched in 2007.

While many comets have been closely studied from Earth sometimes with centuries-worth of observations, only a few comets have been closely visited. In 1985, the International Cometary Explorer conducted the first comet fly-by (21P/Giacobini-Zinner) before joining the Halley Armada studying the famous comet. The Deep Impact probe smashed into 9P/Tempel to learn more about its structure and composition while the Stardust mission returned samples of another comet's tail. The Philae lander successfully landed on comet 67P/Churyumov–Gerasimenko in 2014 as part of the broader Rosetta mission.

Hayabusa was an unmanned spacecraft developed by the Japan Aerospace Exploration Agency to return a sample of material from a small near-Earth asteroid named 25143 Itokawa to Earth for further analysis. Hayabusa was launched on 9 May 2003 and rendezvoused with Itokawa in mid-September 2005. After arriving at Itokawa, Hayabusa studied the asteroid's shape, spin, topography, colour, composition, density, and history. In November 2005, it landed on the asteroid to collect samples. The spacecraft returned to Earth on 13 June 2010.

Deep space exploration

Chandra, Hubble, and Spitzer image NGC 1952
Star cluster Pismis 24 and NGC 6357
Whirlpool Galaxy (Messier 51)

Future of space exploration

Concept art for a NASA Vision mission

In the 2000s, several plans for space exploration were announced; both government entities and the private sector have space exploration objectives. China has announced plans to have a 60-ton multi-module space station in orbit by 2020.

The NASA Authorization Act of 2010 provided a re-prioritized list of objectives for the American space program, as well as funding for the first priorities. NASA proposes to move forward with the development of the Space Launch System (SLS), which will be designed to carry the Orion Multi-Purpose Crew Vehicle, as well as important cargo, equipment, and science experiments to Earth's orbit and destinations beyond. Additionally, the SLS will serve as a back up for commercial and international partner transportation services to the International Space Station. The SLS rocket will incorporate technological investments from the Space Shuttle program and the Constellation program in order to take advantage of proven hardware and reduce development and operations costs. The first developmental flight is targeted for the end of 2017.[33]

AI in Space Exploration

The idea of using high level automated systems for space missions has become a desirable goal to space agencies all around the world. Such systems are believed to yield benefits such as lower cost, less human oversight, and ability to explore deeper in space which is usually restricted by long communications with human controllers.[34]

Autonomous System

Autonomy is defined by 3 requirements:[34]
  1. Being able to sense the world and their state, make decisions, and carry them out on their own
  2. Can interpret the given goal as a list of actions to take
  3. Fail flexibly

Benefits

Autonomed technologies would be able to perform beyond predetermined actions. It would analyze all possible states and events happening around them and come up with a safe response. In addition, such technologies can reduce launch cost and ground involvement. Performance would increase as well. Autonomy would be able to quickly respond upon encountering an unforeseen event, especially in deep space exploration where communication back to Earth would take too long.[34]

NASA’s Autonomous Science Experiment

NASA began its autonomous science experiment (ASE) on the Earth Observing 1 (EO-1) which is NASA’s first satellite in the new millennium program Earth observing series launched on 21 November 2000. The autonomy of ASE is capable of on-board science analysis, replanning, robust execution, and later the addition of model-based diagnostic. Images obtained by the EO-1 are analyzed on-board and downlinked when a change or an interesting event occur. The ASE software has successfully provided over 10,000 science images.[34]

Rationales

Astronaut Buzz Aldrin, had a personal Communion service when he first arrived on the surface of the Moon.

The research that is conducted by national space exploration agencies, such as NASA and Roscosmos, is one of the reasons supporters cite to justify government expenses. Economic analyses of the NASA programs often showed ongoing economic benefits (such as NASA spin-offs), generating many times the revenue of the cost of the program.[35] It is also argued that space exploration would lead to the extraction of resources on other planets and especially asteroids, which contain billions of dollars worth of minerals and metals. The revenue generated from such expeditions could generate a lot of revenue.[36] As well, it has been argued that space exploration programs help inspire youth to study in science and engineering.[37]

Another claim is that space exploration is a necessity to mankind and that staying on Earth will lead to extinction. Some of the reasons are lack of natural resources, comets, nuclear war, and worldwide epidemic. Stephen Hawking, renowned British theoretical physicist, said that "I don't think the human race will survive the next thousand years, unless we spread into space. There are too many accidents that can befall life on a single planet. But I'm an optimist. We will reach out to the stars."[38]

NASA has produced a series of public service announcement videos supporting the concept of space exploration.[39]

Overall, the public remains largely supportive of both manned and unmanned space exploration. According to an Associated Press Poll conducted in July 2003, 71% of U.S. citizens agreed with the statement that the space program is "a good investment", compared to 21% who did not.[40]

Arthur C. Clarke (1950) presented a summary of motivations for the human exploration of space in his non-fiction semi-technical monograph Interplanetary Flight.[41] He argued that humanity's choice is essentially between expansion off the Earth into space, versus cultural (and eventually biological) stagnation and death.

Topics

Delta-v's in km/s for various orbital maneuvers

Spaceflight

Spaceflight is the use of space technology to achieve the flight of spacecraft into and through outer space.
Spaceflight is used in space exploration, and also in commercial activities like space tourism and satellite telecommunications. Additional non-commercial uses of spaceflight include space observatories, reconnaissance satellites and other earth observation satellites.

A spaceflight typically begins with a rocket launch, which provides the initial thrust to overcome the force of gravity and propels the spacecraft from the surface of the Earth. Once in space, the motion of a spacecraft—both when unpropelled and when under propulsion—is covered by the area of study called astrodynamics. Some spacecraft remain in space indefinitely, some disintegrate during atmospheric reentry, and others reach a planetary or lunar surface for landing or impact.

Satellites

Satellites are used for a large number of purposes. Common types include military (spy) and civilian Earth observation satellites, communication satellites, navigation satellites, weather satellites, and research satellites. Space stations and human spacecraft in orbit are also satellites.

Commercialization of space

Current examples of the commercial use of space include satellite navigation systems, satellite television and satellite radio. Space tourism is the recent phenomenon of space travel by individuals for the purpose of personal pleasure.

Alien life

Astrobiology is the interdisciplinary study of life in the universe, combining aspects of astronomy, biology and geology.[42] It is focused primarily on the study of the origin, distribution and evolution of life. It is also known as exobiology (from Greek: έξω, exo, "outside").[43][44][45] The term "Xenobiology" has been used as well, but this is technically incorrect because its terminology means "biology of the foreigners".[46] Astrobiologists must also consider the possibility of life that is chemically entirely distinct from any life found on earth.[47] In the Solar System some of the prime locations for current or past astrobiology are on Enceladus, Europa, Mars, and Titan.[48]

Living in space

The European Space Agency's Columbus Module at the International Space Station, launched into space on the U.S. Space Shuttle mission STS-122 in 2008

Space colonization, also called space settlement and space humanization, would be the permanent autonomous (self-sufficient) human habitation of locations outside Earth, especially of natural satellites or planets such as the Moon or Mars, using significant amounts of in-situ resource utilization.

To date, the longest human occupation of space is the International Space Station which has been in continuous use for 14 years, 69 days. Valeri Polyakov's record single spaceflight of almost 438 days aboard the Mir space station has not been surpassed. Long-term stays in space reveal issues with bone and muscle loss in low gravity, immune system suppression, and radiation exposure.

Many past and current concepts for the continued exploration and colonization of space focus on a return to the Moon as a "stepping stone" to the other planets, especially Mars. At the end of 2006 NASA announced they were planning to build a permanent Moon base with continual presence by 2024.[49]

Beyond the technical factors that could make living in space more widespread, it has been suggested that the lack of private property, the inability or difficulty in establishing property rights in space, has been an impediment to the development of space for human habitation. Since the advent of space technology in the latter half of the twentieth century, the ownership of property in space has been murky, with strong arguments both for and against. In particular, the making of national territorial claims in outer space and on celestial bodies has been specifically proscribed by the Outer Space Treaty, which had been, as of 2012, ratified by all spacefaring nations.[50]

Friday, January 9, 2015

On the futility of climate models: ‘simplistic nonsense’

Guest essay by Leo Smith – elevated from a comment left on WUWT on January 6, 2015 at 2:11 am (h/t to dbs)


As an engineer, my first experience of a computer model taught me nearly all I needed to know about models.

I was tasked with designing a high voltage video amplifier to drive a military heads up display featuring a CRT.

Some people suggested I make use of the acoustic coupler to input my design and optimise it with one of the circuit modelling programs they had devised. The results were encouraging, so I built it. The circuit itself was a dismal failure.

Investigation revealed the reason instantly: the model parametrised parasitic capacitance into a simple single value: the reality of semiconductors is that the capacitance varies with applied voltage – an effect made use of in every radio today as the ‘varicap diode’. for small signals this is an acceptable compromise. Over large voltage swings the effect is massively non linear. The model was simply inadequate.

Most of engineering is to design things so that small unpredictable effects are swamped by large predictable ones. Any stable design has to work like that. If it doesn’t, it ain’t stable. Or reproducible.

That leads to a direct piece of engineering wisdom: If a system is not dominated by a few major feedback factors, it ain’t stable. And if it has a regions of stability then perturbing it outside those regions will result in gross instability, and the system will be short lived.

Climate has been in real terms amazingly stable. For millions of years. It has maintained an average of about 282 degrees absolute +- about 5 degrees since forever.

So called ‘Climate science’ relies on net positive feedback to create alarmist views – and that positive feedback is nothing to do with CO2 allegedly: on the contrary it is a temperature change amplifier pure and simple.

If such a feedback existed, any driver of temperature, from a minor change in the suns output, to a volcanic eruption must inevitably trigger massive temperature changes. But it simply never has. Or we wouldn’t be here to spout such nonsense.

With all simple known factors taken care of the basic IPCC style equation boils down to:

∆T = λ.k.log( ∆CO2)

where lambda (λ) is the climate sensitivity that expresses the presupposed propensity of any warming directly attributable to CO2 (k.log(CO2)) radiative forcing and its resultant direct temperature change to be amplified by some unexplained and unknown feedback factor, which is adjusted to match such late 20th century warming as was reasonably certain.

Everyone argues over the value of lambda. No one is arguing over the actual shape of the equation itself.

And that is the sleight of hand of the IPCC…arguments about climate sensitivity are pure misdirection away from the actuality of what is going on.

Consider an alternative:

∆T = k.log( ∆CO2) + f(∆x)

In terms of matching late 20th century warming, this is equally as good, and relies merely on introducing another unknown to replace the unknown lambda, this time not as a multiplier of CO2 driven change, but as a completely independent variable.

Philosophically both have one unknown. There is little to choose between them.

Scientifically both the rise and the pause together fit the second model far better.

Worse, consider some possible mechanisms for what X might be….

∆T = k.log( ∆CO2) + f(∆T).

Let’s say that f(∆T) is in fact a function whose current value depends on non linear and time delayed values of past temperature. So it does indeed represent temperature feedback to create new temperatures!

This is quite close to the IPCC model, but with one important proviso. The overall long term feedback MUST be negative, otherwise temperatures would be massively unstable over geological timescales.

BUT we know that short term fluctuations of quite significant values – ice ages and warm periods – are also in evidence.

Can long term negative feedback create shorter term instability? Hell yes! If you have enough terms and some time delay, it’s a piece of piss.

The climate has all the elements needed. temperature, and water. Water vapour (greenhouse gas: acts to increase temperatures) clouds (reduce daytime temps, increase night time temps) and ice (massive albedo modifiers: act to reduce temperatures) are functions of sea and air temperature, and sea and air temperature are a function via albedo and greenhouse modifiers, of water vapour concentrations. Better yet, latent heat of ice/water represents massive amounts of energy needed to effect a phase transition at a single temperature. Lots of lovely non-linearity there. Plus huge delays of decadal or multidecadal length in terms of ocean current circulations and melting/freezing of ice sheets and permafrost.

Not to mention continental drift, which adds further water cycle variables into the mix.

Or glaciation that causes falling sea levels, thus exposing more land to lower the albedo where the earth is NOT frozen, and glaciation that strips water vapour out of the air reducing cloud albedo in non glaciated areas.

It’s a massive non linear hugely time delayed negative feedback system. And that’s just water and ice. Before we toss in volcanic action, meteor strikes, continental drift. solar variability, and Milankovitch cycles…

The miracle of AGW is that all this has been simply tossed aside, or considered some kind of constant, or a multiplier of the only driver in town, CO2.

When all you know is linear systems analysis everything looks like a linear system perturbed by an external driver.

When the only driver you have come up with is CO2, everything looks like CO2.

Engineers who have done control system theory are not so arrogant. And can recognise in the irregular sawtooth of ice age temperature record a system that looks remarkably like a nasty multiple (negative) feed back time delayed relaxation oscillator.

Oscillators don’t need external inputs to change, they do that entirely within the feedback that comprises them. Just one electron of thermal noise will start them off.

What examination of the temperature record shows is that glaciation is slow. It takes many many thousands of years as the ice increases before the lowest temperatures are reached, but that positive going temperatures are much faster – we are only 10,000 years out of the last one.

The point finally is this: To an engineer, climate science as the IPCC have it is simplistic nonsense. There are far far better models available, to explain climate change based on the complexity of water interactions with temperature. Unfortunately they are far too complex even for the biggest of computers to be much use in simulating climate. And have no political value anyway, since they will essentially say ‘Climate changes irrespective of human activity, over 100 thousand year major cycles, and within that its simply unpredictable noise due to many factors none of which we have any control over’

UPDATE: An additional and clarifying comment has been posted by Leo Smith on January 6, 2015 at 6:32 pm

Look, this post was elevated (without me being aware…) from a blog comment typed in in a hurry. I accept the formula isn’t quite what I meant, but you get the general idea OK?

If I had known it was going to become a post I’d have taken a lot more care over it.

Not used k where it might confuse,. Spotted that delta log is not the same as log delta..

But the main points stand:

(i) The IPCC ‘formula’ fits the data less well than other equally simple formulae with just as many unknowns.

(ii) The IPCC formula is a linear differential equation.

(iii) There is no reason to doubt that large parts of the radiative/convective thermal cycle/balance of climate are non linear.

(iv) There are good historical reasons to suppose that the overall feedback of the climate system is negative, not positive as the IPCC assumes.

(v) given the number of feedback paths and the lags associated with them, there is more than enough scope in the climate for self generated chaotic quasi-periodic fluctuations to be generated even without any external inputs beyond a steady sun.

(vi) Given the likely shape of the overall real climate equation, there is no hope of anything like a realistic forecast ever being obtained with the current generation of computer systems and mathematical techniques. Chaos style equations are amongst the hardest and most intractable problems we have, and indeed there may well be no final answer to climate change beyond a butterfly flapping its wings in Brazil and tipping the climate into a new ice age, or a warm period, depending ;-)

(vii) A point I didn’t make: a chaotic system is never ‘in balance’, and even its average value has little meaning, because its simply a mathematical oddity – a single point on a range where the system never rests – it merely represents a point between the upper and lower bounds; Worse, is system with multiple attractors, it may not even be anywhere near where the systems orbits fr any length of time.

In short my current thinking says :
– there is no such thing as a normal climate, nor does it have a balance that man has disturbed , or could disturb. Its constantly changing and may go anywhere from ice age to seriously warm over extremely long periods of time. It does this all by itself. There need be no external drivers to move it from one attractor to another or cause it to orbit any given attractor. That climate changes is unarguable, that anything beyond climate itself is causing it, is deeply doubtful. That CO2 has a major effect is, on the data, as absurd as claiming that CO2 has no effect at all.

What we are looking at here is very clever misdirection cooked up for economic and political motives: It suited many peoples books to paint CO2 emissions as a scary pollutant, and a chance temporary correlation of rising temperatures and CO2 was combined in a linear way that any third rate scientist could understand to present a plausible formula for scary AGW. I have pointed out that other interpretations of the data make a non scary scenario, and indeed, post the Pause,. actually fit the data better.

Occam’s razor has nothing to say in defence of either.

Poppers falsifiability is no help because the one model – the IPCC – has been falsified. The other can make no predictions beyond ‘change happens all by itself in ways we cannot hope to predict’. So that cannot be falsified. If you want to test Newton’s laws the last experiment you would use is throwing an egg at a spike to predict where the bits of eggshell are going to land….

Net result is climate science isn’t worth spending a plugged nickel on, and we should spend the money on being reasonably ready for moderate climate change in either direction. Some years ago my business partner – ten years my junior wanted to get key man insurance in case I died or fell under a bus. ‘How much for how much’ ‘well you are a smoker, and old, so its a lot’ It was enough in fact to wipe out the annual profits, and the business, twice over. Curiously he is now dead from prostate cancer, and I have survived testicular cancer, and with luck, a blocked coronary artery. Sometimes you just take te risk because insuring against it costs more … if we had been really serious about climate change we would be 100% nuclear by now. It was proven safe technology and dollar for dollar has ten times the carbon reduction impact than renewables. But of course carbon reduction was not the actual game plan. Political control of energy was. Its so much easier and cheaper to bribe governments than compete in a free market…
.
IF – and this is something that should be demonstrable – the dominant feedback terms in the real climate equations are non linear, and multiple and subject to time delay, THEN we have a complex chaotic system that will be in constant more or less unpredictable flux.

And we are pissing in the wind trying to model it with simple linear differential equations and parametrised nonsense.

The whole sleight of hand of the AGW movement has been to convince scientists who do NOT understand non linear control theory, that they didn’t NEED to understand it to model climate, and that any fluctuations MUST be ’caused’ by an externality, and to pick on the most politically and commercially convenient one – CO2 – that resonated with a vastly anti-science and non-commercial sentiment left over from the Cold War ideological battles . AGW is AgitProp, not science. AGW flatters all the worst people into thinking they are more important than they are. To a man every ground roots green movement has taken a government coin, as have the universities, and they are all dancing to the piper who is paid by the unholy aggregation of commercial interest, political power broking and political marketing.

They bought them all. They couldn’t however buy the climate. Mother Nature is not a whore.

Whether AGW is a deliberate fraud, an honest mistake, or mere sloppy ignorant science is moot. At any given level it is one or the other or any combination.

What it really is, is an emotional narrative, geared to flatter the stupid and pander to their bigotry, in order to make them allies in a process that if they knew its intentions, they would utterly oppose,.

Enormous damage to the environment is justified by environmentalists because the Greater Cause says that windmills and solar panels will Save the Planet. Even when its possible to demonstrate that they have almost no effect on emissions at all, and it is deeply doubtful if those emissions are in any way significant anyway.

Green is utterly anti-nuclear. Yet which- even on their own claims – is less harmful, a few hundred tonnes of long lived radionuclides encased in glass and dumped a mile underground, or a billion tonnes of CO2?

Apparently the radiation which hasn’t injured or killed a single person at Fukushima, is far far more dangerous than the CO2, because Germany would rather burn stinking lignite having utterly polluted its rivers in strip mining it, than allow a nuclear power plant to operate inside its borders .

Years ago Roy Harper sang
“You can lead a horse to water, but you cannot make him drink
You can lead a man to slaughter, but you’ll never make him think”

I had a discussion with a gloomy friend today. We agreed the world is a mess because people don’t think, they follow leaders, trends, emotional narratives, received wisdom.. Never once do they step back and ask, ‘what really is going on here?’. Another acquaintance doing management training in the financial arena chalked up on the whiteboard “Anyone who presages a statement with the words ‘I think’ and then proceeds to regurgitate someone else’s opinions, analysis or received wisdom, will fail this course and be summarily ejected’

And finally Anthony, I am not sure I wanted that post to become an article. I dont want to be someone else’s received wisdom. I want the buggers to start thinking for themselves.

If that means studying control theory systems analysis and chaos mathematics then do it. And form your own opinions.

“Don’t follow leaders, watch your parking meters”

I say people don’t think. Prove me wrong. Don’t believe what I say, do your own analysis. Stop trusting and start thinking.

I’ll leave you with a final chilling thought. Consider the following statement:

“100% of all media ‘news’ and 90% of what is called ‘science’ and an alarming amount of blog material is not what is the case, or even what people think is the case, but what people for reasons of their own, want you to think is the case”

Finally, if I ever get around to finishing it, for those who ask ‘how can it be possible that so many people are caught up in what you claim to be a grand conspiracy or something of that nature?’ I am on the business of writing a philosophical, psychological and social explanation. It entitled ‘convenient lies’ And it shows that bigotry prejudice stupidity and venality are in fact useful techniques for species survival most of the time.

Of course the interesting facet is the ‘Black Swan’ times, when it’s the most dangerous thing in the world.

Following the herd is safer than straying off alone. Unless the herd is approaching the cliff edge and the leaders are more concerned with who is following them than where they are going…

AGW is one of the great dangers facing mankind, not because its true, but because it is widely believed, and demonstrably false.

My analysis of convenient lies shows that they are most dangerous in times of deep social and economic change in society, when the old orthodoxies are simply no good.

I feel more scared these days than at any time in the cold war. Then one felt that no one would be stupid enough to start world war three. Today, I no longer have that conviction. Two generations of social engineering aimed at removing all risk and all need to actually think from society has led to a generation which is stupid enough and smug enough and feels safe enough to utterly destroy western civilisation simply because they take it totally for granted. To them the promotion of the AGW meme is a success story in terms of political and commercial marketing. The fact that where they are taking us over a cliff edge into a new dark age, is something they simply haven’t considered at all.

They have socially engineered risk and dissent out of society. For profit. Leaving behind a population that cannot think for itself, and has no need to. Its told to blindly follow the rules.

Control system theory says that that, unlike the climate, is a deeply unstable situation.

Wake up, smell the coffee. AGW is simply another element in a tendency towards political control of everything, and the subjugation of the individual into the mass of society at large. No decision is to be taken by the individual, all is to be taken by centralised bureaucratic structures – such as the IPCC. The question is, is that a functional and effective way to structure society?

My contention is that its deeply dangerous. It introduces massive and laggy overall centralised feedback, Worse, it introduces a single point of failure. If central government breaks down or falters, people simply do not know what to do any more. No one has the skill or practice in making localised decisions anymore.

The point is to see AGW and the whole greenspin machine as just an aspect of a particular stage in political and societal evolution, and understand it in those terms. Prior to the age of the telegraph and instantaneous communications, government had to be devolved – the lag was too great to pass the decisions back to central authority. Today we think we can, but there is another lag – bureaucratic lag. As well as bureaucratic incompetence.

System theory applied to political systems, gives a really scary prediction. We are on the point of almost total collapse, and we do not have the localised systems in place to replace centralised structures that are utterly dysfunctional. Sooner or later an externality is going to come along that will overwhelm the ability of centralized bureaucracy to deal with it, and it will fail. And nothing else will succeed, because people can no longer think for themselves.

Because they were lazy and let other people do the thinking for them. And paid them huge sums to do it, and accepted the results unquestioningly.

Cooperative

From Wikipedia, the free encyclopedia ...