Search This Blog

Thursday, December 12, 2024

Ecological resilience

From Wikipedia, the free encyclopedia
Temperate lake and Mulga woodland
Lake and Mulga ecosystems with alternative stable states

In ecology, resilience is the capacity of an ecosystem to respond to a perturbation or disturbance by resisting damage and subsequently recovering. Such perturbations and disturbances can include stochastic events such as fires, flooding, windstorms, insect population explosions, and human activities such as deforestation, fracking of the ground for oil extraction, pesticide sprayed in soil, and the introduction of exotic plant or animal species. Disturbances of sufficient magnitude or duration can profoundly affect an ecosystem and may force an ecosystem to reach a threshold beyond which a different regime of processes and structures predominates. When such thresholds are associated with a critical or bifurcation point, these regime shifts may also be referred to as critical transitions.

Human activities that adversely affect ecological resilience such as reduction of biodiversity, exploitation of natural resources, pollution, land use, and anthropogenic climate change are increasingly causing regime shifts in ecosystems, often to less desirable and degraded conditions. Interdisciplinary discourse on resilience now includes consideration of the interactions of humans and ecosystems via socio-ecological systems, and the need for shift from the maximum sustainable yield paradigm to environmental resource management and ecosystem management, which aim to build ecological resilience through "resilience analysis, adaptive resource management, and adaptive governance". Ecological resilience has inspired other fields and continues to challenge the way they interpret resilience, e.g. supply chain resilience.

Definitions

The IPCC Sixth Assessment Report defines resilience as, “not just the ability to maintain essential function, identity and structure, but also the capacity for transformation.” The IPCC considers resilience both in terms of ecosystem recovery as well as the recovery and adaptation of human societies to natural disasters.

The concept of resilience in ecological systems was first introduced by the Canadian ecologist C.S. Holling in order to describe the persistence of natural systems in the face of changes in ecosystem variables due to natural or anthropogenic causes. Resilience has been defined in two ways in ecological literature:

  1. as the time required for an ecosystem to return to an equilibrium or steady-state following a perturbation (which is also defined as stability by some authors). This definition of resilience is used in other fields such as physics and engineering, and hence has been termed ‘engineering resilience’ by Holling.
  2. as "the capacity of a system to absorb disturbance and reorganize while undergoing change so as to still retain essentially the same function, structure, identity, and feedbacks".

The second definition has been termed ‘ecological resilience’, and it presumes the existence of multiple stable states or regimes.

For example, some shallow temperate lakes can exist within either clear water regime, which provides many ecosystem services, or a turbid water regime, which provides reduced ecosystem services and can produce toxic algae blooms. The regime or state is dependent upon lake phosphorus cycles, and either regime can be resilient dependent upon the lake's ecology and management.

Likewise, Mulga woodlands of Australia can exist in a grass-rich regime that supports sheep herding, or a shrub-dominated regime of no value for sheep grazing. Regime shifts are driven by the interaction of fire, herbivory, and variable rainfall. Either state can be resilient dependent upon management.

Theory

Three levels of a panarchy, three adaptive cycles, and two cross-level linkages (remember and revolt)

Ecologists Brian Walker, C S Holling and others describe four critical aspects of resilience: latitude, resistance, precariousness, and panarchy.

The first three can apply both to a whole system or the sub-systems that make it up.

  1. Latitude: the maximum amount a system can be changed before losing its ability to recover (before crossing a threshold which, if breached, makes recovery difficult or impossible).
  2. Resistance: the ease or difficulty of changing the system; how “resistant” it is to being changed.
  3. Precariousness: how close the current state of the system is to a limit or “threshold.”.
  4. Panarchy: the degree to which a certain hierarchical level of an ecosystem is influenced by other levels. For example, organisms living in communities that are in isolation from one another may be organized differently from the same type of organism living in a large continuous population, thus the community-level structure is influenced by population-level interactions.

Closely linked to resilience is adaptive capacity, which is the property of an ecosystem that describes change in stability landscapes and resilience. Adaptive capacity in socio-ecological systems refers to the ability of humans to deal with change in their environment by observation, learning and altering their interactions.

Human impacts

Resilience refers to ecosystem's stability and capability of tolerating disturbance and restoring itself.  If the disturbance is of sufficient magnitude or duration, a threshold may be reached where the ecosystem undergoes a regime shift, possibly permanently. Sustainable use of environmental goods and services requires understanding and consideration of the resilience of the ecosystem and its limits. However, the elements which influence ecosystem resilience are complicated. For example, various elements such as the water cycle, fertility, biodiversity, plant diversity and climate, interact fiercely and affect different systems.

There are many areas where human activity impacts upon and is also dependent upon the resilience of terrestrial, aquatic and marine ecosystems. These include agriculture, deforestation, pollution, mining, recreation, overfishing, dumping of waste into the sea and climate change.

Agriculture

Agriculture can be used as a significant case study in which the resilience of terrestrial ecosystems should be considered. The organic matter (elements carbon and nitrogen) in soil, which is supposed to be recharged by multiple plants, is the main source of nutrients for crop growth. In response to global food demand and shortages, however, intensive agriculture practices including the application of herbicides to control weeds, fertilisers to accelerate and increase crop growth and pesticides to control insects, reduce plant biodiversity while the supply of organic matter to replenish soil nutrients and prevent surface runoff is diminished. This leads to a reduction in soil fertility and productivity. More sustainable agricultural practices would take into account and estimate the resilience of the land and monitor and balance the input and output of organic matter.

Deforestation

The term deforestation has a meaning that covers crossing the threshold of forest's resilience and losing its ability to return to its originally stable state. To recover itself, a forest ecosystem needs suitable interactions among climate conditions and bio-actions, and enough area. In addition, generally, the resilience of a forest system allows recovery from a relatively small scale of damage (such as lightning or landslide) of up to 10 percent of its area. The larger the scale of damage, the more difficult it is for the forest ecosystem to restore and maintain its balance.

Deforestation also decreases biodiversity of both plant and animal life and can lead to an alteration of the climatic conditions of an entire area. According to the IPCC Sixth Assessment Report, carbon emissions due to land use and land use changes predominantly come from deforestation, thereby increasing the long-term exposure of forest ecosystems to drought and other climate change-induced damages. Deforestation can also lead to species extinction, which can have a domino effect particularly when keystone species are removed or when a significant number of species is removed and their ecological function is lost.

Climate change

Climate resilience is a concept to describe how well people or ecosystems are prepared to bounce back from certain climate hazard events. The formal definition of the term is the "capacity of social, economic and ecosystems to cope with a hazardous event or trend or disturbance". For example, climate resilience can be the ability to recover from climate-related shocks such as floods and droughts. Different actions can increase climate resilience of communities and ecosystems to help them cope. They can help to keep systems working in the face of external forces. For example, building a seawall to protect a coastal community from flooding might help maintain existing ways of life there.

Overfishing

It has been estimated by the United Nations Food and Agriculture Organisation that over 70% of the world's fish stocks are either fully exploited or depleted which means overfishing threatens marine ecosystem resilience and this is mostly by rapid growth of fishing technology. One of the negative effects on marine ecosystems is that over the last half-century the stocks of coastal fish have had a huge reduction as a result of overfishing for its economic benefits. Blue fin tuna is at particular risk of extinction. Depletion of fish stocks results in lowered biodiversity and consequently imbalance in the food chain, and increased vulnerability to disease.

In addition to overfishing, coastal communities are suffering the impacts of growing numbers of large commercial fishing vessels in causing reductions of small local fishing fleets. Many local lowland rivers which are sources of fresh water have become degraded because of the inflows of pollutants and sediments.

Dumping of waste into the sea

Dumping both depends upon ecosystem resilience whilst threatening it. Dumping of sewage and other contaminants into the ocean is often undertaken for the dispersive nature of the oceans and adaptive nature and ability for marine life to process the marine debris and contaminants. However, waste dumping threatens marine ecosystems by poisoning marine life and eutrophication.

Poisoning marine life

According to the International Maritime Organisation oil spills can have serious effects on marine life. The OILPOL Convention recognized that most oil pollution resulted from routine shipboard operations such as the cleaning of cargo tanks.  In the 1950s, the normal practice was simply to wash the tanks out with water and then pump the resulting mixture of oil and water into the sea. OILPOL 54   prohibited the dumping of oily wastes within a certain distance from land and in 'special areas' where the danger to the environment was especially acute. In 1962 the limits were extended by means of an amendment adopted at a conference organized by IMO. Meanwhile, IMO in 1965 set up a Subcommittee on Oil Pollution, under the auspices of its Maritime Safety committee, to address oil pollution issues.

The threat of oil spills to marine life is recognised by those likely to be responsible for the pollution, such as the International Tanker Owners Pollution Federation:

The marine ecosystem is highly complex and natural fluctuations in species composition, abundance and distribution are a basic feature of its normal function. The extent of damage can therefore be difficult to detect against this background variability. Nevertheless, the key to understanding damage and its importance is whether spill effects result in a downturn in breeding success, productivity, diversity and the overall functioning of the system. Spills are not the only pressure on marine habitats; chronic urban and industrial contamination or the exploitation of the resources they provide are also serious threats.

Eutrophication and algal blooms

The Woods Hole Oceanographic Institution calls nutrient pollution the most widespread, chronic environmental problem in the coastal ocean. The discharges of nitrogen, phosphorus, and other nutrients come from agriculture, waste disposal, coastal development, and fossil fuel use. Once nutrient pollution reaches the coastal zone, it stimulates harmful overgrowths of algae, which can have direct toxic effects and ultimately result in low-oxygen conditions. Certain types of algae are toxic. Overgrowths of these algae result in harmful algal blooms, which are more colloquially referred to as "red tides" or "brown tides". Zooplankton eat the toxic algae and begin passing the toxins up the food chain, affecting edibles like clams, and ultimately working their way up to seabirds, marine mammals, and humans. The result can be illness and sometimes death.

Sustainable development

There is increasing awareness that a greater understanding and emphasis of ecosystem resilience is required to reach the goal of sustainable development. A similar conclusion is drawn by Perman et al. who use resilience to describe one of 6 concepts of sustainability; "A sustainable state is one which satisfies minimum conditions for ecosystem resilience through time". Resilience science has been evolving over the past decade, expanding beyond ecology to reflect systems of thinking in fields such as economics and political science. And, as more and more people move into densely populated cities, using massive amounts of water, energy, and other resources, the need to combine these disciplines to consider the resilience of urban ecosystems and cities is of paramount importance.

Academic perspectives

The interdependence of ecological and social systems has gained renewed recognition since the late 1990s by academics including Berkes and Folke and developed further in 2002 by Folke et al. As the concept of sustainable development has evolved beyond the 3 pillars of sustainable development to place greater political emphasis on economic development. This is a movement which causes wide concern in environmental and social forums and which Clive Hamilton describes as "the growth fetish".

The purpose of ecological resilience that is proposed is ultimately about averting our extinction as Walker cites Holling in his paper: "[..] "resilience is concerned with [measuring] the probabilities of extinction” (1973, p. 20)". Becoming more apparent in academic writing is the significance of the environment and resilience in sustainable development. Folke et al state that the likelihood of sustaining development is raised by "Managing for resilience" whilst Perman et al. propose that safeguarding the environment to "deliver a set of services" should be a "necessary condition for an economy to be sustainable". The growing application of resilience to sustainable development has produced a diversity of approaches and scholarly debates.

The flaw of the free market

The challenge of applying the concept of ecological resilience to the context of sustainable development is that it sits at odds with conventional economic ideology and policy making. Resilience questions the free market model within which global markets operate. Inherent to the successful operation of a free market is specialisation which is required to achieve efficiency and increase productivity. This very act of specialisation weakens resilience by permitting systems to become accustomed to and dependent upon their prevailing conditions. In the event of unanticipated shocks; this dependency reduces the ability of the system to adapt to these changes. Correspondingly; Perman et al. note that; "Some economic activities appear to reduce resilience, so that the level of disturbance to which the ecosystem can be subjected to without parametric change taking place is reduced".

Moving beyond sustainable development

Berkes and Folke table a set of principles to assist with "building resilience and sustainability" which consolidate approaches of adaptive management, local knowledge-based management practices and conditions for institutional learning and self-organisation.

More recently, it has been suggested by Andrea Ross that the concept of sustainable development is no longer adequate in assisting policy development fit for today's global challenges and objectives. This is because the concept of sustainable development is "based on weak sustainability" which doesn't take account of the reality of "limits to earth's resilience". Ross draws on the impact of climate change on the global agenda as a fundamental factor in the "shift towards ecological sustainability" as an alternative approach to that of sustainable development.

Because climate change is a major and growing driver of biodiversity loss, and that biodiversity and ecosystem functions and services, significantly contribute to climate change adaptation, mitigation and disaster risk reduction, proponents of ecosystem-based adaptation suggest that the resilience of vulnerable human populations and the ecosystem services upon which they depend are critical factors for sustainable development in a changing climate.

In environmental policy

Scientific research associated with resilience is beginning to play a role in influencing policy-making and subsequent environmental decision making.

This occurs in a number of ways:

  • Observed resilience within specific ecosystems drives management practice. When resilience is observed to be low, or impact seems to be reaching the threshold, management response can be to alter human behavior to result in less adverse impact to the ecosystem.
  • Ecosystem resilience impacts upon the way that development is permitted/environmental decision making is undertaken, similar to the way that existing ecosystem health impacts upon what development is permitted. For instance, remnant vegetation in the states of Queensland and New South Wales are classified in terms of ecosystem health and abundance. Any impact that development has upon threatened ecosystems must consider the health and resilience of these ecosystems. This is governed by the Threatened Species Conservation Act 1995 in New South Wales  and the Vegetation Management Act 1999 in Queensland.
  • International level initiatives aim at improving socio-ecological resilience worldwide through the cooperation and contributions of scientific and other experts. An example of such an initiative is the Millennium Ecosystem Assessment whose objective is "to assess the consequences of ecosystem change for human well-being and the scientific basis for action needed to enhance the conservation and sustainable use of those systems and their contribution to human well-being". Similarly, the United Nations Environment Programme aim is "to provide leadership and encourage partnership in caring for the environment by inspiring, informing, and enabling nations and peoples to improve their quality of life without compromising that of future generations.

Environmental management in legislation

Ecological resilience and the thresholds by which resilience is defined are closely interrelated in the way that they influence environmental policy-making, legislation and subsequently environmental management. The ability of ecosystems to recover from certain levels of environmental impact is not explicitly noted in legislation, however, because of ecosystem resilience, some levels of environmental impact associated with development are made permissible by environmental policy-making and ensuing legislation.

Some examples of the consideration of ecosystem resilience within legislation include:

  • Environmental Planning and Assessment Act 1979 (NSW)  – A key goal of the Environmental Assessment procedure is to determine whether proposed development will have a significant impact upon ecosystems.
  • Protection of the Environment (Operations) Act 1997 (NSW)  – Pollution control is dependent upon keeping levels of pollutants emitted by industrial and other human activities below levels which would be harmful to the environment and its ecosystems. Environmental protection licenses are administered to maintain the environmental objectives of the POEO Act and breaches of license conditions can attract heavy penalties and in some cases criminal convictions.
  • Threatened Species Conservation Act 1995 (NSW)  – This Act seeks to protect threatened species while balancing it with development.

History

The theoretical basis for many of the ideas central to climate resilience have actually existed since the 1960s. Originally an idea defined for strictly ecological systems, resilience in ecology was initially outlined by C.S. Holling as the capacity for ecological systems and relationships within those systems to persist and absorb changes to "state variables, driving variables, and parameters." This definition helped form the foundation for the notion of ecological equilibrium: the idea that the behavior of natural ecosystems is dictated by a homeostatic drive towards some stable set point. Under this school of thought (which maintained quite a dominant status during this time period), ecosystems were perceived to respond to disturbances largely through negative feedback systems – if there is a change, the ecosystem would act to mitigate that change as much as possible and attempt to return to its prior state.

As greater amounts of scientific research in ecological adaptation and natural resource management was conducted, it became clear that oftentimes, natural systems were subjected to dynamic, transient behaviors that changed how they reacted to significant changes in state variables: rather than work back towards a predetermined equilibrium, the absorbed change was harnessed to establish a new baseline to operate under. Rather than minimize imposed changes, ecosystems could integrate and manage those changes, and use them to fuel the evolution of novel characteristics. This new perspective of resilience as a concept that inherently works synergistically with elements of uncertainty and entropy first began to facilitate changes in the field of adaptive management and environmental resources, through work whose basis was built by Holling and colleagues yet again.

By the mid 1970s, resilience began gaining momentum as an idea in anthropology, culture theory, and other social sciences. There was significant work in these relatively non-traditional fields that helped facilitate the evolution of the resilience perspective as a whole. Part of the reason resilience began moving away from an equilibrium-centric view and towards a more flexible, malleable description of social-ecological systems was due to work such as that of Andrew Vayda and Bonnie McCay in the field of social anthropology, where more modern versions of resilience were deployed to challenge traditional ideals of cultural dynamics.

Interplanetary spaceflight

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Interplanetary_spaceflight

Interplanetary spaceflight
or interplanetary travel is the crewed or uncrewed travel between stars and planets, usually within a single planetary system. In practice, spaceflights of this type are confined to travel between the planets of the Solar System. Uncrewed space probes have flown to all the observed planets in the Solar System as well as to dwarf planets Pluto and Ceres, and several asteroids. Orbiters and landers return more information than fly-by missions. Crewed flights have landed on the Moon and have been planned, from time to time, for Mars, Venus and Mercury. While many scientists appreciate the knowledge value that uncrewed flights provide, the value of crewed missions is more controversial. Science fiction writers propose a number of benefits, including the mining of asteroids, access to solar power, and room for colonization in the event of an Earth catastrophe.

A number of techniques have been developed to make interplanetary flights more economical. Advances in computing and theoretical science have already improved some techniques, while new proposals may lead to improvements in speed, fuel economy, and safety. Travel techniques must take into consideration the velocity changes necessary to travel from one body to another in the Solar System. For orbital flights, an additional adjustment must be made to match the orbital speed of the destination body. Other developments are designed to improve rocket launching and propulsion, as well as the use of non-traditional sources of energy. Using extraterrestrial resources for energy, oxygen, and water would reduce costs and improve life support systems.

Any crewed interplanetary flight must include certain design requirements. Life support systems must be capable of supporting human lives for extended periods of time. Preventative measures are needed to reduce exposure to radiation and ensure optimum reliability.

Current achievements in interplanetary travel

The plains of Pluto, as seen by New Horizons after its nearly 10-year voyage

Remotely guided space probes have flown by all of the observed planets of the Solar System from Mercury to Neptune, with the New Horizons probe having flown by the dwarf planet Pluto and the Dawn spacecraft currently orbiting the dwarf planet Ceres. The most distant spacecraft, Voyager 1 and Voyager 2 have left the Solar System as of 8 December 2018 while Pioneer 10, Pioneer 11, and New Horizons are on course to leave it.

In general, planetary orbiters and landers return much more detailed and comprehensive information than fly-by missions. Space probes have been placed into orbit around all the five planets known to the ancients: The first being Venus (Venera 7, 1970), Mars (Mariner 9, 1971), Jupiter (Galileo, 1995), Saturn (Cassini/Huygens, 2004), and most recently Mercury (MESSENGER, March 2011), and have returned data about these bodies and their natural satellites.

The NEAR Shoemaker mission in 2000 orbited the large near-Earth asteroid 433 Eros, and was even successfully landed there, though it had not been designed with this maneuver in mind. The Japanese ion-drive spacecraft Hayabusa in 2005 also orbited the small near-Earth asteroid 25143 Itokawa, landing on it briefly and returning grains of its surface material to Earth. Another ion-drive mission, Dawn, has orbited the large asteroid Vesta (July 2011 – September 2012) and later moved on to the dwarf planet Ceres, arriving in March 2015.

Remotely controlled landers such as Viking, Pathfinder and the two Mars Exploration Rovers have landed on the surface of Mars and several Venera and Vega spacecraft have landed on the surface of Venus, with the latter deploying balloons to the planet's atmosphere. The Huygens probe successfully landed on Saturn's moon, Titan.

No crewed missions have been sent to any planet of the Solar System. NASA's Apollo program, however, landed twelve people on the Moon and returned them to Earth. The American Vision for Space Exploration, originally introduced by U.S. President George W. Bush and put into practice through the Constellation program, had as a long-term goal to eventually send human astronauts to Mars. However, on February 1, 2010, President Barack Obama proposed cancelling the program in Fiscal Year 2011. An earlier project which received some significant planning by NASA included a crewed fly-by of Venus in the Manned Venus Flyby mission, but was cancelled when the Apollo Applications Program was terminated due to NASA budget cuts in the late 1960s.

Reasons for interplanetary travel

Space colony on the O'Neill cylinder

The costs and risk of interplanetary travel receive a lot of publicity—spectacular examples include the malfunctions or complete failures of probes without a human crew, such as Mars 96, Deep Space 2, and Beagle 2 (the article List of Solar System probes gives a full list).

Many astronomers, geologists and biologists believe that exploration of the Solar System provides knowledge that could not be gained by observations from Earth's surface or from orbit around Earth. However, they disagree about whether human-crewed missions justify their cost and risk. Critics of human spaceflight argue that robotic probes are more cost-effective, producing more scientific knowledge per dollar spent; robots do not need costly life-support systems, can be sent on one-way missions, and are becoming more capable as artificial intelligence advances. Others argue that either astronauts or spacefaring scientists, advised by Earth-based scientists, can respond more flexibly and intelligently to new or unexpected features of whatever region they are exploring.

Some members of the general public mainly value space activities for whatever tangible benefits they may deliver to themselves or to the human race as a whole. So far the only benefits of this type have been "spin-off" technologies which were developed for space missions and then were found to be at least as useful in other activities. However, public support, at least in the US, remains higher for basic scientific research than for human space flight; a 2023 survey found that Americans rate basic research as their third-highest priority for NASA, after monitoring Earth-endangering asteroids and understanding climate change. Support for scientific research is about four times higher than for human flight to the Moon or Mars.

Besides spinoffs, other practical motivations for interplanetary travel are more speculative. But science fiction writers have a fairly good track record in predicting future technologies—for example geosynchronous communications satellites (Arthur C. Clarke) and many aspects of computer technology (Mack Reynolds).

Many science fiction stories feature detailed descriptions of how people could extract minerals from asteroids and energy from sources including orbital solar panels (unhampered by clouds) and the very strong magnetic field of Jupiter. Some claim that such techniques may be the only way to provide rising standards of living without being stopped by pollution or by depletion of Earth's resources (for example peak oil).

There are also non-scientific motives for human spaceflight, such as adventure or the belief that humans have a spiritually fated destiny in space.

Finally, establishing completely self-sufficient colonies in other parts of the Solar System could, if feasible, prevent the human species from being exterminated by several possible events (see Human extinction). One of these possible events is an asteroid impact like the one which may have resulted in the Cretaceous–Paleogene extinction event. Although various Spaceguard projects monitor the Solar System for objects that might come dangerously close to Earth, current asteroid deflection strategies are crude and untested. To make the task more difficult, carbonaceous chondrites are rather sooty and therefore very hard to detect. Although carbonaceous chondrites are thought to be rare, some are very large and the suspected "dinosaur-killer" may have been a carbonaceous chondrite.

Some scientists, including members of the Space Studies Institute, argue that the vast majority of mankind eventually will live in space and will benefit from doing so.

Economical travel techniques

One of the main challenges in interplanetary travel is producing the very large velocity changes necessary to travel from one body to another in the Solar System.

Due to the Sun's gravitational pull, a spacecraft moving farther from the Sun will slow down, while a spacecraft moving closer will speed up. Also, since any two planets are at different distances from the Sun, the planet from which the spacecraft starts is moving around the Sun at a different speed than the planet to which the spacecraft is travelling (in accordance with Kepler's Third Law). Because of these facts, a spacecraft desiring to transfer to a planet closer to the Sun must decrease its speed with respect to the Sun by a large amount in order to intercept it, while a spacecraft traveling to a planet farther out from the Sun must increase its speed substantially. Then, if additionally the spacecraft wishes to enter into orbit around the destination planet (instead of just flying by it), it must match the planet's orbital speed around the Sun, usually requiring another large velocity change.

Simply doing this by brute force – accelerating in the shortest route to the destination and then matching the planet's speed – would require an extremely large amount of fuel. And the fuel required for producing these velocity changes has to be launched along with the payload, and therefore even more fuel is needed to put both the spacecraft and the fuel required for its interplanetary journey into orbit. Thus, several techniques have been devised to reduce the fuel requirements of interplanetary travel.

As an example of the velocity changes involved, a spacecraft travelling from low Earth orbit to Mars using a simple trajectory must first undergo a change in speed (also known as a delta-v), in this case an increase, of about 3.8 km/s. Then, after intercepting Mars, it must change its speed by another 2.3 km/s in order to match Mars' orbital speed around the Sun and enter an orbit around it. For comparison, launching a spacecraft into low Earth orbit requires a change in speed of about 9.5 km/s.

Hohmann transfers

Hohmann Transfer Orbit: a spaceship leaves from point 2 in Earth's orbit and arrives at point 3 in Mars' (not to scale).

For many years economical interplanetary travel meant using the Hohmann transfer orbit. Hohmann demonstrated that the lowest energy route between any two orbits is an elliptical "orbit" which forms a tangent to the starting and destination orbits. Once the spacecraft arrives, a second application of thrust will re-circularize the orbit at the new location. In the case of planetary transfers this means directing the spacecraft, originally in an orbit almost identical to Earth's, so that the aphelion of the transfer orbit is on the far side of the Sun near the orbit of the other planet. A spacecraft traveling from Earth to Mars via this method will arrive near Mars orbit in approximately 8.5 months, but because the orbital velocity is greater when closer to the center of mass (i.e. the Sun) and slower when farther from the center, the spacecraft will be traveling quite slowly and a small application of thrust is all that is needed to put it into a circular orbit around Mars. If the manoeuver is timed properly, Mars will be "arriving" under the spacecraft when this happens.

The Hohmann transfer applies to any two orbits, not just those with planets involved. For instance it is the most common way to transfer satellites into geostationary orbit, after first being "parked" in low Earth orbit. However, the Hohmann transfer takes an amount of time similar to ½ of the orbital period of the outer orbit, so in the case of the outer planets this is many years – too long to wait. It is also based on the assumption that the points at both ends are massless, as in the case when transferring between two orbits around Earth for instance. With a planet at the destination end of the transfer, calculations become considerably more difficult.

Gravitational slingshot

Plot of Voyager 2's heliocentric velocity against its distance from the Sun, illustrating the use of gravity assist to accelerate the spacecraft by Jupiter, Saturn and Uranus. To observe Triton, Voyager 2 passed over Neptune's north pole resulting in an acceleration out of the plane of the ecliptic and reduced velocity away from the Sun.

The gravitational slingshot technique uses the gravity of planets and moons to change the speed and direction of a spacecraft without using fuel. In typical example, a spacecraft is sent to a distant planet on a path that is much faster than what the Hohmann transfer would call for. This would typically mean that it would arrive at the planet's orbit and continue past it. However, if there is a planet between the departure point and the target, it can be used to bend the path toward the target, and in many cases the overall travel time is greatly reduced. A prime example of this are the two crafts of the Voyager program, which used slingshot effects to change trajectories several times in the outer Solar System. It is difficult to use this method for journeys in the inner part of the Solar System, although it is possible to use other nearby planets such as Venus or even the Moon as slingshots in journeys to the outer planets.

This maneuver can only change an object's velocity relative to a third, uninvolved object, – possibly the “centre of mass” or the Sun. There is no change in the velocities of the two objects involved in the maneuver relative to each other. The Sun cannot be used in a gravitational slingshot because it is stationary compared to rest of the Solar System, which orbits the Sun. It may be used to send a spaceship or probe into the galaxy because the Sun revolves around the center of the Milky Way.

Powered slingshot

A powered slingshot is the use of a rocket engine at or around closest approach to a body (periapsis). The use at this point multiplies up the effect of the delta-v, and gives a bigger effect than at other times.

Fuzzy orbits

Computers did not exist when Hohmann transfer orbits were first proposed (1925) and were slow, expensive and unreliable when gravitational slingshots were developed (1959). Recent advances in computing have made it possible to exploit many more features of the gravity fields of astronomical bodies and thus calculate even lower-cost trajectories. Paths have been calculated which link the Lagrange points of the various planets into the so-called Interplanetary Transport Network. Such "fuzzy orbits" use significantly less energy than Hohmann transfers but are much, much slower. They aren't practical for human crewed missions because they generally take years or decades, but may be useful for high-volume transport of low-value commodities if humanity develops a space-based economy.

Aerobraking

Apollo command module flying at a high angle of attack to aerobrake by skimming the atmosphere (artistic rendition)

Aerobraking uses the atmosphere of the target planet to slow down. It was first used on the Apollo program where the returning spacecraft did not enter Earth orbit but instead used a S-shaped vertical descent profile (starting with an initially steep descent, followed by a leveling out, followed by a slight climb, followed by a return to a positive rate of descent continuing to splash-down in the ocean) through Earth's atmosphere to reduce its speed until the parachute system could be deployed enabling a safe landing. Aerobraking does not require a thick atmosphere – for example most Mars landers use the technique, and Mars' atmosphere is only about 1% as thick as Earth's.

Aerobraking converts the spacecraft's kinetic energy into heat, so it requires a heatshield to prevent the craft from burning up. As a result, aerobraking is only helpful in cases where the fuel needed to transport the heatshield to the planet is less than the fuel that would be required to brake an unshielded craft by firing its engines. This can be addressed by creating heatshields from material available near the target.

Improved technologies and methodologies

Several technologies have been proposed which both save fuel and provide significantly faster travel than the traditional methodology of using Hohmann transfers. Some are still just theoretical, but over time, several of the theoretical approaches have been tested on spaceflight missions. For example, the Deep Space 1 mission was a successful test of an ion drive. These improved technologies typically focus on one or more of:

  • Space propulsion systems with much better fuel economy. Such systems would make it possible to travel much faster while keeping the fuel cost within acceptable limits.
  • Using solar energy and in-situ resource utilization to avoid or minimize the expensive task of shipping components and fuel up from the Earth's surface, against the Earth's gravity (see "Using non-terrestrial resources", below).
  • Novel methodologies of using energy at different locations or in different ways that can shorten transport time or reduce cost per unit mass of space transport

Besides making travel faster or cost less, such improvements could also allow greater design "safety margins" by reducing the imperative to make spacecraft lighter.

Improved rocket concepts

All rocket concepts are limited by the Tsiolkovsky rocket equation, which sets the characteristic velocity available as a function of exhaust velocity and mass ratio, of initial (M0, including fuel) to final (M1, fuel depleted) mass. The main consequence is that mission velocities of more than a few times the velocity of the rocket motor exhaust (with respect to the vehicle) rapidly become impractical, as the dry mass (mass of payload and rocket without fuel) falls to below 10% of the entire rocket's wet mass (mass of rocket with fuel).

Nuclear thermal and solar thermal rockets

Sketch of nuclear thermal rocket

In a nuclear thermal rocket or solar thermal rocket a working fluid, usually hydrogen, is heated to a high temperature, and then expands through a rocket nozzle to create thrust. The energy replaces the chemical energy of the reactive chemicals in a traditional rocket engine. Due to the low molecular mass and hence high thermal velocity of hydrogen these engines are at least twice as fuel efficient as chemical engines, even after including the weight of the reactor.

The US Atomic Energy Commission and NASA tested a few designs from 1959 to 1968. The NASA designs were conceived as replacements for the upper stages of the Saturn V launch vehicle, but the tests revealed reliability problems, mainly caused by the vibration and heating involved in running the engines at such high thrust levels. Political and environmental considerations make it unlikely such an engine will be used in the foreseeable future, since nuclear thermal rockets would be most useful at or near the Earth's surface and the consequences of a malfunction could be disastrous. Fission-based thermal rocket concepts produce lower exhaust velocities than the electric and plasma concepts described below, and are therefore less attractive solutions. For applications requiring high thrust-to-weight ratio, such as planetary escape, nuclear thermal is potentially more attractive.

Electric propulsion

Electric propulsion systems use an external source such as a nuclear reactor or solar cells to generate electricity, which is then used to accelerate a chemically inert propellant to speeds far higher than achieved in a chemical rocket. Such drives produce feeble thrust, and are therefore unsuitable for quick maneuvers or for launching from the surface of a planet. But they are so economical in their use of working mass that they can keep firing continuously for days or weeks, while chemical rockets use up reaction mass so quickly that they can only fire for seconds or minutes. Even a trip to the Moon is long enough for an electric propulsion system to outrun a chemical rocket – the Apollo missions took 3 days in each direction.

NASA's Deep Space One was a very successful test of a prototype ion drive, which fired for a total of 678 days and enabled the probe to run down Comet Borrelly, a feat which would have been impossible for a chemical rocket. Dawn, the first NASA operational (i.e., non-technology demonstration) mission to use an ion drive for its primary propulsion, successfully orbited the large main-belt asteroids 1 Ceres and 4 Vesta. A more ambitious, nuclear-powered version was intended for a Jupiter mission without human crew, the Jupiter Icy Moons Orbiter (JIMO), originally planned for launch sometime in the next decade. Due to a shift in priorities at NASA that favored human crewed space missions, the project lost funding in 2005. A similar mission is currently under discussion as the US component of a joint NASA/ESA program for the exploration of Europa and Ganymede.

A NASA multi-center Technology Applications Assessment Team led from the Johnson Spaceflight Center, has as of January 2011 described "Nautilus-X", a concept study for a multi-mission space exploration vehicle useful for missions beyond low Earth orbit (LEO), of up to 24 months duration for a crew of up to six. Although Nautilus-X is adaptable to a variety of mission-specific propulsion units of various low-thrust, high specific impulse (Isp) designs, nuclear ion-electric drive is shown for illustrative purposes. It is intended for integration and checkout at the International Space Station (ISS), and would be suitable for deep-space missions from the ISS to and beyond the Moon, including Earth/Moon L1, Sun/Earth L2, near-Earth asteroidal, and Mars orbital destinations. It incorporates a reduced-g centrifuge providing artificial gravity for crew health to ameliorate the effects of long-term 0g exposure, and the capability to mitigate the space radiation environment.

Fission powered rockets

The electric propulsion missions already flown, or currently scheduled, have used solar electric power, limiting their capability to operate far from the Sun, and also limiting their peak acceleration due to the mass of the electric power source. Nuclear-electric or plasma engines, operating for long periods at low thrust and powered by fission reactors, can reach speeds much greater than chemically powered vehicles.

Fusion rockets

Fusion rockets, powered by nuclear fusion reactions, would "burn" such light element fuels as deuterium, tritium, or 3He. Because fusion yields about 1% of the mass of the nuclear fuel as released energy, it is energetically more favorable than fission, which releases only about 0.1% of the fuel's mass-energy. However, either fission or fusion technologies can in principle achieve velocities far higher than needed for Solar System exploration, and fusion energy still awaits practical demonstration on Earth.

One proposal using a fusion rocket was Project Daedalus. Another fairly detailed vehicle system, designed and optimized for crewed Solar System exploration, "Discovery II", based on the D3He reaction but using hydrogen as reaction mass, has been described by a team from NASA's Glenn Research Center. It achieves characteristic velocities of >300 km/s with an acceleration of ~1.7•10−3 g, with a ship initial mass of ~1700 metric tons, and payload fraction above 10%.

Fusion rockets are considered to be a likely source of interplanetary transport for a planetary civilization.

Exotic propulsion

See the spacecraft propulsion article for a discussion of a number of other technologies that could, in the medium to longer term, be the basis of interplanetary missions. Unlike the situation with interstellar travel, the barriers to fast interplanetary travel involve engineering and economics rather than any basic physics.

Solar sails

NASA illustration of a solar-sail propelled spacecraft

Solar sails rely on the fact that light reflected from a surface exerts pressure on the surface. The radiation pressure is small and decreases by the square of the distance from the Sun, but unlike rockets, solar sails require no fuel. Although the thrust is small, it continues as long as the Sun shines and the sail is deployed.

The original concept relied only on radiation from the Sun – for example in Arthur C. Clarke's 1965 story "Sunjammer". More recent light sail designs propose to boost the thrust by aiming ground-based lasers or masers at the sail. Ground-based lasers or masers can also help a light-sail spacecraft to decelerate: the sail splits into an outer and inner section, the outer section is pushed forward and its shape is changed mechanically to focus reflected radiation on the inner portion, and the radiation focused on the inner section acts as a brake.

Although most articles about light sails focus on interstellar travel, there have been several proposals for their use within the Solar System.

Currently, the only spacecraft to use a solar sail as the main method of propulsion is IKAROS which was launched by JAXA on May 21, 2010. It has since been successfully deployed, and shown to be producing acceleration as expected. Many ordinary spacecraft and satellites also use solar collectors, temperature-control panels and Sun shades as light sails, to make minor corrections to their attitude and orbit without using fuel. A few have even had small purpose-built solar sails for this use (for example Eurostar E3000 geostationary communications satellites built by EADS Astrium).

Cyclers

It is possible to put stations or spacecraft on orbits that cycle between different planets, for example a Mars cycler would synchronously cycle between Mars and Earth, with very little propellant usage to maintain the trajectory. Cyclers are conceptually a good idea, because massive radiation shields, life support and other equipment only need to be put onto the cycler trajectory once. A cycler could combine several roles: habitat (for example it could spin to produce an "artificial gravity" effect), or a mothership (providing life support for the crews of smaller spacecraft which hitch a ride on it). Cyclers could also possibly make excellent cargo ships for resupply of a colony.

Space elevator

A space elevator is a theoretical structure that would transport material from a planet's surface into orbit. The idea is that, once the expensive job of building the elevator is complete, an indefinite number of loads can be transported into orbit at minimal cost. Even the simplest designs avoid the vicious circle of rocket launches from the surface, wherein the fuel needed to travel the last 10% of the distance into orbit must be lifted all the way from the surface, requiring even more fuel, and so on. More sophisticated space elevator designs reduce the energy cost per trip by using counterweights, and the most ambitious schemes aim to balance loads going up and down and thus make the energy cost close to zero. Space elevators have also sometimes been referred to as "beanstalks", "space bridges", "space lifts", "space ladders" and "orbital towers".

A terrestrial space elevator is beyond our current technology, although a lunar space elevator could theoretically be built using existing materials.

Skyhook

Non-rotating skyhook first proposed by E. Sarmont in 1990

A skyhook is a theoretical class of orbiting tether propulsion intended to lift payloads to high altitudes and speeds. Proposals for skyhooks include designs that employ tethers spinning at hypersonic speed for catching high speed payloads or high altitude aircraft and placing them in orbit. In addition, it has been suggested that the rotating skyhook is "not engineeringly feasible using presently available materials".

Launch vehicle and spacecraft reusability

The SpaceX Starship is designed to be fully and rapidly reusable, making use of the SpaceX reusable technology that was developed during 2011–2018 for Falcon 9 and Falcon Heavy launch vehicles.

SpaceX CEO Elon Musk estimates that the reusability capability alone, on both the launch vehicle and the spacecraft associated with the Starship will reduce overall system costs per tonne delivered to Mars by at least two orders of magnitude over what NASA had previously achieved.

Staging propellants

When launching interplanetary probes from the surface of Earth, carrying all energy needed for the long-duration mission, payload quantities are necessarily extremely limited, due to the basis mass limitations described theoretically by the rocket equation. One alternative to transport more mass on interplanetary trajectories is to use up nearly all of the upper stage propellant on launch, and then refill propellants in Earth orbit before firing the rocket to escape velocity for a heliocentric trajectory. These propellants could be stored on orbit at a propellant depot, or carried to orbit in a propellant tanker to be directly transferred to the interplanetary spacecraft. For returning mass to Earth, a related option is to mine raw materials from a solar system celestial object, refine, process, and store the reaction products (propellant) on the Solar System body until such time as a vehicle needs to be loaded for launch.

On-orbit tanker transfers

As of 2019, SpaceX is developing a system in which a reusable first stage vehicle would transport a crewed interplanetary spacecraft to Earth orbit, detach, return to its launch pad where a tanker spacecraft would be mounted atop it, then both fueled, then launched again to rendezvous with the waiting crewed spacecraft. The tanker would then transfer its fuel to the human crewed spacecraft for use on its interplanetary voyage. The SpaceX Starship is a stainless steel-structure spacecraft propelled by six Raptor engines operating on densified methane/oxygen propellants. It is 55 m (180 ft)-long, 9 m (30 ft)-diameter at its widest point, and is capable of transporting up to 100 tonnes (220,000 lb) of cargo and passengers per trip to Mars, with on-orbit propellant refill before the interplanetary part of the journey.

Propellant plant on a celestial body

As an example of a funded project currently under development, a key part of the system SpaceX has designed for Mars in order to radically decrease the cost of spaceflight to interplanetary destinations is the placement and operation of a physical plant on Mars to handle production and storage of the propellant components necessary to launch and fly the Starships back to Earth, or perhaps to increase the mass that can be transported onward to destinations in the outer Solar System.

The first Starship to Mars will carry a small propellant plant as a part of its cargo load. The plant will be expanded over multiple synods as more equipment arrives, is installed, and placed into mostly-autonomous production.

The SpaceX propellant plant will take advantage of the large supplies of carbon dioxide and water resources on Mars, mining the water (H2O) from subsurface ice and collecting CO2 from the atmosphere. A chemical plant will process the raw materials by means of electrolysis and the Sabatier process to produce oxygen (O2) and methane (CH4), and then liquefy it to facilitate long-term storage and ultimate use.

Using extraterrestrial resources

Langley's Mars Ice Dome design from 2016 for a Mars base would use in-situ water to make a sort of space-igloo.

Current space vehicles attempt to launch with all their fuel (propellants and energy supplies) on board that they will need for their entire journey, and current space structures are lifted from the Earth's surface. Non-terrestrial sources of energy and materials are mostly a lot further away, but most would not require lifting out of a strong gravity field and therefore should be much cheaper to use in space in the long term.

The most important non-terrestrial resource is energy, because it can be used to transform non-terrestrial materials into useful forms (some of which may also produce energy). At least two fundamental non-terrestrial energy sources have been proposed: solar-powered energy generation (unhampered by clouds), either directly by solar cells or indirectly by focusing solar radiation on boilers which produce steam to drive generators; and electrodynamic tethers which generate electricity from the powerful magnetic fields of some planets (Jupiter has a very powerful magnetic field).

Water ice would be very useful and is widespread on the moons of Jupiter and Saturn:

  • The low gravity of these moons would make them a cheaper source of water for space stations and planetary bases than lifting it up from Earth's surface.
  • Non-terrestrial power supplies could be used to electrolyse water ice into oxygen and hydrogen for use in bipropellant rocket engines.
  • Nuclear thermal rockets or Solar thermal rockets could use it as reaction mass. Hydrogen has also been proposed for use in these engines and would provide much greater specific impulse (thrust per kilogram of reaction mass), but it has been claimed that water will beat hydrogen in cost/performance terms despite its much lower specific impulse by orders of magnitude.
  • A spacecraft with an adequate water supply could carry the water under the hull, which could provide a considerable additional safety margin for the vessel and its occupants:
    • The water would absorb and conduct solar energy, thus acting as a heat shield. A vessel traveling in the inner Solar System could maintain a constant heading relative to the Sun without overheating the side of the spacecraft facing the Sun, provided the water under the hull was constantly circulated to evenly distribute the solar heat throughout the hull;
    • The water would provide some additional protection against ionizing radiation;
    • The water would act as an insulator against the extreme cold assuming it was kept heated, whether by the Sun when traveling in the inner Solar System or by an on board power source when traveling further away from the Sun;
    • The water would provide some additional protection against micrometeoroid impacts, provided the hull was compartmentalized so as to ensure any leak could be isolated to a small section of the hull.

Oxygen is a common constituent of the Moon's crust, and is probably abundant in most other bodies in the Solar System. Non-terrestrial oxygen would be valuable as a source of water ice only if an adequate source of hydrogen can be found. Possible uses include:

  • In the life support systems of space ships, space stations and planetary bases.
  • In rocket engines. Even if the other propellant has to be lifted from Earth, using non-terrestrial oxygen could reduce propellant launch costs by up to 2/3 for hydrocarbon fuel, or 85% for hydrogen. The savings are so high because oxygen accounts for the majority of the mass in most rocket propellant combinations.

Unfortunately hydrogen, along with other volatiles like carbon and nitrogen, are much less abundant than oxygen in the inner Solar System.

Scientists expect to find a vast range of organic compounds in some of the planets, moons and comets of the outer Solar System, and the range of possible uses is even wider. For example, methane can be used as a fuel (burned with non-terrestrial oxygen), or as a feedstock for petrochemical processes such as making plastics. And ammonia could be a valuable feedstock for producing fertilizers to be used in the vegetable gardens of orbital and planetary bases, reducing the need to lift food to them from Earth.

Even unprocessed rock may be useful as rocket propellant if mass drivers are employed.

Design requirements for crewed interplanetary travel

In the artistic vision, the spacecraft provides artificial gravity by spinning (1989).
Deep Space Transport and Lunar Gateway

Life support

Life support systems must be capable of supporting human life for weeks, months or even years. A breathable atmosphere of at least 35 kPa (5.1 psi) must be maintained, with adequate amounts of oxygen, nitrogen, and controlled levels of carbon dioxide, trace gases and water vapor.

In October 2015, the NASA Office of Inspector General issued a health hazards report related to human spaceflight, including a human mission to Mars.

Radiation

Once a vehicle leaves low Earth orbit and the protection of Earth's magnetosphere, it enters the Van Allen radiation belt, a region of high radiation. Beyond the Van Allen belts, radiation levels generally decrease, but can fluctuate over time. These high energy cosmic rays pose a health threat. Even the minimum levels of radiation during these fluctuations is comparable to the current annual limit for astronauts in low-Earth orbit.

Scientists of Russian Academy of Sciences are searching for methods of reducing the risk of radiation-induced cancer in preparation for the mission to Mars. They consider as one of the options a life support system generating drinking water with low content of deuterium (a stable isotope of hydrogen) to be consumed by the crew members. Preliminary investigations have shown that deuterium-depleted water features certain anti-cancer effects. Hence, deuterium-free drinking water is considered to have the potential of lowering the risk of cancer caused by extreme radiation exposure of the Martian crew.

In addition, coronal mass ejections from the Sun are highly dangerous, and are fatal within a very short timescale to humans unless they are protected by massive shielding.

Reliability

Any major failure to a spacecraft en route is likely to be fatal, and even a minor one could have dangerous results if not repaired quickly, something difficult to accomplish in open space. The crew of the Apollo 13 mission survived despite an explosion caused by a faulty oxygen tank (1970).

Launch windows

For astrodynamics reasons, economic spacecraft travel to other planets is only practical within certain time windows. Outside these windows the planets are essentially inaccessible from Earth with current technology. This constrains flights and limits rescue options in the case of an emergency.

Wednesday, December 11, 2024

Cognitive rehabilitation therapy

From Wikipedia, the free encyclopedia
 
Cognitive rehabilitation therapy
Effects of cognitive rehabilitation therapy, assessed using fMRI.

Cognitive rehabilitation refers to a wide range of evidence-based interventions designed to improve cognitive functioning in brain-injured or otherwise cognitively impaired individuals to restore normal functioning, or to compensate for cognitive deficits. It entails an individualized program of specific skills training and practice plus metacognitive strategies. Metacognitive strategies include helping the patient increase self-awareness regarding problem-solving skills by learning how to monitor the effectiveness of these skills and self-correct when necessary.

Cognitive rehabilitation therapy (offered by a trained therapist) is a subset of Cognitive Rehabilitation (community-based rehabilitation, often in traumatic brain injury; provided by rehabilitation professionals) and has been shown to be effective for individuals who had a stroke in the left or right hemisphere. or brain trauma. A computer-assisted type of cognitive rehabilitation therapy called cognitive remediation therapy has been used to treat schizophrenia, ADHD, and major depressive disorder.

Cognitive rehabilitation builds upon brain injury strategies involving memory, executive functions, activities planning and "follow through" (e.g., memory, task sequencing, lists).

It may also be recommended for traumatic brain injury, the primary population for which it was developed in the university medical and rehabilitation communities, such as that sustained by U.S. Representative Gabby Giffords, according to Dr. Gregory J. O'Shanick of the Brain Injury Association of America. Her new doctor has confirmed that it will be part of her rehabilitation. Cognitive rehabilitation may be part of a comprehensive community services program and integrated into residential services, such as supported living, supported employment, family support, professional education, home health (as personal assistance), recreation, or education programs in the community.

Cognitive rehabilitation for spatial neglect following stroke

The current body of evidence is uncertain on the efficacy of cognitive rehabilitation for reducing the disabling effects of neglect and increasing independence remains unproven. However, there is limited evidence that cognitive rehabilitation may have an immediate beneficial effect on tests of neglect. Overall, no rehabilitation approach can be supported by evidence for spatial neglect.

Assessments

According to the standard text by Sohlberg and Mateer:

Individuals and families respond differently to different interventions, in different ways, at different times after injury. Premorbid functioning, personality, social support, and environmental demands are but a few of the factors that can profoundly influence outcome. In this variable response to treatment, cognitive rehabilitation is no different from treatment for cancer, diabetes, heart disease, Parkinson's disease, spinal cord injury, psychiatric disorders, or any other injury or disease process for which variable response to different treatments is the norm.

Nevertheless, many different statistical analyses of the benefits of this therapy have been carried out. One study made in 2002 analyzed 47 treatment comparisons and reported "a differential benefit in favor of cognitive rehabilitation in 37 of 47 (78.7%) comparisons, with no comparison demonstrating a benefit in favor of the alternative treatment condition."

An internal study conducted by the Tricare Management Agency in 2009 is cited by the US Department of Defense as its reason for refusing to pay for this therapy for veterans who have had traumatic brain injury. According to Tricare, "There is insufficient, evidence-based research available to conclude that cognitive rehabilitation therapy is beneficial in treating traumatic brain injury." The ECRI Institute, whose report serves as the basis for this decision by the Department of Defense, has summed up their own findings this way:

In our report, we carried out several meta-analyses using data from 18 randomized controlled trials. Based on data from these studies, we were able to conclude the following:

  • Adults with moderate to severe traumatic brain injury who receive social skills training perform significantly better on measures of social communication than patients who receive no treatment.
  • Adults with traumatic brain injury who receive comprehensive cognitive rehabilitation therapy report significant improvement on measures of quality of life compared to patients who receive a less intense form of therapy.

The strength of the evidence supporting our conclusions was low due to the small number of studies that addressed the outcomes of interest. Further, the evidence was too weak to draw any definitive conclusions about the effectiveness of cognitive rehabilitation therapy for treating deficits related to the following cognitive areas: attention, memory, visuospacial skills, and executive function. The following factors contributed to the weakness of the evidence: differences in the outcomes assessed in the studies, differences in the types of cognitive rehabilitation therapy methods/strategies employed across studies, differences in the control conditions, and/or insufficient number of studies addressing an outcome.

Citing this 2009 assessment, US Department of Defense, one of the federal agencies not responsible for health care decisions in the US, has declared that cognitive rehabilitation therapy is scientifically unproved and should refer their concerns to the US Department of Health and Human Services, US Budget and Management, and/or the Government Accountability Office (GAO). As a result, it refuses to cover the cost of cognitive rehabilitation for brain-injured veterans. Cost-benefit and cost-effectiveness studies, together with an analysis of personnel and veterans' services for new our emerging groups in head and brain injuries, are recommended.

Bad trip

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Bad_trip A bad trip (also known as...