Search This Blog

Saturday, January 11, 2025

Experimental philosophy

From Wikipedia, the free encyclopedia

Experimental philosophy is an emerging field of philosophical inquiry that makes use of empirical data—often gathered through surveys which probe the intuitions of ordinary people—in order to inform research on philosophical questions. This use of empirical data is widely seen as opposed to a philosophical methodology that relies mainly on a priori justification, sometimes called "armchair" philosophy, by experimental philosophers. Experimental philosophy initially began by focusing on philosophical questions related to intentional action, the putative conflict between free will and determinism, and causal vs. descriptive theories of linguistic reference. However, experimental philosophy has continued to expand to new areas of research.

Disagreement about what experimental philosophy can accomplish is widespread. One claim is that the empirical data gathered by experimental philosophers can have an indirect effect on philosophical questions by allowing for a better understanding of the underlying psychological processes which lead to philosophical intuitions. Others claim that experimental philosophers are engaged in conceptual analysis, but taking advantage of the rigor of quantitative research to aid in that project. Finally, some work in experimental philosophy can be seen as undercutting the traditional methods and presuppositions of analytic philosophy. Several philosophers have offered criticisms of experimental philosophy.

History

First lecture in Experimental Philosophy, London 1748

Though, in early modern philosophy, natural philosophy was sometimes referred to as "experimental philosophy", the field associated with the current sense of the term dates its origins around 2000 when a small number of students experimented with the idea of fusing philosophy to the experimental rigor of psychology.

While the modern philosophical movement Experimental Philosophy began growing around 2000, there are some earlier examples, such as Hewson, 1994 and Naess 1938, and the use of empirical methods in philosophy far predates the emergence of the recent academic field. Current experimental philosophers claim that the movement is actually a return to the methodology used by many ancient philosophers. Further, other philosophers like David Hume, René Descartes and John Locke are often held up as early models of philosophers who appealed to empirical methodology.

Areas of research

Consciousness

The questions of what consciousness is, and what conditions are necessary for conscious thought have been the topic of a long-standing philosophical debate. Experimental philosophers have approached this question by trying to get a better grasp on how exactly people ordinarily understand consciousness. For instance, work by Joshua Knobe and Jesse Prinz (2008) suggests that people may have two different ways of understanding minds generally, and Justin Sytsma and Edouard Machery (2009) have written about the proper methodology for studying folk intuitions about consciousness. Bryce Huebner, Michael Bruno, and Hagop Sarkissian (2010) have further argued that the way Westerners understand consciousness differs systematically from the way that East Asians understand consciousness, while Adam Arico (2010) has offered some evidence for thinking that ordinary ascriptions of consciousness are sensitive to framing effects (such as the presence or absence of contextual information). Some of this work has been featured in the Online Consciousness Conference.

Other experimental philosophers have approached the topic of consciousness by trying to uncover the cognitive processes that guide everyday attributions of conscious states. Adam Arico, Brian Fiala, Rob Goldberg, and Shaun Nichols, for instance, propose a cognitive model of mental state attribution (the AGENCY model), whereby an entity's displaying certain relatively simple features (e.g., eyes, distinctive motions, interactive behavior) triggers a disposition to attribute conscious states to that entity. Additionally, Bryce Huebner has argued that ascriptions of mental states rely on two divergent strategies: one sensitive to considerations of an entity's behavior being goal-directed; the other sensitive to considerations of personhood.

Cultural diversity

Following the work of Richard Nisbett, which showed that there were differences in a wide range of cognitive tasks between Westerners and East Asians, Jonathan Weinberg, Shaun Nichols and Stephen Stich (2001) compared epistemic intuitions of Western college students and East Asian college students. The students were presented with a number of cases, including some Gettier cases, and asked to judge whether a person in the case really knew some fact or merely believed it. They found that the East Asian subjects were more likely to judge that the subjects really knew. Later Edouard Machery, Ron Mallon, Nichols and Stich performed a similar experiment concerning intuitions about the reference of proper names, using cases from Saul Kripke's Naming and Necessity (1980). Again, they found significant cultural differences. Each group of authors argued that these cultural variances undermined the philosophical project of using intuitions to create theories of knowledge or reference. However, subsequent studies have consistently failed to replicate Weinberg et al.'s (2001) results for other Gettier cases.  Indeed, more recent studies have actually been providing evidence for the opposite hypothesis, that people from a variety of different cultures have surprisingly similar intuitions in these cases.

Determinism and moral responsibility

One area of philosophical inquiry has been concerned with whether or not a person can be morally responsible if their actions are entirely determined, e.g., by the laws of Newtonian physics. One side of the debate, the proponents of which are called ‘incompatibilists,’ argue that there is no way for people to be morally responsible for immoral acts if they could not have done otherwise. The other side of the debate argues instead that people can be morally responsible for their immoral actions even when they could not have done otherwise. People who hold this view are often referred to as ‘compatibilists.’ It was generally claimed that non-philosophers were naturally incompatibilist, that is they think that if you couldn't have done anything else, then you are not morally responsible for your action. Experimental philosophers have addressed this question by presenting people with hypothetical situations in which it is clear that a person's actions are completely determined. Then the person does something morally wrong, and people are asked if that person is morally responsible for what she or he did. Using this technique Nichols and Knobe (2007) found that "people's responses to questions about moral responsibility can vary dramatically depending on the way in which the question is formulated" and argue that "people tend to have compatiblist intuitions when they think about the problem in a more concrete, emotional way but that they tend to have incompatiblist intuitions when they think about the problem in a more abstract, cognitive way".

Epistemology

Recent work in experimental epistemology has tested the apparently empirical claims of various epistemological views. For example, research on epistemic contextualism has proceeded by conducting experiments in which ordinary people are presented with vignettes that involve a knowledge ascription. Participants are then asked to report on the status of that knowledge ascription. The studies address contextualism by varying the context of the knowledge ascription (for example, how important it is that the agent in the vignette has accurate knowledge). Data gathered thus far show no support for what contextualism says about ordinary use of the term "knows". Other work in experimental epistemology includes, among other things, the examination of moral valence on knowledge attributions (the so-called "epistemic side-effect effect"), of the knowing-that / knowing-how distinction, and of laypeople's intuitions about lying, improper assertion, and insincerity.

Intentional action

A prominent topic in experimental philosophy is intentional action. Work by Joshua Knobe has especially been influential. "The Knobe Effect", as it is often called, concerns an asymmetry in our judgments of whether an agent intentionally performed an action. It is "one of the first, most important, and most widely studied effects" in experimental philosophy. Knobe (2003a) asked people to suppose that the CEO of a corporation is presented with a proposal that would, as a side effect, affect the environment. In one version of the scenario, the effect on the environment will be negative (it will "harm" it), while in another version the effect on the environment will be positive (it will "help" it). In both cases, the CEO opts to pursue the policy and the effect does occur (the environment is harmed or helped by the policy). However, the CEO only adopts the program because he wants to raise profits; he does not care about the effect that the action will have on the environment. Although all features of the scenarios are held constant—except for whether the side effect on the environment will be positive or negative—a majority of people judge that the CEO intentionally hurt the environment in the one case, but did not intentionally help it in the other. Knobe ultimately argues that the effect is a reflection of a feature of the speakers' underlying concept of intentional action: broadly moral considerations affect whether we judge that an action is performed intentionally. However, his exact views have changed in response to further research.

Experimental jurisprudence

Experimental jurisprudence is an emerging topic in experimental philosophy and legal scholarship that explores the nature of legal phenomena through psychological investigations of legal concepts. The field departs from traditional analytic legal philosophy in its ambition to elucidate common intuitions in a systematic fashion. Equally, unlike research in legal psychology, experimental jurisprudence emphasises the philosophical implications of its findings, notably, for questions about whether, how, and in what respects, the law's content is a matter of moral perspective. Experimental jurisprudence scholarship has argued that philosophers' appeals to the content of folk legal concepts ought to be tested empirically so that, the ‘big [philosophical] cost of rely[ing]... on… a concept that is distinct from that used by folk’, may be allocated correctly. Whereas some legal theorists have welcomed X-Jur's emergence, others have expressed reservations about the contributions it seeks to make.

Predicting philosophical disagreement

Research suggests that some fundamental philosophical intuitions are related to stable individual differences in personality. Although there are notable limits, philosophical intuitions and disagreements can be predicted by heritable Big Five personality traits and their facets. Extraverts are much more likely to be compatibilists, particularly if they are high in “warmth.” Extraverts show larger biases and different patterns of beliefs in the Knobe side effect cases. Neuroticism is related to susceptibility to manipulation-style free will arguments. Emotional Stability predicts who will attribute virtues to others. Openness to experience predicts non-objectivist moral intuitions. The link between personality and philosophical intuitions is independent of cognitive abilities, training, education, and expertise. Similar effects have also been found cross-culturally and in different languages including German and Spanish.

Because the Big Five Personality Traits are highly heritable, some have argued that many contemporary philosophical disputes are likely to persist through the generations. This may mean that some historical philosophical disputes are unlikely to be solved by purely rational, traditional philosophical methods and may require empirical data and experimental philosophy.

Additional research suggests that variance in philosophical tendencies is partly explained in part by differences in thinking styles (e.g., the intuitive or reflective reasoning from Dual Process Theory), even among philosophers. For example, accepting faulty intuitions on reflection tests has predicted belief in God, and disbelieving that scientific theories are true, while correct responses on reflection tests predicts decisions to minimize harm (a la utilitarianism) or avoid causing harm (a la deontology) on the trolley problem. These data suggest that reasoning habits may be related to philosophical thinking. However, it has been difficult to detect a causal connection between reasoning habit and philosophical thinking.

Criticisms

In 2006, J. David Velleman attacked experimental philosophy on the blog Left2Right, prompting a response from its defenders on Brian Leiter's blog.

Antti Kauppinen (2007) has argued that intuitions will not reflect the content of folk concepts unless they are intuitions of competent concept users who reflect in ideal circumstances and whose judgments reflect the semantics of their concepts rather than pragmatic considerations. Experimental philosophers are aware of these concerns, and acknowledge that they constitute a criticism.

Timothy Williamson (2008) has argued that we should not construe philosophical evidence as consisting of intuitions.

Other experimental philosophers have noted that experimental philosophy often fails to meet basic standards of experimental social science. A great deal of the experiments fail to include enough female participants. Analysis of experimental data is often plagued by improper use of statistics, and reliance on data mining. Others have pointed out that many participants in experimental philosophy studies fail to comprehend the often abstract and complicated materials, and few studies report comprehension checks. Holtzman argues that a number of experimental philosophers are guilty of suppressing evidence. Yet, in lumping together all people's intuitions as those of the 'folk,' critics may be ignoring basic concerns identified by standpoint feminists.

Some research in experimental philosophy is misleading because it examines averaged responses to surveys even though in almost all of the studies in experimental philosophy there have been substantial dissenting minorities. Ignoring individual differences may result in a distorted view of folk intuitions or concepts. This may lead to theoretical and strange fictions about everyday intuitions or concepts that experimental philosophy was designed to avoid akin to creating the fiction that the average human is not a man or a woman, but the average of a man and woman (e.g., the average person has one ovary and one testicle). This criticism is not unique to experimental philosophy but also applies to other sciences such as psychology and chemistry, although experimental philosophers may lack the training to recognize it.

Problem of reproducibility

In a series of studies published in 2012 and later peer-reviewed, Hamid Seyedsayamdost showed that some of the most famous results in experimental philosophy were not reproducible. This work gave rise to a focused attention on reproducibility in experimental philosophy. Several philosophers have carried out independent replications and to date all have confirmed Seyedsayamdost's results.

Some of the areas covered in this debate include the instability and malleability of philosophical intuitions, determinism and moral responsibility, cultural diversity, gender differences and socioeconomic diversity. A large amount of research also focused on epistemology as Stephen Stich argued early on that findings reported by him and co-authors suggested that long practiced methods in philosophy had to be discarded, famously noting that in light of their findings a "reasonable conclusion is that philosophy's 2400 year long infatuation with Plato's method has been a terrible mistake." Since publication of Seyedsayamdost's papers, Stich and collaborators have reversed their research direction on this question. The reason for these problems in experimental philosophy is not entirely clear, although a parallel with experimental psychology has been suggested.

At least one recent study, in which a team attempted to replicate various influential studies in experimental philosophy studies, found that roughly 70% of them could be replicated. The reasons for the discrepancy with Seyedsayamdost's original study are not yet known.

Weather modification

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Weather_modification
A tornado near Anadarko, Oklahoma during the 1999 Oklahoma tornado outbreak. Weather researchers may aspire to eliminate or control dangerous types of weather such as this.
Weather modification is the act of intentionally manipulating or altering the weather. The most common form of weather modification is cloud seeding, which increases rainfall or snowfall, usually for the purpose of increasing the local water supply. Weather modification can also have the goal of preventing damaging weather, such as hail or hurricanes, from occurring; or of provoking damaging weather against an enemy, as a tactic of military or economic warfare like Operation Popeye, where clouds were seeded to prolong the monsoon in Vietnam. Weather modification in warfare has been banned by the United Nations under the Environmental Modification Convention.

History

A popular belief in Northern Europe was that shooting prevents hail, which thus caused many agricultural towns to fire cannons without ammunition. Veterans of the Seven Years' War, Napoleonic wars, and the American Civil War reported that rain fell after every large battle. After their stories were collected in War and Weather, the United States Department of War in the late 19th century purchased $9,000 of gunpowder and explosives to detonate them in Texas, in hopes of condensing water vapor into rain. The results of the test, supervised by Robert Dyrenforth, were inconclusive.

Wilhelm Reich performed cloudbusting experiments in the 1950s, the results of which are controversial and were not widely accepted by mainstream science.

In November 1954 the Thailand Royal Rainmaking Project (Thai: โครงการฝนหลวง) was initiated by King Bhumibol Adulyadej. He discovered that many areas faced the problem of drought. Over 82 percent of Thai agricultural land relied on rainfall. Thai farmers were not able to grow crops for lack of water. The royal rainmaking project debuted on 20 July 1969 at his behest, when the first rainmaking attempt was made at Khao Yai National Park. Dry ice flakes were scattered over clouds. Reportedly, some rainfall resulted. In 1971, the government established the Artificial Rainmaking Research and Development Project within the Thai Ministry of Agriculture and Cooperatives.

In January 2011, several newspapers and magazines, including the UK's Sunday Times and Arabian Business, reported that scientists backed by the government of Abu Dhabi, the capital of the United Arab Emirates, had created over 50 artificial rainstorms between July and August 2010 near Al Ain, a city which lies close to the country's border with Oman and is the second-largest city in the Abu Dhabi Emirate. The artificial rainstorms were said to have sometimes caused hail, gales and thunderstorms, baffling local residents.

In the run up to the 2008 Beijing Olympic Games, the Chinese Government said they could control precipitation to some extent and that the Games would not be hampered by bad weather conditions. For this purpose they established a government office called the Beijing Weather Modification Office, which is under the national weather control office.

Cloud seeding

Cloud seeding can be done by ground generators (left) or planes

Cloud seeding is a common technique to enhance precipitation. Cloud seeding entails spraying small particles, such as silver iodide, onto clouds to attempt to affect their development, usually with the goal of increasing precipitation. Cloud seeding only works to the extent that there is already water vapor present in the air. Critics generally contend that claimed successes occur in conditions which were going to lead to rain anyway. It is used in a variety of drought-prone countries, including the United States, China, India, and Russia. In China, there is a perceived dependency upon it in dry regions, and there is a strong suspicion it is used to "wash the air" in dry and heavily polluted places, such as Beijing. In mountainous areas of the United States such as the Rocky Mountains and Sierra Nevada, cloud seeding has been employed since the 1950s.

Project Cirrus was an attempt by General Electric to modify the weather which ran from 1947-1952. During that time, under the supervision of the United States Air Force, attempts were made to create snowstorms and seed hurricanes by using silver iodide. While General Electric reported positive results, they also acknowledged that their experiments were controversial.

The United Arab Emirates has been cloud seeding since the 2000s and aims to increase rainfall by 15-30% per year. The materials used are potassium chloride, sodium chloride, magnesium, and other materials.

Consequences

Societal

Not having adequate systems to handle weather modification may have disastrous consequences. "In the city of Jeddah in Western Saudi Arabia was damaged by floods in 2009 that reportedly killed more than 100 people; igniting questions of why the country doesn't have effective drainage systems in place."

Human

The U.S. National Library of Medicine notes that the silver iodide has no known "ill effects" on people, although people's "hands may have remained yellowed for weeks" after being exposed to it.

Storm prevention

Project Stormfury

Project Stormfury was an attempt to weaken tropical cyclones by flying aircraft into storms and seeding the eyewall with silver iodide. The project was run by the United States Government from 1962 to 1983. A similar project using soot was run in 1958, with inconclusive results. Various methods have been proposed to reduce the harmful effects of hurricanes. Moshe Alamaro of the Massachusetts Institute of Technology proposed using barges with upward-pointing jet engines to trigger smaller storms to disrupt the progress of an incoming hurricane; critics doubt the jets would be powerful enough to make any noticeable difference.

Alexandre Chorin of the University of California, Berkeley, proposed dropping large amounts of environmentally friendly oils on the sea surface to prevent droplet formation. Experiments by Kerry Emanuel of MIT in 2002 suggested that hurricane-force winds would disrupt the oil slick, making it ineffective. Other scientists disputed the factual basis of the theoretical mechanism assumed by this approach.

The Florida company Dyn-O-Mat and its CEO, Peter Cordani, proposed the use of a patented product it developed, called Dyn-O-Gel, to reduce the strength of hurricanes. The substance is a polymer in powder form (a polyacrylic acid derivative) which reportedly has the ability to absorb 1,500 times its own weight in water. The theory is that the polymer is dropped into clouds to remove their moisture and force the storm to use more energy to move the heavier water drops, thus helping to dissipate the storm. When the gel reaches the ocean surface, it is reportedly dissolved. Peter Cordani teamed up with Mark Daniels and Victor Miller, the owners of a government contracting aviation firm AeroGroup which operated ex-military aircraft commercially. Using a high altitude B-57 Bomber, AeroGroup tested the substance dropping 9,000 pounds from the B-57 aircraft's large bomb bay and dispersing it into a large thunderstorm cell just off the east coast of Florida. The tests were documented on film and made international news showing the storms were successfully removed on monitored Doppler radar. In 2003, the program was shut down because of political pressure through NOAA. Numerical simulations performed by NOAA showed however that it would not be a practical solution for large systems like a tropical cyclone.

Hail cannons at an international congress on hail shooting held in 1901

Hail cannons have been used by some farmers since the 19th century in an attempt to ward off hail, but there is no reliable scientific evidence to confirm their effectiveness. Another new anti-hurricane technology is a method for the reduction of tropical cyclones' destructive force – pumping sea water into and diffusing it in the wind at the bottom of such tropical cyclones in its eye wall.

Hurricane modification

NOAA published a page addressing various ideas in regard to tropical cyclone manipulation.

In 2007, "How to stop a hurricane" explored various ideas such as:

Researchers from NOAA's hurricane research division addressed hurricane control based ideas.

Later ideas (2017) include laser inversion along the same lines as laser cooling (normally used at cryogenic temperatures) but intended to cool the top 1mm of water. If enough power were to be used then it may be enough, combined with computer modelling, to form an interference pattern able to inhibit a hurricane or significantly reduce its strength by depriving it of heat energy.

Other proposals for hurricane modification include the construction of a large array of offshore wind turbines along the East Coast of the United States. Such turbines would have the dual purpose of generating plentiful energy whilst also reducing the power of oncoming hurricanes before they make landfall.

Pumping up deep ocean waters to cool the surface

Pumping up colder deep ocean water in front of a tropical storm to cool the sea surface skin temperature could be a technique used to fight hurricanes in the Atlantic before they develop into major hurricanes.

It is purely speculative and difficult to realize since placing such pumps in the path of a hurricane would be difficult. Furthermore, any such project would need a large number of them to upwell enough water to cool a large enough sea surface area to have any effectiveness. That is without counting the ridiculous amount of energy needed to power those pumps and its effects on marine life.

In military

Operation Popeye was a highly classified operation run by the US military from 1967-1972. The purpose was to prolong the monsoon in Southeast Asia. The overwhelming precipitation successfully disrupted the tactical logistics of the Vietnamese army. Operation Popeye is believed as the first successful practice of weather modification technology in warfare. After it was unveiled, weather modification in warfare was banned by the Environmental Modification Convention (ENMOD).

In "Benign Weather Modification" published in March 1997, Air Force Major Barry B. Coble superficially documents the existence of weather modification science where he traces the developments that have occurred, notably, in the hands of the Pentagon and CIA's staunchest ideological enemies.

  • The first scientifically controlled and monitored effort generally recognized by the meteorological community as constituting weather modification occurred in 1948. When Dr. Irving Langmuir first experimented with artificially seeding clouds to produce rain in New Mexico, his experiments showed positive results – sparking tremendous interest in the field nearly overnight.
  • Many countries throughout the world practice weather modification. The Russians have long been interested in using weather modification as a way to control hail.

In the 1990s a directive from the chief of staff of the Air Force Ronald R. Fogleman was issued to examine the concepts, capabilities, and technologies the United States would require to remain the dominant air and space force in the future.

In law

US and Canada agreement

In 1975, the US and Canada entered into an agreement under the auspices of the United Nations for the exchange of information on weather modification activity.

1977 UN Environmental Modification Convention

Weather modification, particularly hostile weather warfare, was addressed by the "United Nations General Assembly Resolution 31/72, TIAS 9614 Convention on the Prohibition of Military or Any Other Hostile Use of Environmental Modification Techniques." The Convention was signed in Geneva on May 18, 1977; entered into force on October 5, 1978; ratified by U.S. President Jimmy Carter on December 13, 1979; and the U.S. ratification deposited in New York on January 17, 1980.

US National Oceanic and Atmospheric Administration

In the US, the National Oceanic and Atmospheric Administration keeps records of weather modification projects on behalf of the Secretary of Commerce, under the authority of Public Law 92-205, 15 USC § 330B, enacted in 1971.

Proposed US legislation

U.S. Senate Bill 517 and U.S. House Bill 2995 were two bills proposed in 2005 that would have expanded experimental weather modification, to establish a Weather Modification Operations and Research Board, and implemented a national weather modification policy. Neither was made into law.

Senate Bill 1807 and House Bill 3445, identical bills introduced July 17, 2007, proposed to established a Weather Mitigation Advisory and Research Board to fund weather modification research.

Passed Tennessee legislation

Tennessee bill HB 2063/SB 2691 was signed into law on April 11, 2024. This bill bans the "intentional injection, release, or dispersion" of chemicals within Tennessee "with the express purpose of affecting temperature, weather, or the intensity of the sunlight."

The text of the bill doesn't explicitly reference the chemtrail conspiracy theory, but the sponsor of the bill, Sen. Steve Southerland said that it is one of the intended targets of the bill.

In religion and mythology

Witches concoct a brew to summon a hailstorm.

Magical and religious practices to control the weather are attested in a variety of cultures. In ancient India, it is said that yajna or Vedic rituals of chanting mantras and offerings were performed by rishis to bring sudden bursts of rainfall in rain starved regions. Some Indigenous Americans, like some Europeans, had rituals which they believed could induce rain.

The early modern era saw people observe that during battles the firing of cannons and other firearms often initiated precipitation.

In Greek mythology, Iphigenia was offered as a human sacrifice to appease the wrath of the goddess Artemis, who had becalmed the Achaean fleet at Aulis at the beginning of the Trojan War. In Homer's Odyssey, Aeolus, keeper of the winds, bestowed Odysseus and his crew with a gift of the four winds in a bag. However, the sailors opened the bag while Odysseus slept, looking for booty (money), and as a result, were blown off course by the resulting gale. In ancient Rome, the lapis manalis was a sacred stone kept outside the walls of Rome in a temple of Mars. When Rome suffered from drought, the stone was dragged into the city. The Berwick witches of Scotland were found guilty of using black magic to summon storms to murder King James VI of Scotland by seeking to sink the ship upon which he travelled. Scandinavian witches allegedly claimed to sell the wind in bags or magically confined into wooden staves; they sold the bags to seamen who could release them when becalmed. In various towns of Navarre, prayers petitioned Saint Peter to grant rain in times of drought. If the rain was not forthcoming, the statue of St Peter was removed from the church and tossed into a river.

In the Hebrew Bible, it is recorded that Elijah in the way of judgement, told King Ahab that neither dew nor rain would fall until Elijah called for it. It is further recorded that the ensuing drought lasted for a period of 3.5 years at which time Elijah called the rains to come again and the land was restored. The New Testament records Jesus Christ controlling a storm by speaking to it.

In Islam, Salat Al-Istisqa’ (Prayer for Rain) is taken as a recourse when seeking rain from God during times of drought.

Conspiracy theories

Weather modification, along with climate engineering, is a recurring theme in conspiracy theories. The chemtrail conspiracy theory supposes that jet contrails are chemically altered to modify the weather and other phenomena. Other theories attempt to implicate scientific infrastructure such as the High-frequency Active Auroral Research Program (HAARP).

In literature

Frank Herbert's Dune series features weather control technology, mainly on two planets: Arrakis, where the technology is used by the Fremen to assure privacy from observation and hide their true population and their plans to terraform the planet from the Imperium; and, Chapterhouse, where the Bene Gesserit intend to turn the planet into a desert.

The ability to manipulate the weather has become a common superpower in superhero fiction. A notable example is the Marvel Comics character Storm.

In the children's book Cloudy with a Chance of Meatballs, the fictional town of Chewandswallow has weather that rains down food instead of actual rain or snow, which becomes so extreme it forces its citizens to move to a different town. This was adapted into a movie where Flint Lockwood, the town's outcast and scientist, has created a machine that converts water from the clouds into food.

Educational inflation

From Wikipedia, the free encyclopedia

Educational inflation is the increasing educational requirements for occupations that do not require them. Credential inflation is the increasing overqualification for occupations demanded by employers.

A good example of credential inflation is the decline in the value of the US high school diploma since the beginning of the 20th century, when it was held by less than 10 percent of the population. At the time, high school diplomas attested to middle-class respectability and for many years even provided access to managerial level jobs. In the 21st century, however, a high school diploma often barely qualifies the graduate for menial service work.

There are some occupations that used to require a primary school diploma, such as construction worker, shoemaker, and cleaner, now require a high school diploma. Some that required a high school diploma, such as construction supervisors, loans officers, insurance clerks, and executive assistants, are increasingly requiring a bachelor's degree. Some jobs that formerly required candidates to have a bachelor's degree, such as becoming a director in the federal government, tutoring students, or being a history tour guide in a historic site, now require a master's degree. Some jobs that used to require a master's degree, such as junior scientific researcher positions and sessional lecturer jobs, now require a PhD. Also, some jobs that formerly required only a PhD, such as university professor positions, are increasingly requiring one or more postdoctoral fellowship appointments. Often increased requirements are simply a way to reduce the number of applicants to a position. The increasingly global nature of competitions for high-level positions may also be another cause of credential creep.

Credentialism and professionalization

Credentialism is a reliance on formal qualifications or certifications to determine whether someone is permitted to undertake a task, speak as an "expert" or work in a certain field. It has also been defined as "excessive reliance on credentials, especially academic degrees, in determining hiring or promotion policies."

Professionalization is the social process by which any trade or occupation is transformed into a true "profession of the highest integrity and competence". This process tends to involve establishing acceptable qualifications, a professional body or association to oversee the conduct of members of the profession and some degree of demarcation of the qualified from unqualified amateurs. This creates "a hierarchical divide between the knowledge-authorities in the professions and a deferential citizenry." This demarcation is often termed "occupational closure", as it means that the profession then becomes closed to entry from outsiders, amateurs and the unqualified: a stratified occupation "defined by professional demarcation and grade".

Causes

Knowledge economy

The developed world has transitioned from an agricultural economy (pre-1760s) to an industrial economy (1760s – 1900s) to a knowledge economy (late 1900s – present) due to increases in innovation. This latest stage is marked by technological advancement and global competition to produce new products and research. The shift to a knowledge economy, a term coined by Peter Drucker, has led to a decrease in the demand for physical labor (such as that seen during the Industrial Age) and an increase in the demand for intellect. This has caused a multitude of problems to arise. Economists from the Federal Reserve Bank of St. Louis, who categorized jobs as being either routine cognitive, routine manual, nonroutine cognitive or nonroutine manual, have examined a 30 million increase in the number of nonroutine cognitive jobs over the past 30 years, making it the most common job type. These nonroutine cognitive jobs, according to researchers, require "high intellectual skill". This can be rather difficult to measure in potential employees. Additionally, production outputs differ amongst labor types. The results of manual labor are tangible, whereas the results of knowledge labor are not. Management consultant Fred Nickols identifies an issue with this:

The working behaviors of the manual worker are public and those of the knowledge worker are private. From the perspective of a supervisor or an industrial engineer, this means the visibility of working is high for a manual worker and low for a knowledge worker.

Decreased visibility in the workplace correlates with a greater risk of employees underperforming in cognitive tasks. This, along with the previously mentioned issue of measuring cognitive skill, has resulted in employers requiring credentials, such as college degrees. Matt Sigelman, CEO of a labor market analysis firm, elaborates on why employers such as himself value degrees:

Many employers are using the bachelor's degree as a proxy for quality employees—a rough, rule-of-thumb screening mechanism to sort through the resume pile. Employers believe in the college experience, not just as an incubator for job-specific skills but particularly for the so-called soft skills, such as writing, analytical thinking and even maturity.

History

Western culture, specifically that in the United States, has experienced a rise in the attractiveness of professions and a decline in the attractiveness of manufacturing and independent business. This shift could be attributed to the class stratification that occurred during the Gilded Age.

The Gilded Age was a period of time marked by a rise in big businesses and globalization, particularly within the construction and oil industries. During the Long Depression, the monopoly trusts dispossessed family and subsistence farmers of their land. This combined with the mechanization of farm work led to mass proletarianization, employers or the self-employed becoming wage laborers, as individuals took jobs working on large projects such as the Transcontinental Railroad. Rapid advancements such as railroad developments and increased use of steamboats to import/export goods made cities such as New York and Chicago convenient places to operate a business, and therefore ideal places to find work. Local business owners had a difficult time competing with the large companies such as Standard Oil and Armour and Company operating out of cities. The ability for people to become entrepreneurs declined, and people began taking underpaying jobs at these companies. This fueled a class divide between the working class and industrialists (also called "robber barons") such as Andrew Carnegie and John Rockefeller.

Attempting to increase the prestige of one's occupation became standard among working class individuals trying to recover from the financial hardships of this time. Unqualified individuals turned to professions such as medicine and law, which had low barriers to entry. Referring to this phenomenon, historian Robert Huddleston Wiebe once commented:

The concept of a middle class crumbled to a touch. Small businesses appeared and disappeared at a frightening rate. The so-called professions meant little as long as anyone with a bag of pills and a bottle of syrup could pass for a doctor, a few books and a corrupt judge made a man a lawyer, and an unemployed literate qualified as a teacher. Nor did the growing number of clerks, salesman, and secretaries of the city share much more than a common sense of drift as they fell into jobs that attached them to nothing in particular, beyond a salary, a set of clean clothes, and a hope that they would somehow rise in the world.

The establishment of legitimized professional certifications began after the turn of the twentieth century when the Carnegie Foundation published reports on medical and law education. One example of such reports is the Flexner Report, written by educator Abraham Flexner. This research led to the closing of low-quality medical and law schools. The impact of the many unqualified workers of the Gilded age also increased motivation to weed out unqualified workers in other professions. Professionalization increased, and the number of professions and professionals multiplied. There were economic benefits to this because it lowered the competition for jobs by weeding out unqualified candidates, driving up salaries.

The alliance of employers with educational institutions progressed throughout the twentieth century as businesses and technological advancements progressed. Businessmen were unable to keep schedules or accounts in their heads like the small-town merchant had once done. New systems of accounting, organization, and business management were developed. In his book The Visible Hand, Alfred Chandler of Harvard Business School explained that the increase in large corporations with multiple divisions killed off the hybrid owner/managers of simpler times and created a demand for salaried, "scientific" management. The development of professional management societies, research groups, and university business programs began in the early 1900s. By 1910, Harvard and Dartmouth offered graduate business programs, and NYU, the University of Chicago, and the University of Pennsylvania offered undergraduate business programs. By the 1960s, nearly half of all managerial jobs formally required either an undergraduate or graduate degree.

Academic inflation

Academic inflation is the contention that an excess of college-educated individuals with lower degrees (associate and bachelor's degrees) and even higher qualifications (master's or doctorate degrees) compete for too few jobs that require these degrees.

Academic inflation occurs when university graduates take up work that was not formerly done by graduates of a certain level, and higher-degree holders continue to migrate to this particular occupation until it eventually becomes a field known as a "graduate profession" and the minimum job requirements have been inflated academically for low-level job tasks.

The institutionalizing of professional education has resulted in fewer and fewer opportunities for young people to work their way up by "learning on the job". Academic inflation leads employers to put more faith into certificates and diplomas awarded on the basis of other people's assessments.

The term "academic inflation" was popularized by Ken Robinson in his TED Talk entitled "Schools Kill Creativity".

Academic inflation has been analogized to the inflation of paper currencies where too much currency chases too few commodities.

Grade inflation

Grade inflation is the tendency to award progressively higher academic grades for work that would have received lower grades in the past. It is frequently discussed in relation to education in the United States, and to GCSEs and A levels in England and Wales. It is also discussed as an issue in Canada and many other nations, especially Australia and New Zealand.

Credential inflation or degree inflation

Credential inflation refers to the devaluation of educational or academic credentials over time and a corresponding decrease in the expected advantage given a degree holder in the job market. Credential inflation is thus similar to price inflation, and describes the declining value of earned certificates and degrees. Credential inflation in the form of increased educational requirements and testing, can also create artificial labor shortages.

Credential inflation has been recognized as an enduring trend over the past century in Western higher education, and is also known to have occurred in ancient China and Japan, and at Spanish universities of the 17th century.

For instance, in the late 1980s, a bachelor's degree was the standard qualification to enter the profession of physical therapy. By the 1990s, a master's degree was expected. Today, a doctorate is becoming the norm.

State requirements that registered nurses must hold bachelor's degrees have also contributed to a nursing shortage.

Indications

A good example of credential inflation is the decline in the value of the US high school diploma since the beginning of the 20th century, when it was held by less than 10 percent of the population. At the time, high school diplomas attested to middle-class respectability and for many years even provided access to managerial level jobs. In the 21st century, however, a high school diploma often barely qualifies the graduate for menial service work.

One indicator of credential inflation is the relative decline in the wage differential between those with college degrees and those with only high school diplomas. An additional indicator is the gap between the credentials requested by employers in job postings and the qualifications of those already in those occupations. A 2014 study in the United States found, for example, that 65% of job postings for executive secretaries and executive assistants now call for a bachelor's degree, but only 19% of those currently employed in these roles have a degree. Jobs that were open to high school graduates decades ago now routinely require higher education as well—without an appreciable change in required skills. In some cases, such as IT help desk roles, a study found there was little difference in advertised skill requirements between jobs requiring a college degree and those that do not.

According to the New York Federal Reserve Bank, about one third of all college graduates are underemployed, meaning they're employed below the value of their degrees. That distribution has remained largely unchanged for thirty years, although the chance of being underemployed in a good job has gone down 28.0% for recent hirings, and 20.6% overall.

Causes

The causes of credential inflation are controversial, but it is generally thought to be the result of increased access to higher education. This has resulted in entry-level jobs requesting a bachelor's (or higher) degree when they were once open to high school graduates. Potential sources of credential inflation include: degree requirements by employers, self-interest of individuals and families, increased standards of living which allow for additional years of education, cultural pushes for being educated, and the availability of federal student loans which allow many more individuals to obtain credentials than could otherwise afford to do so.

In particular, the internal dynamics of credential inflation threaten higher education initiatives around the world because credential inflation appears to operate independently of market demand for credentials.

The push for more Americans to get a higher education rests on the well-evidenced idea that those without a college degree are less employable. Many critics of higher education, in turn, complain that a surplus of college graduates has produced an "employer's market". Economist Bryan Caplan has argued the combination of more college graduates and weaker learning outcomes has led to employers asking for college degrees for jobs that don't need one and previously did not require one.

Problems

Credential inflation is a controversial topic. There is very little consensus on how, or if, this type of inflation impacts higher education, the job market, and salaries. Some common concerns discussed in this topic are:

  • The proliferation of a grandiose and superficial culture has engendered a paradoxical phenomenon within China's employment landscape, particularly evident in the disproportionate educational requirements for roles traditionally devoid of such stipulations. Previously, there were no academic qualifications required for security guard positions. However, it has come to light that some employers are now demanding that applicants for such roles possess a master's degree or higher. And in fact, many jobs do not require formal education and can be adequately performed after on-the-job training or apprenticeships And those applicants who meet the educational requirements and are hired by employers often find that the subject of their diploma does not align with the work they are doing. They have to start from scratch in their jobs, causing a waste of educational resources.
  • College tuition and fee increases have been blamed on degree inflation, though the current data do not generally support this assertion.
  • Credential-driven students may be less engaged than those who are attending college for personal enrichment.
  • Devaluation of other forms of learning.
  • Opportunity costs of attending graduate school, which can include delayed savings, less years in work force (and less earnings), and postponement of starting families.
  • Lack of adequately trained faculty and rises in the number of adjunct professors which can adversely impact quality of education.
  • Grade inflation has been correlated to degree inflation by some academics, though the causal direction is debated.
  • Some have accused degree inflation of devaluating job and employment experience, though most data show that degrees are not as highly sought after as relevant experience, which is the cited reason for student loan debt that cannot be paid back.

In non-US countries

China

Chinese educational competition is described as breakneck and cut-throat. The word “neijuan” or “involution” has been used to describe people competing for diminishing returns. China is a country exhibiting high wealth inequality and meager social mobility, raising the stakes to get into the few available managerial positions. The entrenched high-stakes testing culture coupled with inconsistent governance has led to unusually high levels of cheating among the fuerdai (China's second-generation rich). The practice includes whole cheating rings and persists despite extreme penalties, as high as seven years in prison. To combat this self-defeating testing culture, the Chinese government has banned cram schools and for-profit tutoring businesses, as well as tutoring on the weekends. “Tang ping” or “lying flat” refers to a peaceful Chinese protest movement calling attention to the desire not to be burned up in an economic race that so many can't seem to win. Six hundred thousand lives are lost in China, each year, as a result of “guolaosi” (过劳死); traditional Chinese: 過勞死) or "death by overwork."

South Korea

South Korea has a very high-pressure education system. 70% of South Koreans have postsecondary diplomas and South Koreans score at or near the top when compared to other countries, but are left to fight for few jobs in a high-maintenance economy. Aside from having to work very hard, they also face an immense housing crisis. In 2021, suicide was the leading cause of death for those under 40, responsible for 44% of teenage deaths, which went up to 56.8% of deaths for those in their 20's. Among OECD nations, South Korea has the highest suicide rate. Only 23.6% of teachers expressed satisfaction with their work in a 2023 poll. The country has also suffered from cripplingly low birth-rates, less than one per female, a testament to the strain that would-be parents endure.

Water vapor

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Water_vapor

Water vapor, water vapour or aqueous vapor is the gaseous phase of water. It is one state of water within the hydrosphere. Water vapor can be produced from the evaporation or boiling of liquid water or from the sublimation of ice. Water vapor is transparent, like most constituents of the atmosphere. Under typical atmospheric conditions, water vapor is continuously generated by evaporation and removed by condensation. It is less dense than most of the other constituents of air and triggers convection currents that can lead to clouds and fog.

Being a component of Earth's hydrosphere and hydrologic cycle, it is particularly abundant in Earth's atmosphere, where it acts as a greenhouse gas and warming feedback, contributing more to total greenhouse effect than non-condensable gases such as carbon dioxide and methane. Use of water vapor, as steam, has been important for cooking, and as a major component in energy production and transport systems since the industrial revolution.

Water vapor is a relatively common atmospheric constituent, present even in the solar atmosphere as well as every planet in the Solar System and many astronomical objects including natural satellites, comets and even large asteroids. Likewise the detection of extrasolar water vapor would indicate a similar distribution in other planetary systems. Water vapor can also be indirect evidence supporting the presence of extraterrestrial liquid water in the case of some planetary mass objects.

Water vapor, which reacts to temperature changes, is referred to as a 'feedback', because it amplifies the effect of forces that initially cause the warming. Therefore, it is a greenhouse gas.

Properties

Evaporation

Whenever a water molecule leaves a surface and diffuses into a surrounding gas, it is said to have evaporated. Each individual water molecule which transitions between a more associated (liquid) and a less associated (vapor/gas) state does so through the absorption or release of kinetic energy. The aggregate measurement of this kinetic energy transfer is defined as thermal energy and occurs only when there is differential in the temperature of the water molecules. Liquid water that becomes water vapor takes a parcel of heat with it, in a process called evaporative cooling. The amount of water vapor in the air determines how frequently molecules will return to the surface. When a net evaporation occurs, the body of water will undergo a net cooling directly related to the loss of water.

In the US, the National Weather Service measures the actual rate of evaporation from a standardized "pan" open water surface outdoors, at various locations nationwide. Others do likewise around the world. The US data is collected and compiled into an annual evaporation map. The measurements range from under 30 to over 120 inches per year. Formulas can be used for calculating the rate of evaporation from a water surface such as a swimming pool. In some countries, the evaporation rate far exceeds the precipitation rate.

Evaporative cooling is restricted by atmospheric conditions. Humidity is the amount of water vapor in the air. The vapor content of air is measured with devices known as hygrometers. The measurements are usually expressed as specific humidity or percent relative humidity. The temperatures of the atmosphere and the water surface determine the equilibrium vapor pressure; 100% relative humidity occurs when the partial pressure of water vapor is equal to the equilibrium vapor pressure. This condition is often referred to as complete saturation. Humidity ranges from 0 grams per cubic metre in dry air to 30 grams per cubic metre (0.03 ounce per cubic foot) when the vapor is saturated at 30 °C.

Recovery of meteorites in Antarctica (ANSMET)
 
Electron micrograph of freeze-etched capillary tissue

Sublimation

Sublimation is the process by which water molecules directly leave the surface of ice without first becoming liquid water. Sublimation accounts for the slow mid-winter disappearance of ice and snow at temperatures too low to cause melting. Antarctica shows this effect to a unique degree because it is by far the continent with the lowest rate of precipitation on Earth. As a result, there are large areas where millennial layers of snow have sublimed, leaving behind whatever non-volatile materials they had contained. This is extremely valuable to certain scientific disciplines, a dramatic example being the collection of meteorites that are left exposed in unparalleled numbers and excellent states of preservation.

Sublimation is important in the preparation of certain classes of biological specimens for scanning electron microscopy. Typically the specimens are prepared by cryofixation and freeze-fracture, after which the broken surface is freeze-etched, being eroded by exposure to vacuum until it shows the required level of detail. This technique can display protein molecules, organelle structures and lipid bilayers with very low degrees of distortion.

Condensation

Clouds, formed by condensed water vapor

Water vapor will only condense onto another surface when that surface is cooler than the dew point temperature, or when the water vapor equilibrium in air has been exceeded. When water vapor condenses onto a surface, a net warming occurs on that surface. The water molecule brings heat energy with it. In turn, the temperature of the atmosphere drops slightly. In the atmosphere, condensation produces clouds, fog and precipitation (usually only when facilitated by cloud condensation nuclei). The dew point of an air parcel is the temperature to which it must cool before water vapor in the air begins to condense. Condensation in the atmosphere forms cloud droplets.

Also, a net condensation of water vapor occurs on surfaces when the temperature of the surface is at or below the dew point temperature of the atmosphere. Deposition is a phase transition separate from condensation which leads to the direct formation of ice from water vapor. Frost and snow are examples of deposition.

There are several mechanisms of cooling by which condensation occurs: 1) Direct loss of heat by conduction or radiation. 2) Cooling from the drop in air pressure which occurs with uplift of air, also known as adiabatic cooling. Air can be lifted by mountains, which deflect the air upward, by convection, and by cold and warm fronts. 3) Advective cooling - cooling due to horizontal movement of air.

Importance and Uses

  • Provides water for plants and animals: Water vapour gets converted to rain and snow that serve as a natural source of water for plants and animals.
  • Controls evaporation: Excess water vapor in the air decreases the rate of evaporation.
  • Determines climatic conditions: Excess water vapor in the air produces rain, fog, snow etc. Hence, it determines climatic conditions.

Chemical reactions

A number of chemical reactions have water as a product. If the reactions take place at temperatures higher than the dew point of the surrounding air the water will be formed as vapor and increase the local humidity, if below the dew point local condensation will occur. Typical reactions that result in water formation are the burning of hydrogen or hydrocarbons in air or other oxygen containing gas mixtures, or as a result of reactions with oxidizers.

In a similar fashion other chemical or physical reactions can take place in the presence of water vapor resulting in new chemicals forming such as rust on iron or steel, polymerization occurring (certain polyurethane foams and cyanoacrylate glues cure with exposure to atmospheric humidity) or forms changing such as where anhydrous chemicals may absorb enough vapor to form a crystalline structure or alter an existing one, sometimes resulting in characteristic color changes that can be used for measurement.

Measurement

Measuring the quantity of water vapor in a medium can be done directly or remotely with varying degrees of accuracy. Remote methods such electromagnetic absorption are possible from satellites above planetary atmospheres. Direct methods may use electronic transducers, moistened thermometers or hygroscopic materials measuring changes in physical properties or dimensions.


medium temperature range (degC) measurement uncertainty typical measurement frequency system cost notes
Sling psychrometer air −10 to 50 low to moderate hourly low
Satellite-based spectroscopy air −80 to 60 low
very high
Capacitive sensor air/gases −40 to 50 moderate 2 to 0.05 Hz medium prone to becoming saturated/contaminated over time
Warmed capacitive sensor air/gases −15 to 50 moderate to low 2 to 0.05 Hz (temp dependant) medium to high prone to becoming saturated/contaminated over time
Resistive sensor air/gases −10 to 50 moderate 60 seconds medium prone to contamination
Lithium chloride dewcell air −30 to 50 moderate continuous medium see dewcell
Cobalt(II) chloride air/gases 0 to 50 high 5 minutes very low often used in Humidity indicator card
Absorption spectroscopy air/gases
moderate
high
Aluminum oxide air/gases
moderate
medium see Moisture analysis
Silicon oxide air/gases
moderate
medium see Moisture analysis
Piezoelectric sorption air/gases
moderate
medium see Moisture analysis
Electrolytic air/gases
moderate
medium see Moisture analysis
Hair tension air 0 to 40 high continuous low to medium Affected by temperature. Adversely affected by prolonged high concentrations
Nephelometer air/other gases
low
very high
Goldbeater's skin (Cow Peritoneum) air −20 to 30 moderate (with corrections) slow, slower at lower temperatures low ref:WMO Guide to Meteorological Instruments and Methods of Observation No. 8 2006, (pages 1.12–1)
Lyman-alpha


high frequency high http://amsglossary.allenpress.com/glossary/search?id=lyman-alpha-hygrometer1 Requires frequent calibration
Gravimetric Hygrometer

very low
very high often called primary source, national independent standards developed in US, UK, EU & Japan

medium temperature range (degC) measurement uncertainty typical measurement frequency system cost notes

Impact on air density

Water vapor is lighter or less dense than dry air. At equivalent temperatures it is buoyant with respect to dry air, whereby the density of dry air at standard temperature and pressure (273.15 K, 101.325 kPa) is 1.27 g/L and water vapor at standard temperature has a vapor pressure of 0.6 kPa and the much lower density of 0.0048 g/L.

Calculations

Water vapor and dry air density calculations at 0 °C:

  • The molar mass of water is 18.02 g/mol, as calculated from the sum of the atomic masses of its constituent atoms.
  • The average molar mass of air (approx. 78% nitrogen, N2; 21% oxygen, O2; 1% other gases) is 28.57 g/mol at standard temperature and pressure (STP).
  • Obeying Avogadro's Law and the ideal gas law, moist air will have a lower density than dry air. At max. saturation (i. e. rel. humidity = 100% at 0 °C) the density will go down to 28.51 g/mol.
  • STP conditions imply a temperature of 0 °C, at which the ability of water to become vapor is very restricted. Its concentration in air is very low at 0 °C. The red line on the chart to the right is the maximum concentration of water vapor expected for a given temperature. The water vapor concentration increases significantly as the temperature rises, approaching 100% (steam, pure water vapor) at 100 °C. However the difference in densities between air and water vapor would still exist (0.598 vs. 1.27 g/L).

At equal temperatures

At the same temperature, a column of dry air will be denser or heavier than a column of air containing any water vapor, the molar mass of diatomic nitrogen and diatomic oxygen both being greater than the molar mass of water. Thus, any volume of dry air will sink if placed in a larger volume of moist air. Also, a volume of moist air will rise or be buoyant if placed in a larger region of dry air. As the temperature rises the proportion of water vapor in the air increases, and its buoyancy will increase. The increase in buoyancy can have a significant atmospheric impact, giving rise to powerful, moisture rich, upward air currents when the air temperature and sea temperature reaches 25 °C or above. This phenomenon provides a significant driving force for cyclonic and anticyclonic weather systems (typhoons and hurricanes).

Respiration and breathing

Water vapor is a by-product of respiration in plants and animals. Its contribution to the pressure, increases as its concentration increases. Its partial pressure contribution to air pressure increases, lowering the partial pressure contribution of the other atmospheric gases (Dalton's Law). The total air pressure must remain constant. The presence of water vapor in the air naturally dilutes or displaces the other air components as its concentration increases.

This can have an effect on respiration. In very warm air (35 °C) the proportion of water vapor is large enough to give rise to the stuffiness that can be experienced in humid jungle conditions or in poorly ventilated buildings.

Lifting gas

Water vapor has lower density than that of air and is therefore buoyant in air but has lower vapor pressure than that of air. When water vapor is used as a lifting gas by a thermal airship the water vapor is heated to form steam so that its vapor pressure is greater than the surrounding air pressure in order to maintain the shape of a theoretical "steam balloon", which yields approximately 60% the lift of helium and twice that of hot air.

General discussion

The amount of water vapor in an atmosphere is constrained by the restrictions of partial pressures and temperature. Dew point temperature and relative humidity act as guidelines for the process of water vapor in the water cycle. Energy input, such as sunlight, can trigger more evaporation on an ocean surface or more sublimation on a chunk of ice on top of a mountain. The balance between condensation and evaporation gives the quantity called vapor partial pressure.

The maximum partial pressure (saturation pressure) of water vapor in air varies with temperature of the air and water vapor mixture. A variety of empirical formulas exist for this quantity; the most used reference formula is the Goff-Gratch equation for the SVP over liquid water below zero degrees Celsius:

where T, temperature of the moist air, is given in units of kelvin, and p is given in units of millibars (hectopascals).

The formula is valid from about −50 to 102 °C; however there are a very limited number of measurements of the vapor pressure of water over supercooled liquid water. There are a number of other formulae which can be used.

Under certain conditions, such as when the boiling temperature of water is reached, a net evaporation will always occur during standard atmospheric conditions regardless of the percent of relative humidity. This immediate process will dispel massive amounts of water vapor into a cooler atmosphere.

Exhaled air is almost fully at equilibrium with water vapor at the body temperature. In the cold air the exhaled vapor quickly condenses, thus showing up as a fog or mist of water droplets and as condensation or frost on surfaces. Forcibly condensing these water droplets from exhaled breath is the basis of exhaled breath condensate, an evolving medical diagnostic test.

Controlling water vapor in air is a key concern in the heating, ventilating, and air-conditioning (HVAC) industry. Thermal comfort depends on the moist air conditions. Non-human comfort situations are called refrigeration, and also are affected by water vapor. For example, many food stores, like supermarkets, utilize open chiller cabinets, or food cases, which can significantly lower the water vapor pressure (lowering humidity). This practice delivers several benefits as well as problems.

In Earth's atmosphere

Evidence for increasing amounts of stratospheric water vapor over time in Boulder, Colorado.

Gaseous water represents a small but environmentally significant constituent of the atmosphere. The percentage of water vapor in surface air varies from 0.01% at -42 °C (-44 °F) to 4.24% when the dew point is 30 °C (86 °F). Over 99% of atmospheric water is in the form of vapour, rather than liquid water or ice, and approximately 99.13% of the water vapour is contained in the troposphere. The condensation of water vapor to the liquid or ice phase is responsible for clouds, rain, snow, and other precipitation, all of which count among the most significant elements of what we experience as weather. Less obviously, the latent heat of vaporization, which is released to the atmosphere whenever condensation occurs, is one of the most important terms in the atmospheric energy budget on both local and global scales. For example, latent heat release in atmospheric convection is directly responsible for powering destructive storms such as tropical cyclones and severe thunderstorms. Water vapor is an important greenhouse gas owing to the presence of the hydroxyl bond which strongly absorbs in the infra-red.

Water vapor is the "working medium" of the atmospheric thermodynamic engine which transforms heat energy from sun irradiation into mechanical energy in the form of winds. Transforming thermal energy into mechanical energy requires an upper and a lower temperature level, as well as a working medium which shuttles forth and back between both. The upper temperature level is given by the soil or water surface of the Earth, which absorbs the incoming sun radiation and warms up, evaporating water. The moist and warm air at the ground is lighter than its surroundings and rises up to the upper limit of the troposphere. There the water molecules radiate their thermal energy into outer space, cooling down the surrounding air. The upper atmosphere constitutes the lower temperature level of the atmospheric thermodynamic engine. The water vapor in the now cold air condenses out and falls down to the ground in the form of rain or snow. The now heavier cold and dry air sinks down to ground as well; the atmospheric thermodynamic engine thus establishes a vertical convection, which transports heat from the ground into the upper atmosphere, where the water molecules can radiate it to outer space. Due to the Earth's rotation and the resulting Coriolis forces, this vertical atmospheric convection is also converted into a horizontal convection, in the form of cyclones and anticyclones, which transport the water evaporated over the oceans into the interior of the continents, enabling vegetation to grow.

Water in Earth's atmosphere is not merely below its boiling point (100 °C), but at altitude it goes below its freezing point (0 °C), due to water's highly polar attraction. When combined with its quantity, water vapor then has a relevant dew point and frost point, unlike e. g., carbon dioxide and methane. Water vapor thus has a scale height a fraction of that of the bulk atmosphere, as the water condenses and exits, primarily in the troposphere, the lowest layer of the atmosphere. Carbon dioxide (CO2) and methane, being well-mixed in the atmosphere, tend to rise above water vapour. The absorption and emission of both compounds contribute to Earth's emission to space, and thus the planetary greenhouse effect. This greenhouse forcing is directly observable, via distinct spectral features versus water vapor, and observed to be rising with rising CO2 levels. Conversely, adding water vapor at high altitudes has a disproportionate impact, which is why jet traffic has a disproportionately high warming effect. Oxidation of methane is also a major source of water vapour in the stratosphere, and adds about 15% to methane's global warming effect.

In the absence of other greenhouse gases, Earth's water vapor would condense to the surface; this has likely happened, possibly more than once. Scientists thus distinguish between non-condensable (driving) and condensable (driven) greenhouse gases, i.e., the above water vapor feedback.

Fog and clouds form through condensation around cloud condensation nuclei. In the absence of nuclei, condensation will only occur at much lower temperatures. Under persistent condensation or deposition, cloud droplets or snowflakes form, which precipitate when they reach a critical mass.

Atmospheric concentration of water vapour is highly variable between locations and times, from 10 ppmv in the coldest air to 5% (50 000 ppmv) in humid tropical air, and can be measured with a combination of land observations, weather balloons and satellites. The water content of the atmosphere as a whole is constantly depleted by precipitation. At the same time it is constantly replenished by evaporation, most prominently from oceans, lakes, rivers, and moist earth. Other sources of atmospheric water include combustion, respiration, volcanic eruptions, the transpiration of plants, and various other biological and geological processes. At any given time there is about 1.29 x 1016 litres (3.4 x 1015 gal.) of water in the atmosphere. The atmosphere holds 1 part in 2500 of the fresh water, and 1 part in 100,000 of the total water on Earth. The mean global content of water vapor in the atmosphere is roughly sufficient to cover the surface of the planet with a layer of liquid water about 25 mm deep. The mean annual precipitation for the planet is about 1 metre, a comparison which implies a rapid turnover of water in the air – on average, the residence time of a water molecule in the troposphere is about 9 to 10 days.

Some effects of global warming can either enhance (positive feedbacks such as increased water vapor concentration) or inhibit (negative feedbacks) warming.

Global mean water vapour is about 0.25% of the atmosphere by mass and also varies seasonally, in terms of contribution to atmospheric pressure between 2.62 hPa in July and 2.33 hPa in December. IPCC AR6 expresses medium confidence in increase of total water vapour at about 1-2% per decade; it is expected to increase by around 7% per °C of warming.

Episodes of surface geothermal activity, such as volcanic eruptions and geysers, release variable amounts of water vapor into the atmosphere. Such eruptions may be large in human terms, and major explosive eruptions may inject exceptionally large masses of water exceptionally high into the atmosphere, but as a percentage of total atmospheric water, the role of such processes is trivial. The relative concentrations of the various gases emitted by volcanoes varies considerably according to the site and according to the particular event at any one site. However, water vapor is consistently the commonest volcanic gas; as a rule, it comprises more than 60% of total emissions during a subaerial eruption.

Atmospheric water vapor content is expressed using various measures. These include vapor pressure, specific humidity, mixing ratio, dew point temperature, and relative humidity.

Radar and satellite imaging

MODIS/Terra global mean atmospheric water vapor in atm-cm (centimeters of water in an atmospheric column if it condensed)

Because water molecules absorb microwaves and other radio wave frequencies, water in the atmosphere attenuates radar signals. In addition, atmospheric water will reflect and refract signals to an extent that depends on whether it is vapor, liquid or solid.

Generally, radar signals lose strength progressively the farther they travel through the troposphere. Different frequencies attenuate at different rates, such that some components of air are opaque to some frequencies and transparent to others. Radio waves used for broadcasting and other communication experience the same effect.

Water vapor reflects radar to a lesser extent than do water's other two phases. In the form of drops and ice crystals, water acts as a prism, which it does not do as an individual molecule; however, the existence of water vapor in the atmosphere causes the atmosphere to act as a giant prism.

A comparison of GOES-12 satellite images shows the distribution of atmospheric water vapor relative to the oceans, clouds and continents of the Earth. Vapor surrounds the planet but is unevenly distributed. The image loop on the right shows monthly average of water vapor content with the units are given in centimeters, which is the precipitable water or equivalent amount of water that could be produced if all the water vapor in the column were to condense. The lowest amounts of water vapor (0 centimeters) appear in yellow, and the highest amounts (6 centimeters) appear in dark blue. Areas of missing data appear in shades of gray. The maps are based on data collected by the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor on NASA's Aqua satellite. The most noticeable pattern in the time series is the influence of seasonal temperature changes and incoming sunlight on water vapor. In the tropics, a band of extremely humid air wobbles north and south of the equator as the seasons change. This band of humidity is part of the Intertropical Convergence Zone, where the easterly trade winds from each hemisphere converge and produce near-daily thunderstorms and clouds. Farther from the equator, water vapor concentrations are high in the hemisphere experiencing summer and low in the one experiencing winter. Another pattern that shows up in the time series is that water vapor amounts over land areas decrease more in winter months than adjacent ocean areas do. This is largely because air temperatures over land drop more in the winter than temperatures over the ocean. Water vapor condenses more rapidly in colder air.

As water vapor absorbs light in the visible spectral range, its absorption can be used in spectroscopic applications (such as DOAS) to determine the amount of water vapor in the atmosphere. This is done operationally, e.g. from the Global Ozone Monitoring Experiment (GOME) spectrometers on ERS (GOME) and MetOp (GOME-2). The weaker water vapor absorption lines in the blue spectral range and further into the UV up to its dissociation limit around 243 nm are mostly based on quantum mechanical calculations and are only partly confirmed by experiments.

Lightning generation

Water vapor plays a key role in lightning production in the atmosphere. From cloud physics, usually clouds are the real generators of static charge as found in Earth's atmosphere. The ability of clouds to hold massive amounts of electrical energy is directly related to the amount of water vapor present in the local system.

The amount of water vapor directly controls the permittivity of the air. During times of low humidity, static discharge is quick and easy. During times of higher humidity, fewer static discharges occur. Permittivity and capacitance work hand in hand to produce the megawatt outputs of lightning.

After a cloud, for instance, has started its way to becoming a lightning generator, atmospheric water vapor acts as a substance (or insulator) that decreases the ability of the cloud to discharge its electrical energy. Over a certain amount of time, if the cloud continues to generate and store more static electricity, the barrier that was created by the atmospheric water vapor will ultimately break down from the stored electrical potential energy. This energy will be released to a local oppositely charged region, in the form of lightning. The strength of each discharge is directly related to the atmospheric permittivity, capacitance, and the source's charge generating ability.

Extraterrestrial

Water vapor is common in the Solar System and by extension, other planetary systems. Its signature has been detected in the atmospheres of the Sun, occurring in sunspots. The presence of water vapor has been detected in the atmospheres of all seven extraterrestrial planets in the Solar System, the Earth's Moon, and the moons of other planets, although typically in only trace amounts.

Cryogeyser erupting on Jupiter's moon Europa (artist concept)
Artist's illustration of the signatures of water in exoplanet atmospheres detectable by instruments such as the Hubble Space Telescope.

Geological formations such as cryogeysers are thought to exist on the surface of several icy moons ejecting water vapor due to tidal heating and may indicate the presence of substantial quantities of subsurface water. Plumes of water vapor have been detected on Jupiter's moon Europa and are similar to plumes of water vapor detected on Saturn's moon Enceladus. Traces of water vapor have also been detected in the stratosphere of Titan. Water vapor has been found to be a major constituent of the atmosphere of dwarf planet, Ceres, largest object in the asteroid belt. The detection was made by using the far-infrared abilities of the Herschel Space Observatory. The finding is unexpected because comets, not asteroids, are typically considered to "sprout jets and plumes." According to one of the scientists, "The lines are becoming more and more blurred between comets and asteroids." Scientists studying Mars hypothesize that if water moves about the planet, it does so as vapor.

The brilliance of comet tails comes largely from water vapor. On approach to the Sun, the ice many comets carry sublimes to vapor. Knowing a comet's distance from the sun, astronomers may deduce the comet's water content from its brilliance.

Water vapor has also been confirmed outside the Solar System. Spectroscopic analysis of HD 209458 b, an extrasolar planet in the constellation Pegasus, provides the first evidence of atmospheric water vapor beyond the Solar System. A star called CW Leonis was found to have a ring of vast quantities of water vapor circling the aging, massive star. A NASA satellite designed to study chemicals in interstellar gas clouds, made the discovery with an onboard spectrometer. Most likely, "the water vapor was vaporized from the surfaces of orbiting comets." Other exoplanets with evidence of water vapor include HAT-P-11b and K2-18b.

Psychedelic drug

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Psychedelic_drug Synthetic mescaline , the first psychedelic com...