Search This Blog

Sunday, February 27, 2022

Petroleum politics

From Wikipedia, the free encyclopedia
 
Argentine president Néstor Kirchner and Venezuelan President Hugo Chávez discuss the Gran Gasoducto del Sur, an energy and trade integration project for South America. They met on November 21, 2005 in Venezuela.

Petroleum politics have been an increasingly important aspect of diplomacy since the rise of the petroleum industry in the Middle East in the early 20th century. As competition continues for a vital resource, the strategic calculations of major and minor countries alike place prominent emphasis on the pumping, refining, transport, sale and use of petroleum products.

Quota agreements

The Achnacarry Agreement or "As-Is Agreement" was an early attempt to restrict petroleum production, signed in Scotland on 17 September 1928. The discovery of the East Texas Oil Field in the 1930s led to a boom in production that caused prices to fall, leading the Railroad Commission of Texas to control production. The Commission retained de facto control of the market until the rise of OPEC in the 1970s.

The Anglo-American Petroleum Agreement of 1944 tried to extend these restrictions internationally but was opposed by the industry in the United States and so Franklin Roosevelt withdrew from the deal.

Venezuela was the first country to move towards the establishment of OPEC by approaching Iran, Gabon, Libya, Kuwait and Saudi Arabia in 1949, but OPEC was not set up until 1960, when the United States forced import quotas on Venezuelan and Persian Gulf oil in order to support the Canadian and Mexican oil industries. OPEC first wielded its power with the 1973 oil embargo against the United States and Western Europe.

Oil and international conflict

The term "petro-aggression" has been used to describe the tendency of oil-rich states to instigate international conflicts. There are many examples including: Iraq's invasion of Iran and Kuwait; Libya's repeated incursions into Chad in the 1970s and 1980s; Iran's long-standing suspicion of Western powers. Some scholars have also suggested that oil-rich states are frequently the targets of "resource wars."

Peak oil

In 1956, a Shell geophysicist named M. King Hubbert accurately predicted that U.S. oil production would peak in 1970.

In June 2006, former U.S. president Bill Clinton said in a speech,

"We may be at a point of peak oil production. You may see $100 a barrel oil in the next two or three years, but what still is driving this globalization is the idea that is you cannot possibly get rich, stay rich and get richer if you don’t release more greenhouse gases into the atmosphere. That was true in the industrial era; it is simply factually not true. What is true is that the old energy economy is well organized, financed and connected politically."

In a 1999 speech, Dick Cheney, the US vice president and former CEO of Halliburton (one of the world's largest energy services corporations), said,

"By some estimates there will be an average of two per cent annual growth in global oil demand over the years ahead along with conservatively a three per cent natural decline in production from existing reserves. That means by 2010 we will need on the order of an additional fifty million barrels a day. So where is the oil going to come from?....While many regions of the world offer great oil opportunities, the Middle East with two thirds of the world's oil and the lowest cost, is still where the prize ultimately lies, even though companies are anxious for greater access there, progress continues to be slow."

Cheney went on to argue that the oil industry should become more active in politics:

"Oil is the only large industry whose leverage has not been all that effective in the political arena. Textiles, electronics, agriculture all seem often to be more influential. Our constituency is not only oilmen from Louisiana and Texas, but software writers in Massachusetts and specialty steel producers in Pennsylvania. I am struck that this industry is so strong technically and financially yet not as politically successful or influential as are often smaller industries. We need to earn credibility to have our views heard."

Pipeline diplomacy in the Caspian Sea area

Location of Baku–Tbilisi–Ceyhan pipeline

The Baku–Tbilisi–Ceyhan pipeline was built to transport crude oil and the Baku-Tbilisi-Erzurum pipeline was built to transport natural gas from the western side (Azerbaijani sector) of the Caspian Sea to the Mediterranean Sea bypassing Russian pipelines and thus Russian control. Following the construction of the pipelines, the United States and the European Union proposed extending them by means of the proposed Trans-Caspian Oil Pipeline and the Trans-Caspian Gas Pipeline under the Caspian Sea to oil and gas fields on the eastern side (Kazakhstan and Turkmenistan sectors) of the Caspian Sea. In 2007, Russia signed agreements with Turkmenistan and Kazakhstan to connect their oil and gas fields to the Russian pipeline system effectively killing the undersea route.

China has completed the Kazakhstan–China oil pipeline from the Kazakhstan oil fields to the Chinese Alashankou-Dushanzi Crude Oil Pipeline in China. China is also working on the Kazakhstan-China gas pipeline from the Kazakhstan gas fields to the Chinese West-East Gas Pipeline in China.

Politics of oil nationalization

Several countries have nationalised foreign-run oil businesses, often failing to compensate investors. Enrique Mosconi, the director of the Argentine state owned oil company Yacimientos Petrolíferos Fiscales (YPF, which was the first state owned oil company in the world, preceding the French Compagnie française des pétroles (CFP, French Company of Petroleums), created in 1924 by the conservative Raymond Poincaré), advocated oil nationalization in the late 1920s among Latin American countries. The latter was achieved in Mexico during Lázaro Cárdenas's rule, with the Expropiación petrolera.

Similarly Venezuela nationalized its oil industry in 1976.

Politics of alternative fuels

Vinod Khosla (a well known investor in IT firms and alternative energy) has argued that the political interests of environmental advocates, agricultural businesses, energy security advocates (such as ex-CIA director James Woolsey) and automakers, are all aligned for the increased production of ethanol. He pointed out that from 2003 to 2006, ethanol fuel in Brazil replaced 40% of its gasoline consumption while flex fuel vehicles went from 3% of car sales to 70%. Brazilian ethanol, which is produced using sugarcane, reduces greenhouse gases by 60-80% (20% for corn-produced ethanol). Khosla also said that ethanol was about 10% cheaper per given distance. There are currently ethanol subsidies in the United States but they are all blender's credits, meaning the oil refineries receive the subsidies rather than the farmers. There are indirect subsidies due to subsidising farmers to produce corn. Vinod says after one of his presentations in Davos, a senior Saudi oil official came up to him and threatened: "If biofuels start to take off, we will drop the price of oil." Since then, Vinod has come up with a new recommendation that oil should be taxed if it drops below $40.00/barrel in order to counter price manipulation.

Ex-CIA director James Woolsey and U.S. Senator Richard Lugar are also vocal proponents of ethanol.

In 2005, Sweden announced plans to end its dependence on fossil fuels by the year 2020.

It is argued that international climate policy and unconventional oil and gas developments may change the balance of power between petroleum exporting and importing countries with major negative implications expected for the exporting states. From around 2015 onwards, there was increasing discussion about whether the geopolitics of oil and gas would be replaced by the geopolitics of renewable energy resources and critical materials for renewable energy technologies.

Some authors argue that compared to the geopolitics of fossil fuels, renewable energy may cause more small-scale conflicts but reduce the risk of large inter-state conflicts.

Geopolitics of oil money

Multibillion-dollar inflows and outflows of petroleum money have worldwide macroeconomic consequences, and major oil exporters can gain substantial influence from their petrodollar recycling activities.

Key oil producing countries

Canada

As development in the Alberta oil sands, deep sea drilling in the North Atlantic and the prospects of arctic oil continue to grow Canada increasingly grows as a global oil exporter. There are currently three major pipelines under proposal that would ship oil to the pacific, atlantic and gulf ports. These projects have stirred internal controversy, receiving fierce opposition from First Nations groups and environmentalists.

Iran

Discovery of oil in 1908 at Masjed Soleiman in Iran initiated the quest for oil in the Middle East. The Anglo-Iranian Oil Company (AIOC) was founded in 1909. In 1951, Iran nationalized its oil fields initiating the Abadan Crisis. The United States of America and Great Britain thus punished Iran by arranging coup against its democratically elected prime minister, Mosaddeq, and brought the former Shah's son, a dictator, to power. In 1953 the US and GB arranged the arrest of the Prime Minister Mosaddeq. Iran exports oil to China and Russia.

Iraq

A U.S. soldier stands guard duty near a burning oil well in the Rumaila oil field, Iraq, April 2003

Iraq holds the world's second-largest proven oil reserves, with increasing exploration expected to enlarge them beyond 200 billion barrels (3.2×1010 m3) of "high-grade crude, extraordinarily cheap to produce." Organizations such as the Global Policy Forum (GPF) have asserted that Iraq's oil is "the central feature of the political landscape" there, and that as a result of the 2003 invasion,"'friendly' companies expect to gain most of the lucrative oil deals that will be worth hundreds of billions of dollars in profits in the coming decades." According to GPF, U.S. influence over the 2005 Constitution of Iraq has made sure it "contains language that guarantees a major role for foreign companies."

Mexico

Mexico has a largely oil-based economy, being the seventh largest producer of petroleum. Though Mexico has gradually explored different types of electricity, oil is still crucial, recently generating 10% of revenue.

Before 1938, all petroleum companies in Mexico were foreign based, often from the United States or Europe. The petroleum industry was nationalized in the late 1930s to early 1940s by then-president Lázaro Cárdenas, creating PEMEX. Mexico's oil industry still remains heavily nationalized. Though oil production has fallen in recent years, Mexico still remains in seventh place.

Norway

Although Norway relies heavily on natural gas and oil for export income, the country consumes almost none of the petroleum resources that they produce. In 2017, Norway was ranked 3rd behind Russia and Qatar as the world’s largest natural gas exporter.  Norway was also the 8th largest exporter of crude oil in the world. These industries are a vital part of Norway’s economy, making up nearly 50% of the country’s total export value and accounts for 17% of its GDP.

Norway however, runs on 98% renewable energy with a heavy emphasis on hydroelectric power. Their development of renewable energy allows Norway to export their non-renewable energy to turn a profit. “Nearly all oil and gas produced on the Norwegian shelf is exported, and combined, oil and gas equals about half of the total value of Norwegian exports of goods. This makes oil and gas the most important export commodities in the Norwegian economy." This has positioned Norway in a very unique spot as they aspire to be the world’s leading climate change combatant while still drilling/fracking in the North Sea, Norwegian Sea, and the Barents Sea.

Several Norwegian environmental groups, such as Greenpeace Norway and Young Friends of the Earth Norway, have sued the Norwegian government for the opening of new oil and natural gas plants in the Arctic. As Norway continues to pursue a green future, the Norwegian government pursues different avenues of justifying that petroleum has a place in a low-carbon future. Norway’s government is continuing to pass measures to bolster their oil and natural gas sectors, while simultaneously passing legislation to further their environmental agenda. Continued dependence on oil production has strong support in the government: ”...the petroleum policy in Norway has been supported by the largest political parties across the left-right cleavage in Norwegian politics. Although there have been tensions on certain issues, the Labour Party, the Conservative Party, and the Progress Party make up the majority of support for the current petroleum policy in Parliament.”

Norway has avoided the “oil curse” or “dutch disease” Dutch disease that many oil producing countries have experienced, in part because it began to harvest petroleum resources at a time when the government regulation was well developed and already in an economically strong position: “Norway had the advantage of entering its oil era with a mature, open democracy as well as bureaucratic institutions with experience regulating other natural resource industries (hydropower generation, fishing, and mining for example)”. 

\As Norway began to exploit their petroleum resources, the government took steps to ensure that the natural resource industry idid not deplete other industries in Norway by funneling profits from the state owned operations into a pension fund known as the Government Pension Fund Global (GPFG). The GPFG is the world’s largest sovereign wealth fund and was established for the purpose of investing in the surplus revenues of the petroleum sector in Norway. “The Norwegian government receives these funds from their market shares within oil industries, such as their two-thirds share of Statoil, and allocates it through their government-controlled domestic economy.”

Nigeria

Petroleum in Nigeria was discovered in 1955 at Oloibiri in the Niger Delta. High oil prices were the driving force behind Nigeria’s economic growth. This has made the Nigerian economy to become the largest in Africa surpassing both Egypt and South Africa, also making it the 24th largest in the world. The Nigerian economy is heavily dependent on the oil sector, which accounts for 98% percent of export earnings and 83% of federal government revenues as well as generating 14% of its GDP.

Even with the substantial oil wealth, Nigeria ranks as one of the poorest countries in the world, with a $1,000 per capita income and more than 70 percent of the population living in poverty. In October 2005, the 15-member Paris Club announced that it would cancel 60 percent of the debt owed by Nigeria. However, Nigeria must still pay $12.4 billion in arrears amongst meeting other conditions. In March 2006, phase two of the Paris Club agreement will include an additional 34 percent debt cancellation, while Nigeria will be responsible for paying back any remaining eligible debts to the lending nations. The International Monetary Fund (IMF), which recently praised the Nigerian government for adopting tighter fiscal policies, will be allowed to monitor Nigeria without having to disburse loans to the country.

Russia

On 21 May 2014, Russia and China signed a $400 billion gas deal for natural gas supplies via the Eastern Route.

High-priced oil allowed the Soviet Union to subsidize the struggling economies of the Soviet bloc for a time, and the loss of petrodollar income during the 1980s oil glut contributed to the bloc's collapse in 1989.

Saudi Arabia

In 1973, Saudi Arabia and other Arab nations imposed an oil embargo against the United States, United Kingdom, Japan and other Western nations which supported Israel in the Yom Kippur War of October 1973. The embargo caused an oil crisis with many short- and long-term effects on global politics and the global economy.

Saudi Arabia is an oil-based economy with strong government controls over major economic activities. It possesses both the world's largest known oil reserves, which are 25% of the world's proven reserves, and produces the largest amount of the world's oil. As of 2005, Ghawar field accounts for about half of Saudi Arabia's total oil production capacity.

Saudi Arabia ranks as the largest exporter of petroleum, and plays a leading role in OPEC, its decisions to raise or cut production almost immediately impact world oil prices. It is perhaps the best example of a contemporary energy superpower, in terms of having power and influence on the global stage (due to its energy reserves and production of not just oil, but natural gas as well). Saudi Arabia is often referred to as the world's only "oil superpower".

It has been suggested that the Iran–Saudi Arabia proxy conflict was a powerful influence in the Saudi decision to launch the price war in 2014, as was Cold War rivalry between the United States and Russia. Larry Elliott argued that "with the help of its Saudi ally, Washington is trying to drive down the oil price by flooding an already weak market with crude. As the Russians and the Iranians are heavily dependent on oil exports, the assumption is that they will become easier to deal with." Vice President of Russia's largest oil company, Rosneft, accused Saudi Arabia of conspiring against Russia.

United States

U.S. President Donald Trump with Saudi Crown Prince Mohammad bin Salman in June 2019

In 1998, about 40% of the energy consumed by the United States came from oil. The United States is responsible for 25% of the world's oil consumption, while having only 3% of the world's proven oil reserves and less than 5% of the world's population. In January 1980, President Jimmy Carter explicitly declared: "An attempt by any outside force to gain control of the Persian Gulf region will be regarded as an assault on the vital interests of the United States."

Venezuela

According to the Oil and Gas Journal (OGJ), Venezuela has 77.2 billion barrels (1.227×1010 m3) of proven conventional oil reserves, the largest of any country in the Western Hemisphere. In addition it has non-conventional oil deposits similar in size to Canada's - at 1,200 billion barrels (1.9×1011 m3) approximately equal to the world's reserves of conventional oil. About 267 billion barrels (4.24×1010 m3) of this may be producible at current prices using current technology. Venezuela's Orinoco tar sands are less viscous than Canada's Athabasca oil sands – meaning they can be produced by more conventional means, but are buried deeper – meaning they cannot be extracted by surface mining. In an attempt to have these extra heavy oil reserves recognized by the international community, Venezuela has moved to add them to its conventional reserves to give nearly 350 billion barrels (5.6×1010 m3) of total oil reserves. This would give it the largest oil reserves in the world, even ahead of Saudi Arabia.

Venezuela nationalized its oil industry in 1975–1976, creating Petróleos de Venezuela S.A. (PdVSA), the country's state-run oil and natural gas company. Along with being Venezuela's largest employer, PdVSA accounts for about one-third of the country's GDP, 50 percent of the government's revenue and 80 percent of Venezuela's exports earnings. In recent years, under the influence of President Chavez, the Venezuelan government has reduced PdVSA's previous autonomy and amended the rules regulating the country's hydrocarbons sector.

Proven world oil reserves, 2013

In the 1990s, Venezuela opened its upstream oil sector to private investment. This collection of policies, called apertura, facilitated the creation of 32 operating service agreements (OSA) with 22 separate foreign oil companies, including international oil majors like Chevron, BP, Total, and Repsol-YPF. Hugo Chávez, the President of Venezuela sharply diverged from previous administrations' economic policies. PDVSA is now used as a cash-cow and as an employer-of-last-resort; foreign oil businesses were nationalised and the government refused to pay compensation.

Estimates of Venezuelan oil production vary. Venezuela claims its oil production is over 3 million barrels per day (480,000 m3/d), but oil industry analysts and the U.S. Energy Information Administration believe it to be much lower. In addition to other reporting irregularities, much of its production is extra-heavy oil, which may or may not be included with conventional oil in the various production estimates. The U.S. Energy Information Agency estimated Venezuela's oil production in December 2006 was only 2.5 million barrels per day (400,000 m3/d), a 24% decline from its peak of 3.3 million in 1997.

Recently, Venezuela has pushed the creation of regional oil initiatives for the Caribbean (Petrocaribe), the Andean region (Petroandino), and South America (Petrosur), and Latin America (Petroamerica). The initiatives include assistance for oil developments, investments in refining capacity, and preferential oil pricing. The most developed of these three is the Petrocaribe initiative, with 13 nations signing a preliminary agreement in 2005. Under Petrocaribe, Venezuela will offer crude oil and petroleum products to Caribbean nations under preferential terms and prices, with Jamaica as the first nation to sign on in August 2005.

Saturday, February 26, 2022

Population history of Indigenous peoples of the Americas

Contemporary illustration of the 1868 Washita Massacre by the 7th Cavalry against Black Kettle's band of Cheyennes, during the American Indian Wars. Violence and conflict with colonists were also important causes of the decline of certain indigenous American populations since the 16th century.

Population figures for the Indigenous people of the Americas prior to colonization have proven difficult to establish. Scholars rely on archaeological data and written records from European settlers. By the end of the 20th century most scholars gravitated toward an estimate of around 50 million—with some historians arguing for an estimate of 100 million or more.

In an effort to circumvent the hold the Ottoman Empire held on the overland trade routes to East Asia and the hold that the Aeterni regis granted to Portugal on maritime routes via the African coast and the Indian Ocean, the monarchs of the nascent Spanish Empire decided to fund Columbus' voyage in 1492, which eventually led to the establishment of settler-colonial states and the migration of millions of Europeans to the Americas. The population of African and European peoples in the Americas grew steadily starting in 1492, while at the same time the indigenous population began to plummet. Eurasian diseases such as influenza, pneumonic plagues, and smallpox devastated the Native Americans, who did not have immunity to them. Conflict and outright warfare with Western European newcomers and other American tribes further reduced populations and disrupted traditional societies. The extent and causes of decline have been characterized as genocide by some scholars.

Population overview

Natives of North America.
 
Natives of the South of America.
 
Columbus landing on Hispaniola, Dec. 6, 1492; greeted by Arawak Indians. Theodore de Bry

Pre-Columbian population figures are difficult to estimate due to the fragmentary nature of the evidence. Estimates range from 8-112 million. Scholars have varied widely on the estimated size of the indigenous populations prior to colonization and on the effects of European contact. Estimates are made by extrapolations from small bits of data. In 1976, geographer William Denevan used the existing estimates to derive a "consensus count" of about 54 million people. Nonetheless, more recent estimates still range widely. In 1992, Denevan suggested that the total population was approximately 53.9 million and the populations by region were, approximately, 3.8 million for the United States and Canada, 17.2 million for Mexico, 5.6 million for Central America, 3 million for the Caribbean, 15.7 million for the Andes and 8.6 million for lowland South America. Historian David Stannard estimates that the extermination of indigenous peoples took the lives of 100 million people: "...the total extermination of many American Indian peoples and the near-extermination of others, in numbers that eventually totaled close to 100,000,000." A 2019 study estimates the pre-Columbian population indigenous population contained more than 60 million people, but dropped to 6 million by 1600, based on a drop in atmospheric CO2 during that period.

The indigenous population of the Americas in 1492 was not necessarily at a high point and may actually have been in decline in some areas. Indigenous populations in most areas of the Americas reached a low point by the early 20th century.

Using an estimate of approximately 37 million people in Mexico, Central and South America in 1492 (including 6 million in the Aztec Empire, 5–10 million in the Mayan States, 11 million in what is now Brazil, and 12 million in the Inca Empire), the lowest estimates give a death toll due to disease of 80% by the end of the 17th century (nine million people in 1650). Latin America would match its 15th-century population early in the 19th century; it numbered 17 million in 1800, 30 million in 1850, 61 million in 1900, 105 million in 1930, 218 million in 1960, 361 million in 1980, and 563 million in 2005.

In the last three decades of the 16th century, the population of present-day Mexico dropped to about one million people. The Maya population is today estimated at six million, which is about the same as at the end of the 15th century, according to some estimates. In what is now Brazil, the indigenous population declined from a pre-Columbian high of an estimated four million to some 300,000. Over 60 million Brazilians possess at least one Native South American ancestor, according to a mitochondrial DNA study.

While it is difficult to determine exactly how many Natives lived in North America before Columbus, estimates range from 3.8 million, as mentioned above, to 7 million people to a high of 18 million. The aboriginal population of Canada during the late 15th century is estimated to have been between 500,000 and two million. Repeated outbreaks of Old World infectious diseases such as influenza, measles and smallpox (to which they had no natural immunity) were the main cause of depopulation. This combined with other factors such as dispossession from European/Canadian settlements and numerous violent conflicts resulted in a forty- to eighty-percent aboriginal population decrease after contact. For example, during the late 1630s, smallpox killed over half of the Wyandot (Huron), who controlled most of the early North American fur trade in what became Canada. They were reduced to fewer than 10,000 people.

The population debate has often had ideological underpinnings. Low estimates were sometimes reflective of European notions of cultural and racial superiority. Historian Francis Jennings argued, "Scholarly wisdom long held that Indians were so inferior in mind and works that they could not possibly have created or sustained large populations." In 1998, Africanist Historian David Henige said many population estimates are the result of arbitrary formulas applied from unreliable sources.

Estimations

Comparative table of estimates of the pre-Columbian population (in thousands)
Author Date USA and Canada Mexico Mesoamerica Caribbean Andes Patagonia and
Amazonia
Total
Sapper 1924 2,000-3,000 12,000-15,000 5,000-6,000 3,000-4,000 12,000-15,000 3,000-5,000 37,000-48,500
Kroeber 1939 900 3,200 100 200 3,000 1,000 8,400
Steward 1949 1,000 4,500 740 220 6,130 2,900 15,490
Rosenblat 1954 1,000 4,500 800 300 4,750 2,030 13,380
Dobyns 1966 9,800-12,250 30,000-37,500 10,800-13,500 440-550 30,000-37,500 9,000-11,250 90,040-112,550
Ubelaker 1988 1,213-2,639 - - - - - -
Denevan 1992 3,790 17,174 5,625 3,000 15,696 8,619 53,904
Snow 2001 3,440 - - - - - -
Alchon 2003 3,500 16,000-18,000 5,000-6,000 2,000-3,000 13,000-15,000 7,000-8,000 46,500-53,500
Thornton 2005 7,000 - - - - - -

Estimations by tribe

Population size for Native American tribes is difficult to state definitively, but at least one writer has made estimates, often based on an assumed proportion of the number of warriors to total population for the tribe. Typical proportions were 5 people per one warrior and at least 2 to 5 warriors (therefore at least 10-25 people) per lodge, cabin or house.

The total peak population size only for the tribes listed in this table is 2,891,428.

Pre-Columbian Americas

Statue of Cuauhtemoc in el Zócalo, Mexico City.
 
Ataw Wallpa portrait.

Genetic diversity and population structure in the American land mass using DNA micro-satellite markers (genotype) sampled from North, Central, and South America have been analyzed against similar data available from other indigenous populations worldwide. The Amerindian populations show a lower genetic diversity than populations from other continental regions. A decreasing geneti as geographic distance from the Bering Strait occurs, as well as a decreasing genetic similarity to Siberian populations from Alaska (genetic entry point). A higher level of diversity and lower level of population structure in western South America compared to eastern South America is observed. A relative lack of differentiation between Mesoamerican and Andean populations is a scenario that implies coastal routes were easier than inland routes for migrating peoples (Paleo-Indians) to traverse. The overall pattern that is emerging suggests that the Americas were recently colonized by a small number of individuals (effective size of about 70–250), and then they grew by a factor of 10 over 800–1,000 years. The data also show that there have been genetic exchanges between Asia, the Arctic and Greenland since the initial peopling of the Americas. A new study in early 2018 suggests that the effective population size of the original founding population of Native Americans was about 250 people.

Depopulation from disease

Sixteenth-century Aztec drawings of victims of smallpox (above) and measles (below)
Graph demonstrating the population collapse in Central Mexico brought on by successive epidemics in the early colonial period.

Early explanations for the population decline of the Indigenous peoples of the Americas include the brutal practices of the Spanish conquistadores, as recorded by the Spaniards themselves, such as the encomienda system, which was ostensibly set up to protect people from warring tribes as well as to teach them the Spanish language and the Catholic religion, but in practice was tantamount to serfdom and slavery. The most notable account was that of the Dominican friar Bartolomé de las Casas, whose writings vividly depict Spanish atrocities committed in particular against the Taínos. The second European explanation was a perceived divine approval, in which God removed the natives as part of His "divine plan" to make way for a new Christian civilization. Many Native Americans viewed their troubles in a religious framework within their own belief systems.

According to later academics such as Noble David Cook, a community of scholars began "quietly accumulating piece by piece data on early epidemics in the Americas and their relation to the subjugation of native peoples." Scholars like Cook believe that widespread epidemic disease, to which the natives had no prior exposure or resistance, was the primary cause of the massive population decline of the Native Americans. One of the most devastating diseases was smallpox, but other deadly diseases included typhus, measles, influenza, bubonic plague, cholera, malaria, tuberculosis, mumps, yellow fever and pertussis, which were chronic in Eurasia.

However, recent scholarship has shown connection between colonial violence and disease. Historian Andrés Reséndez of University of California, Davis asserts that these scholarly studies have shown that the conditions created by colonialism, such as forced labor and removal of Indigenous peoples from traditional homelands and medicines, alongside introduced disease, are the reasons for depopulation. In this way, "slavery has emerged as a major killer" of the indigenous populations of the Caribbean between 1492 and 1550, as it set the conditions for diseases such as smallpox, influenza and malaria to flourish. He posits that unlike the populations of Europe who rebounded following the Black Death, no such rebound occurred for the Indigenous populations. He concludes that, even though the Spanish were aware of deadly diseases such as smallpox, there is no mention of them in the New World until 1519, meaning perhaps they didn't spread as fast as initially believed, and that unlike Europeans, the Indigenous populations were subjected to brutal forced labor on a massive scale. Anthropologist Jason Hickel estimates that a third of Arawak workers died every six months from lethal forced labor in these mines.

Similarly, historian Jeffrey Ostler at The University of Oregon has argued that population collapses in the Americas throughout colonization were not mainly due to lack of Native immunity to European disease. Instead, he claims that "When severe epidemics did hit, it was often less because Native bodies lacked immunity than because European colonialism disrupted Native communities and damaged their resources, making them more vulnerable to pathogens." In specific regard to Spanish colonization of northern Florida and southeastern Georgia, Native peoples there "were subject to forced labor and, because of poor living conditions and malnutrition, succumbed to wave after wave of unidentifiable diseases." Further, in relation to British colonization in the Northeast, Algonquian speaking tribes in Virginia and Maryland "suffered from a variety of diseases, including malaria, typhus, and possibly smallpox." These diseases were not solely a case of Native susceptibility, however, because "as colonists took their resources, Native communities were subject to malnutrition, starvation, and social stress, all making people more vulnerable to pathogens. Repeated epidemics created additional trauma and population loss, which in turn disrupted the provision of healthcare." Such conditions would continue, alongside rampant disease in Native communities, throughout colonization, the formation of the United States, and multiple forced removals, as Ostler explains that many scholars "have yet to come to grips with how U.S. expansion created conditions that made Native communities acutely vulnerable to pathogens and how severely disease impacted them. ... Historians continue to ignore the catastrophic impact of disease and its relationship to U.S. policy and action even when it is right before their eyes."

Historian David Stannard says that by "focusing almost entirely on disease ... contemporary authors increasingly have created the impression that the eradication of those tens of millions of people was inadvertent—a sad, but both inevitable and "unintended consequence" of human migration and progress," and asserts that their destruction "was neither inadvertent nor inevitable," but the result of microbial pestilence and purposeful genocide working in tandem.

The European colonization of the Americas resulted in the deaths of so many people it contributed to climatic change and temporary global cooling, according to scientists from University College London. A century after the arrival of Christopher Columbus, some 90% of indigenous Americans had perished from "wave after wave of disease", along with mass slavery and war, in what researchers have described as the "great dying". According to one of the researchers, UCL Geography Professor Mark Maslin, the large death toll also boosted the economies of Europe: "the depopulation of the Americas may have inadvertently allowed the Europeans to dominate the world. It also allowed for the Industrial Revolution and for Europeans to continue that domination."

Photograph from 1892 of a pile of American bison skulls in Detroit waiting to be ground for fertilizer or charcoal.
 
"Rath & Wright's buffalo hide yard in 1878, showing 40,000 buffalo hides, Dodge City, Kansas."

Biological warfare

When Old World diseases were first carried to the Americas at the end of the fifteenth century, they spread throughout the southern and northern hemispheres, leaving the indigenous populations in near ruins. No evidence has been discovered that the earliest Spanish colonists and missionaries deliberately attempted to infect the American natives, and some efforts were made to limit the devastating effects of disease before it killed off what remained of their forced slave labor under their encomienda system. The cattle introduced by the Spanish contaminated various water reserves which Native Americans dug in the fields to accumulate rainwater. In response, the Franciscans and Dominicans created public fountains and aqueducts to guarantee access to drinking water. But when the Franciscans lost their privileges in 1572, many of these fountains were no longer guarded and so deliberate well poisoning may have happened. Although no proof of such poisoning has been found, some historians believe the decrease of the population correlates with the end of religious orders' control of the water.

In the centuries that followed, accusations and discussions of biological warfare were common. Well-documented accounts of incidents involving both threats and acts of deliberate infection are very rare, but may have occurred more frequently than scholars have previously acknowledged. Many of the instances likely went unreported, and it is possible that documents relating to such acts were deliberately destroyed, or sanitized. By the middle of the 18th century, colonists had the knowledge and technology to attempt biological warfare with the smallpox virus. They well understood the concept of quarantine, and that contact with the sick could infect the healthy with smallpox, and those who survived the illness would not be infected again. Whether the threats were carried out, or how effective individual attempts were, is uncertain.

One such threat was delivered by fur trader James McDougall, who is quoted as saying to a gathering of local chiefs, "You know the smallpox. Listen: I am the smallpox chief. In this bottle I have it confined. All I have to do is to pull the cork, send it forth among you, and you are dead men. But this is for my enemies and not my friends." Likewise, another fur trader threatened Pawnee Indians that if they didn't agree to certain conditions, "he would let the smallpox out of a bottle and destroy them." The Reverend Isaac McCoy was quoted in his History of Baptist Indian Missions as saying that the white men had deliberately spread smallpox among the Indians of the southwest, including the Pawnee tribe, and the havoc it made was reported to General Clark and the Secretary of War. Artist and writer George Catlin observed that Native Americans were also suspicious of vaccination, "They see white men urging the operation so earnestly they decide that it must be some new mode or trick of the pale face by which they hope to gain some new advantage over them." So great was the distrust of the settlers that the Mandan chief Four Bears denounced the white man, whom he had previously treated as brothers, for deliberately bringing the disease to his people.

During the Seven Years' War, British militia took blankets from their smallpox hospital and gave them as gifts to two neutral Lenape Indian dignitaries during a peace settlement negotiation, according to the entry in the Captain's ledger, "To convey the Smallpox to the Indians". In the following weeks, the high commander of the British forces in North America conspired with his Colonel to "Extirpate this Execreble Race" of Native Americans, writing, "Could it not be contrived to send the small pox among the disaffected tribes of Indians? We must on this occasion use every stratagem in our power to reduce them." His Colonel agreed to try. Most scholars have asserted that the 1837 Great Plains smallpox epidemic was "started among the tribes of the upper Missouri River by failure to quarantine steamboats on the river", and Captain Pratt of the St. Peter "was guilty of contributing to the deaths of thousands of innocent people. The law calls his offense criminal negligence. Yet in light of all the deaths, the almost complete annihilation of the Mandans, and the terrible suffering the region endured, the label criminal negligence is benign, hardly befitting an action that had such horrendous consequences." However, some sources attribute the 1836–40 epidemic to the deliberate communication of smallpox to Native Americans, with historian Ann F. Ramenofsky writing, "Variola Major can be transmitted through contaminated articles such as clothing or blankets. In the nineteenth century, the U. S. Army sent contaminated blankets to Native Americans, especially Plains groups, to control the Indian problem." Well into the 20th century, deliberate infection attacks continued as Brazilian settlers and miners transported infections intentionally to the native groups whose lands they coveted."

Vaccination

After Edward Jenner's 1796 demonstration that the smallpox vaccination worked, the technique became better known and smallpox became less deadly in the United States and elsewhere. Many colonists and natives were vaccinated, although, in some cases, officials tried to vaccinate natives only to discover that the disease was too widespread to stop. At other times, trade demands led to broken quarantines. In other cases, natives refused vaccination because of suspicion of whites. The first international healthcare expedition in history was the Balmis expedition which had the aim of vaccinating indigenous peoples against smallpox all along the Spanish Empire in 1803. In 1831, government officials vaccinated the Yankton Sioux at Sioux Agency. The Santee Sioux refused vaccination and many died.

Depopulation from European Conquest

War and violence

An 1899 chromolithograph of U.S. cavalry pursuing American Indians, artist unknown.
 
Storming of the Teocalli by Cortez and His Troops by Emanuel Leutze
 
An 1899 chromolithograph from the Werner Company of Akron, Ohio titled Custer Massacre at Big Horn, Montana – June 25, 1876.

While epidemic disease was a leading factor of the population decline of the American indigenous peoples after 1492, there were other contributing factors, all of them related to European contact and colonization. One of these factors was warfare. According to demographer Russell Thornton, although many lives were lost in wars over the centuries, and war sometimes contributed to the near extinction of certain tribes, warfare and death by other violent means was a comparatively minor cause of overall native population decline.

From the U.S. Bureau of the Census in 1894, wars between the government and the Indigenous peoples ranged over 40 in number over the previous 100 years. These wars cost the lives of approximately 19,000 white people, and the lives of about 30,000 Indians, including men, women, and children. They safely estimated that the amount of Native people who were killed or wounded was actually around fifty percent more than what was recorded.

There is some disagreement among scholars about how widespread warfare was in pre-Columbian America, but there is general agreement that war became deadlier after the arrival of the Europeans and their firearms. The South or Central American infrastructure allowed for thousands of European conquistadors and tens of thousands of their Indian auxiliaries to attack the dominant indigenous civilization. Empires such as the Incas depended on a highly centralized administration for the distribution of resources. Disruption caused by the war and the colonization hampered the traditional economy, and possibly led to shortages of food and materials. Across the western hemisphere, war with various Native American civilizations constituted alliances based out of both necessity or economic prosperity and, resulted in mass-scale intertribal warfare. European colonization in the North American continent also contributed to a number of wars between Native Americans, who fought over which of them should have first access to new technology and weaponry—like in the Beaver Wars.

Exploitation

D'Albertis Castle, Genoa, Museum of World Cultures

Some Spaniards objected to the encomienda system of labor, notably Bartolomé de las Casas, who insisted that the indigenous people were humans with souls and rights. Due to many revolts and military encounters, Emperor Charles V helped relieve the strain on both the native laborers and the Spanish vanguards probing the Caribana for military and diplomatic purposes. Later on New Laws were promulgated in Spain in 1542 to protect isolated natives, but the abuses in the Americas were never entirely or permanently abolished. The Spanish also employed the pre-Columbian draft system called the mita, and treated their subjects as something between slaves and serfs. Serfs stayed to work the land; slaves were exported to the mines, where large numbers of them died. In other areas the Spaniards replaced the ruling Aztecs and Incas and divided the conquered lands among themselves ruling as the new feudal lords with often, but unsuccessful lobbying to the viceroys of the Spanish crown to pay Tlaxcalan war demnities. The infamous Bandeirantes from São Paulo, adventurers mostly of mixed Portuguese and native ancestry, penetrated steadily westward in their search for Indian slaves. Serfdom existed as such in parts of Latin America well into the 19th century, past independence. Historian Andrés Reséndez argues that even though the Spanish were aware of the spread of smallpox, they made no mention of it until 1519, a quarter century after Columbus arrived in Hispaniola. Instead he contends that enslavement in gold and silver mines was the primary reason why the Native American population of Hispaniola dropped so significantly, and that even though disease was a factor, the native population would have rebounded the same way Europeans did following the Black Death if it were not for the constant enslavement they were subject to. He further contends that enslavement of Native Americans was in fact the primary cause of their depopulation in Spanish territories; that the majority of Indians enslaved were women and children compared to the enslavement of Africans which mostly targeted adult males and in turn they were sold at a 50% to 60% higher price, and that 2,462,000 to 4,985,000 Amerindians were enslaved between Columbus's arrival and 1900.

Massacres

Mass grave of Lakota dead after the 1890 Wounded Knee Massacre.
 
Conquest of Mexico
  • The Pequot War in early New England.
  • In mid-19th century Argentina, post-independence leaders Juan Manuel de Rosas and Julio Argentino Roca engaged in what they presented as a "Conquest of the Desert" against the natives of the Argentinian interior, leaving over 1,300 indigenous dead.
  • While some California tribes were settled on reservations, others were hunted down and massacred by 19th century American settlers. It is estimated that at least 9,400 to 16,000 California Indians were killed by non-Indians, mostly occurring in more than 370 massacres (defined as the "intentional killing of five or more disarmed combatants or largely unarmed noncombatants, including women, children, and prisoners, whether in the context of a battle or otherwise").

Displacement and disruption

Throughout history, Indigenous people have been subjected to the repeated and forced removal from their land. Beginning in the 1830s, there was the relocation of an estimated 100,000 Indigenous people in the United States called the “Trail of Tears". The tribes affected by this specific removal were the Five Civilized Tribes: The Cherokee, Creek, Chickasaw, Choctaw, and Seminole. The treaty of New Echota, was enacted, which stated that the United States “would give Cherokee land west of the Mississippi in exchange for $5,000,000”. According to Jeffrey Ostler, "Of the 80,000 Native people who were forced west from 1830 into the 1850s, between 12,000 and 17,000 perished." Ostler states that "the large majority died of interrelated factors of starvation, exposure and disease".

In addition to the removal of the Southern Tribes, there were multiple other removals of Northern Tribes also known as "Trails of Tears." For example, "In the free labor states of the North, federal and state officials, supported by farmers, speculators and business interests, evicted Shawnees, Delawares, Senecas, Potawatomis, Miamis, Wyandots, Ho-Chunks, Ojibwes, Sauks and Meskwakis." These Nations were moved West of the Mississippi into what is now known as Eastern Kansas, and numbered 17,000 on arrival. According to Ostler, "by 1860, their numbers had been cut in half" due to low fertility, high infant mortality, and increased disease caused by conditions such as polluted drinking water, few resources, and social stress.

Ostler also writes that the areas that Northern tribes were removed to were already inhabited: "The areas west of the Mississippi River were home to other Indigenous nations— Osages, Kanzas, Omahas, Ioways, Otoes and Missourias. To make room for thousands of people from the East, the government dispossessed these nations of much their lands." Ostler writes that in 1840, when Northern Nations were moved onto their land, "The combined population of these western nations was 9,000 ... 20 years later, it had fallen to 6,000."

Later apologies from government officials

On 8 September 2000, the head of the United States Bureau of Indian Affairs (BIA) formally apologized for the agency's participation in the ethnic cleansing of Western tribes. In a speech before representatives of Native American peoples in June, 2019, California governor Gavin Newsom apologized for the "California Genocide." Newsom said, "That’s what it was, a genocide. No other way to describe it. And that’s the way it needs to be described in the history books."

Causal inference

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Causal_inference

Causal inference is the process of determining the independent, actual effect of a particular phenomenon that is a component of a larger system. The main difference between causal inference and inference of association is that causal inference analyzes the response of an effect variable when a cause of the effect variable is changed. The science of why things occur is called etiology. Causal inference is said to provide the evidence of causality theorized by causal reasoning.

Causal inference is widely studied across all sciences. Several innovations in the development and implementation of methodology designed to determine causality have proliferated in recent decades. Causal inference remains especially difficult where experimentation is difficult or impossible, which is common throughout most sciences.

The approaches to causal inference are broadly applicable across all types of scientific disciplines, and many methods of causal inference that were designed for certain disciplines have found use in other disciplines. This article outlines the basic process behind causal inference and details some of the more conventional tests used across different disciplines; however, this should not be mistaken as a suggestion that these methods apply only to those disciplines, merely that they are the most commonly used in that discipline.

Causal inference is difficult to perform and there is significant debate amongst scientists about the proper way to determine causality. Despite other innovations, there remain concerns of misattribution by scientists of correlative results as causal, of the usage of incorrect methodologies by scientists, and of deliberate manipulation by scientists of analytical results in order to obtain statistically significant estimates. Particular concern is raised in the use of regression models, especially linear regression models.

Definition

Inferring the cause of something has been described as:

  • "...reason[ing] to the conclusion that something is, or is likely to be, the cause of something else".
  • "Identification of the cause or causes of a phenomenon, by establishing covariation of cause and effect, a time-order relationship with the cause preceding the effect, and the elimination of plausible alternative causes."

Methodology

General

Causal inference is conducted via the study of systems where the measure of one variable is suspected to affect the measure of another. Causal inference is conducted with regard to the scientific method. The first step of causal inference is to formulate a falsifiable null hypothesis, which is subsequently tested with statistical methods. Frequentist statistical inference is the use of statistical methods to determine the probability that the data occur under the null hypothesis by chance: Bayesian inference is used to determine the effect of an independent variable. Statistical inference in general is used to determine the difference between variations in the original data that are random variation or the effect of a well specified causal mechanism. Notably, correlation does not imply causation, so the study of causality is as concerned with the study of potential causal mechanisms as it is with variation amongst the data. A frequently sought after standard of causal inference is an experiment where treatment is randomly assigned but all other confounding factors are held constant. Most of the efforts in causal inference are in the attempt to replicate experimental conditions.

Epidemiological studies employ different epidemiological methods of collecting and measuring evidence of risk factors and effect and different ways of measuring association between the two. Results of a 2020 review of methods for causal inference found that using existing literature for clinical training programs can be challenging. This is because published articles often assume an advanced technical background, they may be written from multiple statistical, epidemiological, computer science, or philosophical perspectives, methodological approaches continue to expand rapidly, and many aspects of causal inference receive limited coverage.

Common frameworks for causal inference include the causal pie model (component-cause), Pearl's structural causal model (causal diagram + do-calculus), structural equation modeling, and Rubin causal model (potential-outcome), which are often used in areas such as social sciences and epidemiology.

Experimental

Experimental verification of causal mechanisms is possible using experimental methods. The main motivation behind an experiment is to hold other experimental variables constant while purposefully manipulating the variable of interest. If the experiment produces statistically significant effects as a result of only the treatment variable being manipulated, there is grounds to believe that a causal effect can be assigned to the treatment variable, assuming that other standards for experimental design have been met.

Quasi-experimental

Quasi-experimental verification of causal mechanisms is conducted when traditional experimental methods are unavailable. This may be the result of prohibitive costs of conducting an experiment, or the inherent infeasibility of conducting an experiment, especially experiments that are concerned with large systems such as economies of electoral systems, or for treatments that are considered to present a danger to the well-being of test subjects. Quasi-experiments may also occur where information is withheld for legal reasons.

Approaches in epidemiology

Epidemiology studies patterns of health and disease in defined populations of living beings in order to infer causes and effects. An association between an exposure to a putative risk factor and a disease may be suggestive of, but is not equivalent to causality because correlation does not imply causation. Historically, Koch's postulates have been used since the 19th century to decide if a microorganism was the cause of a disease. In the 20th century the Bradford Hill criteria, described in 1965 have been used to assess causality of variables outside microbiology, although even these criteria are not exclusive ways to determine causality.

In molecular epidemiology the phenomena studied are on a molecular biology level, including genetics, where biomarkers are evidence of cause or effects.

A recent trend is to identify evidence for influence of the exposure on molecular pathology within diseased tissue or cells, in the emerging interdisciplinary field of molecular pathological epidemiology (MPE). Linking the exposure to molecular pathologic signatures of the disease can help to assess causality. Considering the inherent nature of heterogeneity of a given disease, the unique disease principle, disease phenotyping and subtyping are trends in biomedical and public health sciences, exemplified as personalized medicine and precision medicine.

Approaches in computer science

Determination of cause and effect from joint observational data for two time-independent variables, say X and Y, has been tackled using asymmetry between evidence for some model in the directions, X → Y and Y → X. The primary approaches are based on Algorithmic information theory models and noise models.

Noise models

Incorporate an independent noise term in the model to compare the evidences of the two directions.

Here are some of the noise models for the hypothesis Y → X with the noise E:

  • Additive noise:
  • Linear noise:
  • Post-nonlinear:
  • Heteroskedastic noise:
  • Functional noise:

The common assumption in these models are:

  • There are no other causes of Y.
  • X and E have no common causes.
  • Distribution of cause is independent from causal mechanisms.

On an intuitive level, the idea is that the factorization of the joint distribution P(Cause, Effect) into P(Cause)*P(Effect | Cause) typically yields models of lower total complexity than the factorization into P(Effect)*P(Cause | Effect). Although the notion of "complexity" is intuitively appealing, it is not obvious how it should be precisely defined. A different family of methods attempt to discover causal "footprints" from large amounts of labeled data, and allow the prediction of more flexible causal relations.

Approaches in social sciences

Social science

The social sciences in general have moved increasingly toward including quantitative frameworks for assessing causality. Much of this has been described as a means of providing greater rigor to social science methodology. Political science was significantly influenced by the publication of Designing Social Inquiry, by Gary King, Robert Keohane, and Sidney Verba, in 1994. King, Keohane, and Verba recommend that researchers apply both quantitative and qualitative methods and adopt the language of statistical inference to be clearer about their subjects of interest and units of analysis. Proponents of quantitative methods have also increasingly adopted the potential outcomes framework, developed by Donald Rubin, as a standard for inferring causality.

While much of the emphasis remains on statistical inference in the potential outcomes framework, social science methodologists have developed new tools to conduct causal inference with both qualitative and quantitative methods, sometimes called a "mixed methods" approach. Advocates of diverse methodological approaches argue that different methodologies are better suited to different subjects of study. Sociologist Herbert Smith and Political Scientists James Mahoney and Gary Goertz have cited the observation of Paul Holland, a statistician and author of the 1986 article "Statistics and Causal Inference", that statistical inference is most appropriate for assessing the "effects of causes" rather than the "causes of effects". Qualitative methodologists have argued that formalized models of causation, including process tracing and fuzzy set theory, provide opportunities to infer causation through the identification of critical factors within case studies or through a process of comparison among several case studies. These methodologies are also valuable for subjects in which a limited number of potential observations or the presence of confounding variables would limit the applicability of statistical inference.

Economics and political science

In the economic sciences and political sciences causal inference is often difficult, owing to the real world complexity of economic and political realities and the inability to recreate many large-scale phenomenona within controlled experiments. Causal inference in the economic and political sciences continues to see improvement in methodology and rigor, due to the increased level of technology available to social scientists, the increase in the number of social scientists and research, and improvements to causal inference methodologies throughout social sciences.

Despite the difficulties inherent in determining causality in economic systems, several widely employed methods exist throughout those fields.

Theoretical methods

Economists and political scientists can use theory (often studied in theory-driven econometrics) to estimate the magnitude of supposedly causal relationships in cases where they believe a causal relationship exists. Theorists can presuppose a mechanism believed to be causal and describe the effects using data analysis to justify their proposed theory. For example, theorists can use logic to construct a model, such as theorizing that rain causes fluctuations in economic productivity but that the converse is not true. However, using purely theoretical claims that do not offer any predictive insights has been called "pre-scientific" because there is no ability to predict the impact of the supposed causal properties. It is worth reiterating that regression analysis in the social science does not inherently imply causality, as many phenomenona may correlate in the short run or in particular datasets but demonstrate no correlation in other time periods or other datasets. Thus, the attribution of causality to correlative properties is premature absent a well defined and reasoned causal mechanism.

Instrumental variables

The instrumental variables (IV) technique is a method of determining causality that involves the elimination of a correlation between one of a model's explanatory variables and the model's error term. The belief here is that, if a model's error term goes hand in hand with the variation of another variable, that the model's error term is probably an effect of variation in that explanatory variable. The elimination of this correlation through the introduction of a new instrumental variable thus reduces the error present in the model as a whole.

Model specification

Model specification is the act of selecting a model to be used in data analysis. Social scientists (and, indeed, all scientists) must determine the correct model to use because different models are good at estimating different relationships.

Model specification can be useful in determining causality that is slow to emerge, where the effects of an action in one period are only felt in a later period. It is worth remembering that correlations only measure whether two variables have similar variance, not whether they affect one another in a particular direction; thus, one cannot determine the direction of a causal relation based on correlations only. Because causal acts are believed to precede causal effects, social scientists can use a model that looks specifically for the effect of one variable on another over a period of time. This leads to using the variables representing phenomena happening earlier as treatment effects, where econometric tests are used to look for later changes in data that are attributed to the effect of such treatment effects, where a meaningful difference in results following a meaningful difference in treatment effects may indicate causality between the treatment effects and the measured effects (e.g., Granger-causality tests). Such studies are examples of time-series analysis.

Sensitivity analysis

Other variables, or regressors in regression analysis, are either included or not included across various implementations of the same model to ensure that different sources of variation can be studied more separately from one another. This is a form of sensitivity analysis: it is the study of how sensitive an implementation of a model is to the addition of one or more new variables.

A chief motivating concern in the use of sensitivity analysis is the pursuit of discovering confounding variables. Confounding variables are variables that have a large impact on the results of a statistical test but are not the variable that causal inference is trying to study. Confounding variables may cause a regressor to appear to be significant in one implementation, but not in another.

Multicollinearity

Another reason for the use of sensitivity analysis is to detect multicollinearity. Multicollinearity is the phenomenon where the correlation between two variables is very high. A high level of correlation between two variables can dramatically affect the outcome of a statistical analysis, where small variations in highly correlated data can flip the effect of a variable from a positive direction to a negative direction, or vice versa. This is an inherent property of variance testing. Determining multicollinearity is useful in sensitivity analysis because the elimination of highly correlated variables in different model implementations can prevent the dramatic changes in results that result from the inclusion of such variables.

However, there are limits to sensitivity analysis' ability to prevent the deleterious effects of multicollinearity, especially in the social sciences, where systems are complex. Because it is theoretically impossible to include or even measure all of the confounding factors in a sufficiently complex system, econometric models are susceptible to the common-cause fallacy, where causal effects are incorrectly attributed to the wrong variable because the correct variable was not captured in the original data. This is an example of the failure to account for a lurking variable.

Design-based econometrics

Recently, improved methodology in design-based econometrics has popularized the use of both natural experiments and quasi-experimental research designs to study the causal mechanisms that such experiments are believed to identify.

Malpractice in causal inference

Despite the advancements in the development of methodologies used to determine causality, significant weaknesses in determining causality remain. These weaknesses can be attributed both to the inherent difficulty of determining causal relations in complex systems but also to cases of scientific malpractice.

Separate from the difficulties of causal inference, the perception that large numbers of scholars in the social sciences engage in non-scientific methodology exists among some large groups of social scientists. Criticism of economists and social scientists as passing off descriptive studies as causal studies are rife within those fields.

Scientific malpractice and flawed methodology

In the sciences, especially in the social sciences, there is concern among scholars that scientific malpractice is widespread. As scientific study is a broad topic, there are theoretically limitless ways to have a causal inference undermined through no fault of a researcher. Nonetheless, there remain concerns among scientists that large numbers of researchers do not perform basic duties or practice sufficiently diverse methods in causal inference.

One prominent example of common non-causal methodology is the erroneous assumption of correlative properties as causal properties. There is no inherent causality in phenomenona that correlate. Regression models are designed to measure variance within data relative to a theoretical model: there is nothing to suggest that data that presents high levels of covariance have any meaningful relationship (absent a proposed causal mechanism with predictive properties or a random assignment of treatment). The use of flawed methodology has been claimed to be widespread, with common examples of such malpractice being the overuse of correlative models, especially the overuse of regression models and particularly linear regression models. The presupposition that two correlated phenomenona are inherently related is a logical fallacy known as spurious correlation. Some social scientists claim that widespread use of methodology that attributes causality to spurious correlations have been detrimental to the integrity of the social sciences, although improvements stemming from better methodologies have been noted.

A potential effect of scientific studies that erroneously conflate correlation with causality is an increase in the number of scientific findings whose results are not reproducible by third parties. Such non-reproducibility is a logical consequence of findings that correlation only temporarily being overgeneralized into mechanisms that have no inherent relationship, where new data does not contain the previous, idiosyncratic correlations of the original data. Debates over the effect of malpractice versus the effect of the inherent difficulties of searching for causality are ongoing. Critics of widely practiced methodologies argue that researchers have engaged statistical manipulation in to publish articles that supposedly demonstrate evidence of causality but are actually examples of spurious correlation being touted as evidence of causality: such endeavors may be referred to as P hacking. To prevent this, some have advocated that researchers preregister their research designs prior to conducting to their studies so that they do not inadvertently overemphasize a nonreproducible finding that was not the initial subject of inquiry but was found to be statistically significant during data analysis.

Neurophilosophy

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Neurophilosophy ...