Search This Blog

Monday, December 8, 2025

Climate justice

From Wikipedia, the free encyclopedia
Fridays for Future demonstration in Berlin in September 2021 with the slogan "fight for climate justice"

Climate justice is a type of environmental justice that focuses on the unequal impacts of climate change on marginalized or otherwise vulnerable populations. Climate justice seeks to achieve an equitable distribution of both the burdens of climate change and the efforts to mitigate climate change through advocacy and policy change. The economic burden of climate change mitigation is estimated by some at around 1% to 2% of GDP. Climate justice examines concepts such as equality, human rights, collective rights, justice and the historical responsibilities for climate change.

Climate justice recognizes that those who have benefited most from industrialization (such as coal, oil, and gas enterprises) are disproportionately responsible for the accumulation of carbon dioxide in the earth's atmosphere, and thus for climate change. Meanwhile, there is growing consensus that people in regions that are the least responsible for climate change as well as the world's poorest and most marginalized communities often tend to suffer the greatest consequences, with, for example, health problems due to being raised in an unhealthy environment. Depending on the country and context, this will often include people with low-incomes, indigenous communities or communities of color. They might also be further disadvantaged by responses to climate change which might exacerbate existing inequalities around race, gender, sexuality and disability. When those affected the most by climate change despite having contributed the least to causing it are also negatively affected by responses to climate change, this is known as the 'triple injustice' of climate change.

Conceptions of climate justice can be grouped along the lines of procedural justice and distributive justice. The former stresses fair, transparent and inclusive decision-making. The latter stresses a fair distribution of the costs and outcomes of climate change (substantive rights). There are at least ten different principles that are helpful to distribute climate costs fairly. Climate justice also tries to address the social implications of climate change mitigation. If these are not addressed properly, this could result in profound economic and social tensions. It could even lead to delays in necessary changes.

Climate justice actions can include the growing global body of climate litigation. In 2017, a report of the United Nations Environment Programme identified 894 ongoing legal actions worldwide.

Definition and objectives

Use and popularity of climate justice language has increased dramatically in recent years, yet climate justice is understood in many ways, and the different meanings are sometimes contested. At its simplest, conceptions of climate justice can be grouped along the following two lines:

  • procedural justice, which emphasizes fair, transparent and inclusive decision making, and
  • distributive justice, which places the emphasis on who bears the costs of both climate change and the actions taken to address it.

The objectives of climate justice can be described as: "to encompasses a set of rights and obligations, which corporations, individuals and governments have towards those vulnerable people who will be in a way significantly disproportionately affected by climate change."

Climate justice examines concepts such as equality, human rights, collective rights, and the historical responsibilities for climate change. There are procedural dimensions of climate change mitigation, as well as distributive ethical ones. Recognition and respect are the underlying basis for distributive and procedural justice.

Related fields are environmental justice and social justice.

Causes of injustice

Economic systems

Among major emitters, the U.S. has higher annual per capita emissions than China, which has more total annual emissions.
 
Cumulatively, U.S. and China emissions have caused the most greenhouse gas-related economic damage.

The fundamental differences in economic systems, such as capitalism and socialism as a root cause of climate injustice is an often debated and contentious issue. In this context, fundamental disagreements arise between conservative environmental groups on one side and leftist organizations on the other. While the former often tend to blame the excesses of neoliberalism for climate change and argue in favor of market-based reform within capitalism, the latter view capitalism with its exploitative traits as the underlying central issue. Other possible causal explanations include hierarchies based on the group differences and the nature of the fossil fuel industry itself.

Systemic causes

Many participants of grassroots movements that demand climate justice also ask for system change.

The unwarranted rate of climate change, along with its inequality of burdens, are seen as structural injustice perpetuated by systemic issues. There is political responsibility for the maintenance and support of existing structural processes. This is despite assumed viable potential alternative models based on novel technologies and means. As a criterion for determining responsibility for climate change, individual causal contribution does not matter as much as responsibility for the perpetuation of carbon-intensive practices and institutions. It has been argued that these systemic issues have evolved from and been perpetuated by a long history of practices such as colonization. These systemic causes have differing effects on the groups they are creating issues for. For example, issues with pipelines and oil drilling in the United States often stem from the fact that pipelines are built on Indigenous land. Because of the systems of oppression such as colonialism and settler colonialism that have made Indigenous communities more susceptible to being treated as expendable, it is often difficult for these communities to take action against large corporations. Systemically related climate justice issues are seen globally, especially in places where colonization has occurred (i.e. Gaelic Ireland, Scotland, Australia, India, etc.) These structures constitute the global politico-economic system, rather than enabling structural changes towards a system that does not facilitate exploitation of people and nature.

For others, climate justice could be pursued through existing economic frameworks, global organizations and policy mechanisms. Therefore, the root causes could be found in the causes that so far inhibited global implementation of measures like emissions trading schemes.

Disproportionality between causality and burden

Emissions of the richest 1% are more than twice that of the poorest 50%. Compliance with the Paris Agreement's 1.5°C goal would require the richest 1% to reduce emissions by at least 30 times, while per-person emissions of the poorest 50% could approximately triple.
 
Though total CO2 emissions (size of pie charts) differ substantially among high-emitting regions, the pattern of higher income classes emitting more than lower income classes is consistent across regions. The world's top 1% of emitters emit over 1000 times more than the bottom 1%.
 
Richer (developed) countries emit more CO2 per person than poorer (developing) countries. Emissions are roughly proportional to GDP per person, though the rate of increase diminishes with average GDP/pp of about $10,000.
 
A country-by-country visualisation of each country's vulnerability to effects of climate change (country size) and greenhouse gas emissions (country colour intensity). High emitting countries are generally not the most vulnerable.

The responsibility for climate change differs substantially among individuals and groups. Many of the people and nations most affected by climate change are among the least responsible for it. The most affluent citizens of the world are responsible for most environmental impacts. Robust action by them and their governments is necessary to reduce these impacts.

According to a 2020 report by Oxfam and the Stockholm Environment Institute, the richest 1% of the global population have caused twice as much carbon emissions as the poorest 50% over the 25 years from 1990 to 2015. This was, respectively, during that period, 15% of cumulative emissions compared to 7%. A second 2023 report found the richest 1% of humans produce more carbon emissions than poorest 66%, while the top 10% richest people account for more than half of global carbon emissions.

The bottom half of the population is directly responsible for less than 20% of energy footprints and consume less than the top 5% in terms of trade-corrected energy. High-income people usually have higher energy footprints as they use more energy-intensive goods. In particular, the largest disproportionality was identified to be in the domain of transport, where the top 10% consume 56% of vehicle fuel and conduct 70% of vehicle purchases.

A 2023 review article found that if there were a 2 °C temperature rise by 2100, roughly 1 billion primarily poor people would die as a result of primarily wealthy people's greenhouse gas emissions.

Some already existing effects of climate change hit harder people with high income. The increase of wildfires in the west of the USA "have disproportionately been borne by high-income, white, and older residents, and by owners of high-value properties;" This is because those properties have more greenery. There is similar effect with floods.

Intergenerational equity

Successive generations are predicted to experience progressively greater unprecedented lifetime exposure (ULE) events such as heat waves. About 111 million children born in 2020 will live with unprecedented heatwave exposure in a world that warms by 3.5 °C, compared with 62 million with only 1.5 °C of warming.

Preventable severe effects of climate change are likely to occur during the lifetime of the present adult population. Under current climate policy pledges, children born in 2020 (e.g. "Generation Alpha") will experience over their lifetimes, 2–7 times as many heat waves, as well as more of other extreme weather events compared to people born in 1960. This raises issues of intergenerational equity as it was these generations (individuals and their collective governance and economic systems) who are mainly responsible for the burden of climate change.

This illustrates that emissions produced by any given generation can lock-in damage for one or more future generations. Climate change could progressively become more threatening for the generations affected than for the generation responsible for the threats. The climate system contains tipping points, such as the amount of deforestation of the Amazon that will launch the forest's irreversible decline. A generation whose continued emissions drive the climate system past such significant tipping points inflicts severe injustice on multiple future generations.

Disproportionate impacts on disadvantaged groups

Disadvantaged groups will continue to be especially impacted as climate change persists. These groups will be affected due to inequalities based on demographic characteristics such as gender, race, ethnicity, age, and income. Inequality increases the exposure of disadvantaged groups to harmful effects of climate change. The damage is worsened because disadvantaged groups are last to receive emergency relief and are rarely included in the planning process at local, national and international levels for coping with the impacts of climate change. These are also exacerbated by systematic injustice structures that keep marginalized groups in a state of being seen as expendable by the government. Unless steps are taken to provide these groups with more access to universal resources and protection, disadvantaged groups will continue to suffer the most from climate justice issues.

Communities of color have long been targets of climate related injustices. Systems of racism and colonialism have created power imbalances where communities of color will often suffer when it comes to environmental justice issues. Communities of color are often also low-income communities and suffer from historical injustices like redlining that make it significantly harder to fight back against climate related issues.

Women are also disadvantaged and will be affected by climate change differently than men. Women are more likely to experience gender based violence such as assault and rape and violence will often follow climate justice issues. For example, oil pipelines will frequently house workers in isolated communities known as "man camps". These camps of primarily male workers have been found to bring higher rates of gender based violence to local communities around them, especially for indigenous women. Overall, a history of being seen as lesser and more expendable has made it so women's voices are not valued as much in times of environmental crisis.

Indigenous groups are affected by the consequences of climate change even though they historically have contributed the least to causing it. Indigenous peoples are often initially affected by settler colonialism and displacement by colonizers, which then makes it difficult to establish grounds to fight back against climate injustices. In the United States, Indigenous land is often exploited for resources like oil and critical minerals. Historically, instances like the Dawes Act (1887) have created cases of environmental injustice through the removal of Indigenous peoples from their land. Their land is also often treated as dumping sites for hazardous materials, such as nuclear waste. Indigenous people are unjustly impacted, and they continue to have fewer resources to cope with climate change.

Low-income communities face higher vulnerability to climate change. Low-income communities often become places where companies will establish harmful factories or mining practices, leading to issues like ecological and chemical runoff. An example of this is Norco, Louisiana, where there are multiple oil refineries. It is frequently referred to as "cancer alley". Low-income communities will often be disproportionately impacted by heat waves, air quality, and extreme weather events.

Responses to improve climate justice

Burden on future generations

One generation must not be allowed to consume large portions of the CO2 budget while bearing a relatively minor share of the reduction effort if this would involve leaving subsequent generations with a drastic reduction burden and expose their lives to comprehensive losses of freedom.

— German Federal Constitutional Court
April 2021
 
Conclusion on the Rights of Nature

     The rights of nature protect ecosystems and natural processes for their intrinsic value, thus complementing them with the human right to a healthy and ecologically balanced environment. The rights of nature, like all constitutional rights, are justiciable and, consequently, judges are obliged to guarantee them.

Common principles of justice in burden-sharing

There are three principles of justice in burden-sharing that can be used in making decisions on who bears the larger burdens of climate change globally and domestically: a) those who most caused the problem, b) those who have the most burden-carrying ability and c) those who have benefited most from the activities that cause climate change. A 2023 study estimated that the top 21 fossil fuel companies would owe cumulative climate reparations of $5.4 trillion over the period 2025–2050. To address such inequalities in practice, some cities have begun to address these inequalities through intersectional adaptation policies. Barcelona, for example, has implemented intersectional climate justice measures that include regulating short-term rentals, providing property tax support, and requiring 30% of new housing developments to be social housing units.

Another method of decision-making starts from the objective of preventing climate change e.g. beyond 1.5 °C, and from there reasons backwards to who should do what. This makes use of the principles of justice in burden-sharing to maintain fairness.

Court cases and litigation

Existential problem of planetary proportions

(Climate change is) an existential problem of planetary proportions that imperils all forms of life and the very health of our planet. ... A complete solution to this daunting, and self-inflicted, problem requires the contribution of all fields of human knowledge, whether law, science, economics or any other.

In 2019, the Supreme Court of the Netherlands confirmed that the government must cut carbon dioxide emissions further, as climate change threatens citizens' human rights.

By December 2022, the number of climate change-related lawsuits had grown to 2,180, more than half in the U.S. (1,522 lawsuits). The organization Our Children's Trust filed a lawsuit on the basis that the government had been ineffective in protecting the constitutional rights to life, liberty and protection of the youth of the United States. Based on existing laws, some relevant parties can already be forced into action by means of courts, such as with ie Saúl V. RWE.

Climate change litigation, also known as climate litigation, is an emerging body of environmental law using legal practice to set case law precedent to further climate change mitigation efforts from public institutions, such as governments and companies. In the face of slow climate change politics delaying climate change mitigation, activists and lawyers have increased efforts to use national and international judiciary systems to advance the effort. Climate litigation typically engages in one of five types of legal claims: Constitutional law (focused on breaches of constitutional rights by the state), administrative law (challenging the merits of administrative decision making), private law (challenging corporations or other organizations for negligence, nuisance, etc., fraud or consumer protection (challenging companies for misrepresenting information about climate impacts), or human rights (claiming that failure to act on climate change is a failure to protect human rights). Litigants pursuing such cases have had mixed results.
 
Rally for climate justice: Mass mobilization at the Chevron Oil Refinery in Richmond, California (2009)
 
Tens of thousands marching in Copenhagen for climate justice (2009)

Human rights

     ... acknowledging that climate change is a common concern of humankind, Parties should, when taking action to address climate change, respect, promote and consider their respective obligations on human rights, the right to health, the rights of indigenous peoples, local communities, migrants, children, persons with disabilities and people in vulnerable situations and the right to development, as well as gender equality, empowerment of women and intergenerational equity, ...

— The Glasgow climate pact 13 November 2021
Human rights and climate change is a conceptual and legal framework under which international human rights and their relationship to global warming are studied, analyzed, and addressed. The framework has been employed by governments, United Nations organizations, intergovernmental and non-governmental organizations, human rights and environmental advocates, and academics to guide national and international policy on climate change under the United Nations Framework Convention on Climate Change (UNFCCC) and the core international human rights instruments. In 2022 Working Group II of the IPCC suggested that "climate justice comprises justice that links development and human rights to achieve a rights-based approach to addressing climate change".

Challenges

Societal disruption and policy support

Climate justice may often conflict with social stability. For example, interventions that establish more just product pricing could result in social unrest. Decarbonization interventions could lead to decreased material possessions, comfort, maintained habits.

Multiple studies estimate that if a rapid transition were to be implemented the number of jobs could increase overall at least temporarily due to increased demand for labor to e.g. build public infrastructure and other green jobs to build the renewable energy system.

The urgent need for changes, especially when seeking to facilitate lifestyle-changes and shifts on an industry scale, could lead to social tension and decrease levels of public support for political parties in power. For instance, keeping gas prices low is often "really good for the poor and the middle class". Additionally, according to sociologist David Pellow and critical geographer Laura Pulido, the state is often complicit in environmental justice issues due to the economic benefits seen from ignoring climate injustices. This can create significant barriers to climate justice movements as it makes it more difficult to make progress through actions like lawmaking and protesting. Documents have been made to try to counter this neglect such as the Bali Principles of Climate Justice. The principles call for the importance of communities coming together when trying to make changes in times when the state remains complicit in injustices.

Despite commonly held beliefs, people in rich nations are sometimes willing to give money to poorer nations to help stop climate change. According to one study conducted by Social Science Research Network scientists and published in May 2023, distribution of money from the rich to the poor through a global emissions trading scheme is supported by 76% of Europeans and 54% of the US citizens. Whether or not this actually occurs is yet to be seen.

Loss and damage discussions

Some may see climate justice arguments for compensation by rich countries for natural disasters in developing countries as a way for "limitless liability". High levels of compensations could drain a society's resources, efforts, focus and financial funds away from efficient preventive climate change mitigation towards e.g. immediate climate change relief compensations.

Fossil-fuels dependent states

The US, China and Russia have cumulatively contributed the greatest amounts of CO2 since 1850.
 
Many of the heaviest users of fossil fuels rely on them for a high percentage of their electricity.

Fossil fuel phase out is projected to affect states and their citizens with large or central industries of fossil-fuels extraction – including OPEC states – differently than other nations. These states have obstructed climate negotiations and it has been argued that, due to their wealth, they should not need to receive financial support from other countries but could implement adequate transitions on their own in terms of financial resources.

A study suggested governments of nations that have historically benefited from extraction should take the lead, with countries that have a high dependency on fossil fuels but low capacity for transition needing some support to follow. In particular, transitional impacts of a rapid extraction phase-out is thought to be better absorbed in diversified, wealthier economies as they may have more capacities for enacting absorptive socioeconomic policies.

Conflicting interest-driven interpretations as barriers to agreements

Net income of the global oil and gas industry reached a record US$4 trillion in 2022.

Different interpretations and perspectives, arising from different interests, needs, circumstances, expectations, considerations and histories, can lead to highly varying ideas of what is fair. This may make it more difficult for countries to reach an agreement. Developing effective, legitimate and enforceable agreements could be complicated. This is especially the case if traditional methods or tools of policy-making are used.

Fundamental fairness principles could include: Responsibility, capability and rights (needs). For these principles, country characteristics can predict relative support.

After recovering from the COVID-19 pandemic, energy company profits increased with greater revenues from higher fuel prices resulting from the Russian invasion of Ukraine, falling debt levels, tax write-downs of projects shut down in Russia, and backing off from earlier plans to reduce greenhouse gas emissions. Record profits sparked public calls for windfall taxes.

History

Developed countries, as the main cause of climate change, in assuming their historical responsibility, must recognize and honor their climate debt in all of its dimensions as the basis for a just, effective, and scientific solution to climate change. (...) The focus must not be only on financial compensation, but also on restorative justice, understood as the restitution of integrity to our Mother Earth and all its beings.

World People's Conference on Climate Change and the Rights of Mother Earth, People's Agreement, April 2010, Cochabamba, Bolivia
Though the U.S.'s per capita and per GDP emissions have declined significantly, the raw numerical decline in emissions is much less substantial. Growing populations and increased economic activity work against mitigation attempts.

The concept of climate justice was deeply influential on climate negotiations years before the term "climate justice" was regularly applied to the concept. There have since been a multitude of frameworks written and used in environmental legislature such as the 2002 Bali Principles of Climate Justice.

Climate justice issues have been found to have roots in historical inequities and exploitative practices. A prime example of this are the climate justice issues that have stemmed from colonialism. These issues, while specific to their location, often have very similar roots and effects worldwide. According to environmental scholars such as Kyle Powys Whyte, Zoe Todd, and Dina Gilio-Whitaker, early colonization was focused on extraction and included practices such as clear cutting and unsustainable agriculture as a means of getting as many resources out of the land as possible. Examples of this have been seen worldwide, with the British colonialism that took place in Ireland being seen as a predecessor of what would eventually occur in the United States. These practices also affected Indigenous populations, creating areas where environmental justice violations could easily take place. Concepts like Manifest Destiny and laws like the Indian Removal Act (1830) and Dawes Act (1887) allowed for the exploitation of these communities in pursuit of expansion and progress.

The concept of climate justice was deeply influential on climate negotiations years before the term "climate justice" was regularly applied to the concept. In December 1990 the United Nations appointed an Intergovernmental Negotiating Committee (INC) to draft what became the Framework Convention on Climate Change (FCCC), adopted at the UN Conference on the Environment and Development (UNCED) in Rio de Janeiro in June 1992. As the name "Environment and Development" indicated, the fundamental goal was to coordinate action on climate change with action on sustainable development. It was impossible to draft the text of the FCCC without confronting central questions of climate justice concerning how to share the responsibilities of slowing climate change fairly between developed nations and developing nations.

The issue of the fair terms for sharing responsibility was raised forcefully for the INC by statements about climate justice from developing countries. In response, the FCCC adopted the now-famous (and still-contentious) principles of climate justice embodied in Article 3.1: "The Parties should protect the climate system for the benefit of present and future generations of humankind, on the basis of equity and in accordance with their common but differentiated responsibilities and respective capabilities. Accordingly, the developed country Parties should take the lead in combating climate change and the adverse effects thereof." The first principle of climate justice embedded in Article 3.1 is that calculations of benefits (and burdens) must include not only those for the present generation but also those for future generations. The second is that responsibilities are "common but differentiated", that is, every country has some responsibilities, but equitable responsibilities are different for different types of countries. The third is that a crucial instance of different responsibilities is that in fairness developed countries' responsibilities must be greater. How much greater continues to be debated politically.

In 2000, at the same time as the Sixth Conference of the Parties (COP 6), the first Climate Justice Summit took place in The Hague. This summit aimed to "affirm that climate change is a rights issue" and to "build alliances across states and borders" against climate change and in favor of sustainable development.

Subsequently, in August–September 2002, international environmental groups met in Johannesburg for the Earth Summit. At this summit, also known as Rio+10, as it took place ten years after the 1992 Earth Summit, the Bali Principles of Climate Justice were adopted. These framed the issues of climate justice as a social and human rights issue rather than focusing on them as a technical or logistical problem. There is an emphasis on the importance of the right to life and the importance of community in the protection of environmental rights. The Bali Principles push for offending parties, such as the oil industry and Global North Nations take responsibility for climate change. They also discuss the issues with equity between disadvantaged groups and encourage protecting of the environment for the sake of future generations.

Climate Justice affirms the rights of communities dependent on natural resources for their livelihood and cultures to own and manage the same in a sustainable manner, and is opposed to the commodification of nature and its resources.

Bali Principles of Climate Justice, article 18, August 29, 2002

In 2004, the Durban Group for Climate Justice was formed at an international meeting in Durban, South Africa. Here representatives from NGOs and peoples' movements discussed realistic policies for addressing climate change.

In 2007 at the 13th Conference of the Parties (COP 13) in Bali, the global coalition Climate Justice Now! was founded, and, in 2008, the Global Humanitarian Forum focused on climate justice at its inaugural meeting in Geneva.

In 2009, the Climate Justice Action Network was formed during the run-up to the Copenhagen Summit. It proposed civil disobedience and direct action during the summit, and many climate activists used the slogan 'system change not climate change'.

In April 2010, the World People's Conference on Climate Change and the Rights of Mother Earth took place in Tiquipaya, Bolivia. It was hosted by the government of Bolivia as a global gathering of civil society and governments. The conference published a "People's Agreement" calling, among other things, for greater climate justice.

In September 2013 the Climate Justice Dialogue convened by the Mary Robinson Foundation and the World Resources Institute released their Declaration on Climate Justice in an appeal to those drafting the proposed agreement to be negotiated at COP-21 in Paris in 2015.

In December 2018, the People's Demands for Climate Justice, signed by 292,000 individuals and 366 organizations, called upon government delegates at COP24 to comply with a list of six climate justice demands. One of the demands was to "Ensure developed countries honor their "Fair Shares" for largely fueling this crisis."

Some advance was achieved at the Paris climate finance summit at June 2023. The World Bank allowed to low income countries temporarily stop paying debts if they are hit by climate disaster. Most of financial help to climate vulnerable countries is coming in the form of debts, what often worsens the situation as those countries are overburdened with debts. Around 300 billion dollars was pledged as financial help in the next years, but trillions are needed to really solve the problem. More than 100 leading economists signed a letter calling for an extreme wealth tax as a solution (2% tax can generate around 2.5 trillion). It can serve as a loss and damage mechanism as the 1% of richest people is responsible for twice as many emissions as the poorest 50%.

Examples

Subsistence farmers in Latin America

Several studies that investigated the impacts of climate change on agriculture in Latin America suggest that in the poorer countries of Latin America, agriculture composes the most important economic sector and the primary form of sustenance for small farmers. Maize is the only grain still produced as a sustenance crop on small farms in Latin American nations. The projected decrease of this grain and other crops can threaten the welfare and the economic development of subsistence communities in Latin America. Food security is of particular concern to rural areas that have weak or non-existent food markets to rely on in the case food shortages. In August 2019, Honduras declared a state of emergency when a drought caused the southern part of the country to lose 72% of its corn and 75% of its beans. Food security issues are expected to worsen across Central America due to climate change. It is predicted that by 2070, corn yields in Central America may fall by 10%, beans by 29%, and rice by 14%. With Central American crop consumption dominated by corn (70%), beans (25%), and rice (6%), the expected drop in staple crop yields could have devastating consequences.

The expected impacts of climate change on subsistence farmers in Latin America and other developing regions are unjust for two reasons. First, subsistence farmers in developing countries, including those in Latin America are disproportionately vulnerable to climate change Second, these nations were the least responsible for causing the problem of anthropogenic induced climate.

Disproportionate vulnerability to climate disasters is socially determined. For example, socioeconomic and policy trends affecting smallholder and subsistence farmers limit their capacity to adapt to change. A history of policies and economic dynamics has negatively impacted rural farmers. During the 1950s and through the 1980s, high inflation and appreciated real exchange rates reduced the value of agricultural exports. As a result, farmers in Latin America received lower prices for their products compared to world market prices. Following these outcomes, Latin American policies and national crop programs aimed to stimulate agricultural intensification. These national crop programs benefitted larger commercial farmers more. In the 1980s and 1990s low world market prices for cereals and livestock resulted in decreased agricultural growth and increased rural poverty.

Perceived vulnerability to climate change differs even within communities, as in the example of subsistence farmers in Calakmul, Mexico.

Adaptive planning is challenged by the difficulty of predicting local scale climate change impacts. A crucial component to adaptation should include government efforts to lessen the effects of food shortages and famines. Planning for equitable adaptation and agricultural sustainability will require the engagement of farmers in decision-making processes.

Hurricane Katrina

A house is crushed and swept off its foundations by flooding from a breached levee in the Ninth Ward, New Orleans, Louisiana, due to a storm surge from Hurricane Katrina. Around 90% of the Ninth Ward's population is black.

Due to climate change, tropical cyclones are expected to increase in intensity, have increased rainfall, and have larger storm surges. These changes are driven by rising sea temperatures and increased maximum water vapor content of the atmosphere as the air heats up. Hurricane Katrina in 2005 showed how climate change disasters affect different people individually, as it had a disproportionate effect on low-income and minority groups. A study on the race and class dimensions of Hurricane Katrina suggests that those most vulnerable include poor, black, brown, elderly, sick, and homeless people. Low-income and black communities had little resources and limited mobility to evacuate before the storm. After the hurricane, low-income communities were most affected by contamination, and this was made worse by the fact that government relief measures failed to adequately assist those most at risk.

Pakistan Floods (2022)

In 2022, Pakistan faced catastrophic floods that affected over 33 million people and resulted in significant loss of life and property. The unprecedented monsoon rains and melting glaciers, attributed to climate change, submerged one-third of the country under water. Despite contributing less than 1% to global greenhouse gas emissions, Pakistan is disproportionately impacted by climate-induced disasters. This situation highlights the essence of climate justice, emphasizing how nations with minimal contributions to global emissions suffer the most severe consequences.

Chlordecone use in the French Antilles

The islands of Martinique and Guadeloupe are heavily contaminated with chlordecone, following years of its massive unrestricted use on banana plantations in the region. Chlordecone was banned globally by the Stockholm Convention on Persistent Organic Pollutants in 2009. Since 2003, local authorities in the two islands have restricted the cultivation of various food crops because the soil is badly contaminated by chlordecone. A 2018 large-scale study by the French public health agency, Santé publique France, shows that 95% of the inhabitants of Guadeloupe and 92% of those of Martinique are contaminated by the chemical, far higher than the world average.

Linear no-threshold model

From Wikipedia, the free encyclopedia
Different assumptions on the extrapolation of the cancer risk vs. radiation dose to low-dose levels, given a known risk at a high dose:
(A) supra-linearity, (B) linear
(C) linear-quadratic, (D) hormesis

The linear no-threshold model (LNT) is a dose-response model used in radiation protection to estimate stochastic health effects such as radiation-induced cancer, genetic mutations and teratogenic effects on the human body due to exposure to ionizing radiation. The model assumes a linear relationship between dose and health effects, even for very low doses where biological effects are more difficult to observe. The LNT model implies that all exposure to ionizing radiation is harmful, regardless of how low the dose is, and that the effect is cumulative over a lifetime.

The LNT model is commonly used by regulatory bodies as a basis for formulating public health policies that set regulatory dose limits to protect against the effects of radiation. The validity of the LNT model, however, is disputed, and other models exist: the threshold model, which assumes that very small exposures are harmless, the radiation hormesis model, which says that radiation at very small doses can be beneficial, and the supra-linear model. It has been argued that the LNT model may have created an irrational fear of radiation.

Scientific organizations and government regulatory bodies generally support the use of the LNT model, particularly for optimization. However, some caution against estimating health effects from doses below a certain level (see § Controversy).

Introduction

Stochastic health effects are those that occur by chance, and whose probability is proportional to the dose, but whose severity is independent of the dose. The LNT model assumes there is no lower threshold at which stochastic effects start, and assumes a linear relationship between dose and the stochastic health risk. In other words, LNT assumes that radiation has the potential to cause harm at any dose level, however small, and the sum of several very small exposures is just as likely to cause a stochastic health effect as a single larger exposure of equal dose value. In contrast, deterministic health effects are radiation-induced effects such as acute radiation syndrome, which are caused by tissue damage. Deterministic effects reliably occur above a threshold dose and their severity increases with dose. Because of the inherent differences, LNT is not a model for deterministic effects, which are instead characterized by other types of dose-response relationships.

LNT is a common model to calculate the probability of radiation-induced cancer both at high doses where epidemiology studies support its application, but controversially, also at low doses, which is a dose region that has a lower predictive statistical confidence. Nonetheless, regulatory bodies, such as the Nuclear Regulatory Commission (NRC), commonly use LNT as a basis for regulatory dose limits to protect against stochastic health effects, as found in many public health policies. Whether the LNT model describes the reality for small-dose exposures is disputed, and challenges to the LNT model used by NRC for setting radiation protection regulations were submitted. NRC rejected the petitions in 2021 because "they fail to present an adequate basis supporting the request to discontinue use of the LNT model".

Other dose models include: the threshold model, which assumes that very small exposures are harmless, and the radiation hormesis model, which claims that radiation at very small doses can be beneficial. Because the current data is inconclusive, scientists disagree on which model should be used, though most national and international cancer research organizations explicitly endorse LNT for regulating exposures to low dose radiation. The model is sometimes used to quantify the cancerous effect of collective doses of low-level radioactive contaminations, which is controversial. Such practice has been criticized by the International Commission on Radiological Protection since 2007.

Origins

Increased Risk of Solid Cancer with Dose for A-bomb survivors, from BEIR report. Notably, this exposure pathway occurred from essentially a massive spike or pulse of radiation, a result of the brief instant that the bomb exploded, which while somewhat similar to the environment of a CT scan, is wholly unlike the low dose rate of living in a contaminated area such as Chernobyl, where the dose rate is orders of magnitude smaller. LNT does not consider dose rate and is a one size fits all approach based solely on total absorbed dose, which has not yet been verified in other settings. Likewise, it has also been pointed out that bomb survivors inhaled carcinogenic benzopyrene from the burning cities, yet this is not factored in.

The association of exposure to radiation with cancer had been observed as early as 1902, six years after the discovery of X-rays by Wilhelm Röntgen and radioactivity by Henri Becquerel. In 1927, Hermann Muller demonstrated that radiation may cause genetic mutation. He also suggested mutation as a cause of cancer. Gilbert N. Lewis and Alex Olson, based on Muller's discovery of the effect of radiation on mutation, proposed a mechanism for biological evolution in 1928, suggesting that genomic mutation was induced by cosmic and terrestrial radiation and first introduced the idea that such mutation may occur proportionally to the dose of radiation. Various laboratories, including Muller's, then demonstrated the apparent linear dose response of mutation frequency. Muller, who received a Nobel Prize for his work on the mutagenic effect of radiation in 1946, asserted in his Nobel lecture, The Production of Mutation, that mutation frequency is "directly and simply proportional to the dose of irradiation applied" and that there is "no threshold dose".

The early studies were based on higher levels of radiation that made it hard to establish the safety of low level of radiation. Indeed, many early scientists believed that there may be a tolerance level, and that low doses of radiation may not be harmful. A later study in 1955 on mice exposed to low dose of radiation suggests that they may outlive control animals. The interest in the effects of radiation intensified after the dropping of atomic bombs on Hiroshima and Nagasaki, and studies were conducted on the survivors. Although compelling evidence on the effect of low dosage of radiation was hard to come by, by the late 1940s, the idea of LNT became more popular due to its mathematical simplicity. In 1954, the National Council on Radiation Protection and Measurements (NCRP) introduced the concept of maximum permissible dose. In 1958, the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR) assessed the LNT model and a threshold model, but noted the difficulty in acquiring "reliable information about the correlation between small doses and their effects either in individuals or in large populations". The United States Congress Joint Committee on Atomic Energy (JCAE) similarly could not establish if there is a threshold or "safe" level for exposure; nevertheless, it introduced the concept of "As Low As Reasonably Achievable" (ALARA). ALARA would become a fundamental principle in radiation protection policy that implicitly accepts the validity of LNT. In 1959, the United States Federal Radiation Council (FRC) supported the concept of the LNT extrapolation down to the low dose region in its first report.

By the 1970s, the LNT model had become accepted as the standard in radiation protection practice by a number of bodies. In 1972, the first report of National Academy of Sciences (NAS) Biological Effects of Ionizing Radiation (BEIR), an expert panel who reviewed available peer reviewed literature, supported the LNT model on pragmatic grounds, noting that while "dose-effect relationship for x rays and gamma rays may not be a linear function", the "use of linear extrapolation ... may be justified on pragmatic grounds as a basis for risk estimation." In its seventh report of 2006, NAS BEIR VII writes, "the committee concludes that the preponderance of information indicates that there will be some risk, even at low doses".

The Health Physics Society (in the United States) has published a documentary series on the origins of the LNT model.

Radiation precautions and public policy

Radiation precautions have led to sunlight being listed as a carcinogen at all sun exposure rates, due to the ultraviolet component of sunlight, with no safe level of sunlight exposure being suggested, following the precautionary LNT model. According to a 2007 study submitted by the University of Ottawa to the Department of Health and Human Services in Washington, D.C., there is not enough information to determine a safe level of sun exposure.

The linear no-threshold model is used to extrapolate the expected number of extra deaths caused by exposure to environmental radiation, and it therefore has a great impact on public policy. The model is used to translate any radiation release, into a number of lives lost, while any reduction in radiation exposure, for example as a consequence of radon detection, is translated into a number of lives saved. When the doses are very low the model predicts new cancers only in a very small fraction of the population, but for a large population, the number of lives is extrapolated into hundreds or thousands.

A linear model has long been used in health physics to set maximum acceptable radiation exposures.

In 2025, Donald Trump issued an executive order that proposed implementing "determinate radiation limits" to replace the linear no-threshold model and ALARA principle. These changes were proposed to ease the licensing requirements on new nuclear power plants in the United States.

Controversy

The LNT model has been contested by a number of scientists. It has been claimed that the early proponent of the model Hermann Joseph Muller intentionally ignored an early study that did not support the LNT model when he gave his 1946 Nobel Prize Lecture advocating the model.

In very high dose radiation therapy, it was known at the time that radiation can cause a physiological increase in the rate of pregnancy anomalies; however, human exposure data and animal testing suggests that the "malformation of organs appears to be a deterministic effect with a threshold dose", below which no rate increase is observed. A review in 1999 on the link between the Chernobyl accident and teratology (birth defects) concludes that "there is no substantive proof regarding radiation‐induced teratogenic effects from the Chernobyl accident". It is argued that the human body has defense mechanisms, such as DNA repair and programmed cell death, that would protect it against carcinogenesis due to low-dose exposures of carcinogens. However, these repair mechanisms are known to be error prone.

A 2011 research of the cellular repair mechanisms support the evidence against the linear no-threshold model. According to its authors, this study published in the Proceedings of the National Academy of Sciences of the United States of America "casts considerable doubt on the general assumption that risk to ionizing radiation is proportional to dose".

A 2011 review of studies addressing childhood leukaemia following exposure to ionizing radiation, including both diagnostic exposure and natural background exposure from radon, concluded that existing risk factors, excess relative risk per sievert (ERR/Sv), is "broadly applicable" to low dose or low dose-rate exposure, "although the uncertainties associated with this estimate are considerable". The study also notes that "epidemiological studies have been unable, in general, to detect the influence of natural background radiation upon the risk of childhood leukaemia"

Many expert scientific panels have been convened on the risks of ionizing radiation. Most explicitly support the LNT model and none have concluded that evidence exists for a threshold, with the exception of the French Academy of Sciences in a 2005 report. Considering the uncertainty of health effects at low doses, several organizations caution against estimating health effects below certain doses, generally below natural background, as noted below:

  • The US Nuclear Regulatory Commission upheld the LNT model in 2021 as a "sound regulatory basis for minimizing the risk of unnecessary radiation exposure to both members of the public and radiation workers" following challenges to the dose limit requirements contained in its regulations.

    Based upon the current state of science, the NRC concludes that the actual level of risk associated with low doses of radiation remains uncertain and some studies, such as the INWORKS study, show there is at least some risk from low doses of radiation. Moreover, the current state of science does not provide compelling evidence of a threshold, as highlighted by the fact that no national or international authoritative scientific advisory bodies have concluded that such evidence exists. Therefore, based upon the stated positions of the aforementioned advisory bodies; the comments and recommendations of NCI, NIOSH, and the EPA; the October 28, 2015, recommendation of the ACMUI; and its own professional and technical judgment, the NRC has determined that the LNT model continues to provide a sound regulatory basis for minimizing the risk of unnecessary radiation exposure to both members of the public and occupational workers. Consequently, the NRC will retain the dose limits for occupational workers and members of the public in 10 CFR part 20 radiation protection regulations.

  • In 2005 the United States National Academies' National Research Council published its comprehensive meta-analysis of low-dose radiation research BEIR VII, Phase 2. In its press release the Academies stated:

    The scientific research base shows that there is no threshold of exposure below which low levels of ionizing radiation can be demonstrated to be harmless or beneficial.

  • In a 2005 report, the International Commission on Radiological Protection stated: "The report concludes that while existence of a low-dose threshold does not seem to be unlikely for radiation-related cancers of certain tissues, the evidence does not favour the existence of a universal threshold. The LNT hypothesis, combined with an uncertain DDREF for extrapolation from high doses, remains a prudent basis for radiation protection at low doses and low dose rates." In a 2007 report, ICRP noted that collective dose is effective for optimization, but aggregation of very low doses to estimate excess cancers is inappropriate because of large uncertainties.
  • The National Council on Radiation Protection and Measurements (a body commissioned by the United States Congress), in a 2018 report, "concludes that the recent epidemiological studies support the continued use of LNT model (with the steepness of the dose-response slope perhaps reduced by a DDREF factor) for radiation protection. This is in accord with judgments by other national and international scientific committees, based on somewhat older data, that no alternative dose-response relationship appears more pragmatic or prudent for radiation protection purposes than the LNT model."
  • The United States Environmental Protection Agency endorses the LNT model in its 2011 report on radiogenic cancer risk:

    Underlying the risk models is a large body of epidemiological and radiobiological data. In general, results from both lines of research are consistent with a linear, no-threshold dose (LNT) response model in which the risk of inducing a cancer in an irradiated tissue by low doses of radiation is proportional to the dose to that tissue

  • UNSCEAR stated in Appendix C of its 2020/2021 report:

    The Committee concluded that there remains good justification for the use of a non-threshold model for risk inference given the robust knowledge on the role of mutation and chromosomal aberrations in carcinogenesis. That said, there are ways that radiation could act that might lead to a re-evaluation of the use of a linear dose-response model to infer radiation cancer risks.

A number of organisations caution against using the Linear no-threshold model to estimate risk from radiation exposure below a certain level:

  • The French Academy of Sciences (Académie des sciences) and the National Academy of Medicine (Académie nationale de médecine) published a report in 2005 (at the same time as BEIR VII report in the United States) that rejected the linear no-threshold model in favor of a threshold dose response and a significantly reduced risk at low radiation exposure:

    In conclusion, this report raises doubts on the validity of using LNT for evaluating the carcinogenic risk of low doses (< 100 mSv) and even more for very low doses (< 10 mSv). The LNT concept can be a useful pragmatic tool for assessing rules in radioprotection for doses above 10 mSv; however since it is not based on biological concepts of our current knowledge, it should not be used without precaution for assessing by extrapolation the risks associated with low and even more so, with very low doses (< 10 mSv), especially for benefit-risk assessments imposed on radiologists by the European directive 97-43.

  • The Health Physics Society's position statement first adopted in January 1996, last revised in February 2019, states:

    The Health Physics Society advises against estimating health risks to people from exposures to ionizing radiation that are near or less than natural background levels because statistical uncertainties at these low levels are great.

  • The American Nuclear Society states that the LNT model may not adequately describe the relationship between harm and exposure and notes the recommendation in ICRP-103 "that the LNT model not be used for estimating the health effects of trivial exposures received by large populations over long periods of time…" It further recommends additional research.
  • UNSCEAR stated in its 2012 report:

    The Scientific Committee does not recommend multiplying very low doses by large numbers of individuals to estimate numbers of radiation-induced health effects within a population exposed to incremental doses at levels equivalent to or lower than natural background levels.

Mental health effects

It has been argued that the LNT model had caused an irrational fear of radiation, whose observable effects are much more significant than non-observable effects postulated by LNT. In the wake of the 1986 Chernobyl accident in Ukraine, Europe-wide anxieties were fomented in pregnant mothers over the perception enforced by the LNT model that their children would be born with a higher rate of mutations. As far afield as the country of Switzerland, hundreds of excess induced abortions were performed on the healthy unborn, out of this no-threshold fear. Following the accident however, studies of data sets approaching a million births in the EUROCAT database, divided into "exposed" and control groups were assessed in 1999. As no Chernobyl impacts were detected, the researchers conclude "in retrospect the widespread fear in the population about the possible effects of exposure on the unborn was not justified". Despite studies from Germany and Turkey, the only robust evidence of negative pregnancy outcomes that transpired after the accident were these elective abortion indirect effects, in Greece, Denmark, Italy etc., due to the anxieties created.

The consequences of low-level radiation are often more psychological than radiological. Because damage from very-low-level radiation cannot be detected, people exposed to it are left in anguished uncertainty about what will happen to them. Many believe they have been fundamentally contaminated for life and may refuse to have children for fear of birth defects. They may be shunned by others in their community who fear a sort of mysterious contagion.

Forced evacuation from a radiation or nuclear accident may lead to social isolation, anxiety, depression, psychosomatic medical problems, reckless behavior, or suicide. Such was the outcome of the 1986 Chernobyl nuclear disaster in Ukraine. A comprehensive 2005 study concluded that "the mental health impact of Chernobyl is the largest public health problem unleashed by the accident to date". Frank N. von Hippel, a U.S. scientist, commented on the 2011 Fukushima nuclear disaster, saying that "fear of ionizing radiation could have long-term psychological effects on a large portion of the population in the contaminated areas".

Such great psychological danger does not accompany other materials that put people at risk of cancer and other deadly illness. Visceral fear is not widely aroused by, for example, the daily emissions from coal burning, although as a National Academy of Sciences study found, this causes 10,000 premature deaths a year in the US. It is "only nuclear radiation that bears a huge psychological burden – for it carries a unique historical legacy".

Thalidomide scandal

From Wikipedia, the free encyclopedia
Feet of a baby born to a mother who had taken thalidomide while pregnant

In the late 1950s and early 1960s, thalidomide was prescribed to women in 46 countries who were pregnant or who subsequently became pregnant, and consequently resulted in the "biggest anthropogenic medical disaster ever," with more than 10,000 children born with a range of severe deformities, such as phocomelia, as well as thousands of miscarriages.

Thalidomide was introduced in 1957 as a tranquilizer, and was later marketed by the West German pharmaceutical company Chemie Grünenthal under the trade name Contergan as a medication for anxiety, trouble sleeping, tension, and morning sickness. It was introduced as a sedative and medication for morning sickness without having been tested on pregnant women. While initially deemed to be safe in pregnancy, concerns regarding birth defects were noted in 1961, and the medication was removed from the market in Europe that year.

Development of thalidomide

Thalidomide was first developed as a tranquilizer by Swiss pharmaceutical company Ciba in 1953. In 1954, Ciba abandoned the product, and it was acquired by German pharmaceutical company Chemie Grünenthal. The company had been established by Hermann Wirtz Sr, a Nazi Party member, after World War II as a subsidiary of the family's Mäurer & Wirtz company. The company's initial aim was to develop antibiotics for which there was an urgent market need. Wirtz included many former Nazi associates in his company.

Birth defect crisis

The total number of embryos affected by the use of thalidomide during pregnancy is estimated at more than 10,000, and potentially up to 20,000; of these, approximately 40 percent died at or shortly after the time of birth. Those who survived had limb, eye, urinary tract, and heart defects. Its initial entry into the U.S. market was prevented by Frances Oldham Kelsey at the U.S. Food and Drug Administration (FDA). The birth defects of thalidomide led to the development of greater drug regulation and monitoring in many countries.

The severity and location of the deformities depended on how many days into the pregnancy the mother was before beginning treatment; thalidomide taken on the 20th day of pregnancy caused central brain damage, day 21 would damage the eyes, day 22 the ears and face, day 24 the arms, and leg damage would occur if taken up to day 28. Thalidomide did not damage the fetus if taken after 42 days' gestation.

United Kingdom

Artificial limbs made for an affected child in the 1960s by the Department of Health and Social Security's Limb Fitting Centre in Roehampton, London

In the UK, the drug was licensed in 1958 and withdrawn in 1961. Of the approximately 2,000 babies born with defects, around half died within a few months and 466 survived to at least 2010. In 1968, after a long campaign by The Sunday Times, a compensation settlement for the UK victims was reached with Distillers Company (now part of Diageo), which had distributed the drug in the UK. Distillers Biochemicals paid out approximately £28m in compensation following a legal battle.

The British Thalidomide Children's Trust was set up in 1973 as part of a £20 million legal settlement between Distillers Company and 429 children with thalidomide-related disabilities. In 1997, Diageo (formed by a merger between Grand Metropolitan and Guinness, who had taken over Distillers in 1990) made a long-term financial commitment to support the Thalidomide Trust and its beneficiaries. The UK government gave survivors a grant of £20 million, to be distributed through the Thalidomide Trust, in December 2009.

Spain

In Spain, thalidomide was widely available throughout the 1970s, and perhaps even into the 1980s. There were two reasons for this. First, state controls and safeguarding were poor; it was not until 2008 that the government even admitted the country had ever imported thalidomide. Second, Grünenthal failed to insist that its sister company in Madrid warn Spanish doctors, and permitted its sister company not to warn doctors of the defects. The Spanish advocacy group for victims of thalidomide estimates that in 2015, there were 250–300 living victims of thalidomide in Spain.

Australia and New Zealand

Australian obstetrician William McBride raised concern about thalidomide after a midwife called Sister Pat Sparrow first suspected the drug was causing birth defects in the babies of patients under McBride's care at Crown Street Women's Hospital in Sydney. German paediatrician Widukind Lenz, who also suspected the link, is credited with conducting the scientific research that proved thalidomide was causing birth defects in 1961. Further animal tests were conducted by George Somers, Chief Pharmacologist of Distillers Company in Britain, which showed fetal abnormalities in rabbits. Similar results were also published showing these effects in rats and other species.

Lynette Rowe, who was born without limbs, led an Australian class action lawsuit against the drug's manufacturer, Grünenthal, which fought to have the case heard in Germany. The Supreme Court of Victoria dismissed Grünenthal's application in 2012, and the case was heard in Australia. On 17 July 2012, Rowe was awarded an out-of-court settlement, believed to be in the millions of dollars and providing precedence for class action victims to receive further compensation. In February 2014, the Supreme Court of Victoria endorsed the settlement of $89 million AUD to 107 victims of the drug in Australia and New Zealand.

Germany

In East Germany, thalidomide was rejected by the Central Committee of Experts for the Drug Traffic in the GDR, and was never approved for use. There are no known thalidomide children born in East Germany. Meanwhile, in West Germany, it took some time before the increase in dysmelia at the end of the 1950s was connected with thalidomide. In 1958, Karl Beck, a former pediatric doctor in Bayreuth, wrote an article in a local newspaper claiming a relationship between nuclear weapons testing and cases of dysmelia in children. Based on this, FDP leader Erich Mende requested an official statement from the federal government. For statistical reasons, the main data series used to research dysmelia cases started by chance at the same time as the approval date for thalidomide. After the Nazi regime with its Law for the Prevention of Hereditarily Diseased Offspring used mandatory statistical monitoring to commit various crimes, western Germany had been very reluctant to monitor congenital disorders in a similarly strict way. The parliamentary report rejected any relation with radioactivity and the abnormal increase of dysmelia. Also the DFG research project installed after the Mende request was not helpful. The project was led by pathologist Franz Büchner, who ran the project to propagate his teratological theory. Büchner saw lack of healthy nutrition and behavior of the mothers as being more important than genetic reasons. Furthermore, it took a while to appoint a Surgeon General in Germany; the Federal Ministry of Health was not founded until 1962, some months after thalidomide was banned from the market. In West Germany approximately 2,500 children were born with birth defects from thalidomide.

Canada

Despite its severe side effects, thalidomide was sold in pharmacies in Canada until 1962. The effects of thalidomide increased fears regarding the safety of pharmaceutical drugs. The Society of Toxicology of Canada was formed after the effects of thalidomide were made public, focusing on toxicology as a discipline separate from pharmacology. The need for the testing and approval of the toxins in certain pharmaceutical drugs became more important after the disaster. The Society of Toxicology of Canada is responsible for the Conservation Environment Protection Act, focusing on researching the impact to human health of chemical substances. Thalidomide brought on changes in the way drugs are tested, what type of drugs are used during pregnancy, and increased the awareness of potential side effects of drugs.

According to Canadian news magazine programme W5, most, but not all, victims of thalidomide receive annual benefits as compensation from the Government of Canada. Excluded are those who cannot provide the documentation the government requires.

A group of 120 Canadian survivors formed the Thalidomide Victims Association of Canada, the goal of which is to prevent the approval of drugs that could be harmful to pregnant individuals and babies. The members from the thalidomide victims association were involved in the STEPS programme, which aimed to prevent teratogenicity.

United States

1962: FDA pharmacologist Frances Oldham Kelsey receives the President's Award for Distinguished Federal Civilian Service from President John F. Kennedy for blocking sale of thalidomide in the United States.

In the U.S., the FDA refused approval to market thalidomide, saying further studies were needed. This reduced the impact of thalidomide in American patients. The refusal was largely due to pharmacologist Frances Oldham Kelsey who withstood pressure from the Richardson-Merrell Pharmaceuticals Co. Although thalidomide was not approved for sale in the United States at the time, over 2.5 million tablets had been distributed to over 1,000 physicians during a clinical testing programme. It is estimated that nearly 20,000 patients, several hundred of whom were pregnant, were given the drug to help alleviate morning sickness or as a sedative, and at least 17 children were consequently born in the United States with thalidomide-associated deformities. While pregnant, children's television host Sherri Finkbine took thalidomide that her husband had purchased over-the-counter in Europe. When she learned that thalidomide was causing fetal deformities she wanted to abort her pregnancy, but the laws of Arizona allowed abortion only if the mother's life was in danger. Finkbine traveled to Sweden to have the abortion. Thalidomide was found to have deformed the fetus.

For denying the application despite the pressure from Richardson-Merrell Pharmaceuticals Co., Kelsey eventually received the President's Award for Distinguished Federal Civilian Service at a 1962 ceremony with President John F. Kennedy. In September 2010, the FDA honored Kelsey with the first Kelsey award, given annually to an FDA staff member. This came 50 years after Kelsey, then a new medical officer at the agency, first reviewed the application from the William S. Merrell Pharmaceuticals Company of Cincinnati.

Cardiologist Helen B. Taussig learned of the damaging effects of the drug thalidomide on newborns and in 1967, testified before Congress on this matter after a trip to Germany where she worked with infants with phocomelia (severe limb deformities). As a result of her efforts, thalidomide was banned in the United States and Europe.

Austria

Ingeborg Eichler, a member of the Austrian pharmaceutical admission conference, enforced restrictions on the sale of thalidomide (tradename Softenon) under the rules of prescription medication and as a result relatively few affected children were born in Austria and Switzerland.

Japan

In Japan, there are 300 victims of this drug.

Aftermath of scandal

Thalidomide Memorial in Cardiff, Wales

The numerous reports of malformations in babies brought about the awareness of the side effects of the drug on pregnant women. The birth defects caused by the drug thalidomide can range from moderate malformation to more severe forms. Possible birth defects include phocomelia, dysmelia, amelia, bone hypoplasticity, and other congenital defects affecting the ear, heart, or internal organs. Franks et al. looked at how the drug affected newborn babies, the severity of their deformities, and reviewed the drug in its early years. Webb in 1963 also reviewed the history of the drug and the different forms of birth defects it had caused. "The most common form of birth defects from thalidomide is shortened limbs, with the arms being more frequently affected. This syndrome is the presence of deformities of the long bones of the limbs resulting in shortening and other abnormalities."

Grünenthal criminal trial

In 1968, a large criminal trial began in West Germany, charging several Grünenthal officials with negligent homicide and injury. After Grünenthal settled with the victims in April 1970, the trial ended in December 1970 with no finding of guilt. As part of the settlement, Grünenthal paid 100 million DM into a special foundation; the West German government added 320 million DM. The foundation paid victims a one-time sum of 2,500–25,000 DM (depending on severity of disability) and a monthly stipend of 100–450 DM. The monthly stipends have since been raised substantially and are now paid entirely by the government (as the foundation had run out of money). Grünenthal paid another €50 million into the foundation in 2008.

On 31 August 2012, Grünenthal chief executive Harald F. Stock— who served as the chief executive officer of Grünenthal GmbH from January 2009 to May 28, 2013— apologized for the first time for producing the drug and remaining silent about the birth defects. At a ceremony, Stock unveiled a statue of a disabled child to symbolize those harmed by thalidomide and apologized for not trying to reach out to victims for over 50 years. At the time of the apology, there were between 5,000 and 6,000 people still living with Thalidomide-related birth defects. Victim advocates called the apology "insulting" and "too little, too late", and criticized the company for not compensating victims and for their claim that no one could have known the harm the drug caused, arguing that there were plenty of red flags at the time.

Australian National Memorial

On 13 November 2023, the Australian Government announced its intention to make a formal apology to people affected by thalidomide with the unveiling of a national memorial site. Prime Minister Anthony Albanese described the thalidomide tragedy as a "dark chapter" in Australian history, and Health Minister Mark Butler said, "While we cannot change the past or end the physical suffering, I hope these important next steps of recognition and apology will help heal some of the emotional wounds."

Notable cases

Niko von Glasow, German filmmaker
  • Mercédes Benegbi, born with phocomelia of both arms, drove the successful campaign for compensation from her government for Canadians who were affected by thalidomide.
  • Mat Fraser, born with phocomelia of both arms, is an English rock musician, actor, writer and performance artist. He produced a 2002 television documentary, Born Freak, which looked at this historical tradition and its relevance to modern disabled performers. This work has become the subject of academic analysis in the field of disability studies.
  • Niko von Glasow, a thalidomide survivor, produced a documentary called NoBody's Perfect, based on the lives of 12 people affected by the drug, which was released in 2008.
  • Josée Lake is a Canadian Paralympic gold medallist swimmer, thalidomide survivor, and president of the Thalidomide Victims Association of Canada
  • Lorraine Mercer MBE of the United Kingdom, born with phocomelia of both arms and legs, is the only thalidomide survivor to carry the Olympic Torch.
  • Thomas Quasthoff, an internationally acclaimed bass-baritone, who describes himself: "1.34 meters tall, short arms, seven fingers — four right, three left — large, relatively well-formed head, brown eyes, distinctive lips; profession: singer".
  • Alvin Law, Canadian motivational speaker and former radio broadcaster.

Change in drug regulations

The disaster prompted many countries to introduce tougher rules for the testing and licensing of drugs, such as the Kefauver Harris Amendment (US), Directive 65/65/EEC1 (E.U.), and the Medicines Act 1968 (UK). In the United States, the new regulations strengthened the FDA, among other ways, by requiring applicants to prove efficacy and to disclose all side effects encountered in testing. The FDA subsequently initiated the Drug Efficacy Study Implementation to reclassify drugs already on the market.

Superintelligence

From Wikipedia, the free encyclopedia

A superintelligence is a hypothetical agent that possesses intelligence surpassing that of the most gifted human minds. Philosopher Nick Bostrom defines superintelligence as "any intellect that greatly exceeds the cognitive performance of humans in virtually all domains of interest".

Technological researchers disagree about how likely present-day human intelligence is to be surpassed. Some argue that advances in artificial intelligence (AI) will probably result in general reasoning systems that lack human cognitive limitations. Others believe that humans will evolve or directly modify their biology to achieve radically greater intelligence. Several future study scenarios combine elements from both of these possibilities, suggesting that humans are likely to interface with computers, or upload their minds to computers, in a way that enables substantial intelligence amplification. The hypothetical creation of the first superintelligence may or may not result from an intelligence explosion or a technological singularity.

Some researchers believe that superintelligence will likely follow shortly after the development of artificial general intelligence. The first generally intelligent machines are likely to immediately hold an enormous advantage in at least some forms of mental capability, including the capacity of perfect recall, a vastly superior knowledge base, and the ability to multitask in ways not possible to biological entities.

Several scientists and forecasters have been arguing for prioritizing early research into the possible benefits and risks of human and machine cognitive enhancement, because of the potential social impact of such technologies.

Feasibility of artificial superintelligence

Artificial intelligence, especially foundation models, has made rapid progress, surpassing human capabilities in various benchmarks.

The creation of artificial superintelligence (ASI) has been a topic of increasing discussion in recent years, particularly with the rapid advancements in artificial intelligence (AI) technologies.

Progress in AI and claims of AGI

Recent developments in AI, particularly in large language models (LLMs) based on the transformer architecture, have led to significant improvements in various tasks. Models like GPT-3, GPT-4,GPT-5, Claude 3.5 and others have demonstrated capabilities that some researchers argue approach or even exhibit aspects of artificial general intelligence (AGI).

However, the claim that current LLMs constitute AGI is controversial. Critics argue that these models, while impressive, still lack true understanding and rely primarily on memorization.

Pathways to superintelligence

Philosopher David Chalmers argues that AGI is a likely path to ASI. He posits that AI can achieve equivalence to human intelligence, be extended to surpass it, and then be amplified to dominate humans across arbitrary tasks.

More recent research has explored various potential pathways to superintelligence:

  1. Scaling current AI systems – Some researchers argue that continued scaling of existing AI architectures, particularly transformer-based models, could lead to AGI and potentially ASI.
  2. Novel architectures – Others suggest that new AI architectures, potentially inspired by neuroscience, may be necessary to achieve AGI and ASI.
  3. Hybrid systems – Combining different AI approaches, including symbolic AI and neural networks, could potentially lead to more robust and capable systems.

Computational advantages

Artificial systems have several potential advantages over biological intelligence:

  1. Speed – Computer components operate much faster than biological neurons. Modern microprocessors (~2 GHz) are seven orders of magnitude faster than neurons (~200 Hz).
  2. Scalability – AI systems can potentially be scaled up in size and computational capacity more easily than biological brains.
  3. Modularity – Different components of AI systems can be improved or replaced independently.
  4. Memory – AI systems can have perfect recall and vast knowledge bases. It is also much less constrained than humans when it comes to working memory.
  5. Multitasking – AI can perform multiple tasks simultaneously in ways not possible for biological entities.

Potential path through transformer models

Recent advancements in transformer-based models have led some researchers to speculate that the path to ASI might lie in scaling up and improving these architectures. This view suggests that continued improvements in transformer models or similar architectures could lead directly to ASI.

Some experts even argue that current large language models like GPT-5 may already exhibit early signs of AGI or ASI capabilities. This perspective suggests that the transition from current AI to ASI might be more continuous and rapid than previously thought, blurring the lines between narrow AI, AGI, and ASI.

However, this view remains controversial. Critics argue that current models, while impressive, still lack crucial aspects of general intelligence such as true understanding, reasoning, and adaptability across diverse domains.

The debate over whether the path to ASI will involve a distinct AGI phase or a more direct scaling of current technologies is ongoing, with significant implications for AI development strategies and safety considerations.

Challenges and uncertainties

Despite these potential advantages, there are significant challenges and uncertainties in achieving ASI:

  1. Ethical and safety concerns – The development of ASI raises numerous ethical questions and potential risks that need to be addressed.
  2. Computational requirements – The computational resources required for ASI might be far beyond current capabilities.
  3. Fundamental limitations – There may be fundamental limitations to intelligence that apply to both artificial and biological systems.
  4. Unpredictability – The path to ASI and its consequences are highly uncertain and difficult to predict.

As research in AI continues to advance rapidly, the question of the feasibility of ASI remains a topic of intense debate and study in the scientific community.

Feasibility of biological superintelligence

Carl Sagan suggested that the advent of Caesarean sections and in vitro fertilization may permit humans to evolve larger heads, resulting in improvements via natural selection in the heritable component of human intelligence. By contrast, Gerald Crabtree has argued that decreased selection pressure is resulting in a slow, centuries-long reduction in human intelligence and that this process instead is likely to continue. There is no scientific consensus concerning either possibility and in both cases, the biological change would be slow, especially relative to rates of cultural change.

Selective breeding, nootropics, epigenetic modulation, and genetic engineering could improve human intelligence more rapidly. Bostrom writes that if we come to understand the genetic component of intelligence, pre-implantation genetic diagnosis could be used to select for embryos with as much as 4 points of IQ gain (if one embryo is selected out of two), or with larger gains (e.g., up to 24.3 IQ points gained if one embryo is selected out of 1000). If this process is iterated over many generations, the gains could be an order of magnitude improvement. Bostrom suggests that deriving new gametes from embryonic stem cells could be used to iterate the selection process rapidly. A well-organized society of high-intelligence humans of this sort could potentially achieve collective superintelligence.

Alternatively, collective intelligence might be constructed by better organizing humans at present levels of individual intelligence. Several writers have suggested that human civilization, or some aspect of it (e.g., the Internet, or the economy), is coming to function like a global brain with capacities far exceeding its component agents. A prediction market is sometimes considered as an example of a working collective intelligence system, consisting of humans only (assuming algorithms are not used to inform decisions).

A final method of intelligence amplification would be to directly enhance individual humans, as opposed to enhancing their social or reproductive dynamics. This could be achieved using nootropics, somatic gene therapy, or brain−computer interfaces. However, Bostrom expresses skepticism about the scalability of the first two approaches and argues that designing a superintelligent cyborg interface is an AI-complete problem.

Forecasts

Most surveyed AI researchers expect machines to eventually be able to rival humans in intelligence, though there is little consensus on when this will likely happen. At the 2006 AI@50 conference, 18% of attendees reported expecting machines to be able "to simulate learning and every other aspect of human intelligence" by 2056; 41% of attendees expected this to happen sometime after 2056; and 41% expected machines to never reach that milestone.

In a survey of the 100 most cited authors in AI (as of May 2013, according to Microsoft academic search), the median year by which respondents expected machines "that can carry out most human professions at least as well as a typical human" (assuming no global catastrophe occurs) with 10% confidence is 2024 (mean 2034, standard deviation 33 years), with 50% confidence is 2050 (mean 2072, st. dev. 110 years), and with 90% confidence is 2070 (mean 2168, st. dev. 342 years). These estimates exclude the 1.2% of respondents who said no year would ever reach 10% confidence, the 4.1% who said 'never' for 50% confidence, and the 16.5% who said 'never' for 90% confidence. Respondents assigned a median 50% probability to the possibility that machine superintelligence will be invented within 30 years of the invention of approximately human-level machine intelligence.

In a 2022 survey, the median year by which respondents expected "High-level machine intelligence" with 50% confidence is 2061. The survey defined the achievement of high-level machine intelligence as when unaided machines can accomplish every task better and more cheaply than human workers.

In 2023, OpenAI leaders Sam Altman, Greg Brockman and Ilya Sutskever published recommendations for the governance of superintelligence, which they believe may happen in less than 10 years.

In 2024, Ilya Sutskever left OpenAI to cofound the startup Safe Superintelligence, which focuses solely on creating a superintelligence that is safe by design, while avoiding "distraction by management overhead or product cycles". Despite still offering no product, the startup became valued at $30 billion in February 2025.

In 2025, the forecast scenario "AI 2027" led by Daniel Kokotajlo predicted rapid progress in the automation of coding and AI research, followed by ASI. In September 2025, a review of surveys of scientists and industry experts from the last 15 years reported that most agreed that artificial general intelligence (AGI), a level well below technological singularity, will occur before the year 2100. A more recent analysis by AIMultiple reported that, “Current surveys of AI researchers are predicting AGI around 2040”.

Design considerations

The design of superintelligent AI systems raises critical questions about what values and goals these systems should have. Several proposals have been put forward:

Value alignment proposals

  • Coherent extrapolated volition (CEV) – The AI should have the values upon which humans would converge if they were more knowledgeable and rational.
  • Moral rightness (MR) – The AI should be programmed to do what is morally right, relying on its superior cognitive abilities to determine ethical actions.
  • Moral permissibility (MP) – The AI should stay within the bounds of moral permissibility while otherwise pursuing goals aligned with human values (similar to CEV).

Bostrom elaborates on these concepts:

instead of implementing humanity's coherent extrapolated volition, one could try to build an AI to do what is morally right, relying on the AI's superior cognitive capacities to figure out just which actions fit that description. We can call this proposal "moral rightness" (MR) ...

MR would also appear to have some disadvantages. It relies on the notion of "morally right", a notoriously difficult concept, one with which philosophers have grappled since antiquity without yet attaining consensus as to its analysis. Picking an erroneous explication of "moral rightness" could result in outcomes that would be morally very wrong ...

One might try to preserve the basic idea of the MR model while reducing its demandingness by focusing on moral permissibility: the idea being that we could let the AI pursue humanity's CEV so long as it did not act in morally impermissible ways.

Recent developments

Since Bostrom's analysis, new approaches to AI value alignment have emerged:

  • Inverse Reinforcement Learning (IRL) – This technique aims to infer human preferences from observed behavior, potentially offering a more robust approach to value alignment.
  • Constitutional AI – Proposed by Anthropic, this involves training AI systems with explicit ethical principles and constraints.
  • Debate and amplification – These techniques, explored by OpenAI, use AI-assisted debate and iterative processes to better understand and align with human values.

Transformer LLMs and ASI

The rapid advancement of transformer-based LLMs has led to speculation about their potential path to ASI. Some researchers argue that scaled-up versions of these models could exhibit ASI-like capabilities:

  • Emergent abilities – As LLMs increase in size and complexity, they demonstrate unexpected capabilities not present in smaller models.
  • In-context learning – LLMs show the ability to adapt to new tasks without fine-tuning, potentially mimicking general intelligence.
  • Multi-modal integration – Recent models can process and generate various types of data, including text, images, and audio.

However, critics argue that current LLMs lack true understanding and are merely sophisticated pattern matchers, raising questions about their suitability as a path to ASI.

Other perspectives on artificial superintelligence

Additional viewpoints on the development and implications of superintelligence include:

  • Recursive self-improvementI. J. Good proposed the concept of an "intelligence explosion", where an AI system could rapidly improve its own intelligence, potentially leading to superintelligence.
  • Orthogonality thesis – Bostrom argues that an AI's level of intelligence is orthogonal to its final goals, meaning a superintelligent AI could have any set of motivations.
  • Instrumental convergence – Certain instrumental goals (e.g., self-preservation, resource acquisition) might be pursued by a wide range of AI systems, regardless of their final goals.

Challenges and ongoing research

The pursuit of value-aligned AI faces several challenges:

  • Philosophical uncertainty in defining concepts like "moral rightness"
  • Technical complexity in translating ethical principles into precise algorithms
  • Potential for unintended consequences even with well-intentioned approaches

Current research directions include multi-stakeholder approaches to incorporate diverse perspectives, developing methods for scalable oversight of AI systems, and improving techniques for robust value learning.

Al research is rapidly progressing towards superintelligence. Addressing these design challenges remains crucial for creating ASI systems that are both powerful and aligned with human interests.

Potential threat to humanity

The development of artificial superintelligence (ASI) has raised concerns about potential existential risks to humanity. Researchers have proposed various scenarios in which an ASI could pose a significant threat:

Intelligence explosion and control problem

Some researchers argue that through recursive self-improvement, an ASI could rapidly become so powerful as to be beyond human control. This concept, known as an "intelligence explosion", was first proposed by I. J. Good in 1965:

Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultraintelligent machine could design even better machines; there would then unquestionably be an 'intelligence explosion,' and the intelligence of man would be left far behind. Thus the first ultraintelligent machine is the last invention that man need ever make, provided that the machine is docile enough to tell us how to keep it under control.

This scenario presents the AI control problem: how to create an ASI that will benefit humanity while avoiding unintended harmful consequences. Eliezer Yudkowsky argues that solving this problem is crucial before ASI is developed, as a superintelligent system might be able to thwart any subsequent attempts at control.

Unintended consequences and goal misalignment

Even with benign intentions, an ASI could potentially cause harm due to misaligned goals or unexpected interpretations of its objectives. Nick Bostrom provides a stark example of this risk:

When we create the first superintelligent entity, we might make a mistake and give it goals that lead it to annihilate humankind, assuming its enormous intellectual advantage gives it the power to do so. For example, we could mistakenly elevate a subgoal to the status of a supergoal. We tell it to solve a mathematical problem, and it complies by turning all the matter in the solar system into a giant calculating device, in the process killing the person who asked the question.

Stuart Russell offers another illustrative scenario:

A system given the objective of maximizing human happiness might find it easier to rewire human neurology so that humans are always happy regardless of their circumstances, rather than to improve the external world.

These examples highlight the potential for catastrophic outcomes even when an ASI is not explicitly designed to be harmful, underscoring the critical importance of precise goal specification and alignment.

Potential mitigation strategies

Researchers have proposed various approaches to mitigate risks associated with ASI:

  • Capability control – Limiting an ASI's ability to influence the world, such as through physical isolation or restricted access to resources.
  • Motivational control – Designing ASIs with goals that are fundamentally aligned with human values.
  • Ethical AI – Incorporating ethical principles and decision-making frameworks into ASI systems.
  • Oversight and governance – Developing robust international frameworks for the development and deployment of ASI technologies.

Despite these proposed strategies, some experts, such as Roman Yampolskiy, argue that the challenge of controlling a superintelligent AI might be fundamentally unsolvable, emphasizing the need for extreme caution in ASI development.

Debate and skepticism

Not all researchers agree on the likelihood or severity of ASI-related existential risks. Some, like Rodney Brooks, argue that fears of superintelligent AI are overblown and based on unrealistic assumptions about the nature of intelligence and technological progress. Others, such as Joanna Bryson, contend that anthropomorphizing AI systems leads to misplaced concerns about their potential threats.

Recent developments and current perspectives

The rapid advancement of LLMs and other AI technologies has intensified debates about the proximity and potential risks of ASI. While there is no scientific consensus, some researchers and AI practitioners argue that current AI systems may already be approaching AGI or even ASI capabilities.

  • LLM capabilities – Recent LLMs like GPT-4 have demonstrated unexpected abilities in areas such as reasoning, problem-solving, and multi-modal understanding, leading some to speculate about their potential path to ASI.
  • Emergent behaviors – Studies have shown that as AI models increase in size and complexity, they can exhibit emergent capabilities not present in smaller models, potentially indicating a trend towards more general intelligence.
  • Rapid progress – The pace of AI advancement has led some to argue that we may be closer to ASI than previously thought, with potential implications for existential risk.

As of 2024, AI skeptics such as Gary Marcus caution against premature claims of AGI or ASI, arguing that current AI systems, despite their impressive capabilities, still lack true understanding and general intelligence. They emphasize the significant challenges that remain in achieving human-level intelligence, let alone superintelligence.

The debate surrounding the current state and trajectory of AI development underscores the importance of continued research into AI safety and ethics, as well as the need for robust governance frameworks to manage potential risks as AI capabilities continue to advance.

Russo-Ukrainian war (2022–present)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Russo-Ukrainian_war_(2022%E2%80%93present)   ...