When used in the workplace, AI also presents the possibility of new hazards. These may arise from machine learning techniques leading to unpredictable behavior and inscrutability in their decision-making, or from cybersecurity and information privacy issues. Many hazards of AI are psychosocial due to its potential to cause changes in work organization. These include changes in the skills required of workers, increased monitoring leading to micromanagement, algorithms unintentionally or intentionally mimicking undesirable human biases, and assigning blame for machine errors to the human operator instead. AI may also lead to physical hazards in the form of human–robot collisions, and ergonomic
risks of control interfaces and human–machine interactions. Hazard
controls include cybersecurity and information privacy measures,
communication and transparency with workers about data usage, and
limitations on collaborative robots.
From a workplace safety and health perspective, only "weak" or "narrow" AI
that is tailored to a specific task is relevant, as there are many
examples that are currently in use or expected to come into use in the
near future. "Strong" or "general" AI is not expected to be feasible in the near future,and discussion of its risks is within the purview of futurists and philosophers rather than industrial hygienists.
Certain digital technologies are predicted to result in job
losses. Starting in the 2020s, the adoption of modern robotics has led
to net employment growth. However, many businesses anticipate that
automation, or employing robots would result in job losses in the future. This is especially true for companies in Central and Eastern Europe. Other digital technologies, such as platforms or big data, are projected to have a more neutral impact on employment. A large number of tech workers have been laid off starting in 2023; many such job cuts have been attributed to artificial intelligence.
The long-term predicted impact of AI on the workplace remains
highly contested. Various academic studies have theorised the impact of
AI on the workplace. A 2025 investigation based on users' interactions with Microsoft's AI chatbot, Copilot,
identified forty jobs that had high overlaps with the capabilities of
AI. The report concluded that these jobs - which included Interpreters
and Translators, Historians, Passenger Attendants, Sales Assistants, and
Writers - would thus experience significant transformation in the
workplace by AI. The report garnered high levels of attention in the media, with some outlets claiming these jobs would become obsolete. However, some of the listed professions criticised the report,
suggesting that it had misrepresented their typical workplace activities
in order to augment AI's current performance. The historian Chris
Campbell argued that the 'report’s methods, deliberately or otherwise,
de-skill historians away from a job that requires high-level and deeply
human analytical skills to one that is tasked solely with the retention
and provision of knowledge. Under that flawed rubric, it is little
wonder that historians have a high AI applicability score.
Health and safety applications
In order for any potential AI health and safety application to be adopted, it requires acceptance by both managers and workers. For example, worker acceptance may be diminished by concerns about information privacy, or from a lack of trust and acceptance of the new technology, which may arise from inadequate transparency or training. Alternatively, managers may emphasize increases in economic productivity rather than gains in worker safety and health when implementing AI-based systems.
Eliminating hazardous tasks
Call centers involve significant psychosocial hazards due to surveillance and overwork. AI-enabled chatbots can remove workers from the most basic and repetitive of these tasks.
AI may increase the scope of work tasks where a worker can be removed from a situation that carries risk. In a sense, while traditional automation can replace the functions of a
worker's body with a robot, AI effectively replaces the functions of
their brain with a computer. Hazards that can be avoided include
stress, overwork, musculoskeletal injuries, and boredom.
This can expand the range of affected job sectors into white-collar and service sector jobs such as in medicine, finance, and information technology. As an example, call center
workers face extensive health and safety risks due to its repetitive
and demanding nature and its high rates of micro-surveillance.
AI-enabled chatbots lower the need for humans to perform the most basic call center tasks.
Analytics to reduce risk
The NIOSH lifting equation is calibrated for a typical healthy worker to avoid back injuries, but AI-based methods may instead allow real-time, personalized calculation of risk.
Machine learning is used for people analytics
to make predictions about worker behavior to assist management
decision-making, such as hiring and performance assessment. These could
also be used to improve worker health. The analytics may be based on
inputs such as online activities, monitoring of communications, location
tracking, and voice analysis and body language analysis of filmed interviews. For example, sentiment analysis may be used to spot fatigue to prevent overwork.Decision support systems have a similar ability to be used to, for example, prevent industrial disasters or make disaster response more efficient.
For manual material handling workers, predictive analytics and artificial intelligence may be used to reduce musculoskeletal injury. Traditional guidelines are based on statistical averages and are geared towards anthropometrically typical humans. The analysis of large amounts of data from wearable sensors may allow real-time, personalized calculation of ergonomic risk and fatigue management, as well as better analysis of the risk associated with specific job roles.
Wearable sensors
may also enable earlier intervention against exposure to toxic
substances than is possible with area or breathing zone testing on a
periodic basis. Furthermore, the large data sets generated could improve
workplace health surveillance, risk assessment, and research.
Streamlining safety and health workflows
AI has also been used to attempt to make the workplace safety and health workflow more efficient. One example is coding of workers' compensation
claims, which are submitted in a prose narrative form and must manually
be assigned standardized codes. AI is being investigated to perform this task faster, more cheaply, and with fewer errors.
AI‐enabled virtual reality systems may be useful for safety training for hazard recognition.
Artificial intelligence may be used to more efficiently detect near misses.
Reporting and analysis of near misses are important in reducing
accident rates, but they are often underreported because they are not
noticed by humans, or are not reported by workers due to social factors.
Hazards
Some machine learning training methods are prone to unpredictabiliy and inscrutability
in their decision-making, which can lead to hazards if managers or
workers cannot predict or understand an AI-based system's behavior.
There are several broad aspects of AI that may give rise to specific
hazards. The risks depend on implementation rather than the mere
presence of AI.
Systems using sub-symbolic AI such as machine learning may behave unpredictably and are more prone to inscrutability in their decision-making. This is especially true if a situation is encountered that was not part of the AI's training dataset, and is exacerbated in environments that are less structured. Undesired behavior may also arise from flaws in the system's perception (arising either from within the software or from sensor degradation), knowledge representation and reasoning, or from software bugs.They may arise from improper training, such as a user applying the
same algorithm to two problems that do not have the same requirements. Machine learning applied during the design phase may have different implications than that applied at runtime. Systems using symbolic AI are less prone to unpredictable behavior.
The use of AI also increases cybersecurity risks relative to platforms that do not use AI, and information privacy concerns about collected data may pose a hazard to workers.
Psychosocial hazards
are those that arise from the way work is designed, organized, and
managed, or its economic and social contexts, rather than arising from a
physical substance or object. They cause not only psychiatric and psychological outcomes such as occupational burnout, anxiety disorders, and depression, but they can also cause physical injury or illness such as cardiovascular disease or musculoskeletal injury. Many hazards of AI are psychosocial in nature due to its potential to
cause changes in work organization, in terms of increasing complexity
and interaction between different organizational factors. However,
psychosocial risks are often overlooked by designers of advanced
manufacturing systems.
Einola and Khoreva explore how different organizational groups perceive and interact with AI technologies. Their research shows that successful AI integration depends on human
ownership and contextual understanding. They caution against blind
technological optimism and stress the importance of tailoring AI use to
specific workplace ecosystems. This perspective reinforces the need for
inclusive design and transparent implementation strategies.
Changes in work practices
AI is expected to lead to changes in the skills required of workers, requiring training of existing workers, flexibility, and openness to change. The requirement for combining conventional expertise with computer skills may be challenging for existing workers. Over-reliance on AI tools may lead to deskilling of some professions.
While AI offers convenience and judgement-free interaction,
increased reliance—particularly among Generation Z—may reduce
interpersonal communication in the workplace and affect social cohesion. As AI becomes a substitute for traditional peer collaboration and
mentorship, there is a risk of diminishing opportunities for
interpersonal skill development and team-based learning. This shift could contribute to workplace isolation and changes in team dynamics.
Increased monitoring may lead to micromanagement and thus to stress and anxiety. A perception of surveillance
may also lead to stress. Controls for these include consultation with
worker groups, extensive testing, and attention to introduced bias. Wearable sensors, activity trackers, and augmented reality may also lead to stress from micromanagement, both for assembly line workers and gig workers. Gig workers also lack the legal protections and rights of formal workers.
AI is not merely a technical tool but a transformative force that reshapes workplace structures and decision-making processes. Newell and Marabelli argue that AI alters power dynamics and employee
autonomy, requiring a more nuanced understanding of its social and
organizational implications. Their study calls for thoughtful
integration of AI that considers its broader impact on work culture and
human roles.
There is also the risk of people being forced to work at a robot's pace, or to monitor robot performance at nonstandard hours.
Algorithms trained on past decisions may mimic undesirable human biases, for example, past discriminatory hiring and firing practices. Information asymmetry
between management and workers may lead to stress, if workers do not
have access to the data or algorithms that are the basis for
decision-making.
In addition to building a model with inadvertently discriminatory
features, intentional discrimination may occur through designing
metrics that covertly result in discrimination through correlated variables in a non-obvious way.
In complex human‐machine interactions, some approaches to accident analysis may be biased to safeguard a technological system and its developers by assigning blame to the individual human operator instead.
Physical
Automated guided vehicles are examples of cobots currently in common use. Use of AI to operate these robots may affect the risk of physical hazards such as the robot or its moving parts colliding with workers.
Physical hazards in the form of human–robot collisions may arise from robots using AI, especially collaborative robots (cobots). Cobots are intended to operate in close proximity to humans, which makes impossible the common hazard control of isolating the robot using fences or other barriers, which is widely used for traditional industrial robots. Automated guided vehicles are a type of cobot that as of 2019 are in common use, often as forklifts or pallet jacks in warehouses or factories.
For cobots, sensor malfunctions or unexpected work environment
conditions can lead to unpredictable robot behavior and thus to
human–robot collisions.
Self-driving cars are another example of AI-enabled robots. In addition, the ergonomics of control interfaces and human–machine interactions may give rise to hazards.
Hazard controls
AI, in common with other computational technologies, requires cybersecurity measures to stop software breaches and intrusions, as well as information privacy measures. Communication and transparency with workers about data usage is a
control for psychosocial hazards arising from security and privacy
issues. Proposed best practices for employer‐sponsored worker monitoring
programs include using only validated sensor technologies; ensuring
voluntary worker participation; ceasing data collection outside the
workplace; disclosing all data uses; and ensuring secure data storage.
For industrial cobots equipped with AI‐enabled sensors, the International Organization for Standardization
(ISO) recommended: (a) safety‐related monitored stopping controls; (b)
human hand guiding of the cobot; (c) speed and separation monitoring
controls; and (d) power and force limitations. Networked AI-enabled
cobots may share safety improvements with each other. Human oversight is another general hazard control for AI.
Workplace health surveillance,
the collection and analysis of health data on workers, is challenging
for AI because labor data are often reported in aggregate and does not
provide breakdowns between different types of work, and is focused on
economic data such as wages and employment rates rather than skill
content of jobs. Proxies for skill content include educational
requirements and classifications of routine versus non-routine, and
cognitive versus physical jobs. However, these may still not be
specific enough to distinguish specific occupations that have distinct
impacts from AI. The United States Department of Labor's Occupational Information Network
is an example of a database with a detailed taxonomy of skills.
Additionally, data are often reported on a national level, while there
is much geographical variation, especially between urban and rural
areas.
AI systems in the workplace raise ethical concerns related to
privacy, fairness, human dignity, and transparency. According to the
OECD, these risks must be addressed through robust governance frameworks
and accountability mechanisms. Ethical deployment of AI requires clear
policies on data usage, explainability of algorithms, and safeguards
against discrimination and surveillance.
As of 2019, ISO was developing a standard on the use of metrics and dashboards,
information displays presenting company metrics for managers, in
workplaces. The standard is planned to include guidelines for both
gathering data and displaying it in a viewable and useful manner.
Estimating AI's environmental effects can be difficult because
results depend on how impacts are measured, including whether accounting
includes only model computation or also data-centre overhead, idle capacity, hardware manufacture, and local electricity supply.
As these issues have received greater attention, governments and
regulators have increasingly considered data-centre reporting
requirements, energy-efficiency standards, and broader transparency
measures for AI-related resource use.
Carbon footprint and energy use
AI-related energy use arises at multiple stages, including model training, fine-tuning, inference, storage, networking, and supporting infrastructure such as cooling and power conversion.
Individual level
According
to research institute Epoch AI, energy consumption per typical ChatGPT
query (0.3 watt-hours) is small compared to the average U.S. household
consumption per minute (almost 20 watt-hours).
Published estimates of energy use per AI request vary widely across models, tasks and measurement methods. A benchmark study presented at the 2024 ACM Conference on Fairness, Accountability, and Transparency found substantial differences between task types, with lower energy use for some text tasks and much higher energy use for image generation in the study's test conditions. In that benchmark, simple classification tasks consumed about 0.002–0.007 Wh per prompt on average (about 9% of a smartphone
charge for 1,000 prompts), while text generation and text summarisation
each used about 0.05 Wh per prompt; image generation averaged 2.91 Wh
per prompt, and the least efficient image model in the study used
11.49 Wh per image (roughly equivalent to half a smartphone charge).
First-party measurements in production environments have also been published. A 2025 Google study on Gemini
assistant serving reported median per-prompt energy, emissions, and
water-use estimates under the authors' accounting framework, while
noting that different system boundaries can produce substantially
different results. The study reported a median text-prompt estimate of about 0.24 Wh,
which is roughly as much energy as watching nine seconds of television.
The study also stated that software and infrastructure improvements
reduced energy use by a factor of 33 and carbon emissions by a factor of
44 for a typical prompt over one year within the authors' framework.
Researchers at the University of Michigan measured the energy consumption of various MetaLlama 3.1 models
released in 2024 and found that smaller language models (8 billion
parameters) use about 114 joules (0.03167 Wh) per response, while larger
models (405 billion parameters) require up to 6,700 joules (1.861 Wh)
per response. This corresponds to the energy needed to run a microwave oven for roughly one-tenth of a second and eight seconds, respectively.
Comparisons between AI systems and human labour for specific
tasks have produced mixed results and remain sensitive to assumptions
about output quality, workload and system boundaries. A 2024 study in Scientific Reports
reported 130 to 2900 times lower estimated carbon emissions for
selected AI systems than for human writers and illustrators under its
assumptions. A later Scientific Reports
paper reported a counterexample for programming tasks under its
assumptions, finding 5 to 19 times higher estimated emissions for the
evaluated AI system than for human programmers on the benchmark used in
that study.
System level
Energy use and efficiency
Fueled by growth in artificial intelligence, data centres' demand for power increased in the 2020s.
According to the International Energy Agency data centres are expected to account for a relatively small share of global electricity demand growth by 2030.
Efficiency
improvement of AI related computer chips, 2008–2023. Index of energy
intensity of AI computer chips (2008=100, log scale).
AI electricity intensity depends not only on model architecture but
also on hardware and facility efficiency. Data-centre operators commonly
report Power usage effectiveness (PUE), which measures the ratio of total facility energy to IT equipment energy; a lower PUE indicates less overhead energy for cooling and other supporting infrastructure.
Operators may also publish metrics and case studies on hardware
efficiency, cooling systems and power sourcing. In its 2024
environmental report, Google stated that its 2023 total greenhouse gas
emissions increased 13% year over year, primarily because of increased
data-centre energy consumption and supply-chain emissions, while also reporting lower PUE than industry averages for its own facilities.
The International Energy Agency has also reported that data
centres remain a relatively small share of global electricity use
overall, but that their local effects can be much more pronounced
because demand is geographically concentrated.
Carbon footprint
At system level, AI contributes to rising electricity demand in data centres and related infrastructure. The International Energy Agency
estimated that data centres used about 415 TWh of electricity in 2024,
or around 1.5% of global electricity consumption, and projected that
data-centre electricity use could rise to about 945 TWh by 2030, with AI
identified as the main driver of that growth alongside other digital
services.
The carbon footprint of AI systems depends strongly on
electricity sources, hardware efficiency, utilisation rates, and what
stages are included in the accounting. Training large models can require
substantial electricity, while total lifecycle impacts also depend on
deployment scale and the amount of inference performed after training.
Early analyses of frontier-model development reported rapid
historical growth in training compute for selected systems, although
later trends have depended on changes in model design, hardware and
efficiency gains.
Accounting methods that include upstream or embodied impacts,
such as hardware manufacture and facilities construction, can materially
affect estimates of AI-related emissions.
Decisions and strategies by individual companies
Large technology companies have reported that the expansion of AI and cloud infrastructure
affects their sustainability targets, electricity demand, and resource
use. Google, for example, attributed part of its emissions growth in
2023 to increased data-centre energy consumption and supply-chain
emissions in its 2024 environmental report.
Cloud and AI companies have also announced measures intended to
reduce environmental impacts, including investment in more efficient
hardware, low-carbon electricity procurement, alternative cooling
systems, and water stewardship programmes. The extent, comparability, and third-party verification of such disclosures vary between firms and jurisdictions.
Water usage
Data centres can use water directly for cooling and indirectly through the water used in electricity generation, depending on the local energy mix. Public reporting on data-centre water use has often been inconsistent,
making comparisons between operators and regions difficult.
To standardise operational reporting, The Green Grid proposed the metric water usage effectiveness (WUE), defined as annual site water use divided by IT equipment energy use. WUE does not by itself measure local water stress, source sustainability, or all upstream water impacts. Studies of AI water use also distinguish between water withdrawal and water consumption.
Research on AI-specific water use has argued that the water
footprint of AI systems can be difficult to observe and may vary
substantially by location, cooling design, and electricity source. A
2025 Communications of the ACM
article summarised methods for estimating AI water footprints and
emphasised the distinction between water withdrawal and water
consumption.
Li and colleagues estimated that global AI water withdrawal could
reach 4.2–6.6 billion cubic metres in 2027 under the scenarios examined
in their article. Using GPT-3, released by OpenAI
in 2020, as an example, they estimated that training the model in
Microsoft's U.S. data centres could consume about 700,000 litres of
onsite water and about 5.4 million litres in total when offsite
electricity-related water use was included; they also estimated that
10–50 medium-length GPT-3 responses could consume about 500 mL of water,
depending on when and where the model was deployed. Published prompt-level estimates have also varied by system and
accounting framework: the 2025 Google study on Gemini assistant serving
reported a median text-prompt estimate of about 0.26 mL under its
framework.
Location can materially affect the significance of data-centre
water use. Research on U.S. data centres found that one-fifth of
servers' direct water footprint came from moderately to highly
water-stressed watersheds, while nearly half of servers were fully or
partially powered by plants located in water-stressed regions. A 2025 Reuters report, citing data from Verisk Maplecroft and
NatureFinance, said that an average mid-sized data centre uses about 1.4
million litres of water per day for cooling and that Phoenix would
experience a 32% increase in annual water stress if currently planned
data centres come online.
AI systems depend on specialised computing hardware, and rapid
turnover in servers and accelerators may contribute to rising e-waste. According to the Global E-waste Monitor 2024,
the world generated an estimated 62 million tonnes of e-waste in 2022,
and the total was projected to rise to 82 million tonnes by 2030 under
its scenarios. The World Health Organization has also identified e-waste as a growing environmental and public-health issue.
A 2024 study in Nature Computational Science estimated
that generative AI could add between 1.2 and 5 million tonnes of e-waste
by 2030 under the scenarios examined by the authors. In the study's higher-end scenarios, this would represent up to 12% of projected global e-waste by 2030. The authors also estimated that circular-economy strategies along the
generative-AI value chain could reduce AI-related e-waste generation by
16–86%.
Mining
AI hardware depends on complex supply chains for metals, minerals and manufactured components. UNCTAD
has reported that the expansion of digital infrastructure increases
demand for raw materials and raises environmental and distributional
concerns linked to extraction, processing and manufacturing.
Specialised chips used in AI systems can depend on supply chains
involving critical minerals and other materials whose extraction and
processing may have significant environmental and social effects. These
impacts are not unique to AI, but may increase as demand for AI-related
hardware grows.
Social impact and environmental justice
The environmental effects of AI-related infrastructure are not
distributed evenly. Research on U.S. data centres has found that their
environmental footprints vary by region and may intersect with local
electricity systems, water availability and existing environmental
burdens. In that study, one-fifth of servers' direct water footprint came from
moderately to highly water-stressed watersheds, while nearly half of
servers were fully or partially powered by plants located in
water-stressed regions.
Concerns have also been raised about local air pollution,
permitting and grid stress in communities hosting AI-related facilities
and associated power infrastructure. In 2025, civil-rights and
environmental groups challenged permits connected to an xAI
facility in the Memphis area, arguing that air-pollution burdens could
fall disproportionately on historically overburdened neighbourhoods. The
dispute has been the subject of regulatory and legal proceedings.
Climate solutions
Despite concerns about its environmental footprint, AI has been used
in environmental and climate-related applications, including weather forecasting, Earth observation, and optimisation in transport and energy systems.
In weather forecasting, peer-reviewed studies have reported
strong results for some AI-based forecasting systems under specific
evaluation frameworks. A 2023 Nature paper on Pangu-Weather
reported strong medium-range forecasting performance relative to a
leading numerical weather prediction system in the study's evaluation. AI has also been used in research on extreme weather and climate-event modelling.
AI has also been proposed for mitigation-oriented optimisation.
Google's Green Light project, for example, uses traffic data and machine
learning to recommend traffic-signal timing adjustments intended to
reduce stop-and-go traffic and associated emissions at intersections.
Whether AI produces net environmental benefits at large scale
remains an open question, because outcomes depend on deployment choices,
rebound effects, additional infrastructure demand and the extent to which electricity and cooling systems are decarbonised.
Conflict on the use of AI for environmental research
There is ongoing debate over the balance between the possible
environmental benefits of AI applications and the environmental costs of
scaling AI systems. This includes discussion of transparency,
efficiency, rebound effects, and the extent to which AI-related
infrastructure growth may offset environmental gains from specific
applications.
Policy and regulation
United States
In the United States, proposals have been introduced to study and standardise reporting on AI's environmental impacts. The Artificial Intelligence Environmental Impacts Act of 2024
(S. 3732), introduced in the Senate in February 2024, would require a
federal study on the environmental impacts of AI, direct the National Institute of Standards and Technology to convene a consortium on measurement and standards, and establish a voluntary reporting system.
European Union
In the European Union, the Energy Efficiency Directive introduced reporting obligations for large data centres. The European Commission has stated that a European database collects information relevant to the energy performance and water footprint
of data centres, and that a delegated regulation sets out the
information and key performance indicators for the reporting scheme.
EU member states
also maintain national AI strategies, some of which include references
to sustainability, energy efficiency, or environmental applications of
AI.
France
France's
AI strategy documents have discussed AI in relation to ecological
transition and environmental applications, including the use of digital
infrastructure and data for environmental policy.
Germany
Germany's
national AI strategy includes sections on the environmental impacts of
AI and on research into energy-efficient and sustainable AI
applications.
Italy
Italy's
national AI strategy documents include sustainability-related
priorities and discuss AI applications in areas such as environment,
infrastructure, and sustainable development goals.
The environmental impact of bitcoin has been characterized in
the literature as significant, particularly due to its energy use,
greenhouse gas emissions, and electronic waste. Bitcoin mining, the process by which bitcoins are created and transactions are finalized, is energy-consuming and results in carbon emissions, as 48% of the electricity used in 2025 was generated through fossil fuels while 52% was generated through sustainable energy sources. Moreover, bitcoins are mined on specialized computer hardware resulting in electronic waste. Scholars argue that bitcoin mining could support renewable energy development by utilizing surplus electricity from wind and solar. As of 2025,
several empirical studies report an association between higher
bitcoin-mining electricity use and worse environmental-sustainability
indicators. Bitcoin's environmental impact has attracted the attention of regulators, leading to incentives or restrictions in various jurisdictions.
Greenhouse gas emissions
Mining as an electricity-intensive process
Electricity
consumption of the bitcoin network since 2016 (annualized). The upper
and lower bounds are based on worst-case and best-case scenario
assumptions, respectively. The red trace indicates an intermediate
best-guess estimate.
Bitcoin mining is a highly electricity-intensive proof-of-work process. Miners run dedicated software to compete against each other and be the first to solve the current 10 minute block, yielding them a reward in bitcoins. A transition to the proof-of-stake protocol, which has better energy efficiency, has been described as a sustainable alternative to bitcoin's scheme and as a potential solution to its environmental issues. Bitcoin advocates oppose such a change, arguing that proof of work is needed to secure the network.
Bitcoin mining's distribution makes it difficult for researchers
to identify the location of miners and electricity use. It is therefore
difficult to translate energy consumption into carbon emissions. As of 2025, a non-peer-reviewed study by the Cambridge Centre for Alternative Finance (CCAF) estimated that bitcoin consumed 138 TWh (500 PJ) annually, representing 0.5% of the world's electricity consumption and resulting in annual greenhouse gas emissions of 39.8 Mt CO2, representing 0.08% of global emissions and comparable to Slovakia's emissions.
Bitcoin mining energy mix
Until 2021, most bitcoin mining was done in China. Chinese miners relied on cheap coal power in Xinjiang and Inner Mongolia during late autumn, winter and spring, migrating to regions with overcapacities in low-cost hydropower (like Sichuan and Yunnan) between May and October. After China banned bitcoin mining in June 2021, its mining operations moved to other countries. By August 2021, mining was concentrated in the U.S. (35%), Kazakhstan (18%), and Russia (11%) instead. The shift from coal resources in China to coal resources in Kazakhstan increased bitcoin's carbon footprint, as Kazakhstani coal plants use hard coal, which has the highest carbon content of all coal types. Despite the ban, covert mining operations gradually came back to China, reaching 21% of global hashrate as of 2022.
As of 2025, a CCAF
report based on a survey of 49 bitcoin-mining firms (about 48% of
network hashrate at the time of data collection) reported their
electricity mix as renewables (43%), natural gas (38%), nuclear (10%), and coal (9%). Research by the nonprofit tech company WattTime estimated that US miners consumed 54% fossil fuel-generated power. In 2023, Jamie Coutts, a crypto analyst writing for Bloomberg Terminal, said that renewables represented about half of global bitcoin mining sources.
Environmental effects of electricity use
A study in Scientific Reports found that from 2016 to 2021, each US dollar worth of mined bitcoin caused 35 cents worth of climate damage, compared to 95 for coal, 41 for gasoline, 33 for beef, and 4 for gold mining. A 2025 paper published in Nature Communications
found that the 34 largest U.S. bitcoin mines consumed 32.3 TWh of
electricity from Aug 2022 to July 2023, 33% more than Los Angeles.
Fossil fuel power plants generated 85% of the increased electricity
demand from these mines.
A 2025 study in Scientific Reports of ten major cryptocurrency-producing countries (2019–2022) found that Bitcoin mining's electricity use was linked to worse environmental sustainability. A larger share of renewables softened but did not eliminate these effects during the study period, and the impact on water use was limited. A 2025 peer-reviewed study in Sustainable Development
that used monthly data from 2015–2023 and DARDL/KRLS methods reported
an association between higher Bitcoin-mining electricity use and worse environmental sustainability in an SDG-framed measure; the authors characterized this as a risk factor for sustainability goals. A 2025 life-cycle assessment in ACS Sustainable Chemistry & Engineering
quantified Bitcoin's carbon, water, and land footprints, concluding
that the network's resource consumption poses sustainability challenges
and highlighting the need for technological advances and cleaner energy
sources.
Proposed mitigation strategies and debate
Reducing the environmental impact of bitcoin is possible by mining only using clean electricity sources. Bitcoin mining representatives argue that their industry creates opportunities for wind and solar companies, leading to a debate on whether bitcoin could be an ESG investment.
According to a 2023 ACS Sustainable Chemistry & Engineering
paper, bitcoin mining may offer opportunities to support greenhouse-gas
reduction and the renewable-energy transition, by using otherwise-curtailed renewable-energy and acting as a flexible electricity load. A 2023 review published in Resource and Energy Economics
also concluded that bitcoin mining could increase renewable capacity
but that it might increase carbon emissions and that mining bitcoin to
provide demand response largely mitigated its environmental impact. Two studies from 2023 and 2024 led by Fengqi You concluded that mining bitcoin off-grid during the precommercial phase (when a wind or solar farm is generating electricity but not yet integrated into the grid) could bring additional profits and therefore support renewable energy development and mitigate climate change.Another 2024 study by Fengqi You published in the Proceedings of the National Academy of Sciences of the United States of America showed that pairing green hydrogen infrastructure with bitcoin mining can accelerate the deployment of solar and wind power capacities. A 2024 study published in Heliyon simulated that a solar-powered bitcoin mining system could achieve a return on investment in 3.5 years compared to 8.1 years for selling electricity to the grid, while preventing 50,000 tons of CO2 emissions annually. The authors note that proof-of-stake cryptocurrencies cannot provide these incentives.
Bitcoin has been mined via electricity generated through the combustion of associated petroleum gas (APG), which is a methane-rich byproduct of crude oil drilling that is sometimes flared or released into the atmosphere. Methane is a greenhouse gas with a global warming potential 28 to 36 times greater than CO2. By converting more of the methane to CO2
than flaring alone would, using APG generators reduces the APG's
contribution to the greenhouse effect, but this practice still harms the
environment. In places where flaring is prohibited this practice has allowed more oil drills to operate by offsetting costs, delaying fossil fuel phase-out. Commenting on one pilot project with ExxonMobil, political scientist Paasha Mahdavi noted in 2022 that this process could potentially allow oil companies to report lower emissions by selling gas leaks, shifting responsibility to buyers and avoiding a real reduction commitment. According to a 2024 paper published in the Journal of Cleaner Production, bitcoin mining can finance methane mitigation of landfill gases.
Comparison to other payment systems
In 2018 Nature Climate Change published a study on projections of Bitcoin growth authored by Camilo Mora and fellow researchers from the University of Hawaiʻi at Mānoa. The paper considered the potential effects on global CO2
emissions should Bitcoin eventually replace other cashless
transactions, finding that the associated energy consumption of Bitcoin
usage could potentially produce enough CO2 emissions to lead to a 2°C increase in global mean average temperature within 30 years under certain assumptions.[34][35] Several subsequent papers contested the researchers' assumptions.Mora and fellow publishers of the original article defended their paper.
In a 2023 study published in Ecological Economics, researchers from the International Monetary Fund estimated that the global payment system represented about 0.2% of global electricity consumption, comparable to the consumption of Portugal or Bangladesh. For bitcoin, energy used is estimated around 500 kWh per transaction, compared to 0.001 kWh for credit cards (not including consumption from the merchant's bank, which receives the payment). However, bitcoin's energy expenditure is not directly linked to the number of transactions. Layer 2 solutions, like the Lightning Network, and batching, allow bitcoin to process more payments than the number of on-chain transactions suggests. For instance, in 2022, bitcoin processed 100 million transactions per year, representing 250 million payments.
OECD
notes that a direct comparison between blockchains, which are an
infrastructure technology, and the energy consumption of financial
sector activity may not be an appropriate comparison.
The
total active mining equipment in the bitcoin network and the related
electronic waste generation, from July 2014 to July 2021
Bitcoins are usually mined on specialized computing hardware, called application-specific integrated circuits, with no alternative use beyond bitcoin mining. Due to the consistent increase of the bitcoin network's hashrate, one 2021 study estimated that mining devices had an average lifespan of 1.3 years until they became unprofitable and had to be replaced, resulting in significant electronic waste. This study estimated bitcoin's annual e-waste to be over 30,000 tonnes (comparable to the small IT equipment waste produced by the Netherlands) and each transaction to result in 272 g (9.6 oz) of e-waste. A 2024 systematic review
criticized this estimate and argued, based on market sales and IPO
data, that bitcoin mining hardware lifespan was closer to 4–5 years. According to the CCAF, e-waste is significantly lower, estimated at 2,300 tonnes in 2024 as 87% of hardware is recycled, sold or repurposed.
Noise pollution
Field measurements around several large U.S. bitcoin mines show
steady background sound in nearby residential areas commonly in the
mid-30s to low-50s dBA, with higher levels closer to the mining
facility. A 2024 consultant study commissioned by Hood County, Texas,
measured background levels ranging from 35–53 dBA and recorded a maximum
around 59 dBA at two neighborhood locations, while measurements near
the site ranged 60–65 dBA. Separately, a 2022 Washington Post
investigation that logged ~19,750 one-minute readings outside homes
near a North Carolina cryptomine found sound levels above 55 dBA in 98%
of readings and above 60 dBA in over 30% of readings. Mitigation approaches reported by operators and consultants include
acoustic barriers, equipment enclosures, optimized fan controls, and
immersion cooling; however, effectiveness and adoption vary by site.
Water footprint
According to a 2023 non-peer-reviewed commentary, bitcoin's water footprint reached 1,600 gigalitres (5.7×1010 cu ft) in 2021, due to direct water consumption on site and indirect consumption from electricity generation. The author notes that this water footprint could be mitigated by using immersion cooling and power sources that do not require freshwater such as wind, solar, and thermoelectric power generation with dry cooling.
As of 2025, an investigation by The Texas Observer
reported that a bitcoin mining facility in Corpus Christi, Texas, used
approximately 127,500 gallons of fresh water per day, based on municipal
billing data.
Land footprint
A 2023 study in Earth's Future estimated the global land-use footprint attributable to bitcoin mining in 2020–2021 at 1,870 km2 (720 mi2), about 1.4 times the area of Los Angeles.
Health and local air pollution
A 2025 study in Nature Communications found that the demand from 34 large U.S. bitcoin mines increased PM2.5 pollution and exposed about 1.9 million people to ≥0.1 μg/m3 additional PM2.5, sometimes far from the mines. A 2024 review in Environmental Research linked proof-of-work mining to higher air pollution and potential health risks, and calling for mitigation and better data. A 2024 JAMA
Viewpoint described potential community hazards from cryptocurrency
mining, including air and noise pollution, and recommended protections
for vulnerable groups. One 2020 paper found that, in 2018, each US$1 of bitcoin created was
associated with about US$0.49 in combined health and climate damages in
the United States (US$0.37 in China). A 2024 analysis estimated the climate and health damages from U.S.
mining during 2019–2021 exceeded the value of coins generated in many
geographic hotspots.
Regulatory responses
China's 2021 bitcoin mining ban was partly motivated by its role in illegal coal mining and environmental concerns.
In Canada, due to high demand from the industry and concerned
that their renewable electricity could be better used, the provinces Manitoba and British Columbia paused new connections of bitcoin mining facilities to the hydroelectric grid in late 2022 for 18 months while Hydro-Québec increased prices and capped usage for bitcoin miners.
In October 2022, due to the global energy crisis, the European Commission invited member states to lower the electricity consumption of crypto-asset miners and end tax breaks and other incentives benefiting them.
Green computing, green IT (information technology), or Information and Communication Technology Sustainability, is the study and practice of environmentally sustainable computing or IT.
The goals of green computing include optimising energy efficiency during the product's lifecycle; leveraging greener energy sources to power the product and its network; improving the reusability, maintainability, and repairability of the product to extend its lifecycle; improving the recyclability or biodegradability of e-waste to support circular economy
ambitions; and aligning the manufacture and use of IT systems with
environmental and social goals. Green computing is important for all
classes of systems, ranging from handheld systems to large-scale data
centers. According to the International Energy Agency, data centres
accounted for about 1.5% of global electricity consumption in 2024
(~415 TWh), and under its central scenario, demand could roughly double
to ~945 TWh by 2030, with AI workloads a major driver of growth. Sustainable development is a concept that redefines the notion of the
general interest by integrating environmental, social, and economic
considerations. Many corporate IT departments have green computing
initiatives to reduce the environmental effect of their IT operations. Yet it is also clear that the environmental footprint of the sector is
significant, estimated at 5-9% of the world's total electricity use and
more than 2% of all emissions. Data centers and telecommunications networks will need to become more energy efficient,
reuse waste energy, use more renewable energy sources, and use less
water for cooling to stay competitive. In the European Union, policy
efforts and industry initiatives aim for climate-neutral data centers by
2030.
Green computing can involve complex trade-offs. It can be useful
to distinguish between IT for environmental sustainability and the
environmental sustainability of IT. Although green IT focuses on the
environmental sustainability of IT, in practice these two aspects are
often interconnected. For example, launching an online shopping
platform may increase the carbon footprint of a company's own IT
operations, while at the same time helping customers to purchase
products remotely, without requiring them to drive, in turn reducing
greenhouse gas emission related to travel. The company might be able to take credit for these decarbonisation benefits under its Scope 3 emissions reporting, which includes emissions from across the entire value chain.
Origins
Energy Star logo
In 1992, the U.S. Environmental Protection Agency launched Energy Star, a voluntary labeling program that is designed to promote and recognize the energy efficiency in monitors, climate control equipment, and other technologies. This resulted in the widespread adoption of sleep mode among consumer electronics. Concurrently, the Swedish organization TCO Development launched the TCO Certified program to promote low magnetic and electrical emissions from CRT-based computer displays; this program was later expanded to include criteria on energy consumption, ergonomics, and the use of hazardous materials in construction.
Regulations and industry initiatives
In 2009 the Organisation for Economic Co-operation and Development
(OECD) published a survey of over 90 government and industry
initiatives on "Green ICTs" (Information and Communication
Technologies), the environment and climate change. The report concluded
that initiatives tended to concentrate on the greening ICTs themselves,
rather than on their actual implementation to reduce global warming and environmental degradation.
In general, only 20% of initiatives had measurable targets, with
government programs tending to include targets more frequently than
business associations.
Government
Many governmental agencies have continued to implement standards and regulations that encourage green computing. The Energy Star
program was revised in October 2006 to include stricter efficiency
requirements for computer equipment, along with a tiered ranking system
for approved products.
By 2008, 26 US states established statewide recycling programs for obsolete computers and consumer electronics equipment. The statutes either impose an "advance recovery fee" for each unit sold
at retail or require the manufacturers to reclaim the equipment at
disposal.
In 2010, the American Recovery and Reinvestment Act
(ARRA) was signed into legislation by President Obama. The bill
allocated over $90 billion to be invested in green initiatives
(renewable energy, smart grids, energy efficiency, etc.) In January
2010, the U.S. Energy Department granted $47 million of the ARRA money
towards projects to improve the energy efficiency of data centers. The
projects provided research to optimize data center hardware and
software, improve power supply chain, and data center cooling technologies.
Green Digital Governance
Green digital governance refers to the use of information and communication technology
(ICT) to support environmentally sustainable policies and practices. It
describes a strategy with which an organisation strives to align its
information and communications technology with sustainability goals. This can include using digital tools and platforms to monitor and
regulate environmental impact, as well as promoting the development and
use of clean and renewable energy sources
in the technology sector. The goal of green digital governance is to
reduce the carbon footprint of the digital economy and to support the
transition to a more sustainable and resilient society.
Both the green and the digital transitions are on the agenda for
most European countries, as well as the EU as a whole. Documents and
goals such as the European Green Deal and the Sustainable Development Goals, fit for 55, Digital Europe
and others have begun the transitions. These two transitions often
contradict each other, as digital technologies have substantial
environmental footprints that go against the targets of the green
transition.
The European Union sees digitalisation and the adoption of ICT
(Information and Communications Technology) solutions as an important
tool for creating greener solutions, while also acknowledging that in
order to achieve the desired positive environmental impact, the tools
themselves must be environmentally sustainable. The green transition may accelerate innovation and adoption of digital
solutions offering the ICT sector new opportunities for becoming more
competitive. The synergy created as a result of the green transition and
digitalisation brings social, economic and environmental benefits,
which is a goal of environmentally friendly digital governments and the
creation of green ICT solutions in general.
The digital component is expected to also be used to reach the ambitions of the European Green Deal and Sustainable Development Goals.
As powerful enablers for the sustainability transition, digital
solutions can advance the circular economy, support the decarbonisation
of all sectors and reduce the environmental and social footprint of
products placed on the EU market. For example, key sectors such as
precision agriculture, transport and energy can benefit from digital
solutions in pursuing the sustainability objectives of the European
Green Deal.
E-government services can provide solutions to the environmental problem. The possibility for a citizen to fully request and get a service online
would render, in addition to cost savings for the public authorities
and increased citizen satisfaction, reductions of carbon emissions and
paper consumption.
Industry
iMasons Climate Accord Founded in 2022, the (ICA) is a historic
cooperative of companies committed to reducing carbon in digital
infrastructure materials, products, and power.
Climate Savers Computing Initiative (CSCI) is an effort to reduce the electric power consumption of PCs in active and inactive states. The CSCI provides a catalog of green products from its member
organizations, and information for reducing PC power consumption. It was
started on June 12, 2007. The name stems from the World Wildlife Fund's Climate Savers program, which began in 1999. The WWF is a member of the Computing Initiative.
The Green Electronics Council offers the Electronic Product Environmental Assessment Tool
(EPEAT) to assist in the purchase of "greener" computing systems. The
Council evaluates computing equipment on 51 criteria – 23 required and
28 optional - that measure a product's efficiency and sustainability
attributes. Products are rated Gold, Silver, or Bronze, depending on how
many optional criteria they meet. On January 24, 2007, President George W. Bush issued Executive Order 13423, which requires all United States Federal agencies to use EPEAT when purchasing computer systems.
The Green Grid
is a global consortium dedicated to advancing energy efficiency in data
centers and business computing ecosystems. It was founded in February
2007 by several key companies in the industry – AMD, APC, Dell, HP, IBM, Intel, Microsoft, Rackable Systems, SprayCool (purchased in 2010 by Parker), Sun Microsystems and VMware. The Green Grid has since grown to hundreds of members, including end-users and government organizations focused on improving data center infrastructure efficiency (DCIE).
The Green500 list rates supercomputers by energy efficiency (megaflops/watt), encouraging a focus on efficiency rather than absolute performance.
Green Comm Challenge is an organization that promotes the development of energy conservation technology and practices in the field of ICT.
The Transaction Processing Performance Council
(TPC) Energy specification augments existing TPC benchmarks by allowing
optional publications of energy metrics alongside performance results.
SPECpower
is the first industry standard benchmark that measures power
consumption in relation to performance for server-class computers. Other
benchmarks which measure energy efficiency include SPECweb, SPECvirt, and VMmark.
Approaches
Modern IT
systems rely on a complicated mix of people, networks, and hardware; as
such, a green computing initiative ideally covers these areas. A
solution may also need to address end user satisfaction, management
restructuring, regulatory compliance, and return on investment (ROI).
There are also fiscal motivations for companies to take control of their
own power consumption; "of the power management tools available, one of
the most powerful may still be simple, plain, common sense."
Product longevity
Gartner maintains that the PC manufacturing process accounts for 70% of the natural resources used in the life cycle of a PC. In 2011, Fujitsu released a life-cycle assessment
(LCA) of a desktop that show that manufacturing and end of life
accounts for the majority of this desktop's ecological footprint. Therefore, the biggest contribution to green computing usually is to
prolong the equipment's lifetime. A recent life-cycle assessment
comparing a desktop and a laptop for a four-year use case with similar
performance found total carbon footprints of 679.1 kg CO2e for the
desktop versus 286.1 kg CO2e for the laptop; for both systems,
manufacturing was the largest contributor, followed by the use phase.
Another report from Gartner recommends to "Look for product longevity, including upgradability and modularity." For instance, manufacturing a new PC makes a far bigger ecological footprint than manufacturing a new RAM module to upgrade an existing one.
Data center design
Data center facilities are heavy consumers of energy, accounting for
between 1.1% and 1.5% of the world's total energy use in 2010. The U.S. Department of Energy estimates that data center facilities
consume up to 100 to 200 times more energy than standard office
buildings.
Energy efficient data center design should address all of the
energy use aspects included in a data center: from the IT equipment to
the HVAC (Heating, ventilation and air conditioning) equipment to the
actual location, configuration and construction of the building.
The U.S. Department of Energy specifies five primary areas on which to focus energy efficient data center design best practices:
Information technology (IT) systems
Environmental conditions
Air management
Cooling systems
Electrical systems
Additional energy efficient design opportunities specified by the
U.S. Department of Energy include on-site electrical generation and
recycling of waste heat.
Energy efficient data center design should help to better use a data center's space, and increase performance and efficiency.
The efficiency of algorithms affects the amount of computer resources
required for any given computing function and there are many efficiency
trade-offs in writing programs. Algorithm changes, such as switching
from a slow (e.g. linear) search algorithm
to a fast (e.g. hashed or indexed) search algorithm can reduce resource
usage for a given task from substantial to close to zero. In 2009, a
study by a physicist at Harvard estimated that the average Google search released 7 grams of carbon dioxide (CO2). However, Google disputed this figure, arguing that a typical search produced only 0.2 grams of CO2. Similarly, the environmental footprint of distributed computing is
heavily dependent on the algorithmic efficiency of its underlying
consensus mechanisms. Mathematical consumption models evaluating Sybil
attack resistance schemes indicate that ledger architectures utilizing
directed acyclic graphs (DAG) to achieve consensus via virtual voting
present lower energy consumption per transaction when compared to
traditional proof-of-work systems and standard proof-of-stake
blockchains.
Similarly, the environmental footprint of distributed computing
is heavily dependent on the algorithmic efficiency of its underlying
consensus mechanisms. Mathematical consumption models evaluating Sybil
attack resistance schemes indicate that ledger architectures utilizing
directed acyclic graphs (DAG) to achieve consensus via virtual voting
present lower energy consumption per transaction when compared to
traditional proof-of-work systems and standard proof-of-stake
blockchains.
Algorithms can also be used to route data to data centers where
electricity is less expensive. Researchers from MIT, Carnegie Mellon
University, and Akamai have tested an energy allocation algorithm that
routes traffic to the location with the lowest energy costs. The
researchers project up to 40 percent savings on energy costs if their
proposed algorithm were to be deployed. However, this approach does not
actually reduce the amount of energy being used; it reduces only the
cost to the company using it. Nonetheless, a similar strategy could be
used to direct traffic to rely on energy that is produced in a more
environmentally friendly or efficient way. A similar approach has also
been used to cut energy usage by routing traffic away from data centers
experiencing warm weather; this allows computers to be shut down to
avoid using air conditioning.
Larger server centers are sometimes located where energy and land
are inexpensive and readily available. Local availability of renewable
energy, climate that allows outside air to be used for cooling, or
locating them where the heat they produce may be used for other purposes
could be factors in green siting decisions.
Approaches to actually reduce the energy consumption of network
devices by proper network/device management techniques have been
surveyed Bianzino, et al. The authors grouped the approaches into 4 main strategies, namely (i)
Adaptive Link Rate (ALR), (ii) Interface Proxying, (iii) Energy Aware
Infrastructure, and (iv) Maximum Energy Aware Applications.
Computer virtualization refers to the abstraction of computer
resources, such as the process of running two or more logical computer
systems on one set of physical hardware. The concept originated with the
IBM mainframe operating systems of the 1960s, and was commercialized for x86-compatible
computers, and other computer systems, in the 1990s. With
virtualization, a system administrator can combine several formerly
physical systems as virtual machines on one powerful system, thereby
conserving resources by removing need for some of the original hardware
and reducing power and cooling consumption. Virtualization can assist in
distributing work so that servers are either busy or put in a low-power
sleep state. Several commercial companies and open-source projects now
offer software packages to enable a transition to virtual computing. Intel Corporation and AMD have also built proprietary virtualization enhancements to the x86 instruction set into each of their CPU product lines, in order to facilitate virtual computing.
New virtual technologies, such as operating system-level virtualization
can also be used to reduce energy consumption. These technologies make a
more efficient use of resources, thus reducing energy consumption by
design. Also, the consolidation of virtualized technologies is more
efficient than the one done in virtual machines, so more services can be deployed in the same physical machine, reducing the amount of hardware needed.
Terminal servers have also been used in green computing. When using
the system, users at a terminal connect to a central server; all of the
actual computing is done on the server, but the end user experiences the
system operating as if it were on the terminal. These can be combined
with thin clients, which use up to 1/8 the amount of energy of a normal workstation, resulting in a decrease of energy costs and consumption. There has been an increase in using terminal services with thin clients
to create virtual labs. Examples of terminal server software include Terminal Services for Windows and the Linux Terminal Server Project (LTSP) for the Linux operating system. Software-based remote desktop clients such as Windows Remote Desktop and RealVNC can provide similar thin-client functions when run on low power hardware that connects to a server.
Data compression, which involves using fewer bits to encode
information, may also be used in green computing depending on the
structure of the data. Since it is highly data specific, data
compression strategies may result in using more energy or resources than
necessary in some cases. However, choosing a well-suited compression
algorithm for the dataset can yield greater power efficiency and reduce
network and storage requirements. There is a tradeoff between compression ratio and energy consumption.
Deciding whether or not this is worthwhile depends on the dataset's
compressibility. Compression improves energy efficiency for data with a
compression ratio much less than roughly 0.3, and hurts for data with
higher compression ratios.
The Advanced Configuration and Power Interface
(ACPI), an open industry standard, allows an operating system to
directly control the power-saving aspects of its underlying hardware.
This allows a system to automatically turn off components such as monitors and hard drives after set periods of inactivity. In addition, a system may hibernate, when most components (including the CPU and the system RAM) are turned off. ACPI is a successor to an earlier Intel-Microsoft standard called Advanced Power Management, which allows a computer's BIOS to control power management functions.
Some programs allow the user to manually adjust the voltages
supplied to the CPU, which reduces both the amount of heat produced and
electricity consumed. This process is called undervolting. Some CPUs can automatically undervolt the processor, depending on the workload; this technology is called "SpeedStep" on Intel processors, "PowerNow!"/"Cool'n'Quiet" on AMD chips, LongHaul on VIA CPUs, and LongRun with Transmeta processors.
Data center power
Data centers, which have been criticized for their extraordinarily
high energy demand, are a primary focus for proponents of green
computing. According to a Greenpeace study, data centers represent 21% of the electricity consumed by the IT sector, which is about 382 billion kWh a year.
Data centers can potentially improve their energy and space
efficiency through techniques such as storage consolidation and
virtualization. Many organizations are aiming to eliminate underused
servers, resulting in lower energy usage. The U.S. federal government set a minimum 10% reduction target for data center energy usage by 2011. With the aid of a self-styled ultra-efficient evaporative cooling
technology. Google Inc. claims to have reduced its energy consumption to
50% of the industry average.
Cryptocurrency mining, particularly for proof-of-work currencies like Bitcoin, also uses significant amounts of energy globally. Advocates have argued that cryptocurrency can help to drive investment in green energy.
Operating system support
Microsoft Windows has included limited PC power management features since Windows 95. These initially provided for stand-by (suspend-to-RAM) and a monitor
low power state. Further iterations of Windows added hibernate
(suspend-to-disk) and support for the ACPI standard. Windows 2000
was the first NT-based operating system to include power management.
This required major changes to the underlying operating system
architecture and a new hardware driver model. Windows 2000 also
introduced Group Policy,
a technology that allowed administrators to centrally configure most
Windows features. However, power management was not one of those
features. This is probably because the power management settings design
relied upon a connected set of per-user and per-machine binary registry
values, effectively leaving it up to each user to configure their own power management settings.
This approach, which is not compatible with Windows Group Policy, was repeated in Windows XP. The reasons for this design decision by Microsoft are not known, and it has resulted in heavy criticism. Microsoft significantly improved this in Windows Vista by redesigning the power management system to allow basic configuration
by Group Policy. The support offered is limited to a single
per-computer policy. Windows 7 retains these limitations but includes
refinements for timer coalescing, processor power management, and display panel brightness. The most significant change in Windows 7
is in the user experience. The prominence of the default High
Performance power plan has been reduced with the aim of encouraging
users to save power.
Third-party PC power management software for adds features beyond those built-in to the Windows operating system. Most products offer Active Directory
integration and per-user/per-machine settings with the more advanced
offering multiple power plans, scheduled power plans, anti-insomnia
features and enterprise power usage reporting.
Linux systems started to provide laptop-optimized power-management in 2005, with power-management options being mainstream since 2009.
Power supply
Desktop computer power supplies are in general 70–75% efficient, dissipating the remaining energy as heat. A certification program called 80 Plus
certifies PSUs that are at least 80% efficient; typically these models
are drop-in replacements for older, less efficient PSUs of the same form
factor. As of July 20, 2007, all new Energy Star 4.0-certified desktop
PSUs must be at least 80% efficient.
Storage
Smaller form factor (e.g., 2.5 inch) hard disk drives often consume less power per gigabyte than physically larger drives.Unlike hard disk drives, solid-state drives store data in flash memory or DRAM. With no moving parts, power consumption may be reduced somewhat for low-capacity flash-based devices.
As hard drive prices have fallen, storage farms have tended to
increase in capacity to make more data available online. This includes
archival and backup data that would formerly have been saved on tape or
other offline storage. The increase in online storage has increased
power consumption. Reducing the power consumed by large storage arrays,
while still providing the benefits of online storage, is a subject of
ongoing research.
Video card
A fast GPU may be the largest power consumer in a computer.
Unlike other display technologies, electronic paper does not use any power while displaying an image. CRT monitors typically use more power than LCD monitors. They also contain significant amounts of lead. LCD monitors typically use a cold-cathode fluorescent bulb to provide light for the display. Most newer displays use an array of light-emitting diodes (LEDs) in place of the fluorescent bulb, which further reduces the amount of electricity used by the display. Fluorescent back-lights also contain mercury, whereas LED back-lights do not.
A light-on-dark color scheme, also called dark mode, is a color scheme that requires less energy to display on new display technologies, such as OLED. This positively impacts battery life and energy consumption. While an
OLED will consume around 40% of the power of an LCD displaying an image
that is primarily black, it can use more than three times as much power
to display an image with a white background, such as a document or web
site. This can lead to reduced battery life and increased energy use, unless a light-on-dark color scheme is used. A 2018 article in Popular Science suggests that "Dark mode is easier on the eyes and battery" and displaying white on full brightness uses roughly six times as much
power as pure black on a Google Pixel, which has an OLED display. Apple's iOS 13 and iPadOS 13 both feature a light-on dark mode, which would allow third-party developers to implement their own dark themes. Google's Android 10 features a system-level dark mode.
Recycling computing equipment can keep harmful materials such as lead, mercury, and hexavalent chromium out of landfills,
and can replace equipment that otherwise would need to be manufactured,
saving further energy and emissions. Computer systems that have
outlived their original function can be re-purposed, or donated to
various charities and non-profit organizations. However, many charities have recently imposed minimum system requirements for donated equipment. Additionally, parts from outdated systems may be salvaged and recycled through certain retail outlets and municipal or private recycling centers. Computing supplies, such as printer cartridges, paper, and batteries may be recycled as well.
A drawback to many of these schemes is that computers gathered through recycling drives are often shipped to developing countries where environmental standards are less strict than in North America and Europe. The Silicon Valley Toxics Coalition has estimated that 80% of the post-consumer e-waste collected for recycling is shipped abroad to countries such as China and India.
In 2011, the collection rate of e-waste remained low, even in the
most ecology-responsible countries like France. In the U.S., e-waste
collection was at a 14% annual rate between electronic equipment sold
and e-waste collected for 2006 to 2009.
The recycling of old computers raises a privacy issue. The old
storage devices still hold private information, such as emails,
passwords, and credit card numbers, which can be recovered simply by
using software available freely on the Internet. Deletion of a file does
not actually remove the file from the hard drive. Before recycling a
computer, users should remove the hard drive, or hard drives if there is
more than one, and physically destroy it or store it somewhere safe.
There are some authorized hardware recycling companies to whom the
computer may be given for recycling, and they typically sign a
non-disclosure agreement.
Cloud computing may help to address two major ICT challenges related to green computing – energy usage and embodied carbon. Hyperscale data centers such as those operated by AWS, Azure, and GCP can benefit from economies of scale, and virtualization, dynamic provisioning environment, multi-tenancy and green data center
approaches can enable more efficient resource allocation. Organizations
may be able to reduce their direct energy consumption and carbon
emissions by up to 30% and 90% respectively by moving certain
on-premises applications into the public cloud.
However, critics point to shortcomings in the carbon tracking and management tools provided by major cloud providers. GreenOps, also known as DevGreenOps, DevSusOps or DevSustainableOps, is
emerging as a framework to include sustainability into cloud
management. Carbon-aware computing and grid-aware computing can form part of a
GreenOps approach. This includes techniques like demand shifting, which
means moving computational workloads to locations or times of day with
cleaner energy in the grid. Demand shaping is a similar technique, which focuses on adjusting
workloads according to the amount of clean energy currently available.
New technologies such as edge and fog computing
are a solution to reducing energy consumption. These technologies allow
redistributing computation near its use, thus reducing energy costs in
the network. Furthermore, having smaller data centers, the energy used in operations such as refrigerating and maintenance is reduced.
Remote work
Remote work using teleconference and telepresence
technologies is often implemented in green computing initiatives. The
advantages include increased worker satisfaction, reduction of greenhouse gas emissions related to travel, and increased profit margins as a result of lower overhead costs for office space, heat, lighting, etc. The average annual energy consumption for U.S. office buildings is over
23 kilowatt hours per square foot, with heat, air conditioning and
lighting accounting for 70% of all energy consumed. Other related initiatives, such as Hoteling, reduce the square footage per employee as workers reserve space only when needed. Many types of jobs, such as sales, consulting, and field service, integrate well with this technique.
Voice over IP (VoIP) reduces the telephony wiring infrastructure by sharing the existing Ethernet copper. VoIP and phone extension mobility also made hot desking more practical. Wi-Fi consume 4 to 10 times less energy than 4G.
Telecommunication network devices energy indices
In 2013 ICT energy consumption, in the US and worldwide, was
estimated respectively at 9.4% and 5.3% of the total electricity
produced. The energy consumption of ICTs is today significant even when compared
with other industries. Some studies have tried to identify the key
energy indices that allow a relevant comparison between different
devices (network elements). This analysis was focused on how to optimise device and network
consumption for carrier telecommunication by itself. The target was to
allow an immediate perception of the relationship between the network
technology and the environmental effect. These studies are at the start
and further research will be necessary.
Supercomputers
The Green500
list was first announced on November 15, 2007, at SC|07. As a
complement to the TOP500, the listing of the Green500 began a new era
where supercomputers can be compared by performance-per-watt. As of 2019, two Japanese supercomputers topped the Green500 energy
efficiency ranking with performance exceeding 16 GFLOPS/watt, and two
IBM AC922 systems followed with performance exceeding 15 GFLOPS/watt.
Education and certification
Green computing programs
Degree and postgraduate programs provide training in a range of
information technology concentrations along with sustainable strategies
to educate students on how to build and maintain systems while reducing
its harm to the environment. The Australian National University (ANU) offers "ICT Sustainability" as part of its information technology and engineering masters programs. Athabasca University offers a similar course "Green ICT Strategies", adapted from the ANU course notes by Tom Worthington. In the UK, Leeds Beckett University offers an MSc Sustainable Computing program in both full- and part-time access modes.
Green computing certifications
Some certifications demonstrate that an individual has specific green computing knowledge, including:
Green Computing Initiative – GCI offers the Certified Green
Computing User Specialist (CGCUS), Certified Green Computing Architect
(CGCA) and Certified Green Computing Professional (CGCP) certifications.
Information Systems Examination Board
(ISEB) Foundation Certificate in Green IT is appropriate for showing an
overall understanding and awareness of green computing and where its
implementation can be beneficial.
Singapore Infocomm Technology Federation (SiTF) Singapore Certified
Green IT Professional is an industry endorsed professional level
certification offered with SiTF authorized training partners.
Certification requires completion of a four-day instructor-led core
course, plus a one-day elective from an authorized vendor.
Australian Computer Society
(ACS) The ACS offers a certificate for "Green Technology Strategies" as
part of the Computer Professional Education Program (CPEP). Award of a
certificate requires completion of a 12-week e-learning course designed
by Tom Worthington, with written assignments.
Ratings
Since 2010, Greenpeace
has maintained a list of ratings of prominent technology companies in
several countries based on how clean the energy used by that company is,
ranging from A (the best) to F (the worst).
ICT and energy demand
Digitalization has brought additional energy consumption;
energy-increasing effects have been greater than the energy-reducing
effects. Four energy consumption increasing effects are:
Direct effect – Strong increases of (technical) energy efficiency in ICT are countered by the growth of the sector.
Efficiency and rebound effects – Rebound effects are high for ICT and increased productivity often leads to new behaviors that are more energy intensive.
Economic growth – Positive effect of digitalization on economic growth.
Sectoral change – Growth of ICT services tends not to replace, but come on top of existing services.