Search This Blog

Wednesday, December 9, 2020

Smart grid

From Wikipedia, the free encyclopedia
 
Characteristics of a traditional system (left) versus the smart grid (right)
 
Video about smart grids

A smart grid is an electrical grid which includes a variety of operation and energy measures including smart meters, smart appliances, renewable energy resources, and energy efficient resources. Electronic power conditioning and control of the production and distribution of electricity are important aspects of the smart grid.

Smart grid policy is organized in Europe as Smart Grid European Technology Platform. Policy in the United States is described in 42 U.S.C. ch. 152, subch. IX § 17381.

Roll-out of smart grid technology also implies a fundamental re-engineering of the electricity services industry, although typical usage of the term is focused on the technical infrastructure.

Background

Historical development of the electricity grid

The first alternating current power grid system was installed in 1886 in Great Barrington, Massachusetts. At that time, the grid was a centralized unidirectional system of electric power transmission, electricity distribution, and demand-driven control.

In the 20th century local grids grew over time, and were eventually interconnected for economic and reliability reasons. By the 1960s, the electric grids of developed countries had become very large, mature and highly interconnected, with thousands of 'central' generation power stations delivering power to major load centres via high capacity power lines which were then branched and divided to provide power to smaller industrial and domestic users over the entire supply area. The topology of the 1960s grid was a result of the strong economies of scale: large coal-, gas- and oil-fired power stations in the 1 GW (1000 MW) to 3 GW scale are still found to be cost-effective, due to efficiency-boosting features that can be cost effective only when the stations become very large.

Power stations were located strategically to be close to fossil fuel reserves (either the mines or wells themselves, or else close to rail, road or port supply lines). Siting of hydro-electric dams in mountain areas also strongly influenced the structure of the emerging grid. Nuclear power plants were sited for availability of cooling water. Finally, fossil fuel-fired power stations were initially very polluting and were sited as far as economically possible from population centres once electricity distribution networks permitted it. By the late 1960s, the electricity grid reached the overwhelming majority of the population of developed countries, with only outlying regional areas remaining 'off-grid'.

Metering of electricity consumption was necessary on a per-user basis in order to allow appropriate billing according to the (highly variable) level of consumption of different users. Because of limited data collection and processing capability during the period of growth of the grid, fixed-tariff arrangements were commonly put in place, as well as dual-tariff arrangements where night-time power was charged at a lower rate than daytime power. The motivation for dual-tariff arrangements was the lower night-time demand. Dual tariffs made possible the use of low-cost night-time electrical power in applications such as the maintaining of 'heat banks' which served to 'smooth out' the daily demand, and reduce the number of turbines that needed to be turned off overnight, thereby improving the utilisation and profitability of the generation and transmission facilities. The metering capabilities of the 1960s grid meant technological limitations on the degree to which price signals could be propagated through the system.

From 1970s to the 1990s, growing demand led to increasing numbers of power stations. In some areas, supply of electricity, especially at peak times, could not keep up with this demand, resulting in poor power quality including blackouts, power cuts, and brownouts. Increasingly, electricity was depended on for industry, heating, communication, lighting, and entertainment, and consumers demanded ever higher levels of reliability.

Towards the end of the 20th century, electricity demand patterns were established: domestic heating and air-conditioning led to daily peaks in demand that were met by an array of 'peaking power generators' that would only be turned on for short periods each day. The relatively low utilisation of these peaking generators (commonly, gas turbines were used due to their relatively lower capital cost and faster start-up times), together with the necessary redundancy in the electricity grid, resulted in high costs to the electricity companies, which were passed on in the form of increased tariffs.

In the 21st century, some developing countries like China, India, and Brazil were seen as pioneers of smart grid deployment.

Modernization opportunities

Since the early 21st century, opportunities to take advantage of improvements in electronic communication technology to resolve the limitations and costs of the electrical grid have become apparent. Technological limitations on metering no longer force peak power prices to be averaged out and passed on to all consumers equally. In parallel, growing concerns over environmental damage from fossil-fired power stations has led to a desire to use large amounts of renewable energy. Dominant forms such as wind power and solar power are highly variable, and so the need for more sophisticated control systems became apparent, to facilitate the connection of sources to the otherwise highly controllable grid. Power from photovoltaic cells (and to a lesser extent wind turbines) has also, significantly, called into question the imperative for large, centralised power stations. The rapidly falling costs point to a major change from the centralised grid topology to one that is highly distributed, with power being both generated and consumed right at the limits of the grid. Finally, growing concern over terrorist attack in some countries has led to calls for a more robust energy grid that is less dependent on centralised power stations that were perceived to be potential attack targets.

Definition of "smart grid"

The first official definition of Smart Grid was provided by the Energy Independence and Security Act of 2007 (EISA-2007), which was approved by the US Congress in January 2007, and signed to law by President George W. Bush in December 2007. Title XIII of this bill provides a description, with ten characteristics, that can be considered a definition for Smart Grid, as follows:

"It is the policy of the United States to support the modernization of the Nation's electricity transmission and distribution system to maintain a reliable and secure electricity infrastructure that can meet future demand growth and to achieve each of the following, which together characterize a Smart Grid: (1) Increased use of digital information and controls technology to improve reliability, security, and efficiency of the electric grid. (2) Dynamic optimization of grid operations and resources, with full cyber-security. (3) Deployment and integration of distributed resources and generation, including renewable resources. (4) Development and incorporation of demand response, demand-side resources, and energy-efficiency resources. (5) Deployment of 'smart' technologies (real-time, automated, interactive technologies that optimize the physical operation of appliances and consumer devices) for metering, communications concerning grid operations and status, and distribution automation. (6) Integration of 'smart' appliances and consumer devices. (7) Deployment and integration of advanced electricity storage and peak-shaving technologies, including plug-in electric and hybrid electric vehicles, and thermal storage air conditioning. (8) Provision to consumers of timely information and control options. (9) Development of standards for communication and interoperability of appliances and equipment connected to the electric grid, including the infrastructure serving the grid. (10) Identification and lowering of unreasonable or unnecessary barriers to adoption of smart grid technologies, practices, and services."

The European Union Commission Task Force for Smart Grids also provides smart grid definition as:

"A Smart Grid is an electricity network that can cost efficiently integrate the behaviour and actions of all users connected to it – generators, consumers and those that do both – in order to ensure economically efficient, sustainable power system with low losses and high levels of quality and security of supply and safety. A smart grid employs innovative products and services together with intelligent monitoring, control, communication, and self-healing technologies in order to:

  1. • Better facilitate the connection and operation of generators of all sizes and technologies.
  2. • Allow consumers to play a part in optimising the operation of the system.
  3. • Provide consumers with greater information and options for how they use their supply.
  4. • Significantly reduce the environmental impact of the whole electricity supply system.
  5. • Maintain or even improve the existing high levels of system reliability, quality and security of supply.
  6. • Maintain and improve the existing services efficiently."

A common element to most definitions is the application of digital processing and communications to the power grid, making data flow and information management central to the smart grid. Various capabilities result from the deeply integrated use of digital technology with power grids. Integration of the new grid information is one of the key issues in the design of smart grids. Electric utilities now find themselves making three classes of transformations: improvement of infrastructure, called the strong grid in China; addition of the digital layer, which is the essence of the smart grid; and business process transformation, necessary to capitalize on the investments in smart technology. Much of the work that has been going on in electric grid modernization, especially substation and distribution automation, is now included in the general concept of the smart grid.

Early technological innovations

Smart grid technologies emerged from earlier attempts at using electronic control, metering, and monitoring. In the 1980s, automatic meter reading was used for monitoring loads from large customers, and evolved into the Advanced Metering Infrastructure of the 1990s, whose meters could store how electricity was used at different times of the day. Smart meters add continuous communications so that monitoring can be done in real time, and can be used as a gateway to demand response-aware devices and "smart sockets" in the home. Early forms of such demand side management technologies were dynamic demand aware devices that passively sensed the load on the grid by monitoring changes in the power supply frequency. Devices such as industrial and domestic air conditioners, refrigerators and heaters adjusted their duty cycle to avoid activation during times the grid was suffering a peak condition. Beginning in 2000, Italy's Telegestore Project was the first to network large numbers (27 million) of homes using smart meters connected via low bandwidth power line communication. Some experiments used the term broadband over power lines (BPL), while others used wireless technologies such as mesh networking promoted for more reliable connections to disparate devices in the home as well as supporting metering of other utilities such as gas and water.

Monitoring and synchronization of wide area networks were revolutionized in the early 1990s when the Bonneville Power Administration expanded its smart grid research with prototype sensors that are capable of very rapid analysis of anomalies in electricity quality over very large geographic areas. The culmination of this work was the first operational Wide Area Measurement System (WAMS) in 2000. Other countries are rapidly integrating this technology — China started having a comprehensive national WAMS when the past 5-year economic plan completed in 2012.

The earliest deployments of smart grids include the Italian system Telegestore (2005), the mesh network of Austin, Texas (since 2003), and the smart grid in Boulder, Colorado (2008). See Deployments and attempted deployments below.

Features of the smart grid

The smart grid represents the full suite of current and proposed responses to the challenges of electricity supply. Because of the diverse range of factors there are numerous competing taxonomies and no agreement on a universal definition. Nevertheless, one possible categorization is given here.

Reliability

The smart grid makes use of technologies such as state estimation, that improve fault detection and allow self-healing of the network without the intervention of technicians. This will ensure more reliable supply of electricity, and reduced vulnerability to natural disasters or attack.

Although multiple routes are touted as a feature of the smart grid, the old grid also featured multiple routes. Initial power lines in the grid were built using a radial model, later connectivity was guaranteed via multiple routes, referred to as a network structure. However, this created a new problem: if the current flow or related effects across the network exceed the limits of any particular network element, it could fail, and the current would be shunted to other network elements, which eventually may fail also, causing a domino effect. See power outage. A technique to prevent this is load shedding by rolling blackout or voltage reduction (brownout).

Flexibility in network topology

Next-generation transmission and distribution infrastructure will be better able to handle possible bidirectional energy flows, allowing for distributed generation such as from photovoltaic panels on building roofs, but also charging to/from the batteries of electric cars, wind turbines, pumped hydroelectric power, the use of fuel cells, and other sources.

Classic grids were designed for one-way flow of electricity, but if a local sub-network generates more power than it is consuming, the reverse flow can raise safety and reliability issues. A smart grid aims to manage these situations.

Efficiency

Numerous contributions to overall improvement of the efficiency of energy infrastructure are anticipated from the deployment of smart grid technology, in particular including demand-side management, for example turning off air conditioners during short-term spikes in electricity price, reducing the voltage when possible on distribution lines through Voltage/VAR Optimization (VVO), eliminating truck-rolls for meter reading, and reducing truck-rolls by improved outage management using data from Advanced Metering Infrastructure systems. The overall effect is less redundancy in transmission and distribution lines, and greater utilization of generators, leading to lower power prices.

Load adjustment/Load balancing

The total load connected to the power grid can vary significantly over time. Although the total load is the sum of many individual choices of the clients, the overall load is not necessarily stable or slow varying. For example, if a popular television program starts, millions of televisions will start to draw current instantly. Traditionally, to respond to a rapid increase in power consumption, faster than the start-up time of a large generator, some spare generators are put on a dissipative standby mode. A smart grid may warn all individual television sets, or another larger customer, to reduce the load temporarily (to allow time to start up a larger generator) or continuously (in the case of limited resources). Using mathematical prediction algorithms it is possible to predict how many standby generators need to be used, to reach a certain failure rate. In the traditional grid, the failure rate can only be reduced at the cost of more standby generators. In a smart grid, the load reduction by even a small portion of the clients may eliminate the problem.

While traditionally load balancing strategies have been designed to change consumers' consumption patterns to make demand more uniform, developments in energy storage and individual renewable energy generation have provided opportunities to devise balanced power grids without affecting consumers' behavior. Typically, storing energy during off-peak times eases high demand supply during peak hours. Dynamic game-theoretic frameworks have proved particularly efficient at storage scheduling by optimizing energy cost using their Nash equilibrium.

Peak curtailment/leveling and time of use pricing

Peak load avoidance by smart charging of electric vehicles

To reduce demand during the high cost peak usage periods, communications and metering technologies inform smart devices in the home and business when energy demand is high and track how much electricity is used and when it is used. It also gives utility companies the ability to reduce consumption by communicating to devices directly in order to prevent system overloads. Examples would be a utility reducing the usage of a group of electric vehicle charging stations or shifting temperature set points of air conditioners in a city. To motivate them to cut back use and perform what is called peak curtailment or peak leveling, prices of electricity are increased during high demand periods, and decreased during low demand periods. It is thought that consumers and businesses will tend to consume less during high demand periods if it is possible for consumers and consumer devices to be aware of the high price premium for using electricity at peak periods. This could mean making trade-offs such as cycling on/off air conditioners or running dishwashers at 9 pm instead of 5 pm. When businesses and consumers see a direct economic benefit of using energy at off-peak times, the theory is that they will include energy cost of operation into their consumer device and building construction decisions and hence become more energy efficient.

Sustainability

The improved flexibility of the smart grid permits greater penetration of highly variable renewable energy sources such as solar power and wind power, even without the addition of energy storage. Current network infrastructure is not built to allow for many distributed feed-in points, and typically even if some feed-in is allowed at the local (distribution) level, the transmission-level infrastructure cannot accommodate it. Rapid fluctuations in distributed generation, such as due to cloudy or gusty weather, present significant challenges to power engineers who need to ensure stable power levels through varying the output of the more controllable generators such as gas turbines and hydroelectric generators. Smart grid technology is a necessary condition for very large amounts of renewable electricity on the grid for this reason. There is also support for vehicle-to-grid.

Market-enabling

The smart grid allows for systematic communication between suppliers (their energy price) and consumers (their willingness-to-pay), and permits both the suppliers and the consumers to be more flexible and sophisticated in their operational strategies. Only the critical loads will need to pay the peak energy prices, and consumers will be able to be more strategic in when they use energy. Generators with greater flexibility will be able to sell energy strategically for maximum profit, whereas inflexible generators such as base-load steam turbines and wind turbines will receive a varying tariff based on the level of demand and the status of the other generators currently operating. The overall effect is a signal that awards energy efficiency, and energy consumption that is sensitive to the time-varying limitations of the supply. At the domestic level, appliances with a degree of energy storage or thermal mass (such as refrigerators, heat banks, and heat pumps) will be well placed to 'play' the market and seek to minimise energy cost by adapting demand to the lower-cost energy support periods. This is an extension of the dual-tariff energy pricing mentioned above.

Demand response support

Demand response support allows generators and loads to interact in an automated fashion in real time, coordinating demand to flatten spikes. Eliminating the fraction of demand that occurs in these spikes eliminates the cost of adding reserve generators, cuts wear and tear and extends the life of equipment, and allows users to cut their energy bills by telling low priority devices to use energy only when it is cheapest.

Currently, power grid systems have varying degrees of communication within control systems for their high-value assets, such as in generating plants, transmission lines, substations and major energy users. In general information flows one way, from the users and the loads they control back to the utilities. The utilities attempt to meet the demand and succeed or fail to varying degrees (brownouts, rolling blackout, uncontrolled blackout). The total amount of power demand by the users can have a very wide probability distribution which requires spare generating plants in standby mode to respond to the rapidly changing power usage. This one-way flow of information is expensive; the last 10% of generating capacity may be required as little as 1% of the time, and brownouts and outages can be costly to consumers.

Demand response can be provided by commercial, residential loads, and industrial loads. For example, Alcoa's Warrick Operation is participating in MISO as a qualified Demand Response Resource, and the Trimet Aluminium uses its smelter as a short-term mega-battery.

Latency of the data flow is a major concern, with some early smart meter architectures allowing actually as long as 24 hours delay in receiving the data, preventing any possible reaction by either supplying or demanding devices.

Platform for advanced services

As with other industries, use of robust two-way communications, advanced sensors, and distributed computing technology will improve the efficiency, reliability and safety of power delivery and use. It also opens up the potential for entirely new services or improvements on existing ones, such as fire monitoring and alarms that can shut off power, make phone calls to emergency services, etc.

Provision megabits, control power with kilobits, sell the rest

The amount of data required to perform monitoring and switching one's appliances off automatically is very small compared with that already reaching even remote homes to support voice, security, Internet and TV services. Many smart grid bandwidth upgrades are paid for by over-provisioning to also support consumer services, and subsidizing the communications with energy-related services or subsidizing the energy-related services, such as higher rates during peak hours, with communications. This is particularly true where governments run both sets of services as a public monopoly. Because power and communications companies are generally separate commercial enterprises in North America and Europe, it has required considerable government and large-vendor effort to encourage various enterprises to cooperate. Some, like Cisco, see opportunity in providing devices to consumers very similar to those they have long been providing to industry. Others, such as Silver Spring Networks or Google, are data integrators rather than vendors of equipment. While the AC power control standards suggest powerline networking would be the primary means of communication among smart grid and home devices, the bits may not reach the home via Broadband over Power Lines (BPL) initially but by fixed wireless.

Technology

The bulk of smart grid technologies are already used in other applications such as manufacturing and telecommunications and are being adapted for use in grid operations.

  • Integrated communications: Areas for improvement include: substation automation, demand response, distribution automation, supervisory control and data acquisition (SCADA), energy management systems, wireless mesh networks and other technologies, power-line carrier communications, and fiber-optics. Integrated communications will allow for real-time control, information and data exchange to optimize system reliability, asset utilization, and security.
  • Sensing and measurement: core duties are evaluating congestion and grid stability, monitoring equipment health, energy theft prevention, and control strategies support. Technologies include: advanced microprocessor meters (smart meter) and meter reading equipment, wide-area monitoring systems, (typically based on online readings by Distributed temperature sensing combined with Real time thermal rating (RTTR) systems), electromagnetic signature measurement/analysis, time-of-use and real-time pricing tools, advanced switches and cables, backscatter radio technology, and Digital protective relays.
  • Smart meters.
  • Phasor measurement units. Many in the power systems engineering community believe that the Northeast blackout of 2003 could have been contained to a much smaller area if a wide area phasor measurement network had been in place.
  • Distributed power flow control: power flow control devices clamp onto existing transmission lines to control the flow of power within. Transmission lines enabled with such devices support greater use of renewable energy by providing more consistent, real-time control over how that energy is routed within the grid. This technology enables the grid to more effectively store intermittent energy from renewables for later use.
  • Smart power generation using advanced components: smart power generation is a concept of matching electricity generation with demand using multiple identical generators which can start, stop and operate efficiently at chosen load, independently of the others, making them suitable for base load and peaking power generation. Matching supply and demand, called load balancing, is essential for a stable and reliable supply of electricity. Short-term deviations in the balance lead to frequency variations and a prolonged mismatch results in blackouts. Operators of power transmission systems are charged with the balancing task, matching the power output of all the generators to the load of their electrical grid. The load balancing task has become much more challenging as increasingly intermittent and variable generators such as wind turbines and solar cells are added to the grid, forcing other producers to adapt their output much more frequently than has been required in the past. First two dynamic grid stability power plants utilizing the concept has been ordered by Elering and will be built by Wärtsilä in Kiisa, Estonia (Kiisa Power Plant). Their purpose is to "provide dynamic generation capacity to meet sudden and unexpected drops in the electricity supply." They are scheduled to be ready during 2013 and 2014, and their total output will be 250 MW.
  • Power system automation enables rapid diagnosis of and precise solutions to specific grid disruptions or outages. These technologies rely on and contribute to each of the other four key areas. Three technology categories for advanced control methods are: distributed intelligent agents (control systems), analytical tools (software algorithms and high-speed computers), and operational applications (SCADA, substation automation, demand response, etc.). Using artificial intelligence programming techniques, Fujian power grid in China created a wide area protection system that is rapidly able to accurately calculate a control strategy and execute it. The Voltage Stability Monitoring & Control (VSMC) software uses a sensitivity-based successive linear programming method to reliably determine the optimal control solution.

IT companies disrupting the energy market

Smart grid provides IT-based solutions which the traditional power grid is lacking. These new solutions pave the way of new entrants that were traditionally not related to the energy grid. Technology companies are disrupting the traditional energy market players in several ways. They develop complex distribution systems to meet the more decentralized power generation due to microgrids. Additionally is the increase in data collection bringing many new possibilities for technology companies as deploying transmission grid sensors at a user level and balancing system reserves. The technology in microgrids makes energy consumption cheaper for households than buying from utilities. Additionally, residents can manage their energy consumption easier and more effectively with the connection to smart meters. However, the performances and reliability of microgrids strongly depend on the continuous interaction between power generation, storage and load requirements. A hybrid offering combining renewable energy sources with storing energy sources as coal and gas is showing the hybrid offering of a microgrid serving alone.

Consequences

As a consequence of the entrance of the technology companies in the energy market, utilities and DSO's need to create new business models to keep current customers and to create new customers.

Focus on a customer engagement strategy

DSO's can focus on creating good customer engagement strategies to create loyalty and trust towards the customer. To retain and attract customers who decide to produce their own energy through microgrids, DSO's can offer purchase agreements for the sale of surplus energy that the consumer produces. Indifference from the IT companies, both DSO's and utilities can use their market experience to give consumers energy-use advice and efficiency upgrades to create excellent customer service.

Create alliances with new entered technology companies

Instead of trying to compete against IT companies in their expertise, both utilities and DSO's can try to create alliances with IT companies to create good solutions together. The French utility company Engie did this by buying the service provider Ecova and OpTerra Energy Services.

Renewable energy sources

The generation of renewable energy can often be connected at the distribution level, instead of the transmission grids, which means that DSO's can manage the flows and distribute power locally. This brings new opportunity for DSO's to expand their market by selling energy directly to the consumer. Simultaneously, this is challenging the utilities producing fossil fuels who already are trapped by high costs of aging assets. Stricter regulations for producing traditional energy resources from the government increases the difficulty of stay in business and increases the pressure on traditional energy companies to make the shift to renewable energy sources. An example of a utility changing business model to produce more renewable energy is the Norwegian-based company, Equinor, which was a state-owned oil company which now are heavily investing in renewable energy.

Research

Major programs

IntelliGrid – Created by the Electric Power Research Institute (EPRI), IntelliGrid architecture provides methodology, tools, and recommendations for standards and technologies for utility use in planning, specifying, and procuring IT-based systems, such as advanced metering, distribution automation, and demand response. The architecture also provides a living laboratory for assessing devices, systems, and technology. Several utilities have applied IntelliGrid architecture including Southern California Edison, Long Island Power Authority, Salt River Project, and TXU Electric Delivery. The IntelliGrid Consortium is a public/private partnership that integrates and optimizes global research efforts, funds technology R&D, works to integrate technologies, and disseminates technical information.

Grid 2030 – Grid 2030 is a joint vision statement for the U.S. electrical system developed by the electric utility industry, equipment manufacturers, information technology providers, federal and state government agencies, interest groups, universities, and national laboratories. It covers generation, transmission, distribution, storage, and end-use. The National Electric Delivery Technologies Roadmap is the implementation document for the Grid 2030 vision. The Roadmap outlines the key issues and challenges for modernizing the grid and suggests paths that government and industry can take to build America's future electric delivery system.

Modern Grid Initiative (MGI) is a collaborative effort between the U.S. Department of Energy (DOE), the National Energy Technology Laboratory (NETL), utilities, consumers, researchers, and other grid stakeholders to modernize and integrate the U.S. electrical grid. DOE's Office of Electricity Delivery and Energy Reliability (OE) sponsors the initiative, which builds upon Grid 2030 and the National Electricity Delivery Technologies Roadmap and is aligned with other programs such as GridWise and GridWorks.

GridWise – A DOE OE program focused on developing information technology to modernize the U.S. electrical grid. Working with the GridWise Alliance, the program invests in communications architecture and standards; simulation and analysis tools; smart technologies; test beds and demonstration projects; and new regulatory, institutional, and market frameworks. The GridWise Alliance is a consortium of public and private electricity sector stakeholders, providing a forum for idea exchanges, cooperative efforts, and meetings with policy makers at federal and state levels.

GridWise Architecture Council (GWAC) was formed by the U.S. Department of Energy to promote and enable interoperability among the many entities that interact with the nation's electric power system. The GWAC members are a balanced and respected team representing the many constituencies of the electricity supply chain and users. The GWAC provides industry guidance and tools to articulate the goal of interoperability across the electric system, identify the concepts and architectures needed to make interoperability possible, and develop actionable steps to facilitate the inter operation of the systems, devices, and institutions that encompass the nation's electric system. The GridWise Architecture Council Interoperability Context Setting Framework, V 1.1 defines necessary guidelines and principles.

GridWorks – A DOE OE program focused on improving the reliability of the electric system through modernizing key grid components such as cables and conductors, substations and protective systems, and power electronics. The program's focus includes coordinating efforts on high temperature superconducting systems, transmission reliability technologies, electric distribution technologies, energy storage devices, and GridWise systems.

Pacific Northwest Smart Grid Demonstration Project. - This project is a demonstration across five Pacific Northwest states-Idaho, Montana, Oregon, Washington, and Wyoming. It involves about 60,000 metered customers, and contains many key functions of the future smart grid.

Solar Cities - In Australia, the Solar Cities programme included close collaboration with energy companies to trial smart meters, peak and off-peak pricing, remote switching and related efforts. It also provided some limited funding for grid upgrades.

Smart Grid Energy Research Center (SMERC) - Located at University of California, Los Angeles has dedicated its efforts to large-scale testing of its smart EV charging network technology - WINSmartEV™. It created another platform for a Smart Grid architecture enabling bidirectional flow of information between a utility and consumer end-devices - WINSmartGrid™. SMERC has also developed a demand response (DR) test bed that comprises a Control Center, Demand Response Automation Server (DRAS), Home-Area-Network (HAN), Battery Energy Storage System (BESS), and photovoltaic (PV) panels. These technologies are installed within the Los Angeles Department of Water and Power and Southern California Edison territory as a network of EV chargers, battery energy storage systems, solar panels, DC fast charger, and Vehicle-to-Grid (V2G) units. These platforms, communications and control networks enables UCLA-led projects within the greater Los Angeles to be researched, advanced and tested in partnership with the two key local utilities, SCE and LADWP.

Smart grid modelling

Many different concepts have been used to model intelligent power grids. They are generally studied within the framework of complex systems. In a recent brainstorming session, the power grid was considered within the context of optimal control, ecology, human cognition, glassy dynamics, information theory, microphysics of clouds, and many others. Here is a selection of the types of analyses that have appeared in recent years.

Protection systems that verify and supervise themselves

Pelqim Spahiu and Ian R. Evans in their study introduced the concept of a substation based smart protection and hybrid Inspection Unit.

Kuramoto oscillators

The Kuramoto model is a well-studied system. The power grid has been described in this context as well. The goal is to keep the system in balance, or to maintain phase synchronization (also known as phase locking). Non-uniform oscillators also help to model different technologies, different types of power generators, patterns of consumption, and so on. The model has also been used to describe the synchronization patterns in the blinking of fireflies.

Bio-systems

Power grids have been related to complex biological systems in many other contexts. In one study, power grids were compared to the dolphin social network. These creatures streamline or intensify communication in case of an unusual situation. The intercommunications that enable them to survive are highly complex.

Random fuse networks

In percolation theory, random fuse networks have been studied. The current density might be too low in some areas, and too strong in others. The analysis can therefore be used to smooth out potential problems in the network. For instance, high-speed computer analysis can predict blown fuses and correct for them, or analyze patterns that might lead to a power outage. It is difficult for humans to predict the long term patterns in complex networks, so fuse or diode networks are used instead.

Smart Grid Communication Network

Network Simulators are used to simulate/emulate network communication effects. This typically involves setting up a lab with the smart grid devices, applications etc. with the virtual network being provided by the network simulator.

Neural networks

Neural networks have been considered for power grid management as well. Electric power systems can be classified in multiple different ways: non-linear, dynamic, discrete, or random. Artificial Neural Networks (ANNs) attempt to solve the most difficult of these problems, the non-linear problems.

Demand Forecasting

One application of ANNs is in demand forecasting. In order for grids to operate economically and reliably, demand forecasting is essential, because it is used to predict the amount of power that will be consumed by the load. This is dependent on weather conditions, type of day, random events, incidents, etc. For non-linear loads though, the load profile isn't smooth and as predictable, resulting in higher uncertainty and less accuracy using the traditional Artificial Intelligence models. Some factors that ANNs consider when developing these sort of models: classification of load profiles of different customer classes based on the consumption of electricity, increased responsiveness of demand to predict real time electricity prices as compared to conventional grids, the need to input past demand as different components, such as peak load, base load, valley load, average load, etc. instead of joining them into a single input, and lastly, the dependence of the type on specific input variables. An example of the last case would be given the type of day, whether its weekday or weekend, that wouldn't have much of an effect on Hospital grids, but it'd be a big factor in resident housing grids' load profile.

Markov processes

As wind power continues to gain popularity, it becomes a necessary ingredient in realistic power grid studies. Off-line storage, wind variability, supply, demand, pricing, and other factors can be modelled as a mathematical game. Here the goal is to develop a winning strategy. Markov processes have been used to model and study this type of system.

Maximum entropy

All of these methods are, in one way or another, maximum entropy methods, which is an active area of research. This goes back to the ideas of Shannon, and many other researchers who studied communication networks. Continuing along similar lines today, modern wireless network research often considers the problem of network congestion, and many algorithms are being proposed to minimize it, including game theory, innovative combinations of FDMA, TDMA, and others.

Economics

Market outlook

In 2009, the US smart grid industry was valued at about $21.4 billion – by 2014, it will exceed at least $42.8 billion. Given the success of the smart grids in the U.S., the world market is expected to grow at a faster rate, surging from $69.3 billion in 2009 to $171.4 billion by 2014. With the segments set to benefit the most will be smart metering hardware sellers and makers of software used to transmit and organize the massive amount of data collected by meters.

The size of Smart Grid Market was valued at over US$30 billion in 2017 and is set to expand over 11% CAGR to hit US$70 Billion by 2024. Growing need to digitalize the power sector driven by ageing electrical grid infrastructure will stimulate the global market size. The industry is primarily driven by favorable government regulations and mandates along with rising share of renewables in the global energy mix. According to the International Energy Agency (IEA), global investments in digital electricity infrastructure was over US$50 billion in 2017.

A 2011 study from the Electric Power Research Institute concludes that investment in a U.S. smart grid will cost up to $476 billion over 20 years but will provide up to $2 trillion in customer benefits over that time. In 2015, the World Economic Forum reported a transformational investment of more than $7.6 trillion by members of the OECD is needed over the next 25 years (or $300 billion per year) to modernize, expand, and decentralize the electricity infrastructure with technical innovation as key to the transformation. A 2019 study from International Energy Agency estimates that the current (depriciated) value of the US electric grid is more than USD 1 trillion. The total cost of replacing it with a smart grid is estimated to be more than USD 4 trillion. If smart grids are deployed fully across the US, the country expects to save USD 130 billion annually.

General economics developments

As customers can choose their electricity suppliers, depending on their different tariff methods, the focus of transportation costs will be increased. Reduction of maintenance and replacements costs will stimulate more advanced control.

A smart grid precisely limits electrical power down to the residential level, network small-scale distributed energy generation and storage devices, communicate information on operating status and needs, collect information on prices and grid conditions, and move the grid beyond central control to a collaborative network.

US and UK savings estimates and concerns

A 2003 United States Department of Energy study calculated that internal modernization of US grids with smart grid capabilities would save between 46 and 117 billion dollars over the next 20 years if implemented within a few years of the study. As well as these industrial modernization benefits, smart grid features could expand energy efficiency beyond the grid into the home by coordinating low priority home devices such as water heaters so that their use of power takes advantage of the most desirable energy sources. Smart grids can also coordinate the production of power from large numbers of small power producers such as owners of rooftop solar panels — an arrangement that would otherwise prove problematic for power systems operators at local utilities.

One important question is whether consumers will act in response to market signals. The U.S. Department of Energy (DOE) as part of the American Recovery and Reinvestment Act Smart Grid Investment Grant and Demonstrations Program funded special consumer behavior studies to examine the acceptance, retention, and response of consumers subscribed to time-based utility rate programs that involve advanced metering infrastructure and customer systems such as in-home displays and programmable communicating thermostats.

Another concern is that the cost of telecommunications to fully support smart grids may be prohibitive. A less expensive communication mechanism is proposed using a form of "dynamic demand management" where devices shave peaks by shifting their loads in reaction to grid frequency. Grid frequency could be used to communicate load information without the need of an additional telecommunication network, but it would not support economic bargaining or quantification of contributions.

Although there are specific and proven smart grid technologies in use, smart grid is an aggregate term for a set of related technologies on which a specification is generally agreed, rather than a name for a specific technology. Some of the benefits of such a modernized electricity network include the ability to reduce power consumption at the consumer side during peak hours, called demand side management; enabling grid connection of distributed generation power (with photovoltaic arrays, small wind turbines, micro hydro, or even combined heat power generators in buildings); incorporating grid energy storage for distributed generation load balancing; and eliminating or containing failures such as widespread power grid cascading failures. The increased efficiency and reliability of the smart grid is expected to save consumers money and help reduce CO
2
emissions.

Oppositions and concerns

Most opposition and concerns have centered on smart meters and the items (such as remote control, remote disconnect, and variable rate pricing) enabled by them. Where opposition to smart meters is encountered, they are often marketed as "smart grid" which connects smart grid to smart meters in the eyes of opponents. Specific points of opposition or concern include:

  • consumer concerns over privacy, e.g. use of usage data by law enforcement
  • social concerns over "fair" availability of electricity
  • concern that complex rate systems (e.g. variable rates) remove clarity and accountability, allowing the supplier to take advantage of the customer
  • concern over remotely controllable "kill switch" incorporated into most smart meters
  • social concerns over Enron style abuses of information leverage
  • concerns over giving the government mechanisms to control the use of all power using activities
  • concerns over RF emissions from smart meters

Security

While modernization of electrical grids into smart grids allows for optimization of everyday processes, a smart grid, being online, can be vulnerable to cyberattacks. Transformers which increase the voltage of electricity created at power plants for long-distance travel, transmission lines themselves, and distribution lines which deliver the electricity to its consumers are particularly susceptible. These systems rely on sensors which gather information from the field and then deliver it to control centers, where algorithms automate analysis and decision-making processes. These decisions are sent back to the field, where existing equipment execute them. Hackers have the potential to disrupt these automated control systems, severing the channels which allow generated electricity to be utilized. This is called a denial of service or DoS attack. They can also launch integrity attacks which corrupt information being transmitted along the system as well as desynchronization attacks which affect when such information is delivered to the appropriate location. Additionally, intruders can again access via renewable energy generation systems and smart meters connected to the grid, taking advantage of more specialized weaknesses or ones whose security has not been prioritized. Because a smart grid has a large number of access points, like smart meters, defending all of its weak points can prove difficult. There is also concern on the security of the infrastructure, primarily that involving communications technology. Concerns chiefly center around the communications technology at the heart of the smart grid. Designed to allow real-time contact between utilities and meters in customers' homes and businesses, there is a risk that these capabilities could be exploited for criminal or even terrorist actions. One of the key capabilities of this connectivity is the ability to remotely switch off power supplies, enabling utilities to quickly and easily cease or modify supplies to customers who default on payment. This is undoubtedly a massive boon for energy providers, but also raises some significant security issues. Cybercriminals have infiltrated the U.S. electric grid before on numerous occasions. Aside from computer infiltration, there are also concerns that computer malware like Stuxnet, which targeted SCADA systems which are widely used in industry, could be used to attack a smart grid network.

Electricity theft is a concern in the U.S. where the smart meters being deployed use RF technology to communicate with the electricity transmission network. People with knowledge of electronics can devise interference devices to cause the smart meter to report lower than actual usage. Similarly, the same technology can be employed to make it appear that the energy the consumer is using is being used by another customer, increasing their bill.

The damage from a well-executed, sizable cyberattack could be extensive and long-lasting. One incapacitated substation could take from nine days to over a year to repair, depending on the nature of the attack. It can also cause an hours-long outage in a small radius. It could have an immediate effect on transportation infrastructure, as traffic lights and other routing mechanisms as well as ventilation equipment for underground roadways is reliant on electricity. Additionally, infrastructure which relies on the electric grid, including wastewater treatment facilities, the information technology sector, and communications systems could be impacted.

The December 2015 Ukraine power grid cyberattack, the first recorded of its kind, disrupted services to nearly a quarter of a million people by bringing substations offline. The Council on Foreign Relations has noted that states are most likely to be the perpetrators of such an attack as they have access to the resources to carry one out despite the high level of difficulty of doing so. Cyber intrusions can be used as portions of a larger offensive, military or otherwise. Some security experts warn that this type of event is easily scalable to grids elsewhere. Insurance company Lloyd's of London has already modeled the outcome of a cyberattack on the Eastern Interconnection, which has the potential to impact 15 states, put 93 million people in the dark, and cost the country's economy anywhere from $243 billion to $1 trillion in various damages.

According to the U.S. House of Representatives Subcommittee on Economic Development, Public Buildings, and Emergency Management, the electric grid has already seen a sizable number of cyber intrusions, with two in every five aiming to incapacitate it. As such, the U.S. Department of Energy has prioritized research and development to decrease the electric grid's vulnerability to cyberattacks, citing them as an "imminent danger" in its 2017 Quadrennial Energy Review. The Department of Energy has also identified both attack resistance and self-healing as major keys to ensuring that today's smart grid is future-proof. While there are regulations already in place, namely the Critical Infrastructure Protection Standards introduced by the North America Electric Reliability Council, a significant number of them are suggestions rather than mandates. Most electricity generation, transmission, and distribution facilities and equipment are owned by private stakeholders, further complicating the task of assessing adherence to such standards. Additionally, even if utilities want to fully comply, they may find that it is too expensive to do so.

Some experts argue that the first step to increasing the cyber defenses of the smart electric grid is completing a comprehensive risk analysis of existing infrastructure, including research of software, hardware, and communication processes. Additionally, as intrusions themselves can provide valuable information, it could be useful to analyze system logs and other records of their nature and timing. Common weaknesses already identified using such methods by the Department of Homeland Security include poor code quality, improper authentication, and weak firewall rules. Once this step is completed, some suggest that it makes sense to then complete an analysis of the potential consequences of the aforementioned failures or shortcomings. This includes both immediate consequences as well as second- and third-order cascading effects on parallel systems. Finally, risk mitigation solutions, which may include simple remediation of infrastructure inadequacies or novel strategies, can be deployed to address the situation. Some such measures include recoding of control system algorithms to make them more able to resist and recover from cyberattacks or preventive techniques that allow more efficient detection of unusual or unauthorized changes to data. Strategies to account for human error which can compromise systems include educating those who work in the field to be wary of strange USB drives, which can introduce malware if inserted, even if just to check their contents.

Other solutions include utilizing transmission substations, constrained SCADA networks, policy based data sharing, and attestation for constrained smart meters.

Transmission substations utilize one-time signature authentication technologies and one-way hash chain constructs. These constraints have since been remedied with the creation of a fast-signing and verification technology and buffering-free data processing.

A similar solution has been constructed for constrained SCADA networks. This involves applying a Hash-Based Message Authentication Code to byte streams, converting the random-error detection available on legacy systems to a mechanism that guarantees data authenticity.

Policy-based data sharing utilizes GPS-clock-synchronized-fine-grain power grid measurements to provide increased grid stability and reliability. It does this through synchro-phasor requirements that are gathered by PMUs.

Attestation for constrained smart meters faces a slightly different challenge, however. One of the biggest issues with attestation for constrained smart meters is that in order to prevent energy theft, and similar attacks, cyber security providers have to make sure that the devices’ software is authentic. To combat this problem, an architecture for constrained smart networks has been created and implemented at a low level in the embedded system.

Other challenges to adoption

Before a utility installs an advanced metering system, or any type of smart system, it must make a business case for the investment. Some components, like the power system stabilizers (PSS) installed on generators are very expensive, require complex integration in the grid's control system, are needed only during emergencies, and are only effective if other suppliers on the network have them. Without any incentive to install them, power suppliers don't. Most utilities find it difficult to justify installing a communications infrastructure for a single application (e.g. meter reading). Because of this, a utility must typically identify several applications that will use the same communications infrastructure – for example, reading a meter, monitoring power quality, remote connection and disconnection of customers, enabling demand response, etc. Ideally, the communications infrastructure will not only support near-term applications, but unanticipated applications that will arise in the future. Regulatory or legislative actions can also drive utilities to implement pieces of a smart grid puzzle. Each utility has a unique set of business, regulatory, and legislative drivers that guide its investments. This means that each utility will take a different path to creating their smart grid and that different utilities will create smart grids at different adoption rates.

Some features of smart grids draw opposition from industries that currently are, or hope to provide similar services. An example is competition with cable and DSL Internet providers from broadband over powerline internet access. Providers of SCADA control systems for grids have intentionally designed proprietary hardware, protocols and software so that they cannot inter-operate with other systems in order to tie its customers to the vendor.

The incorporation of digital communications and computer infrastructure with the grid's existing physical infrastructure poses challenges and inherent vulnerabilities. According to IEEE Security and Privacy Magazine, the smart grid will require that people develop and use large computer and communication infrastructure that supports a greater degree of situational awareness and that allows for more specific command and control operations. This process is necessary to support major systems such as demand-response wide-area measurement and control, storage and transportation of electricity, and the automation of electric distribution.

Power Theft / Power Loss

Various "smart grid" systems have dual functions. This includes Advanced Metering Infrastructure systems which, when used with various software can be used to detect power theft and by process of elimination, detect where equipment failures have taken place. These are in addition to their primary functions of eliminating the need for human meter reading and measuring the time-of-use of electricity.

The worldwide power loss including theft is estimated at approximately two-hundred billion dollars annually.

Electricity theft also represents a major challenge when providing reliable electrical service in developing countries.

Deployments and attempted deployments

Enel. The earliest, and one of the largest, example of a smart grid is the Italian system installed by Enel S.p.A. of Italy. Completed in 2005, the Telegestore project was highly unusual in the utility world because the company designed and manufactured their own meters, acted as their own system integrator, and developed their own system software. The Telegestore project is widely regarded as the first commercial scale use of smart grid technology to the home, and delivers annual savings of 500 million euro at a project cost of 2.1 billion euro.

US Dept. of Energy - ARRA Smart Grid Project: One of the largest deployment programs in the world to-date is the U.S. Dept. of Energy's Smart Grid Program funded by the American Recovery and Reinvestment Act of 2009. This program required matching funding from individual utilities. A total of over $9 billion in Public/Private funds were invested as part of this program. Technologies included Advanced Metering Infrastructure, including over 65 million Advanced "Smart" Meters, Customer Interface Systems, Distribution & Substation Automation, Volt/VAR Optimization Systems, over 1,000 Synchrophasors, Dynamic Line Rating, Cyber Security Projects, Advanced Distribution Management Systems, Energy Storage Systems, and Renewable Energy Integration Projects. This program consisted of Investment Grants (matching), Demonstration Projects, Consumer Acceptance Studies, and Workforce Education Programs. Reports from all individual utility programs as well as overall impact reports will be completed by the second quarter of 2015.

Austin, Texas. In the US, the city of Austin, Texas has been working on building its smart grid since 2003, when its utility first replaced 1/3 of its manual meters with smart meters that communicate via a wireless mesh network. It currently manages 200,000 devices real-time (smart meters, smart thermostats, and sensors across its service area), and expects to be supporting 500,000 devices real-time in 2009 servicing 1 million consumers and 43,000 businesses.

Boulder, Colorado completed the first phase of its smart grid project in August 2008. Both systems use the smart meter as a gateway to the home automation network (HAN) that controls smart sockets and devices. Some HAN designers favor decoupling control functions from the meter, out of concern of future mismatches with new standards and technologies available from the fast moving business segment of home electronic devices.

Hydro One, in Ontario, Canada is in the midst of a large-scale Smart Grid initiative, deploying a standards-compliant communications infrastructure from Trilliant. By the end of 2010, the system will serve 1.3 million customers in the province of Ontario. The initiative won the "Best AMR Initiative in North America" award from the Utility Planning Network.

The City of Mannheim in Germany is using realtime Broadband Powerline (BPL) communications in its Model City Mannheim "MoMa" project.

Adelaide in Australia also plans to implement a localised green Smart Grid electricity network in the Tonsley Park redevelopment.

Sydney also in Australia, in partnership with the Australian Government implemented the Smart Grid, Smart City program.

Évora. InovGrid is an innovative project in Évora, Portugal that aims to equip the electricity grid with information and devices to automate grid management, improve service quality, reduce operating costs, promote energy efficiency and environmental sustainability, and increase the penetration of renewable energies and electric vehicles. It will be possible to control and manage the state of the entire electricity distribution grid at any given instant, allowing suppliers and energy services companies to use this technological platform to offer consumers information and added-value energy products and services. This project to install an intelligent energy grid places Portugal and EDP at the cutting edge of technological innovation and service provision in Europe.

E-Energy - In the so-called E-Energy projects several German utilities are creating first nucleolus in six independent model regions. A technology competition identified this model regions to carry out research and development activities with the main objective to create an "Internet of Energy."

Massachusetts. One of the first attempted deployments of "smart grid" technologies in the United States was rejected in 2009 by electricity regulators in the Commonwealth of Massachusetts, a US state

According to an article in the Boston Globe, Northeast Utilities' Western Massachusetts Electric Co. subsidiary actually attempted to create a "smart grid" program using public subsidies that would switch low income customers from post-pay to pre-pay billing (using "smart cards") in addition to special hiked "premium" rates for electricity used above a predetermined amount. This plan was rejected by regulators as it "eroded important protections for low-income customers against shutoffs". According to the Boston Globe, the plan "unfairly targeted low-income customers and circumvented Massachusetts laws meant to help struggling consumers keep the lights on". A spokesman for an environmental group supportive of smart grid plans and Western Massachusetts' Electric's aforementioned "smart grid" plan, in particular, stated "If used properly, smart grid technology has a lot of potential for reducing peak demand, which would allow us to shut down some of the oldest, dirtiest power plants... It’s a tool."

The eEnergy Vermont consortium is a US statewide initiative in Vermont, funded in part through the American Recovery and Reinvestment Act of 2009, in which all of the electric utilities in the state have rapidly adopted a variety of Smart Grid technologies, including about 90% Advanced Metering Infrastructure deployment, and are presently evaluating a variety of dynamic rate structures.

In the Netherlands a large-scale project (>5000 connections, >20 partners) was initiated to demonstrate integrated smart grids technologies, services and business cases.

LIFE Factory Microgrid (LIFE13 ENV / ES / 000700) is a demonstrative project that is part of the LIFE+ 2013 program (European Commission), whose main objective is to demonstrate, through the implementation of a full-scale industrial smartgrid that microgrids can become one of the most suitable solutions for energy generation and management in factories that want to minimize their environmental impact.

EPB in Chattanooga, TN is a municipally-owned electric utility that started construction of a smart grid in 2008, receiving a $111,567,606 grant from the US DOE in 2009 to expedite construction and implementation (for a total budget of $232,219,350). Deployment of power-line interrupters (1170 units) was completed in April 2012, and deployment of smart meters (172,079 units) was completed in 2013. The smart grid's backbone fiber-optic system was also used to provide the first gigabit-speed internet connection to residential customers in the US through the Fiber to the Home initiative, and now speeds of up to 10 gigabits per second are available to residents. The smart grid is estimated to have reduced power outages by an average of 60%, saving the city about 60 million dollars annually. It has also reduced the need for "truck rolls" to scout and troubleshoot faults, resulting in an estimated reduction of 630,000 truck driving miles, and 4.7 million pounds of carbon emissions. In January 2016, EPB became the first major power distribution system to earn Performance Excellence in Electricity Renewal (PEER) certification.

OpenADR Implementations

Certain deployments utilize the OpenADR standard for load shedding and demand reduction during higher demand periods.

China

The smart grid market in China is estimated to be $22.3 billion with a projected growth to $61.4 billion by 2015. Honeywell is developing a demand response pilot and feasibility study for China with the State Grid Corp. of China using the OpenADR demand response standard. The State Grid Corp., the Chinese Academy of Science, and General Electric intend to work together to develop standards for China's smart grid rollout.

United Kingdom

The OpenADR standard was demonstrated in Bracknell, England, where peak use in commercial buildings was reduced by 45 percent. As a result of the pilot, the Scottish and Southern Energy (SSE) said it would connect up to 30 commercial and industrial buildings in Thames Valley, west of London, to a demand response program.

United States

In 2009, the US Department of Energy awarded an $11 million grant to Southern California Edison and Honeywell for a demand response program that automatically turns down energy use during peak hours for participating industrial customers. The Department of Energy awarded an $11.4 million grant to Honeywell to implement the program using the OpenADR standard.

Hawaiian Electric Co. (HECO) is implementing a two-year pilot project to test the ability of an ADR program to respond to the intermittence of wind power. Hawaii has a goal to obtain 70 percent of its power from renewable sources by 2030. HECO will give customers incentives for reducing power consumption within 10 minutes of a notice.

Guidelines, standards and user groups

Part of the IEEE Smart Grid Initiative, IEEE 2030.2 represents an extension of the work aimed at utility storage systems for transmission and distribution networks. The IEEE P2030 group expects to deliver early 2011 an overarching set of guidelines on smart grid interfaces. The new guidelines will cover areas including batteries and supercapacitors as well as flywheels. The group has also spun out a 2030.1 effort drafting guidelines for integrating electric vehicles into the smart grid.

IEC TC 57 has created a family of international standards that can be used as part of the smart grid. These standards include IEC 61850 which is an architecture for substation automation, and IEC 61970/61968 – the Common Information Model (CIM). The CIM provides for common semantics to be used for turning data into information.

OpenADR is an open-source smart grid communications standard used for demand response applications. It is typically used to send information and signals to cause electrical power-using devices to be turned off during periods of higher demand.

MultiSpeak has created a specification that supports distribution functionality of the smart grid. MultiSpeak has a robust set of integration definitions that supports nearly all of the software interfaces necessary for a distribution utility or for the distribution portion of a vertically integrated utility. MultiSpeak integration is defined using extensible markup language (XML) and web services.

The IEEE has created a standard to support synchrophasors – C37.118.

The UCA International User Group discusses and supports real world experience of the standards used in smart grids.

A utility task group within LonMark International deals with smart grid related issues.

There is a growing trend towards the use of TCP/IP technology as a common communication platform for smart meter applications, so that utilities can deploy multiple communication systems, while using IP technology as a common management platform.

IEEE P2030 is an IEEE project developing a "Draft Guide for Smart Grid Interoperability of Energy Technology and Information Technology Operation with the Electric Power System (EPS), and End-Use Applications and Loads".

NIST has included ITU-T G.hn as one of the "Standards Identified for Implementation" for the Smart Grid "for which it believed there was strong stakeholder consensus". G.hn is standard for high-speed communications over power lines, phone lines and coaxial cables.

OASIS EnergyInterop' – An OASIS technical committee developing XML standards for energy interoperation. Its starting point is the California OpenADR standard.

Under the Energy Independence and Security Act of 2007 (EISA), NIST is charged with overseeing the identification and selection of hundreds of standards that will be required to implement the Smart Grid in the U.S. These standards will be referred by NIST to the Federal Energy Regulatory Commission (FERC). This work has begun, and the first standards have already been selected for inclusion in NIST's Smart Grid catalog. However, some commentators have suggested that the benefits that could be realized from Smart Grid standardization could be threatened by a growing number of patents that cover Smart Grid architecture and technologies. If patents that cover standardized Smart Grid elements are not revealed until technology is broadly distributed throughout the network ("locked-in"), significant disruption could occur when patent holders seek to collect unanticipated rents from large segments of the market.

GridWise Alliance rankings

In November 2017 the non-profit GridWise Alliance along with Clean Edge Inc., a clean energy group, released rankings for all 50 states in their efforts to modernize the electric grid. California was ranked number one. The other top states were Illinois, Texas, Maryland, Oregon, Arizona, the District of Columbia, New York, Nevada and Delaware. "The 30-plus page report from the GridWise Alliance, which represents stakeholders that design, build and operate the electric grid, takes a deep dive into grid modernization efforts across the country and ranks them by state."

Tuesday, December 8, 2020

Kirchhoff's law of thermal radiation

Gustav Kirchhoff (1824–1887)

In heat transfer, Kirchhoff's law of thermal radiation refers to wavelength-specific radiative emission and absorption by a material body in thermodynamic equilibrium, including radiative exchange equilibrium.

A body at temperature T radiates electromagnetic energy. A perfect black body in thermodynamic equilibrium absorbs all light that strikes it, and radiates energy according to a unique law of radiative emissive power for temperature T, universal for all perfect black bodies. Kirchhoff's law states that:

For a body of any arbitrary material emitting and absorbing thermal electromagnetic radiation at every wavelength in thermodynamic equilibrium, the ratio of its emissive power to its dimensionless coefficient of absorption is equal to a universal function only of radiative wavelength and temperature. That universal function describes the perfect black-body emissive power.

Here, the dimensionless coefficient of absorption (or the absorptivity) is the fraction of incident light (power) that is absorbed by the body when it is radiating and absorbing in thermodynamic equilibrium.

In slightly different terms, the emissive power of an arbitrary opaque body of fixed size and shape at a definite temperature can be described by a dimensionless ratio, sometimes called the emissivity: the ratio of the emissive power of the body to the emissive power of a black body of the same size and shape at the same fixed temperature. With this definition, Kirchhoff's law states, in simpler language:

For an arbitrary body emitting and absorbing thermal radiation in thermodynamic equilibrium, the emissivity is equal to the absorptivity.

In some cases, emissive power and absorptivity may be defined to depend on angle, as described below. The condition of thermodynamic equilibrium is necessary in the statement, because the equality of emissivity and absorptivity often does not hold when the material of the body is not in thermodynamic equilibrium.

Kirchhoff's law has another corollary: the emissivity cannot exceed one (because the absorptivity cannot, by conservation of energy), so it is not possible to thermally radiate more energy than a black body, at equilibrium. In negative luminescence the angle and wavelength integrated absorption exceeds the material's emission, however, such systems are powered by an external source and are therefore not in thermodynamic equilibrium.

History

Before Kirchhoff's law was recognized, it had been experimentally established that a good absorber is a good emitter, and a poor absorber is a poor emitter. Naturally, a good reflector must be a poor absorber. This is why, for example, lightweight emergency thermal blankets are based on reflective metallic coatings: they lose little heat by radiation.

Kirchhoff's great insight was to recognize the universality and uniqueness of the function that describes the black body emissive power. But he did not know the precise form or character of that universal function. Attempts were made by Lord Rayleigh and Sir James Jeans 1900–1905 to describe it in classical terms, resulting in Rayleigh–Jeans law. This law turned out to be inconsistent yielding the ultraviolet catastrophe. The correct form of the law was found by Max Planck in 1900, assuming quantized emission of radiation, and is termed Planck's law. This marks the advent of quantum mechanics.

Theory

In a blackbody enclosure that contains electromagnetic radiation with a certain amount of energy at thermodynamic equilibrium, this "photon gas" will have a Planck distribution of energies.

One may suppose a second system, a cavity with walls that are opaque, rigid, and not perfectly reflective to any wavelength, to be brought into connection, through an optical filter, with the blackbody enclosure, both at the same temperature. Radiation can pass from one system to the other. For example, suppose in the second system, the density of photons at narrow frequency band around wavelength were higher than that of the first system. If the optical filter passed only that frequency band, then there would be a net transfer of photons, and their energy, from the second system to the first. This is in violation of the second law of thermodynamics, which requires that there can be no net transfer of heat between two bodies at the same temperature.

In the second system, therefore, at each frequency, the walls must absorb and emit energy in such a way as to maintain the black body distribution. Hence absorptivity and emissivity must be equal. The absorptivity of the wall is the ratio of the energy absorbed by the wall to the energy incident on the wall, for a particular wavelength. Thus the absorbed energy is where is the intensity of black body radiation at wavelength and temperature . Independent of the condition of thermal equilibrium, the emissivity of the wall is defined as the ratio of emitted energy to the amount that would be radiated if the wall were a perfect black body. The emitted energy is thus where is the emissivity at wavelength . For the maintenance of thermal equilibrium, these two quantities must be equal, or else the distribution of photon energies in the cavity will deviate from that of a black body. This yields Kirchhoff's law:

By a similar, but more complicated argument, it can be shown that, since black body radiation is equal in every direction (isotropic), the emissivity and the absorptivity, if they happen to be dependent on direction, must again be equal for any given direction.

Average and overall absorptivity and emissivity data are often given for materials with values which differ from each other. For example, white paint is quoted as having an absorptivity of 0.16, while having an emissivity of 0.93. This is because the absorptivity is averaged with weighting for the solar spectrum, while the emissivity is weighted for the emission of the paint itself at normal ambient temperatures. The absorptivity quoted in such cases is being calculated by:

while the average emissivity is given by:

Where is the emission spectrum of the sun, and is the emission spectrum of the paint. Although, by Kirchhoff's law, in the above equations, the above averages and are not generally equal to each other. The white paint will serve as a very good insulator against solar radiation, because it is very reflective of the solar radiation, and although it therefore emits poorly in the solar band, its temperature will be around room temperature, and it will emit whatever radiation it has absorbed in the infrared, where its emission coefficient is high.

Black bodies

Near-black materials

It has long been known that a lamp-black coating will make a body nearly black. Some other materials are nearly black in particular wavelength bands. Such materials do not survive all the very high temperatures that are of interest.

An improvement on lamp-black is found in manufactured carbon nanotubes. Nano-porous materials can achieve refractive indices nearly that of vacuum, in one case obtaining average reflectance of 0.045%.

Opaque bodies

Bodies that are opaque to thermal radiation that falls on them are valuable in the study of heat radiation. Planck analyzed such bodies with the approximation that they be considered topologically to have an interior and to share an interface. They share the interface with their contiguous medium, which may be rarefied material such as air, or transparent material, through which observations can be made. The interface is not a material body and can neither emit nor absorb. It is a mathematical surface belonging jointly to the two media that touch it. It is the site of refraction of radiation that penetrates it and of reflection of radiation that does not. As such it obeys the Helmholtz reciprocity principle. The opaque body is considered to have a material interior that absorbs all and scatters or transmits none of the radiation that reaches it through refraction at the interface. In this sense the material of the opaque body is black to radiation that reaches it, while the whole phenomenon, including the interior and the interface, does not show perfect blackness. In Planck's model, perfectly black bodies, which he noted do not exist in nature, besides their opaque interior, have interfaces that are perfectly transmitting and non-reflective.

Cavity radiation

The walls of a cavity can be made of opaque materials that absorb significant amounts of radiation at all wavelengths. It is not necessary that every part of the interior walls be a good absorber at every wavelength. The effective range of absorbing wavelengths can be extended by the use of patches of several differently absorbing materials in parts of the interior walls of the cavity. In thermodynamic equilibrium the cavity radiation will precisely obey Planck's law. In this sense, thermodynamic equilibrium cavity radiation may be regarded as thermodynamic equilibrium black-body radiation to which Kirchhoff's law applies exactly, though no perfectly black body in Kirchhoff's sense is present.

A theoretical model considered by Planck consists of a cavity with perfectly reflecting walls, initially with no material contents, into which is then put a small piece of carbon. Without the small piece of carbon, there is no way for non-equilibrium radiation initially in the cavity to drift towards thermodynamic equilibrium. When the small piece of carbon is put in, it transduces amongst radiation frequencies so that the cavity radiation comes to thermodynamic equilibrium.

A hole in the wall of a cavity

For experimental purposes, a hole in a cavity can be devised to provide a good approximation to a black surface, but will not be perfectly Lambertian, and must be viewed from nearly right angles to get the best properties. The construction of such devices was an important step in the empirical measurements that led to the precise mathematical identification of Kirchhoff's universal function, now known as Planck's law.

Kirchhoff's perfect black bodies

Planck also noted that the perfect black bodies of Kirchhoff do not occur in physical reality. They are theoretical fictions. Kirchhoff's perfect black bodies absorb all the radiation that falls on them, right in an infinitely thin surface layer, with no reflection and no scattering. They emit radiation in perfect accord with Lambert's cosine law.

Original statements

Gustav Kirchhoff stated his law in several papers in 1859 and 1860, and then in 1862 in an appendix to his collected reprints of those and some related papers.

Prior to Kirchhoff's studies, it was known that for total heat radiation, the ratio of emissive power to absorptive ratio was the same for all bodies emitting and absorbing thermal radiation in thermodynamic equilibrium. This means that a good absorber is a good emitter. Naturally, a good reflector is a poor absorber. For wavelength specificity, prior to Kirchhoff, the ratio was shown experimentally by Balfour Stewart to be the same for all bodies, but the universal value of the ratio had not been explicitly considered in its own right as a function of wavelength and temperature.

Kirchhoff's original contribution to the physics of thermal radiation was his postulate of a perfect black body radiating and absorbing thermal radiation in an enclosure opaque to thermal radiation and with walls that absorb at all wavelengths. Kirchhoff's perfect black body absorbs all the radiation that falls upon it.

Every such black body emits from its surface with a spectral radiance that Kirchhoff labeled I (for specific intensity, the traditional name for spectral radiance).

Kirchhoff's postulated spectral radiance I was a universal function, one and the same for all black bodies, only depending on wavelength and temperature.

The precise mathematical expression for that universal function I was very much unknown to Kirchhoff, and it was just postulated to exist, until its precise mathematical expression was found in 1900 by Max Planck. It is nowadays referred to as Planck's law.

Then, at each wavelength, for thermodynamic equilibrium in an enclosure, opaque to heat rays, with walls that absorb some radiation at every wavelength:

For an arbitrary body radiating and emitting thermal radiation, the ratio E / A between the emissive spectral radiance, E, and the dimensionless absorptive ratio, A, is one and the same for all bodies at a given temperature. That ratio E / A is equal to the emissive spectral radiance I of a perfect black body, a universal function only of wavelength and temperature.

 

Albedo

From Wikipedia, the free encyclopedia

The percentage of diffusely reflected sunlight relative to various surface conditions

Albedo (/ælˈbd/) (Latin: albedo, meaning 'whiteness') is the measure of the diffuse reflection of solar radiation out of the total solar radiation and measured on a scale from 0, corresponding to a black body that absorbs all incident radiation, to 1, corresponding to a body that reflects all incident radiation.

Surface albedo is defined as the ratio of radiosity to the irradiance (flux per unit area) received by a surface. The proportion reflected is not only determined by properties of the surface itself, but also by the spectral and angular distribution of solar radiation reaching the Earth's surface. These factors vary with atmospheric composition, geographic location and time (see position of the Sun). While bi-hemispherical reflectance is calculated for a single angle of incidence (i.e., for a given position of the Sun), albedo is the directional integration of reflectance over all solar angles in a given period. The temporal resolution may range from seconds (as obtained from flux measurements) to daily, monthly, or annual averages.

Unless given for a specific wavelength (spectral albedo), albedo refers to the entire spectrum of solar radiation. Due to measurement constraints, it is often given for the spectrum in which most solar energy reaches the surface (between 0.3 and 3 μm). This spectrum includes visible light (0.4–0.7 μm), which explains why surfaces with a low albedo appear dark (e.g., trees absorb most radiation), whereas surfaces with a high albedo appear bright (e.g., snow reflects most radiation).

Albedo is an important concept in climatology, astronomy, and environmental management (e.g., as part of the Leadership in Energy and Environmental Design (LEED) program for sustainable rating of buildings). The average albedo of the Earth from the upper atmosphere, its planetary albedo, is 30–35% because of cloud cover, but widely varies locally across the surface because of different geological and environmental features.

The term albedo was introduced into optics by Johann Heinrich Lambert in his 1760 work Photometria.

Terrestrial albedo

Any albedo in visible light falls within a range of about 0.9 for fresh snow to about 0.04 for charcoal, one of the darkest substances. Deeply shadowed cavities can achieve an effective albedo approaching the zero of a black body. When seen from a distance, the ocean surface has a low albedo, as do most forests, whereas desert areas have some of the highest albedos among landforms. Most land areas are in an albedo range of 0.1 to 0.4. The average albedo of Earth is about 0.3. This is far higher than for the ocean primarily because of the contribution of clouds.

2003–2004 mean annual clear-sky and total-sky albedo

Earth's surface albedo is regularly estimated via Earth observation satellite sensors such as NASA's MODIS instruments on board the Terra and Aqua satellites, and the CERES instrument on the Suomi NPP and JPSS. As the amount of reflected radiation is only measured for a single direction by satellite, not all directions, a mathematical model is used to translate a sample set of satellite reflectance measurements into estimates of directional-hemispherical reflectance and bi-hemispherical reflectance (e.g.). These calculations are based on the bidirectional reflectance distribution function (BRDF), which describes how the reflectance of a given surface depends on the view angle of the observer and the solar angle. BDRF can facilitate translations of observations of reflectance into albedo.

Earth's average surface temperature due to its albedo and the greenhouse effect is currently about 15 °C. If Earth were frozen entirely (and hence be more reflective), the average temperature of the planet would drop below −40 °C. If only the continental land masses became covered by glaciers, the mean temperature of the planet would drop to about 0 °C. In contrast, if the entire Earth was covered by water – a so-called ocean planet – the average temperature on the planet would rise to almost 27 °C.

White-sky, black-sky, and blue-sky albedo

For land surfaces, it has been shown that the albedo at a particular solar zenith angle θi can be approximated by the proportionate sum of two terms:

with being the proportion of direct radiation from a given solar angle, and being the proportion of diffuse illumination, the actual albedo (also called blue-sky albedo) can then be given as:

This formula is important because it allows the albedo to be calculated for any given illumination conditions from a knowledge of the intrinsic properties of the surface.

Astronomical albedo

The albedos of planets, satellites and minor planets such as asteroids can be used to infer much about their properties. The study of albedos, their dependence on wavelength, lighting angle ("phase angle"), and variation in time composes a major part of the astronomical field of photometry. For small and far objects that cannot be resolved by telescopes, much of what we know comes from the study of their albedos. For example, the absolute albedo can indicate the surface ice content of outer Solar System objects, the variation of albedo with phase angle gives information about regolith properties, whereas unusually high radar albedo is indicative of high metal content in asteroids.

Enceladus, a moon of Saturn, has one of the highest known albedos of any body in the Solar System, with an albedo of 0.99. Another notable high-albedo body is Eris, with an albedo of 0.96. Many small objects in the outer Solar System and asteroid belt have low albedos down to about 0.05. A typical comet nucleus has an albedo of 0.04. Such a dark surface is thought to be indicative of a primitive and heavily space weathered surface containing some organic compounds.

The overall albedo of the Moon is measured to be around 0.14, but it is strongly directional and non-Lambertian, displaying also a strong opposition effect. Although such reflectance properties are different from those of any terrestrial terrains, they are typical of the regolith surfaces of airless Solar System bodies.

Two common albedos that are used in astronomy are the (V-band) geometric albedo (measuring brightness when illumination comes from directly behind the observer) and the Bond albedo (measuring total proportion of electromagnetic energy reflected). Their values can differ significantly, which is a common source of confusion.

In detailed studies, the directional reflectance properties of astronomical bodies are often expressed in terms of the five Hapke parameters which semi-empirically describe the variation of albedo with phase angle, including a characterization of the opposition effect of regolith surfaces.

The correlation between astronomical (geometric) albedo, absolute magnitude and diameter is: ,

where is the astronomical albedo, is the diameter in kilometers, and is the absolute magnitude.

Examples of terrestrial albedo effects

Illumination

Albedo is not directly dependent on illumination because changing the amount of incoming light proportionally changes the amount of reflected light, except in circumstances where a change in illumination induces a change in the Earth's surface at that location (e.g. through melting of reflective ice). That said, albedo and illumination both vary by latitude. Albedo is highest near the poles and lowest in the subtropics, with a local maximum in the tropics.

Insolation effects

The intensity of albedo temperature effects depends on the amount of albedo and the level of local insolation (solar irradiance); high albedo areas in the arctic and antarctic regions are cold due to low insolation, whereas areas such as the Sahara Desert, which also have a relatively high albedo, will be hotter due to high insolation. Tropical and sub-tropical rainforest areas have low albedo, and are much hotter than their temperate forest counterparts, which have lower insolation. Because insolation plays such a big role in the heating and cooling effects of albedo, high insolation areas like the tropics will tend to show a more pronounced fluctuation in local temperature when local albedo changes.

Arctic regions notably release more heat back into space than what they absorb, effectively cooling the Earth. This has been a concern since arctic ice and snow has been melting at higher rates due to higher temperatures, creating regions in the arctic that are notably darker (being water or ground which is darker color) and reflects less heat back into space. This feedback loop results in a reduced albedo effect.

Climate and weather

Albedo affects climate by determining how much radiation a planet absorbs. The uneven heating of Earth from albedo variations between land, ice, or ocean surfaces can drive weather.

Albedo–temperature feedback

When an area's albedo changes due to snowfall, a snow–temperature feedback results. A layer of snowfall increases local albedo, reflecting away sunlight, leading to local cooling. In principle, if no outside temperature change affects this area (e.g., a warm air mass), the raised albedo and lower temperature would maintain the current snow and invite further snowfall, deepening the snow–temperature feedback. However, because local weather is dynamic due to the change of seasons, eventually warm air masses and a more direct angle of sunlight (higher insolation) cause melting. When the melted area reveals surfaces with lower albedo, such as grass or soil, the effect is reversed: the darkening surface lowers albedo, increasing local temperatures, which induces more melting and thus reducing the albedo further, resulting in still more heating.

Snow

Snow albedo is highly variable, ranging from as high as 0.9 for freshly fallen snow, to about 0.4 for melting snow, and as low as 0.2 for dirty snow. Over Antarctica snow albedo averages a little more than 0.8. If a marginally snow-covered area warms, snow tends to melt, lowering the albedo, and hence leading to more snowmelt because more radiation is being absorbed by the snowpack (the ice–albedo positive feedback).

Just as fresh snow has a higher albedo than does dirty snow, the albedo of snow-covered sea ice is far higher than that of sea water. Sea water absorbs more solar radiation than would the same surface covered with reflective snow. When sea ice melts, either due to a rise in sea temperature or in response to increased solar radiation from above, the snow-covered surface is reduced, and more surface of sea water is exposed, so the rate of energy absorption increases. The extra absorbed energy heats the sea water, which in turn increases the rate at which sea ice melts. As with the preceding example of snowmelt, the process of melting of sea ice is thus another example of a positive feedback. Both positive feedback loops have long been recognized as important for global warming.

Cryoconite, powdery windblown dust containing soot, sometimes reduces albedo on glaciers and ice sheets.

The dynamical nature of albedo in response to positive feedback, together with the effects of small errors in the measurement of albedo, can lead to large errors in energy estimates. Because of this, in order to reduce the error of energy estimates, it is important to measure the albedo of snow-covered areas through remote sensing techniques rather than applying a single value for albedo over broad regions.

Small-scale effects

Albedo works on a smaller scale, too. In sunlight, dark clothes absorb more heat and light-coloured clothes reflect it better, thus allowing some control over body temperature by exploiting the albedo effect of the colour of external clothing.

Solar photovoltaic effects

Albedo can affect the electrical energy output of solar photovoltaic devices. For example, the effects of a spectrally responsive albedo are illustrated by the differences between the spectrally weighted albedo of solar photovoltaic technology based on hydrogenated amorphous silicon (a-Si:H) and crystalline silicon (c-Si)-based compared to traditional spectral-integrated albedo predictions. Research showed impacts of over 10%. More recently, the analysis was extended to the effects of spectral bias due to the specular reflectivity of 22 commonly occurring surface materials (both human-made and natural) and analyzes the albedo effects on the performance of seven photovoltaic materials covering three common photovoltaic system topologies: industrial (solar farms), commercial flat rooftops and residential pitched-roof applications.

Trees

Because forests generally have a low albedo, (the majority of the ultraviolet and visible spectrum is absorbed through photosynthesis), some scientists have suggested that greater heat absorption by trees could offset some of the carbon benefits of afforestation (or offset the negative climate impacts of deforestation). In the case of evergreen forests with seasonal snow cover albedo reduction may be great enough for deforestation to cause a net cooling effect. Trees also impact climate in extremely complicated ways through evapotranspiration. The water vapor causes cooling on the land surface, causes heating where it condenses, acts a strong greenhouse gas, and can increase albedo when it condenses into clouds. Scientists generally treat evapotranspiration as a net cooling impact, and the net climate impact of albedo and evapotranspiration changes from deforestation depends greatly on local climate.

In seasonally snow-covered zones, winter albedos of treeless areas are 10% to 50% higher than nearby forested areas because snow does not cover the trees as readily. Deciduous trees have an albedo value of about 0.15 to 0.18 whereas coniferous trees have a value of about 0.09 to 0.15. Variation in summer albedo across both forest types is correlated with maximum rates of photosynthesis because plants with high growth capacity display a greater fraction of their foliage for direct interception of incoming radiation in the upper canopy. The result is that wavelengths of light not used in photosynthesis are more likely to be reflected back to space rather than being absorbed by other surfaces lower in the canopy.

Studies by the Hadley Centre have investigated the relative (generally warming) effect of albedo change and (cooling) effect of carbon sequestration on planting forests. They found that new forests in tropical and midlatitude areas tended to cool; new forests in high latitudes (e.g., Siberia) were neutral or perhaps warming.

Water

Reflectivity of smooth water at 20 °C (refractive index=1.333)

Water reflects light very differently from typical terrestrial materials. The reflectivity of a water surface is calculated using the Fresnel equations (see graph).

At the scale of the wavelength of light even wavy water is always smooth so the light is reflected in a locally specular manner (not diffusely). The glint of light off water is a commonplace effect of this. At small angles of incident light, waviness results in reduced reflectivity because of the steepness of the reflectivity-vs.-incident-angle curve and a locally increased average incident angle.

Although the reflectivity of water is very low at low and medium angles of incident light, it becomes very high at high angles of incident light such as those that occur on the illuminated side of Earth near the terminator (early morning, late afternoon, and near the poles). However, as mentioned above, waviness causes an appreciable reduction. Because light specularly reflected from water does not usually reach the viewer, water is usually considered to have a very low albedo in spite of its high reflectivity at high angles of incident light.

Note that white caps on waves look white (and have high albedo) because the water is foamed up, so there are many superimposed bubble surfaces which reflect, adding up their reflectivities. Fresh 'black' ice exhibits Fresnel reflection. Snow on top of this sea ice increases the albedo to 0.9.

Clouds

Cloud albedo has substantial influence over atmospheric temperatures. Different types of clouds exhibit different reflectivity, theoretically ranging in albedo from a minimum of near 0 to a maximum approaching 0.8. "On any given day, about half of Earth is covered by clouds, which reflect more sunlight than land and water. Clouds keep Earth cool by reflecting sunlight, but they can also serve as blankets to trap warmth."

Albedo and climate in some areas are affected by artificial clouds, such as those created by the contrails of heavy commercial airliner traffic. A study following the burning of the Kuwaiti oil fields during Iraqi occupation showed that temperatures under the burning oil fires were as much as 10 °C colder than temperatures several miles away under clear skies.

Aerosol effects

Aerosols (very fine particles/droplets in the atmosphere) have both direct and indirect effects on Earth's radiative balance. The direct (albedo) effect is generally to cool the planet; the indirect effect (the particles act as cloud condensation nuclei and thereby change cloud properties) is less certain. As per Spracklen et al. the effects are:

  • Aerosol direct effect. Aerosols directly scatter and absorb radiation. The scattering of radiation causes atmospheric cooling, whereas absorption can cause atmospheric warming.
  • Aerosol indirect effect. Aerosols modify the properties of clouds through a subset of the aerosol population called cloud condensation nuclei. Increased nuclei concentrations lead to increased cloud droplet number concentrations, which in turn leads to increased cloud albedo, increased light scattering and radiative cooling (first indirect effect), but also leads to reduced precipitation efficiency and increased lifetime of the cloud (second indirect effect).

Black carbon

Another albedo-related effect on the climate is from black carbon particles. The size of this effect is difficult to quantify: the Intergovernmental Panel on Climate Change estimates that the global mean radiative forcing for black carbon aerosols from fossil fuels is +0.2 W m−2, with a range +0.1 to +0.4 W m−2. Black carbon is a bigger cause of the melting of the polar ice cap in the Arctic than carbon dioxide due to its effect on the albedo.

Human activities

Human activities (e.g., deforestation, farming, and urbanization) change the albedo of various areas around the globe. However, quantification of this effect on the global scale is difficult, further study is required to determine anthropogenic effects.

Other types of albedo

Single-scattering albedo is used to define scattering of electromagnetic waves on small particles. It depends on properties of the material (refractive index); the size of the particle or particles; and the wavelength of the incoming radiation.

 

Lie point symmetry

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Lie_point_symmetry     ...