Search This Blog

Wednesday, October 12, 2022

Universal design

From Wikipedia, the free encyclopedia

Universal design is the design of buildings, products or environments to make them accessible to all people, regardless of age, disability or other factors.

The term "universal design" was coined by the architect Ronald Mace to describe the concept of designing all products and the built environment to be aesthetic and usable to the greatest extent possible by everyone, regardless of their age, ability, or status in life.

Universal design emerged from slightly earlier barrier-free concepts, the broader accessibility movement, and adaptive and assistive technology and also seeks to blend aesthetics into these core considerations. As life expectancy rises and modern medicine increases the survival rate of those with significant injuries, illnesses, and birth defects, there is a growing interest in universal design. There are many industries in which universal design is having strong market penetration but there are many others in which it has not yet been adopted to any great extent. Universal design is also being applied to the design of technology, instruction, services, and other products and environments.

However, it was the work of Selwyn Goldsmith, author of Designing for the Disabled (1963), who really pioneered the concept of free access for people with disabilities. His most significant achievement was the creation of the dropped curb – now a standard feature of the built environment.

Curb cuts or sidewalk ramps, essential for people in wheelchairs but also used by all, are a common example. Color-contrast dishware with steep sides that assists those with visual or dexterity problems are another. There are also cabinets with pull-out shelves, kitchen counters at several heights to accommodate different tasks and postures, and, amidst many of the world's public transit systems, low-floor buses that "kneel" (bring their front end to ground level to eliminate gap) and/or are equipped with ramps rather than on-board lifts.

Principles

The Center for Universal Design at North Carolina State University expounded the following principles:

  1. Equitable use
  2. Flexibility in use
  3. Simple and intuitive
  4. Perceptible information
  5. Tolerance for error
  6. Low physical effort
  7. Size and space for approach and use

Each principle above is succinctly defined and contains a few brief guidelines that can be applied to design processes in any realm: physical or digital.

These principles are broader than those of accessible design and barrier-free design.

Goals

In 2012, the Center for Inclusive Design and Environmental Access at the University at Buffalo expanded the definition of the principles of universal design to include social participation and health and wellness. Rooted in evidence based design, the 8 goals of universal design were also developed.

  1. Body Fit
  2. Comfort
  3. Awareness
  4. Understanding
  5. Wellness
  6. Social Integration
  7. Personalization
  8. Cultural Appropriateness

The first four goals are oriented to human performance: anthropometry, biomechanics, perception, cognition. Wellness bridges human performance and social participation. The last three goals addresses social participation outcomes. The definition and the goals are expanded upon in the textbook "Universal Design: Creating Inclusive Environments."

Examples

  • Smooth, ground level entrances without stairs
  • Surface textures that require low force to traverse on level, less than 5 pounds force per 120 pounds rolling force
  • Surfaces that are stable, firm, and slip resistant per ASTM 2047
  • Wide interior doors (3'0"), hallways, and alcoves with 60" × 60" turning space at doors and dead-ends
  • Functional clearances for approach and use of elements and components
  • Lever handles for opening doors rather than twisting knobs
  • Single-hand operation with closed fist for operable components including fire alarm pull stations
  • Components that do not require tight grasping, pinching or twisting of the wrist
  • Components that require less than 5 pounds of force to operate
  • Cash
  • Light switches with large flat panels rather than small toggle switches
  • Buttons and other controls that can be distinguished by touch
  • Bright and appropriate lighting, particularly task lighting
  • Auditory output redundant with information on visual displays
  • Visual output redundant with information in auditory output
  • Contrast controls on visual output
  • Use of meaningful icons with text labels
  • Clear lines of sight to reduce dependence on sound
  • Volume controls on auditory output
  • Speed controls on auditory output
  • Choice of language on speech output
  • Ramp access in swimming pools
  • Closed captioning on television networks
  • Signs with light-on-dark visual contrast
  • Web pages that provide alternative text to describe images
  • Instruction that presents material both orally and visually
  • Labels in large print on equipment control buttons
  • A museum that allows visitors to choose to listen to or read descriptions

Design standards

In 1960, specifications for barrier free design were published. It was a compendium of over 11 years of disability ergonomic research. In 1961, the specifications became the first Barrier Free Design standard called the American National Standard, A1171.1 was published. It was the first standard to present the criteria for designing facilities and programs for the use of disabled individuals. The research started in 1949 at the University of Illinois Urbana-Champaign and continues to this day. The principal investigator is Dr. Timothy Nugent (his name is listed in the front of the 1961, 1971, 1980 standard). In 1949 Dr. Nugent also started the National Wheelchair Basketball Association. This ANSI A117.1 standard was adopted by the US federal government General Services Administration under 35 FR 4814 - 3/20/70, 39 FR 23214 - 6/27/74, 43 FR 16478 ABA- 4/19/78, 44 FR 39393 7/6/79, 46 FR 39436 8/3/81, in 1984 for UFAS and then in 1990 for ADA. The archived research documents are at the International Code Council (ICC) - ANSI A117.1 division. Dr. Nugent made presentations around the globe in the late 1950s and 1960s presenting the concept of independent functional participation for individuals with disabilities through program options and architectural design.

Another comprehensive publication by the Royal Institute of British Architects published three editions 1963, 1967, 1976 and 1997 of Designing for the Disabled by Selwyn Goldsmith UK. These publications contain valuable empirical data and studies of individuals with disabilities. Both standards are excellent resources for the designer and builder.

Disability ergonomics should be taught to designers, engineers, non-profits executives to further the understanding of what makes an environment wholly tenable and functional for individuals with disabilities.

In October 2003, representatives from China, Japan, and South Korea met in Beijing and agreed to set up a committee to define common design standards for a wide range of products and services that are easy to understand and use. Their goal is to publish a standard in 2004 which covers, among other areas, standards on containers and wrappings of household goods (based on a proposal from experts in Japan), and standardization of signs for public facilities, a subject which was of particular interest to China as it prepared to host the 2008 Summer Olympics.

The International Organization for Standardization, the European Committee for Electrotechnical Standardization, and the International Electrotechnical Commission have developed:

  • CEN/CENELEC Guide 6 – Guidelines for standards developers to address the needs of older persons and persons with disabilities (Identical to ISO/IEC Guide 71, but free for download)
  • ISO 21542:2021 – Building construction — Accessibility and usability of the built environment (available in English and French)
  • ISO 20282-1:2006 – Ease of operation of everyday products — Part 1: Context of use and user characteristics
  • ISO/TS 20282-2:2013 – Usability of consumer products and products for public use — Part 2: Summative test method, published 1 August 2013.

Design for All

The term Design for All (DfA) is used to describe a design philosophy targeting the use of products, services and systems by as many people as possible without the need for adaptation. "Design for All is design for human diversity, social inclusion and equality" (EIDD Stockholm Declaration, 2004). According to the European Commission, it "encourages manufacturers and service providers to produce new technologies for everyone: technologies that are suitable for the elderly and people with disabilities, as much as the teenage techno wizard." The origin of Design for All lies in the field of barrier free accessibility for people with disabilities and the broader notion of universal design.

Background

Design for All has been highlighted in Europe by the European Commission in seeking a more user-friendly society in Europe. Design for All is about ensuring that environments, products, services and interfaces work for people of all ages and abilities in different situations and under various circumstances.

Design for All has become a mainstream issue because of the aging of the population and its increasingly multi-ethnic composition. It follows a market approach and can reach out to a broader market. Easy-to-use, accessible, affordable products and services improve the quality of life of all citizens. Design for All permits access to the built environment, access to services and user-friendly products which are not just a quality factor but a necessity for many aging or disabled persons. Including Design for All early in the design process is more cost-effective than making alterations after solutions are already in the market. This is best achieved by identifying and involving users ("stakeholders") in the decision-making processes that lead to drawing up the design brief and educating public and private sector decision-makers about the benefits to be gained from making coherent use of Design (for All) in a wide range of socio-economic situations.

Examples

The following examples of Designs for All were presented in the book Diseños para todos/Designs for All published in 2008 by Optimastudio with the support of Spain's Ministry of Education, Social Affairs and Sports (IMSERSO) and CEAPAT:

Other useful items for those with mobility limitations:

  • Washlet
  • Wireless remote controlled power sockets
  • Wireless remote controlled window shades

In information and communication technology (ICT)

Design for All criteria are aimed at ensuring that everyone can participate in the Information society. The European Union refers to this under the terms eInclusion and eAccessibility. A three-way approach is proposed: goods which can be accessed by nearly all potential users without modification or, failing that, products being easy to adapt according to different needs, or using standardized interfaces that can be accessed simply by using assistive technology. To this end, manufacturers and service providers, especially, but not exclusively, in the Information and Communication Technologies (ICT), produce new technologies, products, services and applications for everyone.

European organizational networks

In Europe, people have joined in networks to promote and develop Design for All:

  • The European Design for All eAccessibility Network (EDeAN) was launched under the lead of the European Commission and the European Member States in 2002. It fosters Design for All for eInclusion, that is, creating an information society for all. It has national contact centres (NCCs) in almost all EU countries and more than 160 network members in national networks.
  • EIDD - Design for All Europe is a NGO and a 100% self-financed European organization that covers the entire area of theory and practice of Design for All, from the built environment and tangible products to communication, service and system design. Originally set up in 1993 as the European Institute for Design and Disability (EIDD), to enhance the quality of life through Design for All, it changed its name in 2006 to bring it into line with its core business. EIDD - Design for All Europe disseminates the application of Design for All to business and administration communities previously unaware of its benefits and currently (2016) has 31 member organizations in 20 European countries.
  • EuCAN - The European Concept for Accessibility Network started in 1984 as an open network of experts and advocates from all over Europe in order to promote and support the Design for All approach. The coordination work of EuCAN and the functioning of the network are mainly voluntary work. In 1999 the Luxembourg Disability Information and Meeting Centre (better known by its acronym “Info-Handicap”) took over the coordination of the steering group, together with the implicit responsibility for the follow-up of the European Concept for Accessibility (ECA). The EuCAN publications - like ECA - aim to provide practical guidance. They are neither academic nor policy documents.

The "barrier-free" concept

Barrier-free (バリアフリー, bariafurii) building modification consists of modifying buildings or facilities so that they can be used by people who are disabled or have physical impairments. The term is used primarily in Japan and non-English speaking countries (e.g. German: Barrierefreiheit; Finnish: Esteettömyys), while in English-speaking countries, terms such as "accessibility" and "handicapped accessible" dominate in regular everyday use. An example of barrier-free design would be installing a ramp for wheelchairs alongside or in place of steps. In late 1990s any element which could make the use of the environment inconvenient was considered a barrier, for example poor public street lighting. In the case of new buildings, however, the idea of barrier free modification has largely been superseded by the concept of universal design, which seeks to design things from the outset to support easy access.

Freeing a building of barriers means:

  • Recognizing the features that could form barriers for some people
  • Thinking inclusively about the whole range of impairments
  • Reviewing everything - from structure to smallest detail
  • Seeking feedback from users and learning from mistakes

Barrier free is also a term that applies to handicap accessibility in situations where legal codes such as the Americans with Disabilities Act of 1990 Guidelines - a law focusing on all building aspects, products and design that is based on the concept of respecting human rights - don't make specifications.

An example of a country that has sought to implement barrier-free accessibility in housing estates is Singapore. Within five years, all public housing estates in the country, all of 7,800 blocks of apartments, have benefited from the program.

National legislation

Laws and policies related to accessibility or universal design

Funding agencies

The Rehabilitation Engineering Research Center (RERC) on universal design in the Built Environment funded by what is now the National Institute on Disability, Independent Living, and Rehabilitation Research completed its activities on September 29, 2021. Twenty RERCs are currently funded. The Center for Inclusive Design and Environmental Access at the University at Buffalo is a current recipient.

Tuesday, October 11, 2022

Virtual currency

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Virtual_currency

Virtual currency, or virtual money, is a digital currency that is largely unregulated and issued and usually controlled by its developers and used and accepted electronically among the members of a specific virtual community. In 2014, the European Banking Authority defined virtual currency as "a digital representation of value that is neither issued by a central bank or a public authority, nor necessarily attached to a fiat currency, but is accepted by natural or legal persons as a means of payment and can be transferred, stored or traded electronically." A digital currency issued by a central bank is referred to as a central bank digital currency.

Definitions

In 2012, the European Central Bank (ECB) defined virtual currency as "a type of unregulated, digital money, which is issued and usually controlled by its developers, and used and accepted among the members of a specific virtual community".

In 2013, US Financial Crimes Enforcement Network (FinCEN), a bureau of the US Treasury, in contrast to its regulations defining currency as "the coin and paper money of the United States or of any other country that [i] is designated as legal tender and that [ii] circulates and [iii] is customarily used and accepted as a medium of exchange in the country of issuance", also called "real currency" by FinCEN, defined virtual currency as "a medium of exchange that operates like a currency in some environments, but does not have all the attributes of real currency". In particular, virtual currency does not have legal tender status in any jurisdiction.

In 2014, the European Banking Authority defined virtual currency as "a digital representation of value that is neither issued by a central bank or a public authority, nor necessarily attached to a fiat currency, but is accepted by natural or legal persons as a means of payment and can be transferred, stored or traded electronically".

In 2018, Directive (EU) 2018/843 of the European Parliament and of the Council entered into force. The Directive defines the term "virtual currencies" to mean "a digital representation of value that is not issued or guaranteed by a central bank or a public authority, is not necessarily attached to a legally established currency and does not possess a legal status of currency or money, but is accepted by natural or legal persons as a means of exchange and which can be transferred, stored and traded electronically".

History of the term

In a 2013 congressional hearing on virtual currencies, Ben Bernanke said they "have been viewed as a form of 'electronic money' or area of payment system technology that has been evolving over the past 20 years", referencing a 1995 congressional hearing on the Future of Money before the Committee on Banking and Financial Services. The Internet currency Flooz was created in 1999. The term "virtual currency" appears to have been coined around 2009, paralleling the development of digital currencies and social gaming.

Although the correct classification is "digital currency", the US government prefers and has uniformly adopted the term "virtual currency". The FinCEN was first, followed by the FBI in 2012, the General Accounting Office in 2013, as well as the government agencies testifying at the November 2013 US Senate hearing on bitcoin, including the Department of Homeland Security, the US Securities and Exchange Commission, the Office of the Attorney General.

Limits on being currency

Attributes of a real currency, as defined in 2011 in the Code of Federal Regulations, such as real paper money and real coins are simply that they act as legal tender and circulate "customarily".

The IRS decided in March 2014, to treat bitcoin and other virtual currencies as property for tax purposes, not as currency. Some have suggested that this makes bitcoins not fungible—that is one bitcoin is not identical to another bitcoin, unlike one gallon of crude oil being identical to another gallon of crude oil—making bitcoin unworkable as a currency. Others have stated that a measure like accounting on average cost basis would restore fungibility to the currency.

Categorization by currency flow

Closed virtual currencies

Virtual currencies have been called "closed" or "fictional currency" when they have no official connection to the real economy, for example, currencies in massively multiplayer online role-playing games such as World of Warcraft. While there may be a grey market for exchanging such currencies or other virtual assets for real-world assets, this is usually forbidden by the games' terms of service.

Virtual currencies with currency flow into one direction

This type of currency, units of which may also be circulated as (printed) coupons, stamps or reward points, has been known for a long time in the form of customer incentive programs or loyalty programs. A coupon loses its face value when redeemed for an eligible asset or service (hence: flow in one direction), may be valid for only a limited time and subject to other restrictions set by the issuer. The business issuing the coupon functions as a central authority. Coupons remained unchanged for 100 years until new technology enabling credit cards became more common in the 1980s, and credit card rewards were invented. The latest incarnation drives the increase of internet commerce, online services, development of online communities and games. Here virtual or game currency can be bought, but not exchanged back into real money. The virtual currency is akin to a coupon. Examples are frequent flyer programs by various airlines, Microsoft Points, Nintendo Points, Facebook Credits and Amazon Coin.

Convertible virtual currencies

A virtual currency that can be bought with and sold back is called a convertible currency. A virtual currency can be decentralized, as for example bitcoin, a cryptocurrency. Transacting or even holding convertible virtual currency may be illegal in particular jurisdictions and to particular national citizens at particular times and the transactor/recipient/facilitator liable for prosecution by the State.

Centralized versus decentralized

FinCEN defined centralized virtual currencies in 2013 as virtual currencies that have a "centralized repository", similar to a central bank, and a "central administrator".

A decentralized currency was defined by the US Department of Treasury as a "currency (1) that has no central repository and no single administrator, and (2) that persons may obtain by their own computing or manufacturing effort". Rather than relying on confidence in a central authority, it depends instead on a distributed system of trust.

The money matrix

Digital currency is a particular form of currency which is electronically transferred and stored, i.e., distinct from physical currency, such as coins or banknotes. According to the European Central Bank, virtual currencies are "generally digital", although their enduring precursor, the coupon, for example, is physical.

A cryptocurrency is a digital currency using cryptography to secure transactions and to control the creation of new currency units. Since not all virtual currencies use cryptography, not all virtual currencies are cryptocurrencies.

Cryptocurrencies are not always legal tender, but some countries have moved to regulate cryptocurrency-related services as they would financial institutions. Ecuador is the first country attempting a government run a cryptography-free digital currency; during the introductory phase from Christmas Eve 2014 until mid February 2015 people can open accounts and change passwords. At the end of February 2015 transactions of electronic money will be possible.

Estonia has been exploring various possibilities for blockchain technology, such as the use of crypto tokens within its “e-residency” program, which gives both Estonians and foreigners a digital form of identification.

The money matrix
adapted from an ECB work, Virtual Currency Schemes
Money format
Physical Digital
Cryptography-free Cryptography-based or cryptocurrency
Legal
status
Unregulated Centralized Coupon Internet coupon
Mobile coupon
Local currencies Centralized virtual
currencies
Decentralized Physical
commodity money
Digital currency,

Ripple, Stellar

Decentralized
cryptocurrencies
Regulated Banknotes, coins, and cash E-money
Commercial bank
money
(deposits)

Regulation

Virtual currencies pose challenges for central banks, financial regulators, departments or ministries of finance, as well as fiscal authorities and statistical authorities. Gareth Murphy, Central Bank of Ireland, described the regulatory challenges posed by virtual currencies as relating to:

United States of America

US Commodity Futures Trading Commission guidance

The US Commodity Futures Trading Commission (CFTC) has determined virtual currencies are properly defined as commodities in 2015. The CFTC warned investors against pump and dump schemes that use virtual currencies.

US Internal Revenue Service guidance

The US Internal Revenue Service (IRS) ruling Notice 2014-21 defines any virtual currency, cryptocurrency and digital currency as property; gains and losses are taxable within standard property policies.

US Treasury guidance

On 20 March 2013, the Financial Crimes Enforcement Network issued a guidance to clarify how the US Bank Secrecy Act applied to persons creating, exchanging and transmitting virtual currencies.

US Securities and Exchange Commission guidance

In May 2014 the US Securities and Exchange Commission (SEC) "warned about the hazards of bitcoin and other virtual currencies".

State Regulations

New York state regulation

In July 2014, the New York State Department of Financial Services proposed the most comprehensive regulation of virtual currencies to date commonly referred to as a BitLicense. Unlike the US federal regulators it has gathered input from bitcoin supporters and the financial industry through public hearings and a comment period until October 21, 2014, to customize the rules. The proposal, per NY DFS press release "… sought to strike an appropriate balance that helps protect consumers and root out illegal activity". It has been criticized by smaller companies to favor established institutions, and Chinese bitcoin exchanges have complained that the rules are "overly broad in its application outside the United States".

European Union

European Central Bank guidance

In February 2015 the ECB concluded "Virtual currency schemes, such as Bitcoin, are not full forms of money as usually defined in economic literature, nor are virtual currencies money or currency from a legal perspective. Nevertheless, Virtual currency may substitute [for] banknotes and coins, scriptural money and e-money in certain payment situations". In a May 2019 report ECB expressed concerns that "crypto assets provide opportunity for anonymous participation in illegal activities of all sorts".

Legal classification in the EU

In the European Union, a legal definition of cryptocurrency was introduced to be broadly be regarded as “a digital representation of value that can be digitally transferred, stored or traded and is accepted… as a medium of exchange” in the 5th Anti Money Laundering Directive. This also means that within the European Union cryptocurrencies and cryptocurrency exchanges are considered “obliged entities” subject to the European Union's Anti-Money Laundering Directives, and face the same CFT/AML regulations. As of July 20, 2021, the European Commission has proposed replacing the previous Directive 2015/849/EU with provisions from the 6th Anti-Money Laundering Directive.

Virtual currencies are defined as "a digital representation of value that is not issued or guaranteed by a central bank or a public authority, is not necessarily attached to a legally established currency and does not possess a legal status of currency or money, but is accepted by natural or legal persons as a means of exchange and which can be transferred, stored and traded electronically".

The fact that European Union lawmakers regard Bitcoin as the archetypal example of virtual currencies and that Bitcoin therefore fulfills all elements of the legal definition can serve as an anchor point for interpretation. Basically, the definition consists of six elements:

  1. Virtual currencies are digital representations of value. Thus, digital assets must have a certain value in business transactions in order to be considered virtual currencies under EU law.
  2. Virtual currencies are not issued or guaranteed by a central bank or public authority. Issuing is the first placement of a digital asset in the market. Guaranteeing is the assumption of third-party or own liabilities. If digital assets are issued or guaranteed by a central bank or public body, they are not virtual currencies.
  3. Virtual currencies can be attached to a legal currency. Attachment is a legal or economic mechanism that links the value of the digital asset to a legal currency.
  4. Virtual currencies do not have the legal status of a currency or money. This depends on the status of a digital asset in the EU or a Member State.
  5. Virtual currencies are accepted by natural or legal persons as a means of exchange. This is the core element of the legal definition: The term "medium of exchange" is best understood in negative terms and requires that a digital asset is neither e-money as defined by the EU E-Money Directive, nor a payment service or payment instrument as defined by EU Payment Services Directive II, nor any other means of payment as defined by EU Capital Requirements Directive IV. The concept of acceptance requires a certain minimum of actual demand for the digital asset on the market to be considered a virtual currency.
  6. Virtual currencies can be transferred, stored and traded electronically. Only digital assets that can be transferred electronically to a person (transfer), whereby the owner also has the option of preventing transfers without his intervention (storage), fulfill this concept.

The authors of the legal definition under EU law primarily had blockchain technology in mind – and Bitcoin as an archetypal form. Against this background, it is remarkable that the definition does not contain any elements tailored to the use of a particular technology. On the contrary, the legal definition is strikingly technology neutral.

In a CNN interview, the financial crime expert Veit Buetterlin explained that the raise of the cryptocurrency market opened creative channels for terror groups to finance themselves.

Climate change scenario

From Wikipedia, the free encyclopedia
 
Global CO2 emissions and probabilistic temperature outcomes of different policies

Climate change scenarios or socioeconomic scenarios are projections of future greenhouse gas (GHG) emissions used by analysts to assess future vulnerability to climate change. Scenarios and pathways are created by scientists to survey any long term routes and explore the effectiveness of mitigation and helps us understand what the future may hold this will allow us to envision the future of human environment system. Producing scenarios requires estimates of future population levels, economic activity, the structure of governance, social values, and patterns of technological change. Economic and energy modelling (such as the World3 or the POLES models) can be used to analyze and quantify the effects of such drivers.

Scientists can develop separate international, regional and national climate change scenarios. These scenarios are designed to help stakeholders understand what kinds of decisions will have meaningful effects on climate change mitigation or adaptation. Most countries developing adaptation plans or Nationally Determined Contributions will commission scenario studies in order to better understand the decisions available to them.

International goals for mitigating climate change through international processes like the Intergovernmental Panel on Climate Change (IPCC), the Paris Agreement and Sustainable Development Goal 13 ("Take urgent action to combat climate change and its impacts") are based on reviews of these scenarios. For example, the Special Report on Global Warming of 1.5 °C was released in 2018 order to reflect more up-to-date models of emissions, Nationally Determined Contributions, and impacts of climate change than its predecessor IPCC Fifth Assessment Report published in 2014 before the Paris Agreement.

Emissions scenarios

Global futures scenarios

These scenarios can be thought of as stories of possible futures. They allow the description of factors that are difficult to quantify, such as governance, social structures, and institutions. Morita et al. assessed the literature on global futures scenarios. They found considerable variety among scenarios, ranging from variants of sustainable development, to the collapse of social, economic, and environmental systems. In the majority of studies, the following relationships were found:

  • Rising GHGs: This was associated with scenarios having a growing, post-industrial economy with globalization, mostly with low government intervention and generally high levels of competition. Income equality declined within nations, but there was no clear pattern in social equity or international income equality.
  • Falling GHGs: In some of these scenarios, GDP rose. Other scenarios showed economic activity limited at an ecologically sustainable level. Scenarios with falling emissions had a high level of government intervention in the economy. The majority of scenarios showed increased social equity and income equality within and among nations.

Morita et al. (2001) noted that these relationships were not proof of causation.

No strong patterns were found in the relationship between economic activity and GHG emissions. Economic growth was found to be compatible with increasing or decreasing GHG emissions. In the latter case, emissions growth is mediated by increased energy efficiency, shifts to non-fossil energy sources, and/or shifts to a post-industrial (service-based) economy.

Factors affecting emissions growth

Development trends

In producing scenarios, an important consideration is how social and economic development will progress in developing countries. If, for example, developing countries were to follow a development pathway similar to the current industrialized countries, it could lead to a very large increase in emissions. Emissions do not only depend on the growth rate of the economy. Other factors include the structural changes in the production system, technological patterns in sectors such as energy, geographical distribution of human settlements and urban structures (this affects, for example, transportation requirements), consumption patterns (e.g., housing patterns, leisure activities, etc.), and trade patterns the degree of protectionism and the creation of regional trading blocks can affect availability to technology.

Baseline scenarios

A baseline scenario is used as a reference for comparison against an alternative scenario, e.g., a mitigation scenario. In assessing baseline scenarios literature, Fisher et al., it was found that baseline CO2 emission projections covered a large range. In the United States, electric power plants emit about 2.4 billion tons of carbon dioxide (CO2) each year, or roughly 40 percent of the nation's total emissions. The EPA has taken important first steps by setting standards that will cut the carbon pollution from automobiles and trucks nearly in half by 2025 and by proposing standards to limit the carbon pollution from new power plants.

Factors affecting these emission projections are:

  • Population projections: All other factors being equal, lower population projections result in lower emissions projections.
  • Economic development: Economic activity is a dominant driver of energy demand and thus of GHG emissions.
  • Energy use: Future changes in energy systems are a fundamental determinant of future GHG emissions.
    • Energy intensity: This is the total primary energy supply (TPES) per unit of GDP. In all of the baseline scenarios assessments, energy intensity was projected to improve significantly over the 21st century. The uncertainty range in projected energy intensity was large (Fisher et al. 2007).
    • Carbon intensity: This is the CO2 emissions per unit of TPES. Compared with other scenarios, Fisher et al. (2007) found that the carbon intensity was more constant in scenarios where no climate policy had been assumed. The uncertainty range in projected carbon intensity was large. At the high end of the range, some scenarios contained the projection that energy technologies without CO2 emissions would become competitive without climate policy. These projections were based on the assumption of increasing fossil fuel prices and rapid technological progress in carbon-free technologies. Scenarios with a low improvement in carbon intensity coincided with scenarios that had a large fossil fuel base, less resistance to coal consumption, or lower technology development rates for fossil-free technologies.
  • Land-use change: Land-use change plays an important role in climate change, impacting on emissions, sequestration and albedo. One of the dominant drivers in land-use change is food demand. Population and economic growth are the most significant drivers of food demand.

Quantitative emissions projections

A wide range of quantitative projections of greenhouse gas emissions have been produced. The "SRES" scenarios are "baseline" emissions scenarios (i.e., they assume that no future efforts are made to limit emissions), and have been frequently used in the scientific literature (see Special Report on Emissions Scenarios for details). Greenhouse gas#Projections summarizes projections out to 2030, as assessed by Rogner et al. Other studies are presented here.

Individual studies

In the reference scenario of World Energy Outlook 2004, the International Energy Agency projected future energy-related CO2 emissions. Emissions were projected to increase by 62% between the years 2002 and 2030. This lies between the SRES A1 and B2 scenario estimates of +101% and +55%, respectively. As part of the IPCC Fourth Assessment Report, Sims et al. (2007) compared several baseline and mitigation scenarios out to the year 2030. The baseline scenarios included the reference scenario of IEA's World Energy Outlook 2006 (WEO 2006), SRES A1, SRES B2, and the ABARE reference scenario. Mitigation scenarios included the WEO 2006 Alternative policy, ABARE Global Technology and ABARE Global Technology + CCS. Projected total energy-related emissions in 2030 (measured in GtCO2-eq) were 40.4 for the IEA WEO 2006 reference scenario, 58.3 for the ABARE reference scenario, 52.6 for the SRES A1 scenario, and 37.5 for the SRES B2 scenario. Emissions for the mitigation scenarios were 34.1 for the IEA WEO 2006 Alternative Policy scenario, 51.7 for the ABARE Global Technology scenario, and 49.5 for the ABARE Global Technology + CCS scenario.

Garnaut et al. (2008) made a projection of fossil-fuel CO2 emissions for the time period 2005-2030. Their "business-as usual" annual projected growth rate was 3.1% for this period. This compares to 2.5% for the fossil-fuel intensive SRES A1FI emissions scenario, 2.0% for the SRES median scenario (defined by Garnaut et al. (2008) as the median for each variable and each decade of the four SRES marker scenarios), and 1.6% for the SRES B1 scenario. Garnaut et al. (2008) also referred to projections over the same time period of the: US Climate Change Science Program (2.7% max, and 2.0% mean), International Monetary Fund's 2007 World Economic Outlook (2.5%), Energy Modelling Forum (2.4% max, 1.7% mean), US Energy Information Administration (2.2% high, 1.8% medium, and 1.4% low), IEA's World Energy Outlook 2007 (2.1% high, 1.8 base case), and the base case from the Nordhaus model (1.3%).

The central scenario of the International Energy Agency publication World Energy Outlook 2011 projects a continued increase in global energy-related CO
2
emissions, with emissions reaching 36.4 Gt in the year 2035. This is a 20% increase in emissions relative to the 2010 level.

UNEP 2011 synthesis report

The United Nations Environment Programme (UNEP, 2011) looked at how world emissions might develop out to the year 2020 depending on different policy decisions. To produce their report, UNEP (2011) convened 55 scientists and experts from 28 scientific groups across 15 countries.

Projections, assuming no new efforts to reduce emissions or based on the "business-as-usual" hypothetical trend, suggested global emissions in 2020 of 56 gigatonnes CO
2
-equivalent (GtCO
2
-eq), with a range of 55-59 GtCO
2
-eq. In adopting a different baseline where the pledges to the Copenhagen Accord were met in their most ambitious form, the projected global emission by 2020 will still reach the 50 gigatonnes CO
2
. Continuing with the current trend, particularly in the case of low-ambition form, there is an expectation of 3° Celsius temperature increase by the end of the century, which is estimated to bring severe environmental, economic, and social consequences. For instance, warmer air temperature and the resulting evapotranspiration can lead to larger thunderstorms and greater risk from flash flooding.

Other projections considered the effect on emissions of policies put forward by UNFCCC Parties to address climate change. Assuming more stringent efforts to limit emissions lead to projected global emissions in 2020 of between 49-52 GtCO
2
-eq, with a median estimate of 51 GtCO
2
-eq. Assuming less stringent efforts to limit emissions lead to projected global emissions in 2020 of between 53-57 GtCO
2
-eq, with a median estimate of 55 GtCO
2
-eq.

National climate (change) projections

National climate (change) projections (also termed "national climate scenarios" or "national climate assessments") are specialized regional climate projections, typically produced for and by individual countries. What distinguishes national climate projections from other climate projections is that they are officially signed off by the national government, thereby being the relevant national basis for adaptation planning. Climate projections are commonly produced over several years by countries' national meteorological services or academic institutions working on climate change.

Typically distributed as a single product, climate projections condense information from multiple climate models, using multiple greenhouse gas emission pathways (e.g. Representative Concentration Pathways) to characterize different yet coherent climate futures. Such a product highlights plausible climatic changes through the use of narratives, graphs, maps, and perhaps raw data. Climate projections are often publicly available for policy-makers, public and private decision makers, as well as researchers to undertake further climate impact studies, risk assessments, and climate change adaptation research. The projections are updated every few years, in order to incorporate new scientific insights and improved climate models.

Aims

National climate projections illustrate plausible changes to a country's climate in the future. By using multiple emission scenarios, these projections highlight the impact different global mitigation efforts have on variables, including temperature, precipitation, and sunshine hours. Climate scientists strongly recommend the use of multiple emission scenarios in order to ensure that decisions are robust to a range of climatic changes. National climate projections form the basis of national climate adaptation and climate resilience plans, which are reported to UNFCCC and used in IPCC assessments.

Design

To explore a wide range of plausible climatic outcomes and to enhance confidence in the projections, national climate change projections are often generated from multiple general circulation models (GCMs). Such climate ensembles can take the form of perturbed physics ensembles (PPE), multi-model ensembles (MME), or initial condition ensembles (ICE). As the spatial resolution of the underlying GCMs is typically quite coarse, the projections are often downscaled, either dynamically using regional climate models (RCMs), or statistically. Some projections include data from areas which are larger than the national boundaries, e.g. to more fully evaluate catchment areas of transboundary rivers. Some countries have also produced more localized projections for smaller administrative areas, e.g. States in the United States, and Länder in Germany.

Various countries have produced their national climate projections with feedback and/or interaction with stakeholders. Such engagement efforts have helped tailoring the climate information to the stakeholders' needs, including the provision of sector-specific climate indicators such as degree-heating days. In the past, engagement formats have included surveys, interviews, presentations, workshops, and use-cases. While such interactions helped not only to enhance the usability of the climate information, it also fostered discussions on how to use climate information in adaptation projects. Interestingly, a comparison of the British, Dutch, and Swiss climate projections revealed distinct national preferences in the way stakeholders were engaged, as well as how the climate model outputs were condensed and communicated.

Examples

Over 30 countries have reported national climate projections / scenarios in their most recent National Communications to United Nations Framework Convention on Climate Change. Many European governments have also funded national information portals on climate change.

For countries which lack adequate resources to develop their own climate change projections, organisations such as UNDP or FAO have sponsored development of projections and national adaptation programmes (NAPAs).

Applications

National climate projections are widely used to predict climate change impacts in a wide range of economic sectors, and also to inform climate change adaptation studies and decisions. Some examples include:

Comparisons

A detailed comparison between some national climate projections have been carried out.

Global long-term scenarios

In 2021, researchers who found that projecting effects of greenhouse gas emissions only for up to 2100, as widely practiced in research and policy-making, is short-sighted modeled RCPs climate change scenarios and their effects for up to 2500.

Projections for crop suitability to 2100 and 2500 under the moderate–high RCP6.0 emission scenario
 
Global mean near-surface air temperature and thermosteric sea-level rise anomalies relative to the 2000–2019 mean for RCP scenarios
 
Mean number of months per year where heat stress exceeds 38°C (UTCI scale) in present (2020) and future climates

Being able to reduce greenhouse gas emissions will require many major transitions: including massive reduction of fossil fuels, producing and distributing low-emission energy sources, changing to various other energy providers, and, maybe most important, conserving energy and being more efficient with it. If fossil fuels continue to be burnt and vented to the environment, GHG emissions will be very hard to reduce.

Mitigation scenarios

Scenarios of global greenhouse gas emissions. If all countries achieve their current Paris Agreement pledges, average warming by 2100 will go far beyond the target of the Paris Agreement to keep warming "well below 2°C".

Climate change mitigation scenarios are possible futures in which global warming is reduced by deliberate actions, such as a comprehensive switch to energy sources other than fossil fuels. These are actions that minimize emissions so atmospheric greenhouse gas concentrations are stabilized at levels that restrict the adverse consequences of climate change. Using these scenarios, the examination of the impacts of different carbon prices on an economy is enabled within the framework of different levels of global aspirations.

A typical mitigation scenario is constructed by selecting a long-range target, such as a desired atmospheric concentration of carbon dioxide (CO2), and then fitting the actions to the target, for example by placing a cap on net global and national emissions of greenhouse gases.

An increase of global temperature by more than 2 °C has come to be the majority definition of what would constitute intolerably dangerous climate change with efforts to limit the temperature increase to 1.5 °C above pre-industrial levels per the Paris Agreement. Some climate scientists are increasingly of the opinion that the goal should be a complete restoration of the atmosphere's preindustrial condition, on the grounds that too protracted a deviation from those conditions will produce irreversible changes.

Stabilization wedges

A stabilization wedge (or simply "wedge") is an action which incrementally reduces projected emissions. The name is derived from the triangular shape of the gap between reduced and unreduced emissions trajectories when graphed over time. For example, a reduction in electricity demand due to increased efficiency means that less electricity needs to be generated and thus fewer emissions need to be produced. The term originates in the Stabilization Wedge Game. As a reference unit, a stabilization wedge is equal to the following examples of mitigation initiatives: deployment of two hundred thousand 10 MW wind turbines; completely halting the deforestation and planting of 300 million hectares of trees; the increase in the average energy efficiency of all the world's buildings by 25 percent; or the installation of carbon capture and storage facilities in 800 large coal-fired power plants. Pacala and Socolow proposed in their work, Stabilization Wedges, that seven wedges are required to be delivered by 2050 – at current technologies – to make a significant impact on the mitigation of climate change. There are, however, sources that estimate the need for 14 wedges because Pacala and Socolow's proposal would only stabilize carbon dioxide emissions at current levels but not the atmospheric concentration, which is increasing by more than 2 ppm/year. In 2011, Socolow revised their earlier estimate to nine.

Target levels of CO2

Contributions to climate change, whether they cool or warm the Earth, are often described in terms of the radiative forcing or imbalance they introduce to the planet's energy budget. Now and in the future, anthropogenic carbon dioxide is believed to be the major component of this forcing, and the contribution of other components is often quantified in terms of "parts-per-million carbon dioxide equivalent" (ppm CO2e), or the increment/decrement in carbon dioxide concentrations which would create a radiative forcing of the same magnitude.

450 ppm

The BLUE scenarios in the IEA's Energy Technology Perspectives publication of 2008 describe pathways to a long-range concentration of 450 ppm. Joseph Romm has sketched how to achieve this target through the application of 14 wedges.

World Energy Outlook 2008, mentioned above, also describes a "450 Policy Scenario", in which extra energy investments to 2030 amount to $9.3 trillion over the Reference Scenario. The scenario also features, after 2020, the participation of major economies such as China and India in a global cap-and-trade scheme initially operating in OECD and European Union countries. Also the less conservative 450 ppm scenario calls for extensive deployment of negative emissions, i.e. the removal of CO2 from the atmosphere. According to the International Energy Agency (IEA) and OECD, "Achieving lower concentration targets (450 ppm) depends significantly on the use of BECCS".

550 ppm

This is the target advocated (as an upper bound) in the Stern Review. As approximately a doubling of CO2 levels relative to preindustrial times, it implies a temperature increase of about three degrees, according to conventional estimates of climate sensitivity. Pacala and Socolow list 15 "wedges", any 7 of which in combination should suffice to keep CO2 levels below 550 ppm.

The International Energy Agency's World Energy Outlook report for 2008 describes a "Reference Scenario" for the world's energy future "which assumes no new government policies beyond those already adopted by mid-2008", and then a "550 Policy Scenario" in which further policies are adopted, a mixture of "cap-and-trade systems, sectoral agreements and national measures". In the Reference Scenario, between 2006 and 2030 the world invests $26.3 trillion in energy-supply infrastructure; in the 550 Policy Scenario, a further $4.1 trillion is spent in this period, mostly on efficiency increases which deliver fuel cost savings of over $7 trillion.

Other greenhouse gases

Greenhouse gas concentrations are aggregated in terms of carbon dioxide equivalent. Some multi-gas mitigation scenarios have been modeled by Meinshausen et al.

As a short-term focus

In a 2000 paper, Hansen argued that the 0.75 °C rise in average global temperatures over the last 100 years has been driven mainly by greenhouse gases other than carbon dioxide, since warming due to CO2 had been offset by cooling due to aerosols, implying the viability of a strategy initially based around reducing emissions of non-CO2 greenhouse gases and of black carbon, focusing on CO2 only in the longer run.

This was also argued by Veerabhadran Ramanathan and Jessica Seddon Wallack in the September/October 2009 Foreign Affairs.

Online machine learning

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Online_machine_learning In computer sci...