Search This Blog

Saturday, November 5, 2022

Rain garden

From Wikipedia, the free encyclopedia
 
A rain garden during the winter

Rain gardens, also called bioretention facilities, are one of a variety of practices designed to increase rain runoff reabsorption by the soil. They can also be used to treat polluted stormwater runoff. Rain gardens are designed landscape sites that reduce the flow rate, total quantity, and pollutant load of runoff from impervious urban areas like roofs, driveways, walkways, parking lots, and compacted lawn areas. Rain gardens rely on plants and natural or engineered soil medium to retain stormwater and increase the lag time of infiltration, while remediating and filtering pollutants carried by urban runoff. Rain gardens provide a method to reuse and optimize any rain that falls, reducing or avoiding the need for additional irrigation. A benefit of planting rain gardens is the consequential decrease in ambient air and water temperature, a mitigation that is especially effective in urban areas containing an abundance of impervious surfaces that absorb heat in a phenomenon known as the heat-island effect.

Rain garden plantings commonly include wetland edge vegetation, such as wildflowers, sedges, rushes, ferns, shrubs and small trees. These plants take up nutrients and water that flow into the rain garden, and they release water vapor back to the atmosphere through the process of transpiration. Deep plant roots also create additional channels for stormwater to filter into the ground. Root systems enhance infiltration, maintain or even augment soil permeability, provide moisture redistribution, and sustain diverse microbial populations involved in biofiltration. Microbes help to break down organic compounds (including some pollutants) and remove nitrogen.

Rain gardens are beneficial for many reasons; they improve water quality by filtering runoff, provide localized flood control, create aesthetic landscaping sites, and provide diverse planting opportunities. They also encourage wildlife and biodiversity, tie together buildings and their surrounding environments in integrated and environmentally advantageous ways. Rain gardens can improve water quality in nearby bodies of water and recharge depleted groundwater supply. Rain gardens also reduce the amount of polluted runoff that enters the storm sewer system, which discharges directly to surface waters and causes erosion, water pollution and flooding. Rain gardens also reduce energy consumption by decreasing the load on conventional stormwater infrastructure.

History

The first rain gardens were created to mimic the natural water retention areas that developed before urbanization occurred. The rain gardens for residential use were developed in 1990 in Prince George's County, Maryland, when Dick Brinker, a developer building a new housing subdivision had the idea to replace the traditional best management practices (BMP) pond with a bioretention area. He approached Larry Coffman, an environmental engineer and the county's Associate Director for Programs and Planning in the Department of Environmental Resources, with the idea. The result was the extensive use of rain gardens in Somerset, a residential subdivision which has a 300–400 sq ft (28–37 m2) rain garden on each house's property. This system proved to be highly cost-effective. Instead of a system of curbs, sidewalks, and gutters, which would have cost nearly $400,000, the planted drainage swales cost $100,000 to install. This was also much more cost effective than building BMP ponds that could handle 2-, 10-, and 100-year storm events. Flow monitoring done in later years showed that the rain gardens have resulted in a 75–80% reduction in stormwater runoff during a regular rainfall event.

Some de facto rain gardens predate their recognition by professionals as a significant LID (Low Impact Development) tool. Any shallow garden depression implemented to capture and filter rain water within the garden so as to avoid draining water offsite is at conception a rain garden—particularly if vegetation is planted and maintained with recognition of its role in this function. Vegetated roadside swales, now promoted as “bioswales”, remain the conventional runoff drainage system in many parts of the world from long before extensive networks of concrete sewers became the conventional engineering practice in the industrialized world. What is new about such technology is the emerging rigor of increasingly quantitative understanding of how such tools may make sustainable development possible. This is as true for developed communities retrofitting bioretention into existing stormwater management infrastructure as it is for developing communities seeking a faster and more sustainable development path.

Urban runoff mitigation

Effects of urban runoff

In developed urban areas, naturally occurring depressions where storm water would pool are typically covered by impermeable surfaces, such as asphalt, pavement, or concrete, and are leveled for automobile use. Stormwater is directed into storm drains which may cause overflows of combined sewer systems or pollution, erosion, or flooding of waterways receiving the storm water runoff. Redirected stormwater is often warmer than the groundwater normally feeding a stream, and has been linked to upset in some aquatic ecosystems primarily through the reduction of dissolved oxygen (DO). Stormwater runoff is also a source of a wide variety of pollutants washed off hard or compacted surfaces during rain events. These pollutants may include volatile organic compounds, pesticides, herbicides, hydrocarbons and trace metals.

Stormwater management systems

Stormwater management occurs on a watershed scale to prevent downstream impacts on urban water quality. A watershed is maintained through the cyclical accumulation, storage, and flow of groundwater. Naturally occurring watersheds are damaged when they are sealed by an impervious surface, which diverts pollutant-carrying stormwater runoff into streams. Urban watersheds are affected by greater quantities of pollutants due to the consequences of anthropogenic activities within urban environments. Rainfall on impermeable surfaces accumulates surface runoff containing oil, bacteria, and sediment that eventually makes its way to streams and groundwater. Stormwater control strategies such as infiltration gardens treat contaminated surface runoff and return processed water to the underlying soil, helping to restore the watershed system. The effectiveness of stormwater control systems is measured by the reduction of the amount of rainfall that becomes runoff (retention), and the lag time (rate of depletion) of the runoff. Even rain gardens with small capacities for daily infiltration can create a positive cumulative impact on mitigating urban runoff. Increasing the number of permeable surfaces by designing rain gardens reduces the amount of polluted stormwater that reaches natural bodies of water and recharges groundwater at a higher rate. Additionally, adding a rain garden to a site that experiences excessive rainwater runoff mitigates the water quantity load on public stormwater systems.

The bioretention approach to water treatment, and specifically rain gardens in this context, is two-fold: to utilize the natural processes within landscapes and soils to transport, store, and filter stormwater before it becomes runoff, and to reduce the overall amount of impervious surface covering the ground that allow for contaminated urban runoff. Rain gardens perform most effectively when they interact with the greater system of stormwater control. This integrated approach to water treatment is called the "stormwater chain", which consists of all associated techniques to prevent surface run-off, retain run-off for infiltration or evaporation, detain run-off and release it at a predetermined rate, and convey rainfall from where it lands to detention or retention facilities. Rain gardens have many reverberating effects on the greater hydrological system. In a bioretention system such as a rain garden, water filters through layers of soil and vegetation media, which treat the water before it enters the groundwater system or an underdrain. Any remaining runoff from a rain garden will have a lower temperature than runoff from an impervious surface, which reduces the thermal shock on receiving bodies of water. Additionally, increasing the amount of permeable surfaces by designing urban rain gardens reduces the amount of polluted stormwater that reaches natural bodies of water and recharges groundwater at a higher rate.

Bioretention

The concept of LID (low-impact design) for stormwater management is based on bioretention: a landscape and water design practice that utilizes the chemical, biological, and physical properties of soils, microorganisms, and plants to control the quality and quantity of water flow within a site. Bioretention facilities are primarily designed for water management, and can treat urban runoff, stormwater, groundwater, and in special cases, wastewater. Carefully designed constructed wetlands are necessary for the bioretention of sewage water or grey water, which have greater effects on human health than the implications of treating urban runoff and rainfall. Environmental benefits of bioretention sites include increased wildlife diversity and habitat production and minimized energy use and pollution. Prioritizing water management through natural bioretention sites eliminates the possibility of covering the land with impermeable surfaces.

Water treatment process

Bioretention controls the stormwater quantity through interception, infiltration, evaporation, and transpiration. First, rainfall is captured by plant tissue (leaves and stems) and in the soil micropores. Then, water performs infiltration - the downward movement of water through soil - and is stored in the soil until the substrate reaches its moisture capacity, when it begins to pool at the top of the bioretention feature. The pooled water and water from plant and soil surfaces is then evaporated into the atmosphere. Optimal design of bioretention sites aim for shallow pooled water to reach a higher rate of evaporation. Water also evaporates through the leaves of the plants in the feature and back to the atmosphere, which is a process known as evapotranspiration.

Stormwater quality can be controlled by bioretention through settling, filtration, assimilation, adsorption, degradation, and decomposition. When water pools on top of a bioretention feature, suspended solids and large particles will settle out. Dust particles, soil particles, and other small debris are filtered out of the water as it moves downward through the soil and interspersed plant roots. Plants take up some of the nutrients for use in their growth processes, or for mineral storage. Dissolved chemical substances from the water also bind to the surfaces of plant roots, soil particles, and other organic matter in the substrate and are rendered ineffective. Soil microorganisms break down remaining chemicals and small organic matter and effectively decompose the pollutants into a saturated soil matter.

Even though natural water purification is based on the design of planted areas, the key components of bioremediation are the soil quality and microorganism activity. These features are supported by plants, which create secondary pore space to increase soil permeability, prevent soil compaction through complex root structure growth, provide habitats for the microorganisms on the surfaces of their roots, and transport oxygen to the soil.

Design

A recently planted home rain garden

Stormwater garden design encompasses a wide range of features based on the principles of bioretention. These facilities are then organized into a sequence and incorporated into the landscape in the order that rainfall moves from buildings and permeable surfaces to gardens, and eventually, to bodies of water. A rain garden requires an area where water can collect and infiltrate, and plants can maintain infiltration rates, diverse microorganism communities, and water storage capacity. Because infiltration systems manage storm water quantity by reducing storm water runoff volumes and peak flows, rain garden design must begin with a site analysis and assessment of the rainfall loads on the proposed bioretention system. This will lead to different knowledge about each site, which will affect the choice of plantings and substrate systems. At a minimum, rain gardens should be designed for the peak runoff rate during the most severe expected storm. The load applied on the system will then determine the optimal design flow rate.

Existing gardens can be adapted to perform like rain gardens by adjusting the landscape so that downspouts and paved surfaces drain into existing planting areas. Even though existing gardens have loose soil and well-established plants, they may need to be augmented in size and/or with additional, diverse plantings to support a higher infiltration capacity. Also, many plants do not tolerate saturated roots for long and will not be able to handle the increased flow of water. Rain garden plant species should be selected to match the site conditions after the required location and storage capacity of the bioretention area are determined. In addition to mitigating urban runoff, the rain garden may contribute to urban habitats for native butterflies, birds, and beneficial insects.

Rain gardens are at times confused with bioswales. Swales slope to a destination, while rain gardens are level; however, a bioswale may end with a rain garden as a part of a larger stormwater management system. Drainage ditches may be handled like bioswales and even include rain gardens in series, saving time and money on maintenance. Part of a garden that nearly always has standing water is a water garden, wetland, or pond, and not a rain garden. Rain gardens also differ from retention basins, where the water will infiltrate the ground at a much slower rate, within a day or two.

Soil and drainage

Collected water is filtered through the strata of soil or engineering growing soil, called substrate. After the soil reaches its saturation limit, excess water pools on the surface of the soil and eventually infiltrates the natural soil below. The bioretention soil mixture should typically contain 60% sand, 20% compost, and 20% topsoil. Soils with higher concentrations of compost have shown improved effects on filtering groundwater and rainwater. Non-permeable soil needs to be removed and replaced periodically to generate maximum performance and efficiency if used in the bioretention system. The sandy soil (bioretention mixture) cannot be combined with a surrounding soil that has a lower sand content because the clay particles will settle in between the sand particles and form a concrete-like substance that is not conducive to infiltration, according to a 1983 study. Compact lawn soil cannot harbor groundwater nearly as well as sandy soils, because the micropores within the soil are not sufficient for retaining substantial runoff levels.

When an area's soils are not permeable enough to allow water to drain and filter at an appropriate rate, the soil should be replaced and an underdrain installed. Sometimes a drywell with a series of gravel layers near the lowest spot in the rain garden will help facilitate percolation and avoid clogging at the sedimentation basin. However, a drywell placed at the lowest spot can become clogged with silt prematurely, turning the garden into an infiltration basin and defeating its purpose as a bioretention system. The more polluted the runoff water, the longer it must be retained in the soil for purification. Capacity for a longer purification period is often achieved by installing several smaller rain garden basins with soil deeper than the seasonal high water table. In some cases lined bioretention cells with subsurface drainage are used to retain smaller amounts of water and filter larger amounts without letting water percolate as quickly. A five-year study by the U.S. Geological Survey indicates that rain gardens in urban clay soils can be effective without the use of underdrains or replacement of native soils with the bioretention mix. Yet it also indicates that pre-installation infiltration rates should be at least .25 in/hour. Type D soils will require an underdrain paired with the sandy soil mix in order to drain properly.

Rain gardens are often located near a building's roof drainpipe (with or without rainwater tanks). Most rain gardens are designed to be an endpoint of a building's or urban site's drainage system with a capacity to percolate all incoming water through a series of soil or gravel layers beneath the surface plantings. A French drain may be used to direct a portion of the rainwater to an overflow location for heavier rain events. If the bioretention site has additional runoff directed from downspouts leading from the roof of a building, or if the existing soil has a filtration rate faster than 5 inches per hour, the substrate of the rain garden should include a layer of gravel or sand beneath the topsoil to meet that increased infiltration load. If not originally designed to include a rain garden onsite, downpipes from the roof can be disconnected and diverted to a rain garden for retrofit stormwater management. This reduces the amount of water load on the conventional drainage system, and instead directs water for infiltration and treatment through bioretention features. By reducing peak stormwater discharge, rain gardens extend hydraulic lag time and somewhat mimic the natural water cycle displaced by urban development and allow for groundwater recharge. While rain gardens always allow for restored groundwater recharge, and reduced stormwater volumes, they may not improve pollution unless remediation materials are included in the design of the filtration layers.

Vegetation

Typical rain garden plants are herbaceous perennials and grasses, which are chosen for their porous root structure and high growth rate. Trees and shrubs can also be planted to cover larger areas on the bioretention site. Although specific plants are selected and designed for respective soils and climates, plants that can tolerate both saturated and dry soil are typically used for the rain garden. They need to be maintained for maximum efficiency, and be compatible with adjacent land uses. Native and adapted plants are commonly selected for rain gardens because they are more tolerant of the local climate, soil, and water conditions; have deep and variable root systems for enhanced water infiltration and drought tolerance; increase habitat value, diversity for local ecological communities, and overall sustainability once established. Vegetation with dense and uniform root structure depth helps to maintain consistent infiltration throughout the bioretention system. There can be trade-offs associated with using native plants, including lack of availability for some species, late spring emergence, short blooming season, and relatively slow establishment.

It is important to plant a wide variety of species so the rain garden is functional during all climatic conditions. It is likely that the garden will experience a gradient of moisture levels across its functional lifespan, so some drought tolerant plantings are desirable. There are four categories of a vegetative species’ moisture tolerance that can be considered when choosing plants for a rain garden. Wet soil is constantly full of water with long periods of pooling surface water; this category includes swamp and marsh sites. Moist soil is always slightly damp, and plants that thrive in this category can tolerate longer periods of flooding. Mesic soil is neither very wet nor very dry; plants that prefer this category can tolerate brief periods of flooding. Dry soil is ideal for plants that can withstand long dry periods. Plantings chosen for rain gardens must be able to thrive during both extreme wet and dry spells, since rain gardens periodically swing between these two states. A rain garden in temperate climates will unlikely dry out completely, but gardens in dry climates will need to sustain low soil moisture levels during periods of drought. On the other hand, rain gardens are unlikely to suffer from intense waterlogging, since the function of a rain garden is that excess water is drained from the site. Plants typically found in rain gardens are able to soak up large amounts of rainfall during the year as an intermediate strategy during the dry season. Transpiration by growing plants accelerates soil drying between storms. Rain gardens perform best using plants that grow in regularly moist soils, because these plants can typically survive in drier soils that are relatively fertile (contain many nutrients).

Chosen vegetation needs to respect site constraints and limitations, and especially should not impede the primary function of bioretention. Trees under power lines, or that up-heave sidewalks when soils become moist, or whose roots seek out and clog drainage tiles can cause expensive damage. Trees generally contribute to bioretention sites the most when they are located close enough to tap moisture in the rain garden depression, yet do not excessively shade the garden and allow for evaporation. That said, shading open surface waters can reduce excessive heating of vegetative habitats. Plants tolerate inundation by warm water for less time than they tolerate cold water because heat drives out dissolved oxygen, thus a plant tolerant of early spring flooding may not survive summer inundation.

Pollutant removal

Rain gardens are designed to capture the initial flow of stormwater and reduce the accumulation of toxins flowing directly into natural waterways through ground filtration. Natural remediation of contaminated stormwater is an effective, cost-free treatment process. Directing water to flow through soil and vegetation achieves particle pollutant capture, while atmospheric pollutants are captured in plant membranes and then trapped in soil, where most of them begin to break down. These approaches help to diffuse runoff, which allows contaminants to be distributed across the site instead of concentrated. The National Science Foundation, the United States Environmental Protection Agency, and a number of research institutions are presently studying the impact of augmenting rain gardens with materials capable of capture or chemical reduction of the pollutants to benign compounds.

The primary challenge of rain garden design is predicting the types of pollutants and the acceptable loads of pollutants the rain garden's filtration system can process during high impact storm events. Contaminants may include organic material, such as animal waste and oil spills, as well as inorganic material, such as heavy metals and fertilizer nutrients. These pollutants are known to cause harmful over-promotion of plant and algal growth if they seep into streams and rivers. The challenge of predicting pollutant loads is specifically acute when a rain event occurs after a longer dry period. The initial storm water is often highly contaminated with the accumulated pollutants from dry periods. Rain garden designers have previously focused on finding robust native plants and encouraging adequate biofiltration, but recently have begun augmenting filtration layers with media specifically suited to chemically reduce redox of incoming pollutant streams. Certain plant species are very effective at storing mineral nutrients, which are only released once the plant dies and decays. Other species can absorb heavy metal contaminants. Cutting back and entirely removing these plants at the end of the growth cycle completely removes these contaminants. This process of cleaning up polluted soils and stormwater is called phytoremediation.

Projects

Australia

  • Healthy Waterways Raingardens Program promotes a simple and effective form of stormwater treatment, and aims to raise peoples' awareness about how good stormwater management contributes to healthy waterways. The program encourages people to build rain gardens at home, and has achieved its target is to see 10,000 rain gardens built across Melbourne by 2013.
  • Melbourne Water's database of Water Sensitive Urban Design projects, including 57 case studies relating to rain gardens/bioretention systems. Melbourne Water is the Victorian State Government agency responsible for managing Melbourne's water supply catchments.
  • Water By Design is a capacity building program that supports the uptake of Water Sensitive Urban Design, including rain gardens, in South East Queensland. It was established by the South East Queensland Healthy Waterways Partnership in 2005, as an integral component of the SEQ Healthy Waterways Strategy.

United Kingdom

  • The Wildfowl and Wetlands Trust's London Wetland Centre includes a rain garden designed by Nigel Dunnett.
  • Islington London Borough Council commissioned sustainable drainage consultants Robert Bray Associates to design a pilot rain garden in the Ashby Grove development which was completed in 2011. This raingarden is fed from a typical modest domestic roof catchment area of 30m² and is designed to demonstrate how simple and cost effective domestic rain gardens are to install. Monitoring apparatus was built into the design to allow Middlesex University to monitor water volumes, water quality and soil moisture content. The rain garden basin is 300mm deep and has a storage capacity of 2.17m³ which is just over the volume required to store runoff from the roof catchment in a 1 in 100 storm plus 30% allowance for climate change.
  • The Day Brook Rain Garden Project has introduced a number of rain gardens into an existing residential street in Sherwood, Nottingham

United States

  • The 12,000 rain garden campaign for Puget Sound is coordinating efforts to build 12,000 rain gardens in the Puget Sound Basin of Western Washington by 2016. The 12,000 rain gardens website provides information and resources for the general public, landscape professionals, municipal staff, and decision makers. By providing access to the best current guidance, easy-to-use materials, and a network of trained "Rain Garden Mentor" Master Gardeners, this campaign seeks to capture and cleanse over 200 Million gallons of polluted runoff each year, and thereby significantly improve Puget Sound's water quality.
  • Maplewood, Minnesota has implemented a policy of encouraging residents to install rain gardens. Many neighborhoods had swales added to each property, but installation of a garden at the swale was voluntary. The project was a partnership between the City of Maplewood, University of Minnesota Department of Landscape Architecture, and the Ramsey Washington Metro Watershed District. A focus group was held with residents and published so that other communities could use it as a resource when planning their own rain garden projects.
  • Some local governmental organizations offer local grants for residents to install raingardens. In Dakota County Minnesota, the Dakota County Soil and Water Conservation District offers $250 grants and technical assistance through their Landscaping for Clean Water programhttp://www.dakotaswcd.org/cleanwater_form.html to encourage residents to install residential raingardens.
  • In Seattle, a prototype project, used to develop a plan for the entire city, was constructed in 2003. Called SEA Street, for Street Edge Alternatives, it was a drastic facelift of a residential street. The street was changed from a typical linear path to a gentle curve, narrowed, with large rain gardens placed along most of the length of the street. The street has 11% less impervious surface than a regular street. There are 100 evergreen trees and 1100 shrubs along this 3-block stretch of road, and a 2-year study found that the amount of stormwater which leaves the street has been reduced by 99%.
  • 10,000 Rain Gardens is a public initiative in the Kansas City, Missouri metro area. Property owners are encouraged to create rain gardens, with an eventual goal of 10,000 individual gardens.
  • The West Michigan Environmental Action Council has established Rain Gardens of West Michigan as an outreach water quality program. Also in Michigan, the Southeastern Oakland County Water Authority has published a pamphlet to encourage residents to add a rain garden to their landscapes in order to improve the water quality in the Rouge River watershed. In Washtenaw County, homeowners can volunteer for the Water Resources Commissioner's Rain Garden program, in which volunteers are annually selected for free professional landscape design. The homeowners build the gardens themselves as well as pay for landscaping material. Photos of the gardens as well as design documents and drainage calculations are available online. The Washtenaw County Water Resource Commissioner's office also offers yearly in person and online Master Rain Gardener classes to help guide those interested in the rain garden design, building, and upkeep process.
  • The city of Portland, Oregon, has established a Clean River Rewards program, to encourage residents to disconnect downspouts from the city's combined sewer system and create rain gardens. Workshops, discounts on storm water bills, and web resources are offered.
  • In Delaware, several rain gardens have been created through the work of the University of Delaware Water Resources Agency, and environmental organizations, such as the Appoquinimink River Association.
  • In New Jersey, the Rutgers Cooperative Extension Water Resources Program has already installed over 125 demonstration rain gardens in suburban and urban areas. The Water Resources Program has begun to focus on using rain gardens as green infrastructure in urban areas, such as Camden and Newark to help prevent localized flooding, combined sewer overflows, and to improve water quality. The Water Resources Program has also revised and produced a rain garden manual in collaboration with The Native Plant Society of New Jersey.
  • According to the Massachusetts Department of Environmental Protection, rain gardens may remove 90% of total suspended solids, 50% of nitrogen, and 90% of phosphorus.
  • Dr. Allen P. Davis is an environment and civil engineering professor at the University of Maryland, College Park. For the past 20 years, Davis and his team have been studying the effectiveness of rain gardens. For their research, they constructed two rain gardens on campus near the Anacostia River watershed in the Fall of 2001. Much of the runoff from the University of Maryland campus, a member of the Anacostia Watershed Restoration Partnership, ends up in the Anacostia River feeding into the Chesapeake Bay. This research finds rain gardens to be a very effective method of water capture and filtration, encouraging others in the Chesapeake Bay Watershed to implement rain gardens.
    • Davis' research showed that rain gardens aid in the capturing and bio-degradation of pollutants such as suspended solids, bacteria, metals, oil, and grease.
    • Water quality analyzed at the University of Maryland showed a significant increase in water clarity after rain garden filtration.
    • There is a rain garden at the Center for Young Children (CYC) at University of Maryland designed by students from the Department of Plant Science and Landscape Agriculture. The rain garden allows teachers at the CYC to educate future students on sustainability.

China

  • At the University of Technology in Xi'an China, a rain garden was built to observe and study over 4 years. This study showed that over 4 years, there were 28 large storm events in Xi'an. Within these 28 storms, the rain garden was able to retain the rainfall from a majority of the storms. Only 5 of these storms caused the rain garden to overflow.
  • Rain Gardens in this sub-humid loess region of Xi'an China, are Low Impact Developments (LID).
  • China plans to implement a "sponge city" program in response to urban flooding. This program will prioritize the natural environment and will include rain gardens, green roofs, wetlands and more permeable surfaces to slow down storm water retention.

Friday, November 4, 2022

Human–computer chess matches

From Wikipedia, the free encyclopedia

Chess computers were first able to beat strong chess players in the late 1980s. Their most famous success was the victory of Deep Blue over then World Chess Champion Garry Kasparov in 1997, but there was some controversy over whether the match conditions favored the computer.

In 2002–2003, three human–computer matches were drawn, but, whereas Deep Blue was a specialized machine, these were chess programs running on commercially available computers.

Chess programs running on commercially available desktop computers won decisive victories against human players in matches in 2005 and 2006. The second of these, against then world champion Vladimir Kramnik is (as of 2019) the last major human-computer match.

Since that time, chess programs running on commercial hardware—more recently including mobile phones—have been able to defeat even the strongest human players.

MANIAC (1956)

In 1956 MANIAC, developed at Los Alamos Scientific Laboratory, became the first computer to defeat a human in a chess-like game. Playing with the simplified Los Alamos rules, it defeated a novice in 23 moves.

Mac Hack VI (1966–1968)

In 1966 MIT student Richard Greenblatt wrote the chess program Mac Hack VI using MIDAS macro assembly language on a Digital Equipment Corporation PDP-6 computer with 16K of memory. Mac Hack VI evaluated 10 positions per second.

In 1967, several MIT students and professors (organized by Seymour Papert) challenged Dr. Hubert Dreyfus to play a game of chess against Mac Hack VI. Dreyfus, a professor of philosophy at MIT, wrote the book What Computers Can’t Do, questioning the computer's ability to serve as a model for the human brain. He also asserted that no computer program could defeat even a 10-year-old child at chess. Dreyfus accepted the challenge. Herbert A. Simon, an artificial intelligence pioneer, watched the game. He said, "it was a wonderful game—a real cliffhanger between two woodpushers with bursts of insights and fiendish plans ... great moments of drama and disaster that go in such games." The computer was beating Dreyfus when he found a move which could have captured the enemy queen. The only way the computer could get out of this was to keep Dreyfus in checks with its own queen until it could fork the queen and king, and then exchange them. That is what the computer did. Soon, Dreyfus was losing. Finally, the computer checkmated Dreyfus in the middle of the board.

In the spring of 1967, Mac Hack VI played in the Boston Amateur championship, winning two games and drawing two games. Mac Hack VI beat a 1510 United States Chess Federation player. This was the first time a computer won a game in a human tournament. At the end of 1968, Mac Hack VI achieved a rating of 1529. The average rating in the USCF was near 1500.

Chess x.x (1968–1978)

In 1968, Northwestern University students Larry Atkin, David Slate and Keith Gorlen began work on Chess (Northwestern University). On 14 April 1970 an exhibition game was played against Australian Champion Fred Flatow, the program running on a Control Data Corporation 6600 model. Flatow won easily. On 25 July 1976, Chess 4.5 scored 5–0 in the Class B (1600–1799) section of the 4th Paul Masson chess tournament in Saratoga, California. This was the first time a computer won a human tournament. Chess 4.5 was rated 1722. Chess 4.5 running on a Control Data Corporation CDC Cyber 175 supercomputer (2.1 megaflops) looked at less than 1500 positions per second. On 20 February 1977, Chess 4.5 won the 84th Minnesota Open Championship with 5 wins and 1 loss. It defeated expert Charles Fenner rated 2016. On 30 April 1978, Chess 4.6 scored 5–0 at the Twin Cities Open in Minneapolis. Chess 4.6 was rated 2040. International Master Edward Lasker stated that year, "My contention that computers cannot play like a master, I retract. They play absolutely alarmingly. I know, because I have lost games to 4.7."

David Levy's bet (1978)

For a long time in the 1970s and 1980s, it remained an open question whether any chess program would ever be able to defeat the expertise of top humans. In 1968, International Master David Levy made a famous bet that no chess computer would be able to beat him within ten years. He won his bet in 1978 by beating Chess 4.7 (the strongest computer at the time).

Cray Blitz (1981)

In 1981, Cray Blitz scored 5–0 in the Mississippi State Championship. In round 4, it defeated Joe Sentef (2262) to become the first computer to beat a master in tournament play and the first computer to gain a master rating (2258).

HiTech (1988)

In 1988, HiTech won the Pennsylvania State Chess Championship with a score of 4½–½. HiTech defeated International Master Ed Formanek (2485).

The Harvard Cup Man versus Computer Chess Challenge was organized by Harvard University. There were six challenges from 1989 until 1995. They played in Boston and New York City. In each challenge the humans scored higher and the highest scorer was a human.

Year Men–Compu. Human points Compu. points Winner Points Best Program Points Rank
1989 4–4 13½ Boris Gulko, Michael Rohde 4 Deep Thought 1 5
1991 4–4 12 4 Maxim Dlugy Heuristic Alpha 2 5
1992 5–5 18 7 Michael Rohde 5 Socrates 3 3
1993 6–6 27 9 Joel Benjamin 5 Socrates 3 6
1994 6–8 29½ 18½ Joel Benjamin WChess 5 4
1995 6–6 23½ 12½ Joel Benjamin Virtual Chess 4

The Aegon Man–Machine Tournaments (1986–1997)

The 12 Aegon Man–Machine Tournaments were held annually from 1986 to 1997. The Dutch Computer Chess Federation (CSVN) organized the Aegon Man–Machine Tournaments in The Hague, Netherlands. The Aegon insurance company hosted the tournaments. An equal number of humans and computers played a 6-round swiss tournament with all games between humans and computers. The early tournaments were mostly local players and anti-computer tactics specialists. Later tournaments included masters and grandmasters. In the early tournaments, humans won more games. In the later tournaments, computers won more games.

100 players played in the 1997 tournament. Computers won 151 ½ points. Humans won 148 ½ points. Yona Kosashvili scored highest for the humans at 6 points out of 6 games. Kallisto scored highest for the computers at 4 ½ points.

Year Players Rounds Human points Comp Points Winner Points Best Program Points Rank
1986 2*11 7

Fred van der Vliet 6 Rebel 5
1987 2*13 6

Martin Voorn 6 Mephisto Dallas 16 bit 3
1988 2*16 6

Lex Jongsma 6 Mephisto Mega 4 4 7
1989 2*16 6 57½ 23 Ad van den Berg 5 Chess Challenger 8
1990 2*14 6 47 37 HiTech 5 HiTech 5 1
1991 2*20 6

John van der Wiel 6 MChess 4 8
1992 2*24 6 84 60 David Bronstein 6 Mephisto 68030 4 8
1993 2*32 6

David Bronstein The King 5 3
1994 2*38 6 114 114 Larry Christiansen Gideon 5
1995 2*48 6 132 155 John van der Wiel Hiarcs 5 2
1996 2*50 6 137½ 162½ Yasser Seirawan 6 Quest 5
1997 2*50 6 148½ 151½ Yona Kosashvili 6 Kallisto 4

Deep Thought (1989)

In 1988, Deep Thought shared first place with Tony Miles in the Software Toolworks Championship, ahead of a former world champion Mikhail Tal and several grandmasters, including Samuel Reshevsky, Walter Browne, and Mikhail Gurevich. It also defeated grandmaster Bent Larsen, making it the first computer to beat a grandmaster in a tournament. Its rating for performance in this tournament of 2745 (USCF scale).

In 1989, Levy was defeated by the computer Deep Thought in an exhibition match.

Deep Thought, however, was still considerably below World Championship Level, as the then reigning world chess champion Garry Kasparov demonstrated in two convincing wins in 1989.

Chess Genius (1994)

The "Chess Genius" program was entered into a Professional Chess Association rapid chess tournament in 1994. It defeated and eliminated world champion Kasparov, but lost to Viswanathan Anand in the next round. This was the first time a computer had defeated the world champion in an official game, albeit at rapid time controls.

Kasparov–Deep Blue (1996–1997)

1996 Deep Blue–Kasparov 1996, game 1



abcdefgh
8
Chessboard480.svg
h7 white rook
f6 black queen
h6 black king
d5 white queen
g5 white knight
d4 black pawn
a3 white pawn
b3 white pawn
f3 black pawn
g3 white pawn
h3 white pawn
f2 black knight
h2 white king
e1 black rook
8
77
66
55
44
33
22
11

abcdefgh
Final position

Kasparov played a six-game match against IBM's Deep Blue in 1996. Kasparov lost the first game (Deep Blue–Kasparov, 1996, Game 1), the first time a reigning world champion had lost to a computer using regular time controls. However, Kasparov regrouped to win three and draw two of the remaining five games of the match, for a convincing 4–2 match victory.

1997

In May 1997, an updated version of Deep Blue defeated Kasparov 3½–2½ in a highly publicized six-game match. Kasparov won the first, lost the second, and drew the next three. The match was even after five games but Kasparov was crushed in Game 6. This was the first time a computer had ever defeated a world champion in match play. A documentary film was made about this famous match-up entitled Game Over: Kasparov and the Machine. In that film Kasparov casually says, "I have to tell you that, you know, game two was not just a single loss of a game. It was a loss of the match, because I couldn't recover."

In game 6, Kasparov blundered very early into the game. Kasparov cites tiredness and unhappiness with the IBM team's conduct at the time as the main reason.

Kasparov claimed that several factors weighed against him in this match. In particular, he was denied access to Deep Blue's recent games, in contrast to the computer's team that could study hundreds of Kasparov's.

After the loss, Kasparov said that he sometimes saw deep intelligence and creativity in the machine's moves, suggesting that during the second game, human chess players, in contravention of the rules, intervened. IBM denied that it cheated, saying the only human intervention occurred between games. The rules provided for the developers to modify the program between games, an opportunity they said they used to shore up weaknesses in the computer's play revealed during the course of the match. Kasparov requested printouts of the machine's log files but IBM refused, although the company later published the logs on the Internet. Kasparov demanded a rematch, but IBM refused and dismantled Deep Blue.

Kasparov maintains that he was told the match was to be a scientific project but that it soon became apparent that IBM wanted only to beat him for the company's advertisement.

Anand–REBEL (1998)

With increasing processing power, Chess programs running on regular workstations began to rival top flight players. In 1998, Rebel 10 defeated Viswanathan Anand who, at the time, was ranked second in the world, by a score of 5–3. However, most of those games were not played under normal time controls. Out of the eight games, four were blitz games (five minutes plus five seconds Fischer delay (see time control) for each move); these Rebel won 3–1. Then two were semi-blitz games (fifteen minutes for each side) which Rebel won as well (1½–½). Finally, two games were played as regular tournament games (forty moves in two hours, one hour sudden death); here, it was Anand who won ½–1½. At least in fast games, computers played better than humans, but under classical time controls—at which a player's rating is determined—the advantage was not so clear.

Deep Junior at Dortmund (2000)

Deep Junior played 9 grandmasters at the Sparkassen Chess Meeting in Dortmund, Germany from 6–17 July 2000. The 2000 Sparkassen Chess Meeting was a category 19 chess tournament. Computer program Deep Junior competed in a round robin format. Deep Junior scored 4½ in 9 rounds. Deep Junior performed at a rating of 2703.

Round White Elo Black Elo Result Moves ECO
1 Bareev, E 2702 Deep Junior
½–½ 146 D46
2 Deep Junior
Huebner, R 2615 1–0 39 C04
3 Adams, M 2755 Deep Junior
½–½ 84 C68
4 Deep Junior
Khalifman, A 2667 ½–½ 129 B08
5 Kramnik, V 2770 Deep Junior
1–0 65 D00
6 Deep Junior
Akopian, V 2660 ½–½ 89 B00
7 Anand, V 2762 Deep Junior
½–½ 35 D05
8 Deep Junior
Piket, J 2649 0–1 68 B15
9 Leko, P 2740 Deep Junior
0–1 120 C48

Kramnik–Deep Fritz (2002)

In October 2002, Vladimir Kramnik (who had succeeded Kasparov as Classical World Chess Champion) and Deep Fritz competed in the eight-game Brains in Bahrain match, which ended in a 4–4 draw.

Kramnik was given several advantages in his match against Fritz when compared to most other human–computer matches, such as the one Kasparov lost against Deep Blue in 1997. The code of Fritz was frozen some time before the first match and Kramnik was given a copy of Fritz to practice with for several months. Another difference was that in games lasting more than 56 moves, Kramnik was allowed to adjourn until the following day, during which time he could use his copy of Fritz to aid him in his overnight analysis of the position.

Kramnik won games 2 and 3 by "conventional" anti-computer tactics—play conservatively for a long-term advantage the computer is not able to see in its game tree search. Fritz, however, won game 5 after a severe blunder by Kramnik. Game 6 was described by the tournament commentators as "spectacular." Kramnik, in a better position in the early middlegame, tried a piece sacrifice to achieve a strong tactical attack, a strategy known to be highly risky against computers who are at their strongest defending against such attacks. True to form, Fritz found a watertight defense and Kramnik's attack petered out, leaving him in a bad position. Kramnik resigned the game, believing the position lost. However, post-game human and computer analysis has shown that the Fritz program was unlikely to have been able to force a win and Kramnik effectively sacrificed a drawn position. The final two games were draws. Given the circumstances, most commentators still rate Kramnik the stronger player in the match.

Kasparov–Deep Junior (2003)

In January 2003, Kasparov engaged in a six-game classical time control match with a $1 million prize fund which was billed as the FIDE "Man vs. Machine" World Championship, against Deep Junior. The engine evaluated three million positions per second. After one win each and three draws, it was all up to the final game. The final game of the match was televised on ESPN2 and was watched by an estimated 200–300 million people. After reaching a decent position Kasparov offered a draw, which was soon accepted by the Deep Junior team. Asked why he offered the draw, Kasparov said he feared making a blunder. Originally planned as an annual event, the match was not repeated.

Kasparov–X3D Fritz (2003)

In November 2003, Kasparov engaged in a four-game match against the computer program X3D Fritz (which was said to have an estimated rating of 2807), using a virtual board, 3D glasses and a speech recognition system. After two draws and one win apiece, the X3D Man–Machine match ended in a draw. Kasparov received $175,000 for the result and took home the golden trophy. Kasparov continued to criticize the blunder in the second game that cost him a crucial point. He felt that he had outplayed the machine overall and played well. "I only made one mistake but unfortunately that one mistake lost the game."

Man vs Machine World Team Championship (2004–2005)

The Man vs Machine World Team Championships were two chess tournaments in Bilbao, Spain, between leading chess grandmasters and chess computers. Both were convincingly won by the computers. A second name for the tournaments is Human vs. Computers World Team Matches.

2004

In October 2004, Ruslan Ponomariov (then having Elo 2710), Veselin Topalov (Elo 2757) and Sergey Karjakin (Elo 2576) played against computers Hydra, Fritz 8, and Deep Junior. Ponomariov and Topalov were FIDE world chess champions. Sergey Karjakin at 12 was the youngest Grandmaster. Hydra was running on a special machine with 16 processors located in Abu Dhabi, UAE; Deep Junior, the then reigning computer chess world champion, used a remote 4 x 2.8 GHz Xeon machine located at Intel UK (Swindon); and Fritz 8 was running on a Centrino 1.7 GHz notebook. The computers won 8½ to 3½. The humans won one game: Karjakin, the youngest and lowest rated player, defeated Deep Junior.

  • Ponomariov–Hydra, 0–1
  • Fritz–Karjakin, 1–0
  • Deep Junior–Topalov, ½–½
  • Karjakin–Deep Junior, 1–0
  • Ponomariov–Fritz, ½–½
  • Topalov–Hydra, ½–½
  • Deep Junior–Ponomariov, ½–½
  • Hydra–Karjakin, 1–0
  • Fritz–Topalov, 1–0
  • Hydra–Ponomariov, 1–0
  • Karjakin–Fritz, 0–1
  • Topalov–Deep Junior, ½–½

2005

In November 2005, 3 former FIDE world chess champions—Alexander Khalifman, Ruslan Ponomariov and Rustam Kasimdzhanov—played against computers Hydra, Junior and Fritz. The computers won 8 to 4. The Ponomariov vs Fritz game on 21 November 2005 is the last known win by a human against a top performing computer under normal chess tournament conditions.

  • Ponomariov–Junior, 0–1
  • Hydra–Kasimdzhanov, 1–0
  • Fritz–Khalifman, 1–0
  • Ponomariov–Fritz, 1–0
  • Kasimdzhanov–Junior, ½–½
  • Khalifman–Hydra, ½–½
  • Hydra–Ponomariov, 1–0
  • Fritz–Kasimdzhanov, ½–½
  • Junior–Khalifman, 1–0
  • Ponomariov–Junior, ½–½
  • Kasimdzhanov–Hydra, ½–½
  • Khalifman–Fritz, ½–½

Hydra–Adams (2005)

In 2005, Hydra, a dedicated chess computer with custom hardware and sixty-four processors and also winner of the 14th IPCCC in 2005, crushed seventh-ranked Michael Adams 5½–½ in a six-game match. While Adams was criticized for not preparing as well as Kasparov and Kramnik had, some commentators saw this as heralding the end of human–computer matches.

Kramnik–Deep Fritz (2006)

Kramnik, then still the World Champion, played a six-game match against the computer program Deep Fritz in Bonn, Germany from 25 November – 5 December 2006, losing 4–2 to the machine, with two losses and four draws. He received 500,000 euros for playing and would have received another 500,000 euros had he won the match. Deep Fritz version 10 ran on a computer containing two Intel Xeon CPUs (a Xeon DC 5160 3 GHz processor with a 1333 MHz FSB and a 4 MB L2 cache) and was able to evaluate eight million positions per second. Kramnik received a copy of the program in mid-October for testing, but the final version included an updated opening book. Except for limited updates to the opening book, the program was not allowed to be changed during the course of the match. The endgame tablebases used by the program were restricted to five pieces even though a complete six-piece tablebase is widely available. While Deep Fritz was in its opening book Kramnik is allowed to see Fritz's display. The Fritz display contains opening book moves, number of games, Elo performance, score from grandmaster games and the move weighting.

In the first five games, Kramnik steered the game into a typical "anti-computer" positional contest. On 25 November, the first game ended in a draw at the 47th move. A number of commentators believe Kramnik missed a win. Two days later, the second game resulted in a victory for Deep Fritz when Kramnik made what Susan Polgar called the "blunder of the century", when he failed to defend against a threatened mate-in-one in an even position. (see also Deep Fritz vs. Vladimir Kramnik blunder). The third, fourth and fifth games in the match ended in draws.

In the final game, in an attempt to draw the match, Kramnik played the more aggressive Sicilian Defence and was crushed, losing the match 4–2.

There was speculation that interest in human–computer chess competition would plummet as a result of the 2006 Kramnik–Deep Fritz match. According to McGill University computer science professor Monty Newborn, for example, "I don’t know what one could get out of it [a further match] at this point. The science is done.". The prediction appears to have come true, with no further major human–computer matches, as of 2019.

Rybka odds matches (2007–2008)

Since 2007, Rybka has played some odds matches against grandmasters. Jaan Ehlvest first lost a pawn-odds match, then later lost a match when given time, color, opening, and endgame advantages. Roman Dzindzichashvili then lost a match when given pawn and move odds.

In September 2008, Rybka played an odds match against Vadim Milov, its strongest opponent yet in an odds match (Milov at the time had an Elo rating of 2705, 28th in the world). The result was a narrow victory to Milov—he had won 1½–½ when given pawn-and-move, and 2½–1½ (1 win, 3 draws) when given exchange odds but playing black. In two standard games (Milov had white, no odds), Rybka won 1½–½.

Pocket Fritz 4 (2009)

In 2009, a chess engine running on slower hardware, a 528 MHz HTC Touch HD mobile phone, reached the grandmaster level. The mobile phone won a category 6 tournament with a performance rating of 2898. The chess engine Hiarcs 13 runs inside Pocket Fritz 4 on the mobile phone HTC Touch HD. Pocket Fritz 4 won the Copa Mercosur tournament in Buenos Aires, Argentina with 9 wins and 1 draw on 4–14 August 2009. Pocket Fritz 4 searches fewer than 20,000 positions per second, as explained by Tsukrov, the author of the Pocket Fritz GUI. This is in contrast to supercomputers such as Deep Blue that searched 200 million positions per second. Pocket Fritz 4 achieves a higher performance level than Deep Blue.

Pocket Fritz 3 using version 12.1 of Hiarcs won the same event the previous year with six wins and four draws, running on a 624 MHz HP iPAQ hx2790. The 2008 Mercosur Cup was a category 7 tournament. Pocket Fritz 3 achieved a performance rating of 2690.

Komodo handicap matches (2015)

In 2015, a chess engine Komodo played a series of handicap matches with GM Petr Neuman. Neuman won the match.

Komodo handicap matches (2020)

In 2020, chess engine Komodo played a series of handicap matches with Australian GM David Smerdon at knight odds. Smerdon won 5–1, in spite of most commentators who favored Komodo to win. In November 2020, chess engine Komodo trained by an NNUE reinforcement learning algorithm, played 8 15-minute games against top grandmaster Hikaru Nakamura given double-pawn odds. Nakamura lost the match, drawing 3, losing 5, and winning 0.

Bayesian inference

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Bayesian_inference Bayesian inference ( / ...