Search This Blog

Monday, December 17, 2018

Plant breeding

From Wikipedia, the free encyclopedia

The Yecoro wheat (right) cultivar is sensitive to salinity, plants resulting from a hybrid cross with cultivar W4910 (left) show greater tolerance to high salinity

Plant breeding is the science of changing the traits of plants in order to produce desired characteristics. It has been used to improve the quality of nutrition in products for humans and animals. Plant breeding can be accomplished through many different techniques ranging from simply selecting plants with desirable characteristics for propagation, to methods that make use of knowledge of genetics and chromosomes, to more complex molecular techniques. Genes in a plant are what determine what type of qualitative or quantitative traits it will have. Plant breeders strive to create a specific outcome of plants and potentially new plant varieties.

Plant breeding has been practiced for thousands of years, since near the beginning of human civilization. It is practiced worldwide by individuals such as gardeners and farmers, and by professional plant breeders employed by organizations such as government institutions, universities, crop-specific industry associations or research centers. 

International development agencies believe that breeding new crops is important for ensuring food security by developing new varieties that are higher yielding, disease resistant, drought tolerant or regionally adapted to different environments and growing conditions.

History

Plant breeding started with sedentary agriculture and particularly the domestication of the first agricultural plants, a practice which is estimated to date back 9,000 to 11,000 years. Initially early farmers simply selected food plants with particular desirable characteristics, and employed these as progenitors for subsequent generations, resulting in an accumulation of valuable traits over time. 

Grafting technology had been practiced in China before 2000 BCE.

By 500 BCE grafting was well established and practiced. 

Gregor Mendel (1822–84) is considered the "father of modern genetics". His experiments with plant hybridization led to his establishing laws of inheritance. Genetics stimulated research to improve crop production through plant breeding. 

Modern plant breeding is applied genetics, but its scientific basis is broader, covering molecular biology, cytology, systematics, physiology, pathology, entomology, chemistry, and statistics (biometrics). It has also developed its own technology.

Classical plant breeding

One major technique of plant breeding is selection, the process of selectively propagating plants with desirable characteristics and eliminating or "culling" those with less desirable characteristics.

Another technique is the deliberate interbreeding (crossing) of closely or distantly related individuals to produce new crop varieties or lines with desirable properties. Plants are crossbred to introduce traits/genes from one variety or line into a new genetic background. For example, a mildew-resistant pea may be crossed with a high-yielding but susceptible pea, the goal of the cross being to introduce mildew resistance without losing the high-yield characteristics. Progeny from the cross would then be crossed with the high-yielding parent to ensure that the progeny were most like the high-yielding parent, (backcrossing). The progeny from that cross would then be tested for yield (selection, as described above) and mildew resistance and high-yielding resistant plants would be further developed. Plants may also be crossed with themselves to produce inbred varieties for breeding. Pollinators may be excluded through the use of pollination bags

Classical breeding relies largely on homologous recombination between chromosomes to generate genetic diversity. The classical plant breeder may also make use of a number of in vitro techniques such as protoplast fusion, embryo rescue or mutagenesis (see below) to generate diversity and produce hybrid plants that would not exist in nature

Traits that breeders have tried to incorporate into crop plants include:

Before World War II

Garton's catalogue from 1902

Successful commercial plant breeding concerns were founded from the late 19th century. Gartons Agricultural Plant Breeders in England was established in the 1890s by John Garton, who was one of the first to commercialize new varieties of agricultural crops created through cross-pollination. The firm's first introduction was Abundance Oat, one of the first agricultural grain varieties bred from a controlled cross, introduced to commerce in 1892.

In the early 20th century, plant breeders realized that Mendel's findings on the non-random nature of inheritance could be applied to seedling populations produced through deliberate pollinations to predict the frequencies of different types. Wheat hybrids were bred to increase the crop production of Italy during the so-called "Battle for Grain" (1925–1940). Heterosis was explained by George Harrison Shull. It describes the tendency of the progeny of a specific cross to outperform both parents. The detection of the usefulness of heterosis for plant breeding has led to the development of inbred lines that reveal a heterotic yield advantage when they are crossed. Maize was the first species where heterosis was widely used to produce hybrids. 

Statistical methods were also developed to analyze gene action and distinguish heritable variation from variation caused by environment. In 1933 another important breeding technique, cytoplasmic male sterility (CMS), developed in maize, was described by Marcus Morton Rhoades. CMS is a maternally inherited trait that makes the plant produce sterile pollen. This enables the production of hybrids without the need for labor-intensive detasseling

These early breeding techniques resulted in large yield increase in the United States in the early 20th century. Similar yield increases were not produced elsewhere until after World War II, the Green Revolution increased crop production in the developing world in the 1960s.

After World War II

In vitro-culture of Vitis (grapevine), Geisenheim Grape Breeding Institute

Following World War II a number of techniques were developed that allowed plant breeders to hybridize distantly related species, and artificially induce genetic diversity. 

When distantly related species are crossed, plant breeders make use of a number of plant tissue culture techniques to produce progeny from otherwise fruitless mating. Interspecific and intergeneric hybrids are produced from a cross of related species or genera that do not normally sexually reproduce with each other. These crosses are referred to as Wide crosses. For example, the cereal triticale is a wheat and rye hybrid. The cells in the plants derived from the first generation created from the cross contained an uneven number of chromosomes and as result was sterile. The cell division inhibitor colchicine was used to double the number of chromosomes in the cell and thus allow the production of a fertile line. 

Failure to produce a hybrid may be due to pre- or post-fertilization incompatibility. If fertilization is possible between two species or genera, the hybrid embryo may abort before maturation. If this does occur the embryo resulting from an interspecific or intergeneric cross can sometimes be rescued and cultured to produce a whole plant. Such a method is referred to as Embryo Rescue. This technique has been used to produce new rice for Africa, an interspecific cross of Asian rice (Oryza sativa) and African rice (Oryza glaberrima)

Hybrids may also be produced by a technique called protoplast fusion. In this case protoplasts are fused, usually in an electric field. Viable recombinants can be regenerated in culture. 

Chemical mutagens like EMS and DMS, radiation and transposons are used to generate mutants with desirable traits to be bred with other cultivars – a process known as Mutation Breeding. Classical plant breeders also generate genetic diversity within a species by exploiting a process called somaclonal variation, which occurs in plants produced from tissue culture, particularly plants derived from callus. Induced polyploidy, and the addition or removal of chromosomes using a technique called chromosome engineering may also be used. 

Agricultural research on potato plants

When a desirable trait has been bred into a species, a number of crosses to the favored parent are made to make the new plant as similar to the favored parent as possible. Returning to the example of the mildew resistant pea being crossed with a high-yielding but susceptible pea, to make the mildew resistant progeny of the cross most like the high-yielding parent, the progeny will be crossed back to that parent for several generations. This process removes most of the genetic contribution of the mildew resistant parent. Classical breeding is therefore a cyclical process. 

With classical breeding techniques, the breeder does not know exactly what genes have been introduced to the new cultivars. Some scientists therefore argue that plants produced by classical breeding methods should undergo the same safety testing regime as genetically modified plants. There have been instances where plants bred using classical techniques have been unsuitable for human consumption, for example the poison solanine was unintentionally increased to unacceptable levels in certain varieties of potato through plant breeding. New potato varieties are often screened for solanine levels before reaching the marketplace.

Modern plant breeding

Modern plant breeding may use techniques of molecular biology to select, or in the case of genetic modification, to insert, desirable traits into plants. Application of biotechnology or molecular biology is also known as molecular breeding

Modern facilities in molecular biology are now used in plant breeding.

Marker assisted selection

Sometimes many different genes can influence a desirable trait in plant breeding. The use of tools such as molecular markers or DNA fingerprinting can map thousands of genes. This allows plant breeders to screen large populations of plants for those that possess the trait of interest. The screening is based on the presence or absence of a certain gene as determined by laboratory procedures, rather than on the visual identification of the expressed trait in the plant. The purpose of marker assisted selection, or plant genomes analysis, is to identify the location and function (phenotype) of various genes within the genome. If all of the genes are identified it leads to Genome sequence. All plants have varying sizes and lengths of genomes with genes that code for different proteins, but many are also the same. If a gene's location and function is identified in one plant species, a very similar gene likely can also be found in a similar location in another species genome.

Reverse breeding and doubled haploids (DH)

Homozygous plants with desirable traits can be produced from heterozygous starting plants, if a haploid cell with the alleles for those traits can be produced, and then used to make a doubled haploid. The doubled haploid will be homozygous for the desired traits. Furthermore, two different homozygous plants created in that way can be used to produce a generation of F1 hybrid plants which have the advantages of heterozygosity and a greater range of possible traits. Thus, an individual heterozygous plant chosen for its desirable characteristics can be converted into a heterozygous variety (F1 hybrid) without the necessity of vegetative reproduction but as the result of the cross of two homozygous/doubled haploid lines derived from the originally selected plant. Using plant tissue culturing can produce haploid or double haploid plant lines and generations. This minimizes the amount of genetic diversity among that plant species in order to select for desirable traits that will increase the fitness of the individuals. Using this method decreases the need for breeding multiple generations of plants to get a generation that is homologous for the desired traits, therefore save much time in the process. There are many plant tissue culturing techniques that can be used to achieve the haploid plants, but microspore culturing is currently the most promising for producing the largest numbers of them.

Genetic modification

Genetic modification of plants is achieved by adding a specific gene or genes to a plant, or by knocking down a gene with RNAi, to produce a desirable phenotype. The plants resulting from adding a gene are often referred to as transgenic plants. If for genetic modification genes of the species or of a crossable plant are used under control of their native promoter, then they are called cisgenic plants. Sometimes genetic modification can produce a plant with the desired trait or traits faster than classical breeding because the majority of the plant's genome is not altered. 

To genetically modify a plant, a genetic construct must be designed so that the gene to be added or removed will be expressed by the plant. To do this, a promoter to drive transcription and a termination sequence to stop transcription of the new gene, and the gene or genes of interest must be introduced to the plant. A marker for the selection of transformed plants is also included. In the laboratory, antibiotic resistance is a commonly used marker: Plants that have been successfully transformed will grow on media containing antibiotics; plants that have not been transformed will die. In some instances markers for selection are removed by backcrossing with the parent plant prior to commercial release. 

The construct can be inserted in the plant genome by genetic recombination using the bacteria Agrobacterium tumefaciens or A. rhizogenes, or by direct methods like the gene gun or microinjection. Using plant viruses to insert genetic constructs into plants is also a possibility, but the technique is limited by the host range of the virus. For example, Cauliflower mosaic virus (CaMV) only infects cauliflower and related species. Another limitation of viral vectors is that the virus is not usually passed on the progeny, so every plant has to be inoculated. 

The majority of commercially released transgenic plants are currently limited to plants that have introduced resistance to insect pests and herbicides. Insect resistance is achieved through incorporation of a gene from Bacillus thuringiensis (Bt) that encodes a protein that is toxic to some insects. For example, the cotton bollworm, a common cotton pest, feeds on Bt cotton it will ingest the toxin and die. Herbicides usually work by binding to certain plant enzymes and inhibiting their action. The enzymes that the herbicide inhibits are known as the herbicides target site. Herbicide resistance can be engineered into crops by expressing a version of target site protein that is not inhibited by the herbicide. This is the method used to produce glyphosate resistant crop plants.

Genetic modification can further increase yields by increasing stress tolerance to a given environment. Stresses such as temperature variation, are signalled to the plant via a cascade of signalling molecules which will activate a Transcription factor to regulate Gene expression. Overexpression of particular genes involved in cold acclimation has been shown to become more resistant to freezing, which is one common cause of yield loss.

Genetic modification of plants that can produce pharmaceuticals (and industrial chemicals), sometimes called pharming, is a rather radical new area of plant breeding.

Issues and concerns

Modern plant breeding, whether classical or through genetic engineering, comes with issues of concern, particularly with regard to food crops. The question of whether breeding can have a negative effect on nutritional value is central in this respect. Although relatively little direct research in this area has been done, there are scientific indications that, by favoring certain aspects of a plant's development, other aspects may be retarded. A study published in the Journal of the American College of Nutrition in 2004, entitled Changes in USDA Food Composition Data for 43 Garden Crops, 1950 to 1999, compared nutritional analysis of vegetables done in 1950 and in 1999, and found substantial decreases in six of 13 nutrients measured, including 6% of protein and 38% of riboflavin. Reductions in calcium, phosphorus, iron and ascorbic acid were also found. The study, conducted at the Biochemical Institute, University of Texas at Austin, concluded in summary: "We suggest that any real declines are generally most easily explained by changes in cultivated varieties between 1950 and 1999, in which there may be trade-offs between yield and nutrient content." 

The debate surrounding genetically modified food during the 1990s peaked in 1999 in terms of media coverage and risk perception, and continues today – for example, "Germany has thrown its weight behind a growing European mutiny over genetically modified crops by banning the planting of a widely grown pest-resistant corn variety." The debate encompasses the ecological impact of genetically modified plants, the safety of genetically modified food and concepts used for safety evaluation like substantial equivalence. Such concerns are not new to plant breeding. Most countries have regulatory processes in place to help ensure that new crop varieties entering the marketplace are both safe and meet farmers' needs. Examples include variety registration, seed schemes, regulatory authorizations for GM plants, etc. 

Plant breeders' rights is also a major and controversial issue. Today, production of new varieties is dominated by commercial plant breeders, who seek to protect their work and collect royalties through national and international agreements based in intellectual property rights. The range of related issues is complex. In the simplest terms, critics of the increasingly restrictive regulations argue that, through a combination of technical and economic pressures, commercial breeders are reducing biodiversity and significantly constraining individuals (such as farmers) from developing and trading seed on a regional level. Efforts to strengthen breeders' rights, for example, by lengthening periods of variety protection, are ongoing.

When new plant breeds or cultivars are bred, they must be maintained and propagated. Some plants are propagated by asexual means while others are propagated by seeds. Seed propagated cultivars require specific control over seed source and production procedures to maintain the integrity of the plant breeds results. Isolation is necessary to prevent cross contamination with related plants or the mixing of seeds after harvesting. Isolation is normally accomplished by planting distance but in certain crops, plants are enclosed in greenhouses or cages (most commonly used when producing F1 hybrids.)

Role of plant breeding in organic agriculture

Critics of organic agriculture claim it is too low-yielding to be a viable alternative to conventional agriculture. However, part of that poor performance may be the result of growing poorly adapted varieties. It is estimated that over 95% of organic agriculture is based on conventionally adapted varieties, even though the production environments found in organic vs. conventional farming systems are vastly different due to their distinctive management practices. Most notably, organic farmers have fewer inputs available than conventional growers to control their production environments. Breeding varieties specifically adapted to the unique conditions of organic agriculture is critical for this sector to realize its full potential. This requires selection for traits such as:
  • Water use efficiency
  • Nutrient use efficiency (particularly nitrogen and phosphorus)
  • Weed competitiveness
  • Tolerance of mechanical weed control
  • Pest/disease resistance
  • Early maturity (as a mechanism for avoidance of particular stresses)
  • Abiotic stress tolerance (i.e. drought, salinity, etc...)
Currently, few breeding programs are directed at organic agriculture and until recently those that did address this sector have generally relied on indirect selection (i.e. selection in conventional environments for traits considered important for organic agriculture). However, because the difference between organic and conventional environments is large, a given genotype may perform very differently in each environment due to an interaction between genes and the environment. If this interaction is severe enough, an important trait required for the organic environment may not be revealed in the conventional environment, which can result in the selection of poorly adapted individuals. To ensure the most adapted varieties are identified, advocates of organic breeding now promote the use of direct selection (i.e. selection in the target environment) for many agronomic traits. 

There are many classical and modern breeding techniques that can be utilized for crop improvement in organic agriculture despite the ban on genetically modified organisms. For instance, controlled crosses between individuals allow desirable genetic variation to be recombined and transferred to seed progeny via natural processes. Marker assisted selection can also be employed as a diagnostics tool to facilitate selection of progeny who possess the desired trait(s), greatly speeding up the breeding process. This technique has proven particularly useful for the introgression of resistance genes into new backgrounds, as well as the efficient selection of many resistance genes pyramided into a single individual. Unfortunately, molecular markers are not currently available for many important traits, especially complex ones controlled by many genes.

Addressing global food security through plant breeding

For future agriculture to thrive there are necessary changes which must be made in accordance to arising global issues. These issues are arable land, harsh cropping conditions and food security which involves, being able to provide the world population with food containing sufficient nutrients. These crops need to be able to mature in several environments allowing for worldwide access, this is involves issues such as drought tolerance. These global issues are achievable through the process of plant breeding, as it offers the ability to select specific genes allowing the crop to perform at a level which yields the desired results.

Increased yield without expansion

With an increasing population, the production of food needs to increase with it. It is estimated that a 70% increase in food production is needed by 2050 in order to meet the Declaration of the World Summit on Food Security. But with the degradation of agricultural land, simply planting more crops is no longer a viable option. New varieties of plants can in some cases be developed through plant breeding that generate an increase of yield without relying on an increase in land area. An example of this can be seen in Asia, where food production per capita has increased twofold. This has been achieved through not only the use of fertilisers, but through the use of better crops that have been specifically designed for the area.

Breeding for increased nutritional value

Plant breeding can contribute to global food security as it is a cost-effective tool for increasing nutritional value of forage and crops. Improvements in nutritional value for forage crops from the use of analytical chemistry and rumen fermentation technology have been recorded since 1960; this science and technology gave breeders the ability to screen thousands of samples within a small amount of time, meaning breeders could identify a high performing hybrid quicker. The main area genetic increases were made was in vitro dry matter digestibility (IVDMD) resulting in 0.7-2.5% increase, at just 1% increase in IVDMD a single Bos Taurus also known as beef cattle reported 3.2% increase in daily gains. This improvement indicates plant breeding is an essential tool in gearing future agriculture to perform at a more advanced level.

Breeding for tolerance

Plant breeding of hybrid crops has become extremely popular worldwide in an effort to combat the harsh environment. With long periods of drought and lack of water or nitrogen stress tolerance has become a significant part of agriculture. Plant breeders have focused on identifying crops which will ensure crops perform under these conditions; a way to achieve this is finding strains of the crop that is resistance to drought conditions with low nitrogen. It is evident from this that plant breeding is vital for future agriculture to survive as it enables farmers to produce stress resistant crops hence improving food security.  In countries that experience harsh winters such as Iceland, Germany and further east in Europe, plant breeders are involved in breeding for tolerance to frost, continuous snow-cover, frost-drought (desiccation from wind and solar radiation under frost) and high moisture levels in soil in winter.

Participatory plant breeding

Participatory plant breeding (PPB) is when farmers are involved in a crop improvement programme with opportunities to make decisions and contribute to the research process at different stages. Participatory approaches to crop improvement can also be applied when plant biotechnologies are being used for crop improvement. Local agricultural systems and genetic diversity are developed and strengthened by crop improvement, which participatory crop improvement (PCI) plays a large role. PPB is enhanced by farmers knowledge of the quality required and evaluation of target environment which affects the effectiveness of PPB.

List of notable plant breeders

Terms related to plant breeding

  • Mentor pollen, inactivated pollen that is compatible with the female plant is mixed with pollen that would normally be incompatible. The mentor pollen has the effect of guiding the foreign pollen to the ovules.
  • S1 generation, the product of self-fertilization.

History of biotechnology

From Wikipedia, the free encyclopedia

Brewing was an early example of biotechnology
 
Biotechnology is the application of scientific and engineering principles to the processing of materials by biological agents to provide goods and services. From its inception, biotechnology has maintained a close relationship with society. Although now most often associated with the development of drugs, historically biotechnology has been principally associated with food, addressing such issues as malnutrition and famine. The history of biotechnology begins with zymotechnology, which commenced with a focus on brewing techniques for beer. By World War I, however, zymotechnology would expand to tackle larger industrial issues, and the potential of industrial fermentation gave rise to biotechnology. However, both the single-cell protein and gasohol projects failed to progress due to varying issues including public resistance, a changing economic scene, and shifts in political power. 

Yet the formation of a new field, genetic engineering, would soon bring biotechnology to the forefront of science in society, and the intimate relationship between the scientific community, the public, and the government would ensue. These debates gained exposure in 1975 at the Asilomar Conference, where Joshua Lederberg was the most outspoken supporter for this emerging field in biotechnology. By as early as 1978, with the development of synthetic human insulin, Lederberg's claims would prove valid, and the biotechnology industry grew rapidly. Each new scientific advance became a media event designed to capture public support, and by the 1980s, biotechnology grew into a promising real industry. In 1988, only five proteins from genetically engineered cells had been approved as drugs by the United States Food and Drug Administration (FDA), but this number would skyrocket to over 125 by the end of the 1990s. 

The field of genetic engineering remains a heated topic of discussion in today's society with the advent of gene therapy, stem cell research, cloning, and genetically modified food. While it seems only natural nowadays to link pharmaceutical drugs as solutions to health and societal problems, this relationship of biotechnology serving social needs began centuries ago.

Origins of biotechnology

Biotechnology arose from the field of zymotechnology or zymurgy, which began as a search for a better understanding of industrial fermentation, particularly beer. Beer was an important industrial, and not just social, commodity. In late 19th-century Germany, brewing contributed as much to the gross national product as steel, and taxes on alcohol proved to be significant sources of revenue to the government. In the 1860s, institutes and remunerative consultancies were dedicated to the technology of brewing. The most famous was the private Carlsberg Institute, founded in 1875, which employed Emil Christian Hansen, who pioneered the pure yeast process for the reliable production of consistent beer. Less well known were private consultancies that advised the brewing industry. One of these, the Zymotechnic Institute, was established in Chicago by the German-born chemist John Ewald Siebel.

The heyday and expansion of zymotechnology came in World War I in response to industrial needs to support the war. Max Delbrück grew yeast on an immense scale during the war to meet 60 percent of Germany's animal feed needs. Compounds of another fermentation product, lactic acid, made up for a lack of hydraulic fluid, glycerol. On the Allied side the Russian chemist Chaim Weizmann used starch to eliminate Britain's shortage of acetone, a key raw material for cordite, by fermenting maize to acetone. The industrial potential of fermentation was outgrowing its traditional home in brewing, and "zymotechnology" soon gave way to "biotechnology." 

With food shortages spreading and resources fading, some dreamed of a new industrial solution. The Hungarian Károly Ereky coined the word "biotechnology" in Hungary during 1919 to describe a technology based on converting raw materials into a more useful product. He built a slaughterhouse for a thousand pigs and also a fattening farm with space for 50,000 pigs, raising over 100,000 pigs a year. The enterprise was enormous, becoming one of the largest and most profitable meat and fat operations in the world. In a book entitled Biotechnologie, Ereky further developed a theme that would be reiterated through the 20th century: biotechnology could provide solutions to societal crises, such as food and energy shortages. For Ereky, the term "biotechnologie" indicated the process by which raw materials could be biologically upgraded into socially useful products.

This catchword spread quickly after the First World War, as "biotechnology" entered German dictionaries and was taken up abroad by business-hungry private consultancies as far away as the United States. In Chicago, for example, the coming of prohibition at the end of World War I encouraged biological industries to create opportunities for new fermentation products, in particular a market for nonalcoholic drinks. Emil Siebel, the son of the founder of the Zymotechnic Institute, broke away from his father's company to establish his own called the "Bureau of Biotechnology," which specifically offered expertise in fermented nonalcoholic drinks.

The belief that the needs of an industrial society could be met by fermenting agricultural waste was an important ingredient of the "chemurgic movement." Fermentation-based processes generated products of ever-growing utility. In the 1940s, penicillin was the most dramatic. While it was discovered in England, it was produced industrially in the U.S. using a deep fermentation process originally developed in Peoria, Illinois. The enormous profits and the public expectations penicillin engendered caused a radical shift in the standing of the pharmaceutical industry. Doctors used the phrase "miracle drug", and the historian of its wartime use, David Adams, has suggested that to the public penicillin represented the perfect health that went together with the car and the dream house of wartime American advertising. Beginning in the 1950s, fermentation technology also became advanced enough to produce steroids on industrially significant scales. Of particular importance was the improved semisynthesis of cortisone which simplified the old 31 step synthesis to 11 steps. This advance was estimated to reduce the cost of the drug by 70%, making the medicine inexpensive and available. Today biotechnology still plays a central role in the production of these compounds and likely will for years to come.

Penicillin was viewed as a miracle drug that brought enormous profits and public expectations.

Single-cell protein and gasohol projects

Even greater expectations of biotechnology were raised during the 1960s by a process that grew single-cell protein. When the so-called protein gap threatened world hunger, producing food locally by growing it from waste seemed to offer a solution. It was the possibilities of growing microorganisms on oil that captured the imagination of scientists, policy makers, and commerce. Major companies such as British Petroleum (BP) staked their futures on it. In 1962, BP built a pilot plant at Cap de Lavera in Southern France to publicize its product, Toprina. Initial research work at Lavera was done by Alfred Champagnat, In 1963, construction started on BP's second pilot plant at Grangemouth Oil Refinery in Britain.

As there was no well-accepted term to describe the new foods, in 1966 the term "single-cell protein" (SCP) was coined at MIT to provide an acceptable and exciting new title, avoiding the unpleasant connotations of microbial or bacterial.

The "food from oil" idea became quite popular by the 1970s, when facilities for growing yeast fed by n-paraffins were built in a number of countries. The Soviets were particularly enthusiastic, opening large "BVK" (belkovo-vitaminny kontsentrat, i.e., "protein-vitamin concentrate") plants next to their oil refineries in Kstovo (1973) and Kirishi (1974).

By the late 1970s, however, the cultural climate had completely changed, as the growth in SCP interest had taken place against a shifting economic and cultural scene (136). First, the price of oil rose catastrophically in 1974, so that its cost per barrel was five times greater than it had been two years earlier. Second, despite continuing hunger around the world, anticipated demand also began to shift from humans to animals. The program had begun with the vision of growing food for Third World people, yet the product was instead launched as an animal food for the developed world. The rapidly rising demand for animal feed made that market appear economically more attractive. The ultimate downfall of the SCP project, however, came from public resistance.

This was particularly vocal in Japan, where production came closest to fruition. For all their enthusiasm for innovation and traditional interest in microbiologically produced foods, the Japanese were the first to ban the production of single-cell proteins. The Japanese ultimately were unable to separate the idea of their new "natural" foods from the far from natural connotation of oil. These arguments were made against a background of suspicion of heavy industry in which anxiety over minute traces of petroleum was expressed. Thus, public resistance to an unnatural product led to the end of the SCP project as an attempt to solve world hunger. 

Also, in 1989 in the USSR, the public environmental concerns made the government decide to close down (or convert to different technologies) all 8 paraffin-fed-yeast plants that the Soviet Ministry of Microbiological Industry had by that time.

In the late 1970s, biotechnology offered another possible solution to a societal crisis. The escalation in the price of oil in 1974 increased the cost of the Western world's energy tenfold. In response, the U.S. government promoted the production of gasohol, gasoline with 10 percent alcohol added, as an answer to the energy crisis. In 1979, when the Soviet Union sent troops to Afghanistan, the Carter administration cut off its supplies to agricultural produce in retaliation, creating a surplus of agriculture in the U.S. As a result, fermenting the agricultural surpluses to synthesize fuel seemed to be an economical solution to the shortage of oil threatened by the Iran–Iraq War. Before the new direction could be taken, however, the political wind changed again: the Reagan administration came to power in January 1981 and, with the declining oil prices of the 1980s, ended support for the gasohol industry before it was born.

Biotechnology seemed to be the solution for major social problems, including world hunger and energy crises. In the 1960s, radical measures would be needed to meet world starvation, and biotechnology seemed to provide an answer. However, the solutions proved to be too expensive and socially unacceptable, and solving world hunger through SCP food was dismissed. In the 1970s, the food crisis was succeeded by the energy crisis, and here too, biotechnology seemed to provide an answer. But once again, costs proved prohibitive as oil prices slumped in the 1980s. Thus, in practice, the implications of biotechnology were not fully realized in these situations. But this would soon change with the rise of genetic engineering.

Genetic engineering

The origins of biotechnology culminated with the birth of genetic engineering. There were two key events that have come to be seen as scientific breakthroughs beginning the era that would unite genetics with biotechnology. One was the 1953 discovery of the structure of DNA, by Watson and Crick, and the other was the 1973 discovery by Cohen and Boyer of a recombinant DNA technique by which a section of DNA was cut from the plasmid of an E. coli bacterium and transferred into the DNA of another. This approach could, in principle, enable bacteria to adopt the genes and produce proteins of other organisms, including humans. Popularly referred to as "genetic engineering," it came to be defined as the basis of new biotechnology. 

Genetic engineering proved to be a topic that thrust biotechnology into the public scene, and the interaction between scientists, politicians, and the public defined the work that was accomplished in this area. Technical developments during this time were revolutionary and at times frightening. In December 1967, the first heart transplant by Christian Barnard reminded the public that the physical identity of a person was becoming increasingly problematic. While poetic imagination had always seen the heart at the center of the soul, now there was the prospect of individuals being defined by other people's hearts. During the same month, Arthur Kornberg announced that he had managed to biochemically replicate a viral gene. "Life had been synthesized," said the head of the National Institutes of Health. Genetic engineering was now on the scientific agenda, as it was becoming possible to identify genetic characteristics with diseases such as beta thalassemia and sickle-cell anemia

Responses to scientific achievements were colored by cultural skepticism. Scientists and their expertise were looked upon with suspicion. In 1968, an immensely popular work, The Biological Time Bomb, was written by the British journalist Gordon Rattray Taylor. The author's preface saw Kornberg's discovery of replicating a viral gene as a route to lethal doomsday bugs. The publisher's blurb for the book warned that within ten years, "You may marry a semi-artificial man or woman…choose your children's sex…tune out pain…change your memories…and live to be 150 if the scientific revolution doesn’t destroy us first." The book ended with a chapter called "The Future – If Any." While it is rare for current science to be represented in the movies, in this period of "Star Trek", science fiction and science fact seemed to be converging. "Cloning" became a popular word in the media. Woody Allen satirized the cloning of a person from a nose in his 1973 movie Sleeper, and cloning Adolf Hitler from surviving cells was the theme of the 1976 novel by Ira Levin, The Boys from Brazil.

In response to these public concerns, scientists, industry, and governments increasingly linked the power of recombinant DNA to the immensely practical functions that biotechnology promised. One of the key scientific figures that attempted to highlight the promising aspects of genetic engineering was Joshua Lederberg, a Stanford professor and Nobel laureate. While in the 1960s "genetic engineering" described eugenics and work involving the manipulation of the human genome, Lederberg stressed research that would involve microbes instead. Lederberg emphasized the importance of focusing on curing living people. Lederberg's 1963 paper, "Biological Future of Man" suggested that, while molecular biology might one day make it possible to change the human genotype, "what we have overlooked is euphenics, the engineering of human development." Lederberg constructed the word "euphenics" to emphasize changing the phenotype after conception rather than the genotype which would affect future generations. 

With the discovery of recombinant DNA by Cohen and Boyer in 1973, the idea that genetic engineering would have major human and societal consequences was born. In July 1974, a group of eminent molecular biologists headed by Paul Berg wrote to Science suggesting that the consequences of this work were so potentially destructive that there should be a pause until its implications had been thought through. This suggestion was explored at a meeting in February 1975 at California's Monterey Peninsula, forever immortalized by the location, Asilomar. Its historic outcome was an unprecedented call for a halt in research until it could be regulated in such a way that the public need not be anxious, and it led to a 16-month moratorium until National Institutes of Health (NIH) guidelines were established. 

Joshua Lederberg was the leading exception in emphasizing, as he had for years, the potential benefits. At Asilomar, in an atmosphere favoring control and regulation, he circulated a paper countering the pessimism and fears of misuses with the benefits conferred by successful use. He described "an early chance for a technology of untold importance for diagnostic and therapeutic medicine: the ready production of an unlimited variety of human proteins. Analogous applications may be foreseen in fermentation process for cheaply manufacturing essential nutrients, and in the improvement of microbes for the production of antibiotics and of special industrial chemicals." In June 1976, the 16-month moratorium on research expired with the Director's Advisory Committee (DAC) publication of the NIH guidelines of good practice. They defined the risks of certain kinds of experiments and the appropriate physical conditions for their pursuit, as well as a list of things too dangerous to perform at all. Moreover, modified organisms were not to be tested outside the confines of a laboratory or allowed into the environment.

Synthetic insulin crystals synthesized using recombinant DNA technology

Atypical as Lederberg was at Asilomar, his optimistic vision of genetic engineering would soon lead to the development of the biotechnology industry. Over the next two years, as public concern over the dangers of recombinant DNA research grew, so too did interest in its technical and practical applications. Curing genetic diseases remained in the realms of science fiction, but it appeared that producing human simple proteins could be good business. Insulin, one of the smaller, best characterized and understood proteins, had been used in treating type 1 diabetes for a half century. It had been extracted from animals in a chemically slightly different form from the human product. Yet, if one could produce synthetic human insulin, one could meet an existing demand with a product whose approval would be relatively easy to obtain from regulators. In the period 1975 to 1977, synthetic "human" insulin represented the aspirations for new products that could be made with the new biotechnology. Microbial production of synthetic human insulin was finally announced in September 1978 and was produced by a startup company, Genentech. Although that company did not commercialize the product themselves, instead, it licensed the production method to Eli Lilly and Company. 1978 also saw the first application for a patent on a gene, the gene which produces human growth hormone, by the University of California, thus introducing the legal principle that genes could be patented. Since that filing, almost 20% of the more than 20,000 genes in the human DNA have been patented.

The radical shift in the connotation of "genetic engineering" from an emphasis on the inherited characteristics of people to the commercial production of proteins and therapeutic drugs was nurtured by Joshua Lederberg. His broad concerns since the 1960s had been stimulated by enthusiasm for science and its potential medical benefits. Countering calls for strict regulation, he expressed a vision of potential utility. Against a belief that new techniques would entail unmentionable and uncontrollable consequences for humanity and the environment, a growing consensus on the economic value of recombinant DNA emerged.

Biotechnology and industry

A Genentech-sponsored sign declaring South San Francisco to be "The Birthplace of Biotechnology."

With ancestral roots in industrial microbiology that date back centuries, the new biotechnology industry grew rapidly beginning in the mid-1970s. Each new scientific advance became a media event designed to capture investment confidence and public support. Although market expectations and social benefits of new products were frequently overstated, many people were prepared to see genetic engineering as the next great advance in technological progress. By the 1980s, biotechnology characterized a nascent real industry, providing titles for emerging trade organizations such as the Biotechnology Industry Organization (BIO). 

The main focus of attention after insulin were the potential profit makers in the pharmaceutical industry: human growth hormone and what promised to be a miraculous cure for viral diseases, interferon. Cancer was a central target in the 1970s because increasingly the disease was linked to viruses. By 1980, a new company, Biogen, had produced interferon through recombinant DNA. The emergence of interferon and the possibility of curing cancer raised money in the community for research and increased the enthusiasm of an otherwise uncertain and tentative society. Moreover, to the 1970s plight of cancer was added AIDS in the 1980s, offering an enormous potential market for a successful therapy, and more immediately, a market for diagnostic tests based on monoclonal antibodies. By 1988, only five proteins from genetically engineered cells had been approved as drugs by the United States Food and Drug Administration (FDA): synthetic insulin, human growth hormone, hepatitis B vaccine, alpha-interferon, and tissue plasminogen activator (TPa), for lysis of blood clots. By the end of the 1990s, however, 125 more genetically engineered drugs would be approved.

The 2007–2008 global financial crisis led to several changes in the way the biotechnology industry was financed and organized. First, it led to a decline in overall financial investment in the sector, globally; and second, in some countries like the UK it led to a shift from business strategies focused on going for an initial public offering (IPO) to seeking a trade sale instead. By 2011, financial investment in the biotechnology industry started to improve again and by 2014 the global market capitalization reached $1 trillion.

Genetic engineering also reached the agricultural front as well. There was tremendous progress since the market introduction of the genetically engineered Flavr Savr tomato in 1994. Ernst and Young reported that in 1998, 30% of the U.S. soybean crop was expected to be from genetically engineered seeds. In 1998, about 30% of the US cotton and corn crops were also expected to be products of genetic engineering.

Genetic engineering in biotechnology stimulated hopes for both therapeutic proteins, drugs and biological organisms themselves, such as seeds, pesticides, engineered yeasts, and modified human cells for treating genetic diseases. From the perspective of its commercial promoters, scientific breakthroughs, industrial commitment, and official support were finally coming together, and biotechnology became a normal part of business. No longer were the proponents for the economic and technological significance of biotechnology the iconoclasts. Their message had finally become accepted and incorporated into the policies of governments and industry.

Global trends

According to Burrill and Company, an industry investment bank, over $350 billion has been invested in biotech since the emergence of the industry, and global revenues rose from $23 billion in 2000 to more than $50 billion in 2005. The greatest growth has been in Latin America but all regions of the world have shown strong growth trends. By 2007 and into 2008, though, a downturn in the fortunes of biotech emerged, at least in the United Kingdom, as the result of declining investment in the face of failure of biotech pipelines to deliver and a consequent downturn in return on investment.

Regenerative design

From Wikipedia, the free encyclopedia

Regenerative design is a process-oriented whole systems approach to design. The term "regenerative" describes processes that restore, renew or revitalize their own sources of energy and materials. Regenerative design uses whole systems thinking to create resilient and equitable systems that integrate the needs of society with the integrity of nature.
 
Designers use systems thinking, applied permaculture design principles, and community development processes to design human and ecological systems. The development of regenerative design has been influenced by approaches found in the biomimicry, biophilic design, ecological economics, circular economics. As well as social movements such as permaculture, transition and the new economy. Regenerative design can also refer to process of designing systems such as restorative justice, rewilding and regenerative agriculture.

Feedback loop used in regenerative design

A new generation of designers are applying ecologically inspired design to agriculture, architecture, community planning, cities, enterprises, economics and ecosystem regeneration. Many designers use the resilient models observed in systems ecology in their design process and recognize that ecosystems are resilient largely because they operate in closed loop systems. Using this model regenerative design seeks feedback at every stage of the design process. Feedback loops are an integral to regenerative systems as understood by processes used in restorative practice and community development

Regenerative design is interconnected with the approaches of systems thinking and with New Economy movement. The 'new economy' considers that the current economic system needs to be restructured. The theory is based on the assumption that people and the planet should come first, and that it is human well-being, not economic growth, which should be prioritized. 

Whereas the highest aim of sustainable development is to satisfy fundamental human needs today without compromising the possibility of future generations to satisfy theirs, the goal of regenerative design is to develop restorative systems that are dynamic and emergent, and are beneficial for humans and other species. This regeneration process is participatory, iterative and individual to the community and environment it is applied to. This process intends to revitalize communities, human and natural resources, and for some, society as a whole. 

In recent years regenerative design is made possible on a larger scale using open source socio- technical platforms and technological systems as used in SMART cities. It is an includes community and city development processes like gathering feedback, participatory governance, sortition and participatory budgeting.

History

Permaculture

The term permaculture was developed and coined by David Holmgren, then a graduate student at the Tasmanian College of Advanced Education's Department of Environmental Design, and Bill Mollison, senior lecturer in Environmental Psychology at University of Tasmania, in 1978. The word permaculture originally referred to "permanent agriculture", but was expanded to stand also for "permanent culture", as it was understood that social aspects were integral to a truly sustainable system as inspired by Masanobu Fukuoka’s natural farming philosophy. Regenerative design is integral to permaculture design. 

In 1974 David Holmgren and Bill Mollison first started working together to develop the theory and practice of permaculture. They met when Mollison spoke at a seminar at the Department of Environmental Design and began to work together. During their first three years together Mollison worked at applying their ideas, and Holmgren wrote the manuscript for what would become Permaculture One: a perennial agricultural system for human settlements as he completed his Environmental Design studies, and submitted it as the major reference for his thesis. He then handed the manuscript to Mollison for editing and additions, before it was published in 1978.

Regenerative organic agriculture

Robert Rodale, son of American organic pioneer and Rodale Institute founder J.I. Rodale, coined the term ‘regenerative organic agriculture.’ The term distinguished a kind of farming that goes beyond simply ‘sustainable.’ Regenerative organic agriculture “takes advantage of the natural tendencies of ecosystems to regenerate when disturbed. In that primary sense it is distinguished from other types of agriculture that either oppose or ignore the value of those natural tendencies.” This type of farming is marked by "tendencies towards closed nutrient loops, greater diversity in the biological community, fewer annuals and more perennials, and greater reliance on internal rather than external resources."

John T. Lyle (1934–1998), a landscape architecture professor saw the connection between concepts developed by Bob Rodale for regenerative agriculture and the opportunity to develop regenerative systems for all other aspects of the world. While regenerative agriculture focused solely on agriculture, Lyle expanded its concepts and use to all systems. Lyle understood that when developing for other types of systems, more complicated ideas such as entropy and emergy must be taken into consideration.

Regenerative design in the built environment

In 1976, Lyle challenged his landscape architecture graduate students at California State Polytechnic University, Pomona to "envision a community in which daily activities were based on the value of living within the limits of available renewable resources without environmental degradation." Over the next few decades an eclectic group of students, professors and experts from around the world and crossing many disciplines developed designs for an institute to be built at Cal Poly Pomona. In 1994, the Lyle Center for Regenerative Studies opened after two years of construction. In that same year Lyle's book Regenerative Design for Sustainable Development was published by Wiley. In 1995 Lyle worked with William McDonough at Oberlin College on the design of the Adam Joseph Lewis Center for Environmental Studies completed in 2000. In 2002 McDonough's book, the more popular and successful, Cradle to Cradle: Remaking the Way We Make Things was published reiterating the concepts developed by Lyle. Swiss architect Walter R. Stahel developed approaches entirely similar to Lyle's also in the late 1970s but instead coined the term cradle-to-cradle design made popular by McDonough and Michael Braungart.

Sim Van Der Ryn is an architect, author, and educator with more than 40 years of experience integrating ecological principles into the built environment.  Author of eight publications, one of his most influential books titled Ecological Design, published in 1996, provides a framework for integrating human design with living systems. The book challenges designers to push beyond "green building" to create buildings, infrastructure and landscapes that truly restore and regenerative of the surrounding ecosystems.

Green vs. sustainable vs. regenerative

There is an important distinction that should be made between the words ‘green’, ‘sustainable’, and ‘regenerative’ and how they influence design.

Green Design

In the article Transitioning from green to regenerative design, Raymond J. Cole explores the concept of regenerative design and what it means in relation to ‘green’ and ‘sustainable’ design. Cole identifies eight key attributes of green buildings:
  1. Reduces damage to natural or sensitive sites
  2. Reduces the need for new infrastructure
  3. Reduces the impacts on natural feature and site ecology during construction
  4. Reduces the potential environmental damage from emissions and outflows
  5. Reduces the contributions to global environmental damage
  6. Reduces resource use – energy, water, materials
  7. Minimizes the discomfort of building occupants
  8. Minimizes harmful substances and irritants within building interiors
By these eight key attributes, ‘green’ design is accomplished by reducing the harmful, damaging and negative impacts to both the environment and humans that result from the construction of the built environment. Another characteristic that separates ‘green’ design is that it is aimed at broad market transformation and therefore green building assessment frameworks and tools are typically generic in nature. 

Sustainable Design

Sustainable design lies within a balance of economical, environmental and social responsibilities

‘Sustainable’ and ‘green’ are for the most part used interchangeably however, there is a slight distinction between then. ‘Green’ design is centralized around specifically decreasing environmental impacts from human development whereas sustainability can be viewed for an environmental, economic or social lens. The implication is that sustainability can be incorporated to all three aspects of the Triple Bottom Line: people, planet, profit. 

The definition of sustainable or sustainability has been widely accepted as the ability to meet the needs of the current generation without depleting the resources needed to meet the needs of future generations. It “promotes a bio-centric view that places the human presence within a larger natural context, and focuses on constraints and on fundamental values and behavioral change.” David Orr defines two approaches to sustainability in his book Ecological Literacy: “technological sustainability” and “ecological sustainability.” “Technological sustainability” emphasizes the anthropocentric view by focusing on making technological and engineering processes more efficient whereas “ecological sustainability" emphasizes the bio-centric view and focuses on enabling and maintaining the essential and natural functions of ecosystems.

The sustainability movement has gained momentum over the last two decades, with interest from all sectors increasing rapidly each year. In the book Regenerative Development and Design: A Framework for Evolving Sustainability, the Regenesis Group asserts that the sustainability “debate is shifting from whether we should work on sustainability to how we’re going to get it done.” Sustainability was first viewed as a “steady state of equilibrium” in which there was a balance between inputs and outputs with the idea that sustainable practices meant future resources were not compromised by current processes. As this idea of sustainability and sustainable building has become more widely accepted and adopted, the idea of “net-zero” and even “net-positive” have become topics of interest. These relatively newer concepts focus on positively impacting the surrounding environment of a building rather than simply reducing the negative impacts.

Regenerative Design

J.T. Gibberd argued “a building is an element set within wider human endeavors and is necessarily dependent on this context. Thus, a building can support sustainable patterns of living, but in and of itself cannot be sustainable” Regenerative design goes a step further than sustainable design. In a regenerative system, feedback loops allow for adaptability, dynamism and emergence to create and develop resilient and flourishing eco-systems. Cole highlights a key distinction of regenerative design is the recognition and emphasis of the “co-evolutionary, partnered relationship between human and natural systems” and thus importance of project location and place. Bruno Duarte Dias asserts that regenerative design goes beyond the traditional weighing and measuring of various environmental, social and economic impacts of sustainable design and instead  focuses on mapping relationships. Dias is in agreement with Cole stating three fundamental aspects of regenerative design which include: understanding place and it’s unique patterns, designing for harmony within place, and co-evolution.

Fundamental aspects of regenerative design

Co-evolution of humans & nature

Regenerative design is built on the idea that humans and the built environment exist within natural systems and thus, the built environment should be designed to co-evolve with the surrounding natural environment. Dias asserts that a building should serve as a “catalyst for positive change.” The project does not end with the completion of construction and certificate of occupancy, instead the building serves to enhance the relationships between people, the built environment and the surrounding natural systems over a long period of time.

Designing in context of place

Understanding the location of the project, the unique dynamics of the site and the relationship of the project to the living natural systems is a fundamental concept in the regenerative design process. In their article Designing from place: a regenerative framework and methodology, Pamela Mang and Bill Reed define place as a "unique, multilayered network of living systems within a geographic region that results from the complex interactions, through time, of the natural ecology (climate, mineral and other deposits, soil, vegetation, water and wildlife, etc.) and culture (distinctive customs, expressions of values, economic activities, forms of association, ideas for education, traditions, etc.)" A systems-based approach to design in which the design team looks at the building within the larger system is crucial.

The Gardener Analogy

Beatrice Benne and Pamela Mang emphasize the importance of the distinction between working with a place rather than working on a place within the regenerative design process. They use an analogy of a gardener to re-define the role of a designer in the building process. “A gardener does not ‘make’ a garden. Instead, a skilled gardener is one who has developed an understanding of the key processes operating in the garden” and thus the gardener “makes judicious decisions on how and where to intervene to reestablish the flows of energy that are vital to the health of the garden.” In the same way a designer does not create a thriving ecosystem rather they make decisions that indirectly influence whether the ecosystem degrades or flourishes over time. This requires designers to push beyond the prescriptive and narrow way of thinking they have been taught and use complex systems thinking that will be ambiguous and overwhelming at times. This includes accepting that the solutions do not exclusively lie in technological advancements and are instead a combination of sustainable technologies and an understanding of the natural flow of resources and underlying ecological processes. Benne and Mang identify these challenges and state the most difficult of these will be shifting from a mechanistic to an ecological worldview. The tendency is to view building as the physical processes of the structure rather than the complex network of relationships the building has with the surrounding environment including the natural systems and the human community. 

Conservation vs. preservation

Regenerative design places more importance on conservation and biodiversity rather than on preservation. It is recognized in regenerative design that humans are a part of natural ecosystems. To exclude people is to create dense areas that destroy pockets of existing ecosystems while preserving pockets of ecosystems without allowing them to change naturally over time.

Regenerative design frameworks

There are a few regenerative design frameworks that have been developed in recent years. Unlike many green building rating systems, these frameworks are not prescriptive checklists. Instead they are conceptual and meant to guide dialogue throughout the design process. They should not be used exclusively rather in conjunction with existing green building rating systems such as LEED, BREEAM or Living Building Challenge.

SPeAR

Sustainable Project Appraisal Routine (SPeAR) is a decision-making tool developed by software and sustainability experts at Arup. The framework incorporates key categories including transportation, biodiversity, culture, employment and skills.

REGEN

The regenerative design framework REGEN was proposed by Berkebile Nelson Immenschuh McDowell (BNIM), a US architectural firm, for the US Green Building Council (USGBC). The tool was was intended to be a web-based, data-rich framework to guide dialogue between professionals in the design and development process as well as "address the gap in information and integration of information." The framework has three components:
  1. Framework - the framework encourages systems thinking and collaboration as well as linking individual strategies to the goals of the project as a whole
  2. Resources - the framework includes place-based data and information for project teams to use
  3. Projects - the framework includes examples of successful projects that have incorporated regenerative ideas into the design as models for project teams

LENSES

Living Environments in Natural, Social and Economic Systems (LENSES) was created by Colorado State University's Institute for the Built Environment. The framework is intended to be process-based rather than product-based. The goals of the framework include:
  • to direct the development of eco-regional guiding principles for living built environments
  • to illustrate connections and relationships between sustainability issues
  • to guide collaborative dialogue
  • to present complex concepts quickly and effectively to development teams and decision-makers
The framework consists of three "lenses": Foundational Lens, Aspects of Place Lens and Flows Lens. The lenses work together to guide the design process, emphasizing the guiding principles and core values, understanding the delicate relationship between building and place and how elements flow through the natural and human systems.

Perkins+Will

Perkins+Will is a global architecture and design firm with a strong focus on sustainability - by September of 2012 the firm had completed over 150 LEED-certified projects. It was at te 2008 Healthcare Center for Excellence meeting in Vancouver, British Columbia that the decision was made to develop a regenerative design framework in an effort to generate broader conversation and inspirational ideas. Later that year, a regenerative design framework that could be used by all market sectors including healthcare, education, commercial and residential was developed by Perkins+Will in conjunction with the University of British Columbia. The framework had four primary objectives:
  1. to initiate a different and expanded dialogue between the design team members and with the client and users, moving beyond the immediate building and site boundaries
  2. to emphasize the opportunities of developed sites and buildings to relate to, maintain, and enhance the health of the ecological and human systems in the place in which they are situated
  3. to highlight the ecological and human benefits that accrue from regenerative approaches
  4. to facilitate the broader integration of allied design professionals - urban planners, landscape architects and engineers, together with other disciplines (ecologists, botanists, hydrologists, etc.) typically not involved in buildings - in an interdisciplinary design process
The structure of the framework consists of four primary themes:
  1. Representation of human and natural systems - the framework is representative of the interactions between humans and the natural environment and is built on the notion that human systems exist only within natural systems. Human needs are further categorized into four categories: individual human health and well-being, social connectivity and local community, cultural vitality and sense of place, and healthy community. 
  2. Representation of resource flows - the framework recognizes that human systems and natural systems are impacted through the way building relates to the land and engages resource flows. These resource flows include energy, water and materials.
  3. Resource cycles - within the framework, resource flows illustrate how resources flow in and out of human and natural cycles whereas resource cycles focus on how resources move through human systems. The four sub-cycles included in the framework are produce, use, recycle and replenish. 
  4. Direct and indirect engagement with flows - the framework distinguishes between the direct and indirect ways a building engages with resource flows. Direct engagement includes approaches and strategies that occur within the bounds of the project site. Indirect engagement extends beyond the boundaries of the project site and can thus be implemented on a much larger scale such as purchasing renewable energy credits.
Case study - Van Dusen Botanical Garden
The Visitor Center at the Van Dusen Botanical Garden in Vancouver, British Columbia was designed in parallel with the regenerative design framework developed by Perkins+Will. The site of the new visitor center was 17,575 m2 and the building itself 1,784 m2. A four stage process was identified and included: education and project aspirations, goal setting, strategies and synergies, and whole systems approaches. Each stage raises important questions that require the design team to define place and look at the project in a much larger context, identify key resources flows and understand the complex holistic systems, determine synergistic relationships and identify approaches that provoke the coevolution of both humans and ecological systems. The visitor centre was the first project that Perkins+Will worked on in collaboration with an ecologist. Incorporating an ecologist on the project team allowed the team to focus on the project from a larger scale and understand how the building and its specific design would interact with the surrounding ecosystem through its energy, water and environmental performance.

Regenerative design for retrofitting existing buildings

Importance and implications

It is said that the majority of buildings estimated to exist in the year 2050 have already been built. Additionally, current buildings account for roughly 40 percent of the total energy consumption within the United States. This means that in order to meet climate change goals - such as the Paris Agreement on Climate Change - and reduce greenhouse gas emissions, existing buildings need to be updated to reflect sustainable and regenerative design strategies.

Strategies

Craft et al. attempted to create a regenerative design model that could be applied to retrofitting existing buildings. This model was prompted by the large number of currently existing buildings projected to be present in 2050. The model presented in this article for building retrofits follows a ‘Levels of Work’ framework consisting of four levels that are said to be pertinent in increasing the “vitality, viability and capacity for evolution” which require a deep understanding of place and how the building interacts with the natural systems. These four levels are classified as either proactive or reactive and include regenerate, improve, maintain and operate.

Case Study

University of New South Wales

Craft et al. present a case study in which the chemical science building at the University of New South Wales was retrofitted to incorporate these regenerative design principles. The strategy uses biophilia to improve occupants health and wellbeing by strengthening their connection to nature. The facade acts as a “vertical ecosystem” by providing habitats for indigenous wildlife to increase biodiversity. This included the addition of balconies to increase the connection between humans and nature. 

Regenerative agriculture

Regenerative farming or 'regenerative agriculture' calls for the creation of demand on agricultural systems to produce food in a way that is beneficial to the production and the ecology of the environment. It uses the science of systems ecology, and the design and application through permaculture. As understanding of its benefits to human biology and ecological systems that sustain us is increased as has the demand for organic food. Organic food grown using regenerative and permaculture design increases the biodiversity and is used to develop business models that regenerate communities. Whereas some foods are organic some are not strictly regenerative because it is not clearly seeking to maximize biodiversity and the resilience of the environment and the workforce. Regenerative agriculture grows organic produce through ethical supply chains, zero waste policies, fair wages, staff development and wellbeing, and in some cases cooperative and social enterprise models. It seeks to benefit the staff along the supply chain, customers, and ecosystems with the outcome of human and ecological restoration and regeneration.

Size of regenerative systems

The size of the regenerative system effects the complexity of the design process. The smaller a system is designed the more likely it is to be resilient and regenerative. Multiple small regenerative systems that are put together to create larger regenerative systems help to create supplies for multiple human-inclusive-ecological systems.

Operator (computer programming)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Operator_(computer_programmin...