Search This Blog

Friday, October 5, 2018

Bill & Melinda Gates Foundation

From Wikipedia, the free encyclopedia
 
Bill & Melinda Gates Foundation logo.svg
Abbreviation BMGF
Formation 2000; 18 years ago
Founders
Type
Purpose Healthcare, Education, Ending poverty
Headquarters Seattle, Washington, US
Area served
Worldwide
Method Donations, Grants
Key people
Endowment US$50.7 billion (2017)
Employees
1,541 (2017)
Website gatesfoundation.org
Formerly called
William H. Gates Foundation (1994–1999)

Bill & Melinda Gates Foundation (BMGF), also known as the Gates Foundation, is a private foundation founded by Bill and Melinda Gates. It was launched in 2000, and is said to be the largest private foundation in the United States, holding US$50.7 billion in assets. The primary aims of the foundation are, globally, to enhance healthcare and reduce extreme poverty, and the US, to expand educational opportunities and access to information technology. The foundation, based in Seattle, Washington, is controlled by its three trustees: Bill and Melinda Gates, and Warren Buffett. Other principal officers include Co-Chair William H. Gates, Sr. and Chief Executive Officer Susan Desmond-Hellmann.

It had an endowment of US$50.7 billion as of December 31, 2017. The scale of the foundation and the way it seeks to apply business techniques to giving makes it one of the leaders in venture philanthropy, though the foundation itself notes that the philanthropic role has limitations. In 2007, its founders were ranked as the second most generous philanthropists in the US, and Warren Buffett the first. As of May 16, 2013, Bill Gates had donated US$28 billion to the foundation. Since its founding, the foundation has endowed and supported a broad range of social, health, and education developments including the establishment of the Gates Cambridge Scholarships at Cambridge University.

History

Complex as seen from the Space Needle
 
Front building
 
Rear building

In 1994, the foundation was formed as the William H. Gates Foundation. During the foundation's following years, funding grew to US$2 billion. On June 15, 2006, Gates announced his plans to transition out of a day-to-day role with Microsoft, effective July 31, 2008, to allow him to devote more time to working with the foundation.

In 2005, Bill and Melinda Gates, along with the musician Bono, were named by Time as Persons of the Year 2005 for their outstanding charitable work. In the case of Bill and Melinda Gates, the work referenced was that of this foundation.

In April 2010, Gates was invited to visit and speak at the Massachusetts Institute of Technology where he asked the students to take on the hard problems of the world in their futures. He also explained the nature and philosophy of his philanthropic endeavors.

In 2010, the foundation's founders started the Commission on Education of Health Professionals for the 21st Century, entitled "Transforming education to strengthen health systems in an interdependent world."

A 2011 survey of grantees found that many believed the foundation did not make its goals and strategies clear and sometimes did not understand those of the grantees; that the foundation's decision- and grantmaking procedures were too opaque; and that its communications could be more consistent and responsive. The foundation's response was to improve the clarity of its explanations, make "orientation calls" to grantees upon awarding grants, tell grantees who their foundation contact is, give timely feedback when they receive a grantee report, and establish a way for grantees to provide anonymous or attributed feedback to the foundation. The foundation also launched a podcast series.

In 2013, Hillary Clinton launched a partnership between the foundation and the Clinton Foundation to gather and study data on the progress of women and girls around the world since the 1995 United Nations Fourth World Conference On Women in Beijing. This is called "No Ceilings: The Full Participation Project."

Warren Buffett donation

On June 25, 2006, Warren Buffett (then the world's richest person, estimated worth of US$62 billion as of April 16, 2008) pledged to give the foundation approximately 10 million Berkshire Hathaway Class B shares spread over multiple years through annual contributions, with the first year's donation of 500,000 shares being worth approximately US$1.5 billion. Buffett set conditions so that these contributions do not simply increase the foundation's endowment, but effectively work as a matching contribution, doubling the Foundation's annual giving. Bloomberg News noted, "Buffett's gift came with three conditions for the Bill & Melinda Gates Foundation: Bill or Melinda Gates must be alive and active in its administration; it must continue to qualify as a charity; and each year it must give away an amount equal to the previous year's Berkshire gift, plus an additional amount equal to 5 percent of net assets. Buffett gave the foundation two years to abide by the third requirement." The Gates Foundation received 5% (500,000) of the shares in July 2006 and will receive 5% of the remaining earmarked shares in the July of each following year (475,000 in 2007, 451,250 in 2008). In July 2018, Buffet announced another donation of his company's Class B stock, this time worth $2 billion, to the Gates Foundation.

Activities

Program areas and grant database

To maintain its status as a charitable foundation, the Bill & Melinda Gates Foundation must donate funds equal to at least 5 percent of its assets each year. As of April 2014, the foundation is organized into four program areas under chief executive officer Susan Desmond-Hellmann, who "sets strategic priorities, monitors results, and facilitates relationships with key partners":
  • Global Development Division
  • Global Health Division
  • United States Division
  • Global Policy & Advocacy Division
  • Global Growth & Opportunity Division
The foundation maintains an online database of grants on its website which includes for each grant the name of the grantee organization, the purpose of the grant and the amount. This database is publicly available.

Open access policy

In November 2014, the Gates Foundation announced that they were adopting an open access (OA) policy for publications and data, "to enable the unrestricted access and reuse of all peer-reviewed published research funded by the foundation, including any underlying data sets". This move has been widely applauded by those who are working in the area of capacity building and knowledge sharing. Its terms have been called the most stringent among similar OA policies. As of January 1, 2015 their Open Access policy is effective for all new agreements. In March 2017, it was confirmed that the open access policy, Gates Open Research would be based on the same initiative launched in 2016 by Wellcome Trust in their Wellcome Open Research policy launched in partnership with F1000 Research.

Funds for grants in developing countries

The following table lists the Bill & Melinda Gates Foundation's committed funding as recorded in their International Aid Transparency Initiative (IATI) publications. The Gates Foundation announced in October 2013 that it would join the IATI. The IATI publications only include a subset of Gates Foundation grants (mainly excluding grants to developed countries), and contain few grants before 2009 (which are excluded from the table). The Gates Foundation states on the IATI Registry site that "reporting starts from 2009 and excludes grants related to our US programs and grants that if published could harm our employees, grantees, partners, or the beneficiaries of our work".


Committed funding (US$ millions)
DAC 5 Digit Sector 2009 2010 2011 2012 2013 2014 2015 Sum
Infectious disease control 256.9 720.3 462.8 528.7 1248.3 1271.8 1097.5 5586.4
Malaria control 324.5 101.7 133.6 75.5 302.4 377.6 140.8 1456.1
STD control including HIV/AIDS 175.5 26.9 291.4 199.7 184.4 264.4 165.7 1308.0
Tuberculosis control 69.2 211.1 59.5 273.9 135.3 100.1 244.8 1094.0
Reproductive health care 173.8 66.8 77.4 165.2 84.9 207.6 130.0 905.8
Agricultural research 84.7 27.8 196.2 192.8 207.1 14.7 83.9 807.2
Family planning 104.5 21.2 21.4 49.3 165.0 145.8 181.7 688.9
Health policy and administrative management 119.3 14.3 145.7 75.5 61.1 113.4 130.3 659.5
Agricultural development 5.2 30.0 0.0 35.0 0.0 325.1 86.1 481.3
Agricultural policy and administrative management 72.9 30.0 77.5 77.1 86.2 19.7 96.9 460.3
Promotion of development awareness 47.2 45.0 35.5 41.7 124.4 61.7 80.7 436.2
Basic health care 22.3 23.9 43.7 73.2 1.7 45.6 206.3 416.7
Basic nutrition 19.2 15.7 40.9 51.5 63.7 55.9 148.2 395.2
Basic sanitation 10.1 34.9 82.9 74.9 59.1 48.7 64.9 375.5
Financial policy and administrative management 29.0 18.4 9.8 8.9 70.1 32.9 53.4 222.5
Other 487.5 273.8 2208.9 260.2 332.1 433.3 2195.7 6191.5
Total 2001.9 1662.0 3887.2 2183.0 3125.9 3518.4 5106.6 21485.1

The following table lists the top receiving organizations to which the Bill & Melinda Gates Foundation has committed funding, between 2009 and 2015. The table again only includes grants recorded in the Gates Foundation's IATI publications.

Organization Amount (US$ millions)
GAVI Alliance 3,152.8
World Health Organization 1,535.1
The Global Fund to Fight AIDS, Tuberculosis and Malaria 777.6
PATH 635.2
United States Fund for UNICEF 461.1
The Rotary Foundation of Rotary International 400.1
International Bank for Reconstruction and Development 340.0
Global Alliance for TB Drug Development 338.4
Medicines for Malaria Venture 334.1
PATH Vaccine Solutions 333.4
UNICEF Headquarters 277.6
Johns Hopkins University 265.4
Aeras 227.6
Clinton Health Access Initiative Inc 199.5
International Development Association 174.7
CARE 166.2
World Health Organization Nigeria Country Office 166.1
Agence française de développement 165.0
Centro Internacional de Mejoramiento de Maíz y Trigo 153.1
Cornell University 146.7
Alliance for a Green Revolution in Africa 146.4
United Nations Foundation 143.0
University of Washington Foundation 138.2
Foundation for the National Institutes of Health 136.2
Emory University 123.2
University of California San Francisco 123.1
Population Services International 122.5
University of Oxford 117.8
International Food Policy Research Institute 110.7
International Institute of Tropical Agriculture 104.8

Financials

The foundation explains on its website that its trustees divided the organization into two entities: the Bill & Melinda Gates Foundation and the Bill & Melinda Gates Foundation Trust. The foundation section, based in Seattle, US, "focuses on improving health and alleviating extreme poverty", and its trustees are Bill and Melinda Gates and Warren Buffett. The trust section manages "the investment assets and transfer proceeds to the foundation as necessary to achieve the foundation's charitable goals"—it holds the assets of Bill and Melinda Gates, who are the sole trustees, and receives contributions from Buffett.

The foundation posts its audited financial statements and 990-PF forms on the "Financials" section of its website as they become available. At the end of 2012, the foundation registered a cash sum of US$4,998,000, down from US$10,810,000 at the end of 2011. Unrestricted net assets at the end of 2012 were worth US$31,950,613,000, while total assets were worth US$37,176,777,000.

Trust Investments

As of 2016, the foundation appeared to have the following stakes in investments: Arcos Dorados Holdings ~ 2.36% stake, AutoNation, Inc. ~ 1.56% stake Berkshire Hathaway Class B Stock ~ 6.59% stake, British Petroleum ~ 0.24% stake (US$372 million[45]), Canadian National Railway Co. ~ 2.06% stake, Caterpillar, Inc. ~ 1.77% stake, Coca-Cola Co. ~ 0.77% stake Crown Castle International Corp. ~ 1.60% stake, Exxon Mobil ~ 0.19% stake, FedEx Corp. ~ 0.97% stake, FEMSA ~ 3.06% stake, Liberty Global ~ 2.12% stake, McDonald's Corp. ~ 1.09% stake Republic Services, Inc. ~ 0.37% stake, Shell - US$5.5 million,[45] Televisa ~ 2.9% stake, Wal-Mart ~ 0.36%[46] stake, Waste Management, Inc. ~ 3.97% stake.

Global development division

Christopher Elias leads the foundation's efforts to combat extreme poverty through grants as president of the Global Development Program.

In March 2006, the foundation announced a US$5 million grant for the International Justice Mission (IJM), a human rights organization based in Washington, D.C., US to work in the area of sex trafficking. The official announcement explained that the grant would allow the IJM to "create a replicable model for combating sex trafficking and slavery" that would involve the opening of an office in a region with high rates of sex trafficking, following research. The office was opened for three years for the following purposes: "conducting undercover investigations, training law enforcement, rescuing victims, ensuring appropriate aftercare, and seeking perpetrator accountability".

The IJM used the grant money to found "Project Lantern" and established an office in the Philippines city of Cebu. In 2010, the results of the project were published, in which the IJM stated that Project Lantern had led to "an increase in law enforcement activity in sex trafficking cases, an increase in commitment to resolving sex trafficking cases among law enforcement officers trained through the project, and an increase in services – like shelter, counseling and career training – provided to trafficking survivors". At the time that the results were released, the IJM was exploring opportunities to replicate the model in other regions.

Gates Cambridge Scholarships

In October 2000, William Gates established the Gates Cambridge Scholarships which allow students and scholars from the U.S. and around the world to study at Cambridge University, one of the top universities in the world. The Gates Cambridge Scholarship has often been compared to the Rhodes Scholarship, given its similarly international scope and substantial endowment. In 2000, the Gates Foundation endowed the scholarship trust with $210 million to help outstanding graduate students outside of the United Kingdom study at the University of Cambridge. The Gates Foundation has continued to contribute funds to expand the scholarship, making it one of the largest and best endowed scholarships in the world. The Gates Cambridge Scholarship accepts less than 0.3% of applicants and remains extremely competitive. Each year, approximately 100 new graduate students from around the world receive funding to study at Cambridge University.

Financial assistance

  • Alliance for Financial Inclusion (AFI): A US$35 million grant for the AFI supports a coalition of countries from the developing world to create savings accounts, insurance, and other financial services that are made available to people living on less than $2 per day.
  • Financial Access Initiative: A US$5 million grant allows Financial Access Initiative to conduct field research and answer important questions about microfinance and financial access in impoverished countries around the world.
  • Pro Mujer: A five-year US$3.1 million grant to Pro Mujer—a microfinance network in Latin America combining financial services with healthcare for the poorest women entrepreneurs—will be used to research new opportunities for the poorest segment of the Latin American microfinance market.
  • Grameen Foundation: A US$1.5 million grant allows Grameen Foundation to approve more microloans that support Grameen's goal of helping five million additional families, and successfully freeing 50 percent of those families from poverty within five years.

Agricultural Development

Water, sanitation and hygiene

The "sanitation value chain" used by the Gates Foundation to illustrate their approach to sanitation, showing collection, transport, treatment and reuse.
 
The Gates Foundation created this video to advocate for increased innovation for toilets and everything they are connected to.
 
Example for technology innovation: The off-grid Nano Membrane Toilet of Cranfield University - prototype on display at Reinvent the Toilet Fair in Delhi, India.
 
The Water, Sanitation and Hygiene (WASH) program of the Bill & Melinda Gates Foundation was launched in mid-2005 as a "Learning Initiative", and became a full-fledged program under the Global Development Division in early 2010. The Foundation has since 2005 undertaken a wide range of efforts in the WASH sector involving research, experimentation, reflection, advocacy, and field implementation. In 2009, the Foundation decided to refocus its WASH effort mainly on sustainable sanitation services for the poor, using non-piped sanitation services (i.e. without the use of sewers), and less on water supply. This was because the sanitation sector was generally receiving less attention from other donors and from governments, and because the Foundation believed it had the potential to make a real difference through strategic investments.

In mid 2011, the Foundation announced in its new "Water, Sanitation, Hygiene Strategy Overview" that its funding now focuses primarily on sanitation, particularly in sub-Saharan Africa and South Asia, because access to improved sanitation is lowest in those regions. Their grant-making focus has been since 2011 on sanitation science and technology ("transformative technologies"), delivery models at scale, urban sanitation markets, building demand for sanitation, measurement and evaluation as well as policy, advocacy and communications.

In mid 2011, the foundation stated that they had committed more than US$265 million to the water, sanitation, and hygiene sector over the past five years, i.e. since about 2006. For the time period of about 2008 to mid 2015, all grants awarded to water, sanitation and hygiene projects totaled a value of around US$650 million, according to the publicly available grant database.

Example of low-tech toilet development being funded: A urine-diverting dry toilet called Earth Auger toilet from Ecuador/USA

Improved sanitation in the developing world is a global need, but a neglected priority, as shown by the data collected by the Joint Monitoring Programme for Water Supply and Sanitation (JMP) of UNICEF and WHO. This program is tasked to monitor progress towards the Millennium Development Goal (MDG) relating to drinking water and sanitation. About one billion people have no sanitation facility whatsoever and continue to defecate in gutters, behind bushes or in open water bodies, with no dignity or privacy. This is called open defecation and it poses significant health risks. India is the country with the highest number of people practicing open defecation: around 600 million people. The Foundation has been funding many sanitation research and demonstration projects in India since about 2011.

Technology Innovations

In 2011, the foundation launched a program called "Reinvent the Toilet Challenge" with the aim to promote the development of innovations in toilet design to benefit the 2.5 billion people that do not have access to safe and effective sanitation. This program has generated significant interest of the mainstream media. It was complemented by a program called "Grand Challenges Explorations" (2011 to 2013 with some follow-up grants reaching until 2015) which involved grants of US$100,000 each in the first round. Both funding schemes explicitly excluded project ideas that relied on centralized sewerage systems or are not compatible with development country contexts.

Microbial fuel cell stack that converts urine into electricity (research by University of the West of England, UK)

Since the launch of the "Reinvent the Toilet Challenge", more than a dozen research teams, mainly at universities in the U.S., Europe, India, China and South Africa, have received grants to develop innovative on-site and off-site waste treatment solutions for the urban poor. The grants were in the order of 400,000 USD for their first phase, followed by typically 1-3 million USD for their second phase; many of them investigated resource recovery or processing technologies for excreta or fecal sludge.

The "Reinvent the Toilet Challenge" is focused on "reinventing the flush toilet". The aim was to create a toilet that not only removes pathogens from human excreta, but also recovers resources such as energy, clean water, and nutrients (a concept also known as reuse of excreta). It should operate "off-the-grid" without connections to water, sewer, or electrical networks. Finally, it should costs less than 5 US-cents per user per day.

High-tech toilets for tackling the growing public health problem of human waste are gaining increasing attention, but this focus on a "technology fix" has also been criticized by many in the sector. However, low-tech solutions may be more practical in poor countries, and research is also funded by the foundation for such toilets.

The Reinvent the Toilet Challenge is a long-term research and development effort to develop a hygienic, stand-alone toilet. This challenge is being complemented by another investment program to develop new technologies for improved pit latrine emptying (called by the foundation the "Omni-Ingestor") and fecal sludge processing (called "Omni-Processor"). The aim of the "Omni Processor" is to convert excreta (for example fecal sludge) into beneficial products such as energy and soil nutrients with the potential to develop local business and revenue.

Examples of transformative technologies research

  • About 200 sanitation projects in many different countries and at various scales - some with a technology focus, some with a focus on market development or policy and advocacy, have received funding by the foundation since 2008.
  • The University of KwaZulu-Natal in Durban, South Africa Gates Foundation was awarded US$1.6 million in 2014 to act as a hub for sanitation researchers and product developers.
  • One example of an Omni-Processor is a combustion based system designed to turn fecal sludge into energy and drinking water. The development of this particular prototype by U.S.-based company Janicki Bioenergy attracted media attention for the sanitation crisis and the work of the foundation after Bill Gates drank water produced from this process.
  • Examples for the Reinvent the Toilet Challenge include: Scientists at the University of Colorado Boulder were giving funding of US$1.8 million to develop a prototype toilet that uses solar heat to treat the fecal matter and produce biochar. Funding has been provided to RTI International since 2012 to develop a toilet based on electrochemical disinfection and solid waste combustion.

Other Global Initiatives

Some examples include:

Global health division

Since 2011, the president of the Global Health Program is Trevor Mundel.
  • The Global Fund to Fight AIDS, Tuberculosis and Malaria: The foundation has donated more than $6.6 billion for global health programs, including over US$1.3 billion donated as of 2012 on malaria alone, greatly increasing the dollars spent per year on malaria research. Before the Gates efforts on malaria, malaria drugmakers had largely given up on producing drugs to fight the disease, and the foundation is the world's largest donor to research on diseases of the poor. With the help of Gates-funded vaccination drives, deaths from measles in Africa have dropped by 90 percent since 2000.
The foundation has donated billions of dollars to help sufferers of AIDS, tuberculosis and malaria, protecting millions of children from death at the hands of preventable diseases. However, a 2007 investigation by The Los Angeles Times claimed there are three major unintended consequences with the foundation's allocation of aid. First, sub-Saharan Africa already suffered from a shortage of primary doctors before the arrival of the Gates Foundation, but "by pouring most contributions into the fight against such high-profile killers as AIDS, Gates grantees have increased the demand for specially trained, higher-paid clinicians, diverting staff from basic care" in sub-Saharan Africa. This "brain drain" adds to the existing doctor shortage and pulls away additional trained staff from children and those suffering from other common killers. Second, "the focus on a few diseases has shortchanged basic needs such as nutrition and transportation". Third, "Gates-funded vaccination programs have instructed caregivers to ignore – even discourage patients from discussing – ailments that the vaccinations cannot prevent".

In response, the Gates Foundation has said that African governments need to spend more of their budgets on public health than on wars, that the foundation has donated at least $70 million to help improve nutrition and agriculture in Africa, in addition to its disease-related initiatives and that it is studying ways to improve the delivery of health care in Africa.

Both insiders and external critics have suggested that there is too much deference to Bill Gates's personal views within the Gates Foundation, insufficient internal debate, and pervasive "group think." Critics also complain that Gates Foundation grants are often awarded based on social connections and ideological allegiances rather than based on formal external review processes or technical competence.

Critics have suggested that Gates' approach to Global Health and Agriculture favors the interests of large pharmaceutical and agribusiness companies (in which Gates invests) over the interests of the people of developing countries.
The Global Health Program's other significant grants include:
  • Polio eradication: In 2006, the foundation provided US$86 million toward efforts attempting to eradicate poliomyelitis (polio).
  • The GAVI Alliance: The Foundation gave the GAVI Alliance (formerly the "Global Alliance for Vaccines and Immunization") a donation of US$750 million on January 25, 2005.
  • Children's Vaccine Program: The Children's Vaccine Program, run by the Program for Appropriate Technology in Health (PATH), received a donation of US$27 million to help vaccinate against Japanese encephalitis on December 9, 2003.
  • University of Washington Department of Global Health: The Foundation provided approximately US$30 million for the foundation of the new Department of Global Health at the University of Washington in Seattle, US. The donation promoted three of the foundation's target areas: education, Pacific Northwest and global health.
  • HIV Research: The foundation donated a total of US$287 million to various HIV/AIDS researchers. The money was split between 16 different research teams across the world, on the condition that the findings are shared amongst the teams.
  • Aeras Global TB Vaccine Foundation: The foundation gave the Aeras Global TB Vaccine Foundation more than US$280 million to develop and license an improved vaccine against tuberculosis (TB) for use in high-burden countries (HBCs).
  • Cheaper high-tech tuberculosis (TB) test: In August 2012, the Foundation, in partnership with PEPFAR (United States President's Emergency Plan for AIDS Relief), USAID (United States Agency for International Development) and UNITAID (an international drug purchasing facility hosted by WHO), announced they had finalized an agreement to reduce the cost of a commercial TB test (Cepheid's Xpert MTB/RIF run on the GeneXpert platform), from US$16.86 to US$9.98. This test can take the place of smear microscopy, a technique first developed in the 1880s by Robert Koch. Smear microscopy often does not show TB infection in persons who are also co-infected with HIV, whereas the GeneXpert system can show TB in the co-infected patient. In addition, the system can show whether the particular TB strain is resistant to the bactericidal antibiotic rifampicin, a widely accepted indicator of the presence of multidrug resistant tuberculosis.
  • Visceral leishmaniasis (VL) research: The Foundation awarded the Hebrew University of Jerusalem's Kuvin Center for the Study of Infectious and Tropical Diseases a US$5 million grant in 2009 for research into visceral leishmaniasis (VL), an emerging parasitic disease in Ethiopia, Africa, where it is frequently associated with HIV/AIDS, and is a leading cause of adult illness and death. The project, a collaborative effort with Addis Ababa University, will gather data for analysis—to identify the weak links in the transmission cycle—and devise methods for control of the disease. In 2005 the Foundation provided a US$30 million grant to The Institute for OneWorld Health to support the nonprofit pharmaceutical company's VL work in the rural communities of India, Bangladesh and Nepal. By September 2006, the company had received approval from the Indian body Drug-Controller General of India (DCGI) for the Paromomycin Intramuscular (IM) Injection, a drug that provides an effective cure for VL following a 21-day course. In 2010 Raj Shankar Ghosh, the Regional Director for the South Asia Institute for OneWorld Health, explained that the Foundation funded "the majority of our work" in the development of the drug.
  • Next-Generation Condom: The foundation gave US$100,000 to 11 applicants in November 2013 to develop an improved condom; that is, one that "significantly preserves or enhances pleasure, in order to improve uptake and regular use", according to the Gates Foundation's Grand Challenges in Global Health website. Further grants of up to US$1 million will be given to projects that are successful.
  • Neglected tropical diseases (NTDs): Alongside WHO, the governments of the United States, United Kingdom and United Arab Emirates, and the World Bank, the Foundation endorsed the London Declaration on Neglected Tropical Diseases, "to eradicate, eliminate and intensify control of 17 selected diseases by 2015 and 2020", at a meeting on January 30, 2012, held at the Royal College of Physicians in London, UK. Gates was the principal organizer responsible for bringing together the heads of 13 of the world's largest pharmaceutical companies and the Foundation's monetary commitment to the Declaration was US$363 million over five years. On April 3, 2014, the two-year anniversary of the Declaration, Gates attended a meeting in Paris, France, at which participants reviewed the progress that had been made against 10 neglected tropical diseases (NTDs). The Foundation committed a further US$50 million, together with US$50 million from the Children's Investment Fund Foundation and US$120 million from the World Bank.
  • Coalition for Epidemic Preparedness Innovations (CEPI): A global group tasked with more quickly developing vaccines against infectious disease threats worldwide was launched on 8 January 2017 by a coalition of governments and nonprofit groups including the Bill & Melinda Gates Foundation. The Coalition for Epidemic Preparedness Innovations, funded with an initial investment of $460 million from Germany, Japan, Norway, the Wellcome Trust and the Gates foundation, aims to develop vaccines against known infectious disease threats that could be deployed quickly to contain outbreaks before they become global health emergencies, the group said in a statement at the World Economic Forum in Davos, Switzerland.

United States division

Under President Allan Golston, the United States Program has made grants such as the following:

Donation to Planned Parenthood

Up to 2013, the Bill & Melinda Gates Foundation provided $71 million to Planned Parenthood and affiliated organizations. In 2014, Melinda Gates has stated that the foundation "has decided not to fund abortion", focusing instead on family planning and contraception in order to avoid conflation of abortion and family planning. In response to questions about this decision, Gates stated in a June 2014 blog post that "[she], like everyone else, struggle[s] with the issue" and that "the emotional and personal debate about abortion is threatening to get in the way of the lifesaving consensus regarding basic family planning". Since this time, their endeavors have shifted to a more globalperspective, focusing on voluntary family planning and maternal and newborn health.

Libraries

In 1997, the charity introduced a U.S. Libraries initiative with a goal of "ensuring that if you can get to a public library, you can reach the internet". Only 35% of the world's population has access to the Internet. The foundation has given grants, installed computers and software, and provided training and technical support in partnership with public libraries nationwide in an effort to increase access and knowledge. Helping provide access and training for these resources, this foundation helps move public libraries into the digital age.

Most recently, the foundation gave a US$12.2 million grant to the Southeastern Library Network (SOLINET) to assist libraries in Louisiana and Mississippi on the Gulf Coast, many of which were damaged or destroyed by Hurricanes Katrina and Rita.

Education

A key aspect of the Gates Foundation's U.S. efforts involves an overhaul of the country's education policies at both the K-12 and college levels, including support for teacher evaluations and charter schools and opposition to seniority-based layoffs and other aspects of the education system that are typically backed by teachers' unions. It spent $373 million on education in 2009. It has also donated to the two largest national teachers' unions. The foundation was the biggest early backer of the Common Core State Standards Initiative. In October 2017 it was announced that the Bill and Melinda Gates Foundation would spend more than $1.7 billion over five years to pay for new initiatives in public education.

One of the foundation's goals is to lower poverty by increasing the number of college graduates in the United States, and the organization has funded "Reimagining Aid Design and Delivery" grants to think tanks and advocacy organizations to produce white papers on ideas for changing the current system of federal financial aid for college students, with a goal of increasing graduation rates. One of the ways the foundation has sought to increase the number of college graduates is to get them through college faster, but that idea has received some pushback from organizations of universities and colleges.

As part of its education-related initiatives, the foundation has funded journalists, think tanks, lobbying organizations and governments. Millions of dollars of grants to news organizations have funded reporting on education and higher education, including more than $1.4 million to the Education Writers Association to fund training for journalists who cover education. While some critics have feared the foundation for directing the conversation on education or pushing its point of view through news coverage, the foundation has said it lists all its grants publicly and does not enforce any rules for content among its grantees, who have editorial independence. Union activists in Chicago have accused Gates Foundation grantee Teach Plus, which was founded by new teachers and advocates against seniority-based layoffs, of "astroturfing".

The K-12 and higher education reform programs of the Gates Foundation have been criticized by some education professionals, parents, and researchers because they have driven the conversation on education reform to such an extent that they may marginalize researchers who do not support Gates' predetermined policy preferences. Several Gates-backed policies such as small schools, charter schools, and increasing class sizes have been expensive and disruptive, but some studies indicate they have not improved educational outcomes and may have caused harm.

Examples of some of the K-12 reforms advocated by the foundation include closing ineffective neighborhood schools in favor of privately run charter schools; extensively using standardized test scores to evaluate the progress of students, teachers, and schools; and merit pay for teachers based on student test scores. Critics also believe that the Gates Foundation exerts too much influence over public education policy without being accountable to voters or taxpayers.

Some of the foundation's educational initiatives have included:
  • Gates Cambridge Scholarships: In 2000, the Gates Foundation donated $210 million to help outstanding graduate students from the U.S. and around the world to study at the prestigious University of Cambridge. The Gates Cambridge Scholarship has often been compared to the Rhodes Scholarship given its international scope and substantial endowment, The scholar remains extremely competitive with just 0.3% of applicants being selected. Each year, approximately 100 new graduate students from around the world receive funding to attend Cambridge University. Several buildings at the University of Cambridge also bear the name of William and Melinda Gates after sizable contributions to their construction.
  • Cornell University: Faculty of Computing and Information Science received US$25 million from the Foundation for a new Information Science building, which will be named the "Bill and Melinda Gates Hall". The total cost of the building is expected to be US$60 million. Construction began in March 2012, and officially opened in January 2014.
  • Massachusetts Institute of Technology: Part of the Ray and Maria Stata Center is known as the "Gates Tower" in recognition of partial funding of the building.
  • Carnegie Mellon University: The foundation gave US$20 million to the Carnegie Mellon School of Computer Science for a new Computer Science building called the "Gates Center for Computer Science". It officially opened on September 22, 2009.
  • Smaller schools: The Gates Foundation claims one in five students is unable to read and grasp the contents of what they read, and African American and Latino students are graduating high school with the skills of a middle school student. Gates Foundation has invested more than US$250 million in grants to create new small schools, reduce student-to-teacher ratios, and to divide up large high schools through the schools-within-a-school model.
  • D.C. Achievers Scholarships: The Gates Foundation announced March 22, 2007 a US$122 million initiative to send hundreds of the District of Columbia's poorest students to college.
  • Gates Millennium Scholars: Administered by the United Negro College Fund, the foundation donated US$1.5 billion for scholarships to high achieving minority students.
  • NewSchools Venture Fund: The foundation contributed US$30 million to help NewSchools to manage more charter schools, which aim to prepare students in historically underserved areas for college and careers.
  • Strong American Schools: On April 25, 2007, the Gates Foundation joined forces with the Eli and Edythe Broad Foundation pledging a joint US$60 million to create Strong American Schools, a nonprofit project responsible for running ED in 08, an initiative and information campaign aimed at encouraging 2008 presidential contenders to include education in their campaign policies.
  • Teaching Channel: The Gates Foundation announced in September 2011 a US$3.5 million initiative to launch a multi-platform service delivering professional development videos for teachers over the Internet, public television, cable and other digital outlets. To date, over 500,000 teachers and educators have joined the community to share ideas, lesson plans and teaching methods.
  • The Texas High School Project: The project was set out to increase and improve high school graduation rates across Texas. The foundation committed US$84.6 million to the project beginning in 2003. The project focuses its efforts on high-need schools and districts statewide, with an emphasis on urban areas and the Texas-Mexico border.
  • University Scholars Program: Donated US$20 million in 1998 to endow a scholarship program at Melinda Gates' alma mater, Duke University. The program provides full scholarships to about 10 members of each undergraduate class and one member in each class in each of the professional schools (schools of medicine, business, law, divinity, environment, nursing, and public policy), as well as to students in the Graduate School pursuing doctoral degrees in any discipline. Graduate and professional school scholars serve as mentors to the undergraduate scholars, who are chosen on the basis of financial need and potential for interdisciplinary academic interests. Scholars are chosen each spring from new applicants to Duke University's undergraduate, graduate, and professional school programs. The program features seminars to bring these scholars together for interdisciplinary discussions and an annual spring symposium organized by the scholars.
  • Washington State Achievers Scholarship: The Washington State Achievers program encourages schools to create cultures of high academic achievement while providing scholarship support to select college-bound students.
  • William H. Gates Public Service Law Program: This program awards five full scholarships annually to the University of Washington School of Law. Scholars commit to working in relatively low-paying public service legal positions for at least the first five years following graduation.
  • University of Texas at Austin: $30 million challenge grant to build the Bill & Melinda Gates Computer Science Complex.
  • STAND UP: a national campaign that seeks to positively impact the current crisis within the United States public education system by calling upon community leaders, parents, students and citizens to encourage change and STAND UP for better schools and the future of America's children. STAND UP was co-founded by the Eli Broad Foundation, and was launched in April 2006 on The Oprah Winfrey Show in a two-part feature.
  • Every Student Succeeds Act: donated about $44 million to help with the 2015 federal education law.

Pacific Northwest

  • Discovery Institute: Donated US$1 million in 2000 to the Discovery Institute and pledged US$9.35 million over 10 years in 2003, including US$50,000 of Bruce Chapman's US$141,000 annual salary. According to a Gates Foundation grant maker, this grant is "exclusive to the Cascadia project" on regional transportation, and it may not be used for the Institute's other activities, including promotion of intelligent design.
  • Rainier Scholars: Donated US$1 million.
  • Computer History Museum: Donated US$15 million to the museum in October 2005.

Criticism

Critics say the Bill & Melinda Gates Foundation has overlooked the links between poverty and poor academic achievement, and has unfairly demonized teachers for poor achievement by underprivileged students. They contend that the Gates Foundation should be embracing anti-poverty and living wage policies rather than pursuing untested and empirically unsupported education reforms.

Critics say that Gates-backed reforms such as increasing the use of technology in education may financially benefit Microsoft and the Gates family.

The foundation trust invests undistributed assets, with the exclusive goal of maximizing the return on investment. As a result, its investments include companies that have been criticized for worsening poverty in the same developing countries where the foundation is attempting to relieve poverty. These include companies that pollute heavily and pharmaceutical companies that do not sell into the developing world. In response to press criticism, the foundation announced in 2007 a review of its investments to assess social responsibility. It subsequently cancelled the review and stood by its policy of investing for maximum return, while using voting rights to influence company practices.

Critics have called on the Gates Foundation to end its investments in the GEO Group, the second largest private prison corporation in the United States. A large part of the prison's work involves incarcerating and detaining migrants that have been detained by the Obama administration and now the Trump administration. In spring 2014 the Gates Foundation acknowledged its $2.2 million investment in the prison corporation. It has more recently rebuffed critics' request that it sever investment ties with the prison corporation. It has refused to comment on whether it is continuing its investments.

Lifespan

In October 2006, the Bill & Melinda Gates Foundation was split into two entities: the Bill & Melinda Gates Foundation Trust, which manages the endowment assets and the Bill & Melinda Gates Foundation, which "... conducts all operations and grantmaking work, and it is the entity from which all grants are made". Also announced was the decision to "... spend all of [the Trust's] resources within 20 years after Bill's and Melinda's deaths". This would close the Bill & Melinda Gates Foundation Trust and effectively end the Bill & Melinda Gates Foundation. In the same announcement it was reiterated that Warren Buffett "... has stipulated that the proceeds from the Berkshire Hathaway shares he still owns at death are to be used for philanthropic purposes within 10 years after his estate has been settled".

The plan to close the Foundation Trust is in contrast to most large charitable foundations that have no set closure date. This is intended to lower administrative costs over the years of the Foundation Trust's life and ensure that the Foundation Trust not fall into a situation where the vast majority of its expenditures are on administrative costs, including salaries, with only token amounts contributed to charitable causes.

Awards

Computer simulation

From Wikipedia, the free encyclopedia
 
Process of building a computer model, and the interplay between experiment, simulation, and theory.

Computer simulation is the reproduction of the behavior of a system using a computer to simulate the outcomes of a mathematical model associated with said system. Since they allow to check the reliability of chosen mathematical models, computer simulations have become a useful tool for the mathematical modeling of many natural systems in physics (computational physics), astrophysics, climatology, chemistry and biology, human systems in economics, psychology, social science, and engineering. Simulation of a system is represented as the running of the system's model. It can be used to explore and gain new insights into new technology and to estimate the performance of systems too complex for analytical solutions.

Computer simulations are realized by running computer programs that can be either small, running almost instantly on small devices, or large-scale programs that run for hours or days on network-based groups of computers. The scale of events being simulated by computer simulations has far exceeded anything possible (or perhaps even imaginable) using traditional paper-and-pencil mathematical modeling. Over 10 years ago, a desert-battle simulation of one force invading another involved the modeling of 66,239 tanks, trucks and other vehicles on simulated terrain around Kuwait, using multiple supercomputers in the DoD High Performance Computer Modernization Program. Other examples include a 1-billion-atom model of material deformation; a 2.64-million-atom model of the complex protein-producing organelle of all living organisms, the ribosome, in 2005; a complete simulation of the life cycle of Mycoplasma genitalium in 2012; and the Blue Brain project at EPFL (Switzerland), begun in May 2005 to create the first computer simulation of the entire human brain, right down to the molecular level.

Because of the computational cost of simulation, computer experiments are used to perform inference such as uncertainty quantification.

Simulation versus model

A computer model is the algorithms and equations used to capture the behavior of the system being modeled. By contrast, computer simulation is the actual running of the program that contains these equations or algorithms. Simulation, therefore, is the process of running a model. Thus one would not "build a simulation"; instead, one would "build a model", and then either "run the model" or equivalently "run a simulation".

History

Computer simulation developed hand-in-hand with the rapid growth of the computer, following its first large-scale deployment during the Manhattan Project in World War II to model the process of nuclear detonation. It was a simulation of 12 hard spheres using a Monte Carlo algorithm. Computer simulation is often used as an adjunct to, or substitute for, modeling systems for which simple closed form analytic solutions are not possible. There are many types of computer simulations; their common feature is the attempt to generate a sample of representative scenarios for a model in which a complete enumeration of all possible states of the model would be prohibitive or impossible.

Data preparation

The external data requirements of simulations and models vary widely. For some, the input might be just a few numbers (for example, simulation of a waveform of AC electricity on a wire), while others might require terabytes of information (such as weather and climate models).
Input sources also vary widely:
  • Sensors and other physical devices connected to the model;
  • Control surfaces used to direct the progress of the simulation in some way;
  • Current or historical data entered by hand;
  • Values extracted as a by-product from other processes;
  • Values output for the purpose by other simulations, models, or processes.
Lastly, the time at which data is available varies:
  • "invariant" data is often built into the model code, either because the value is truly invariant (e.g., the value of π) or because the designers consider the value to be invariant for all cases of interest;
  • data can be entered into the simulation when it starts up, for example by reading one or more files, or by reading data from a preprocessor;
  • data can be provided during the simulation run, for example by a sensor network.
Because of this variety, and because diverse simulation systems have many common elements, there are a large number of specialized simulation languages. The best-known may be Simula (sometimes called Simula-67, after the year 1967 when it was proposed). There are now many others.

Systems that accept data from external sources must be very careful in knowing what they are receiving. While it is easy for computers to read in values from text or binary files, what is much harder is knowing what the accuracy (compared to measurement resolution and precision) of the values are. Often they are expressed as "error bars", a minimum and maximum deviation from the value range within which the true value (is expected to) lie. Because digital computer mathematics is not perfect, rounding and truncation errors multiply this error, so it is useful to perform an "error analysis" to confirm that values output by the simulation will still be usefully accurate.

Even small errors in the original data can accumulate into substantial error later in the simulation. While all computer analysis is subject to the "GIGO" (garbage in, garbage out) restriction, this is especially true of digital simulation. Indeed, observation of this inherent, cumulative error in digital systems was the main catalyst for the development of chaos theory.

Types

Computer models can be classified according to several independent pairs of attributes, including:
  • Stochastic or deterministic (and as a special case of deterministic, chaotic) – see external links below for examples of stochastic vs. deterministic simulations
  • Steady-state or dynamic
  • Continuous or discrete (and as an important special case of discrete, discrete event or DE models)
  • Dynamic system simulation, e.g. electric systems, hydraulic systems or multi-body mechanical systems (described primarely by DAE:s) or dynamics simulation of field problems, e.g. CFD of FEM simulations (described by PDE:s).
  • Local or distributed.
Another way of categorizing models is to look at the underlying data structures. For time-stepped simulations, there are two main classes:
  • Simulations which store their data in regular grids and require only next-neighbor access are called stencil codes. Many CFD applications belong to this category.
  • If the underlying graph is not a regular grid, the model may belong to the meshfree method class.
Equations define the relationships between elements of the modeled system and attempt to find a state in which the system is in equilibrium. Such models are often used in simulating physical systems, as a simpler modeling case before dynamic simulation is attempted.
  • Dynamic simulations model changes in a system in response to (usually changing) input signals.
  • Stochastic models use random number generators to model chance or random events;
  • A discrete event simulation (DES) manages events in time. Most computer, logic-test and fault-tree simulations are of this type. In this type of simulation, the simulator maintains a queue of events sorted by the simulated time they should occur. The simulator reads the queue and triggers new events as each event is processed. It is not important to execute the simulation in real time. It is often more important to be able to access the data produced by the simulation and to discover logic defects in the design or the sequence of events.
  • A continuous dynamic simulation performs numerical solution of differential-algebraic equations or differential equations (either partial or ordinary). Periodically, the simulation program solves all the equations and uses the numbers to change the state and output of the simulation. Applications include flight simulators, construction and management simulation games, chemical process modeling, and simulations of electrical circuits. Originally, these kinds of simulations were actually implemented on analog computers, where the differential equations could be represented directly by various electrical components such as op-amps. By the late 1980s, however, most "analog" simulations were run on conventional digital computers that emulate the behavior of an analog computer.
  • A special type of discrete simulation that does not rely on a model with an underlying equation, but can nonetheless be represented formally, is agent-based simulation. In agent-based simulation, the individual entities (such as molecules, cells, trees or consumers) in the model are represented directly (rather than by their density or concentration) and possess an internal state and set of behaviors or rules that determine how the agent's state is updated from one time-step to the next.
  • Distributed models run on a network of interconnected computers, possibly through the Internet. Simulations dispersed across multiple host computers like this are often referred to as "distributed simulations". There are several standards for distributed simulation, including Aggregate Level Simulation Protocol (ALSP), Distributed Interactive Simulation (DIS), the High Level Architecture (simulation) (HLA) and the Test and Training Enabling Architecture (TENA).

Visualization

Formerly, the output data from a computer simulation was sometimes presented in a table or a matrix showing how data were affected by numerous changes in the simulation parameters. The use of the matrix format was related to traditional use of the matrix concept in mathematical models. However, psychologists and others noted that humans could quickly perceive trends by looking at graphs or even moving-images or motion-pictures generated from the data, as displayed by computer-generated-imagery (CGI) animation. Although observers could not necessarily read out numbers or quote math formulas, from observing a moving weather chart they might be able to predict events (and "see that rain was headed their way") much faster than by scanning tables of rain-cloud coordinates. Such intense graphical displays, which transcended the world of numbers and formulae, sometimes also led to output that lacked a coordinate grid or omitted timestamps, as if straying too far from numeric data displays. Today, weather forecasting models tend to balance the view of moving rain/snow clouds against a map that uses numeric coordinates and numeric timestamps of events.

Similarly, CGI computer simulations of CAT scans can simulate how a tumor might shrink or change during an extended period of medical treatment, presenting the passage of time as a spinning view of the visible human head, as the tumor changes.

Other applications of CGI computer simulations are being developed to graphically display large amounts of data, in motion, as changes occur during a simulation run.

Computer simulation in science

Computer simulation of the process of osmosis

Generic examples of types of computer simulations in science, which are derived from an underlying mathematical description:
Specific examples of computer simulations follow:
  • statistical simulations based upon an agglomeration of a large number of input profiles, such as the forecasting of equilibrium temperature of receiving waters, allowing the gamut of meteorological data to be input for a specific locale. This technique was developed for thermal pollution forecasting.
  • agent based simulation has been used effectively in ecology, where it is often called "individual based modeling" and is used in situations for which individual variability in the agents cannot be neglected, such as population dynamics of salmon and trout (most purely mathematical models assume all trout behave identically).
  • time stepped dynamic model. In hydrology there are several such hydrology transport models such as the SWMM and DSSAM Models developed by the U.S. Environmental Protection Agency for river water quality forecasting.
  • computer simulations have also been used to formally model theories of human cognition and performance, e.g., ACT-R.
  • computer simulation using molecular modeling for drug discovery.
  • computer simulation to model viral infection in mammalian cells.
  • computer simulation for studying the selective sensitivity of bonds by mechanochemistry during grinding of organic molecules.
  • Computational fluid dynamics simulations are used to simulate the behaviour of flowing air, water and other fluids. One-, two- and three-dimensional models are used. A one-dimensional model might simulate the effects of water hammer in a pipe. A two-dimensional model might be used to simulate the drag forces on the cross-section of an aeroplane wing. A three-dimensional simulation might estimate the heating and cooling requirements of a large building.
  • An understanding of statistical thermodynamic molecular theory is fundamental to the appreciation of molecular solutions. Development of the Potential Distribution Theorem (PDT) allows this complex subject to be simplified to down-to-earth presentations of molecular theory.
Notable, and sometimes controversial, computer simulations used in science include: Donella Meadows' World3 used in the Limits to Growth, James Lovelock's Daisyworld and Thomas Ray's Tierra.

In social sciences, computer simulation is an integral component of the five angles of analysis fostered by the data percolation methodology, which also includes qualitative and quantitative methods, reviews of the literature (including scholarly), and interviews with experts, and which forms an extension of data triangulation.

Simulation environments for physics and engineering

Graphical environments to design simulations have been developed. Special care was taken to handle events (situations in which the simulation equations are not valid and have to be changed). The open project Open Source Physics was started to develop reusable libraries for simulations in Java, together with Easy Java Simulations, a complete graphical environment that generates code based on these libraries.

Simulation environments for linguistics

Taiwanese Tone Group Parser is a simulator of Taiwanese tone sandhi acquisition. In practical, the method using linguistic theory to implement the Taiwanese tone group parser is a way to apply knowledge engineering technique to build the experiment environment of computer simulation for language acquisition. A work-in-process version of artificial tone group parser that includes a knowledge base and an executable program file for Microsoft Windows system (XP/Win7) can be download for evaluation.

Computer simulation in practical contexts

Computer simulations are used in a wide variety of practical contexts, such as:
The reliability and the trust people put in computer simulations depends on the validity of the simulation model, therefore verification and validation are of crucial importance in the development of computer simulations. Another important aspect of computer simulations is that of reproducibility of the results, meaning that a simulation model should not provide a different answer for each execution. Although this might seem obvious, this is a special point of attention in stochastic simulations, where random numbers should actually be semi-random numbers. An exception to reproducibility are human-in-the-loop simulations such as flight simulations and computer games. Here a human is part of the simulation and thus influences the outcome in a way that is hard, if not impossible, to reproduce exactly.

Vehicle manufacturers make use of computer simulation to test safety features in new designs. By building a copy of the car in a physics simulation environment, they can save the hundreds of thousands of dollars that would otherwise be required to build and test a unique prototype. Engineers can step through the simulation milliseconds at a time to determine the exact stresses being put upon each section of the prototype.

Computer graphics can be used to display the results of a computer simulation. Animations can be used to experience a simulation in real-time, e.g., in training simulations. In some cases animations may also be useful in faster than real-time or even slower than real-time modes. For example, faster than real-time animations can be useful in visualizing the buildup of queues in the simulation of humans evacuating a building. Furthermore, simulation results are often aggregated into static images using various ways of scientific visualization.

In debugging, simulating a program execution under test (rather than executing natively) can detect far more errors than the hardware itself can detect and, at the same time, log useful debugging information such as instruction trace, memory alterations and instruction counts. This technique can also detect buffer overflow and similar "hard to detect" errors as well as produce performance information and tuning data.

Pitfalls

Although sometimes ignored in computer simulations, it is very important to perform a sensitivity analysis to ensure that the accuracy of the results is properly understood. For example, the probabilistic risk analysis of factors determining the success of an oilfield exploration program involves combining samples from a variety of statistical distributions using the Monte Carlo method. If, for instance, one of the key parameters (e.g., the net ratio of oil-bearing strata) is known to only one significant figure, then the result of the simulation might not be more precise than one significant figure, although it might (misleadingly) be presented as having four significant figures.

Model calibration techniques

The following three steps should be used to produce accurate simulation models: calibration, verification, and validation. Computer simulations are good at portraying and comparing theoretical scenarios, but in order to accurately model actual case studies they have to match what is actually happening today. A base model should be created and calibrated so that it matches the area being studied. The calibrated model should then be verified to ensure that the model is operating as expected based on the inputs. Once the model has been verified, the final step is to validate the model by comparing the outputs to historical data from the study area. This can be done by using statistical techniques and ensuring an adequate R-squared value. Unless these techniques are employed, the simulation model created will produce inaccurate results and not be a useful prediction tool.
Model calibration is achieved by adjusting any available parameters in order to adjust how the model operates and simulates the process. For example, in traffic simulation, typical parameters include look-ahead distance, car-following sensitivity, discharge headway, and start-up lost time. These parameters influence driver behavior such as when and how long it takes a driver to change lanes, how much distance a driver leaves between his car and the car in front of it, and how quickly a driver starts to accelerate through an intersection. Adjusting these parameters has a direct effect on the amount of traffic volume that can traverse through the modeled roadway network by making the drivers more or less aggressive. These are examples of calibration parameters that can be fine-tuned to match characteristics observed in the field at the study location. Most traffic models have typical default values but they may need to be adjusted to better match the driver behavior at the specific location being studied.

Model verification is achieved by obtaining output data from the model and comparing them to what is expected from the input data. For example, in traffic simulation, traffic volume can be verified to ensure that actual volume throughput in the model is reasonably close to traffic volumes input into the model. Ten percent is a typical threshold used in traffic simulation to determine if output volumes are reasonably close to input volumes. Simulation models handle model inputs in different ways so traffic that enters the network, for example, may or may not reach its desired destination. Additionally, traffic that wants to enter the network may not be able to, if congestion exists. This is why model verification is a very important part of the modeling process.

The final step is to validate the model by comparing the results with what is expected based on historical data from the study area. Ideally, the model should produce similar results to what has happened historically. This is typically verified by nothing more than quoting the R-squared statistic from the fit. This statistic measures the fraction of variability that is accounted for by the model. A high R-squared value does not necessarily mean the model fits the data well. Another tool used to validate models is graphical residual analysis. If model output values drastically differ from historical values, it probably means there is an error in the model. Before using the model as a base to produce additional models, it is important to verify it for different scenarios to ensure that each one is accurate. If the outputs do not reasonably match historic values during the validation process, the model should be reviewed and updated to produce results more in line with expectations. It is an iterative process that helps to produce more realistic models.

Validating traffic simulation models requires comparing traffic estimated by the model to observed traffic on the roadway and transit systems. Initial comparisons are for trip interchanges between quadrants, sectors, or other large areas of interest. The next step is to compare traffic estimated by the models to traffic counts, including transit ridership, crossing contrived barriers in the study area. These are typically called screenlines, cutlines, and cordon lines and may be imaginary or actual physical barriers. Cordon lines surround particular areas such as a city's central business district or other major activity centers. Transit ridership estimates are commonly validated by comparing them to actual patronage crossing cordon lines around the central business district.

Three sources of error can cause weak correlation during calibration: input error, model error, and parameter error. In general, input error and parameter error can be adjusted easily by the user. Model error however is caused by the methodology used in the model and may not be as easy to fix. Simulation models are typically built using several different modeling theories that can produce conflicting results. Some models are more generalized while others are more detailed. If model error occurs as a result, in may be necessary to adjust the model methodology to make results more consistent.

In order to produce good models that can be used to produce realistic results, these are the necessary steps that need to be taken in order to ensure that simulation models are functioning properly. Simulation models can be used as a tool to verify engineering theories, but they are only valid if calibrated properly. Once satisfactory estimates of the parameters for all models have been obtained, the models must be checked to assure that they adequately perform the intended functions. The validation process establishes the credibility of the model by demonstrating its ability to replicate actual traffic patterns. The importance of model validation underscores the need for careful planning, thoroughness and accuracy of the input data collection program that has this purpose. Efforts should be made to ensure collected data is consistent with expected values. For example, in traffic analysis it is typical for a traffic engineer to perform a site visit to verify traffic counts and become familiar with traffic patterns in the area. The resulting models and forecasts will be no better than the data used for model estimation and validation.

Metabolic network modelling

From Wikipedia, the free encyclopedia
Metabolic network showing interactions between enzymes and metabolites in the Arabidopsis thaliana citric acid cycle. Enzymes and metabolites are the red dots and interactions between them are the lines.
Metabolic network model for Escherichia coli.

Metabolic network reconstruction and simulation allows for an in-depth insight into the molecular mechanisms of a particular organism. In particular, these models correlate the genome with molecular physiology. A reconstruction breaks down metabolic pathways (such as glycolysis and the citric acid cycle) into their respective reactions and enzymes, and analyzes them within the perspective of the entire network. In simplified terms, a reconstruction collects all of the relevant metabolic information of an organism and compiles it in a mathematical model. Validation and analysis of reconstructions can allow identification of key features of metabolism such as growth yield, resource distribution, network robustness, and gene essentiality. This knowledge can then be applied to create novel biotechnology.

In general, the process to build a reconstruction is as follows:
  1. Draft a reconstruction
  2. Refine the model
  3. Convert model into a mathematical/computational representation
  4. Evaluate and debug model through experimentation

Genome-scale metabolic reconstruction

A metabolic reconstruction provides a highly mathematical, structured platform on which to understand the systems biology of metabolic pathways within an organism. The integration of biochemical metabolic pathways with rapidly available, annotated genome sequences has developed what are called genome-scale metabolic models. Simply put, these models correspond metabolic genes with metabolic pathways. In general, the more information about physiology, biochemistry and genetics is available for the target organism, the better the predictive capacity of the reconstructed models. Mechanically speaking, the process of reconstructing prokaryotic and eukaryotic metabolic networks is essentially the same. Having said this, eukaryote reconstructions are typically more challenging because of the size of genomes, coverage of knowledge, and the multitude of cellular compartments. The first genome-scale metabolic model was generated in 1995 for Haemophilus influenzae. The first multicellular organism, C. elegans, was reconstructed in 1998. Since then, many reconstructions have been formed.

Organism Genes in Genome Genes in Model Reactions Metabolites Date of reconstruction
Haemophilus influenzae 1,775 296 488 343 June 1999
Escherichia coli 4,405 660 627 438 May 2000
Saccharomyces cerevisiae 6,183 708 1,175 584 February 2003
Mus musculus 28,287 473 1220 872 January 2005
Homo sapiens 21,090 3,623 3,673 -- January 2007
Mycobacterium tuberculosis 4,402 661 939 828 June 2007
Bacillus subtilis 4,114 844 1,020 988 September 2007
Synechocystis sp. PCC6803 3,221 633 831 704 October 2008
Salmonella typhimurium 4,489 1,083 1,087 774 April 2009
Arabidopsis thaliana 27,379 1,419 1,567 1,748 February 2010

Drafting a reconstruction

Resources

Because the timescale for the development of reconstructions is so recent, most reconstructions have been built manually. However, now, there are quite a few resources that allow for the semi-automatic assembly of these reconstructions that are utilized due to the time and effort necessary for a reconstruction. An initial fast reconstruction can be developed automatically using resources like PathoLogic or ERGO in combination with encyclopedias like MetaCyc, and then manually updated by using resources like PathwayTools. These semi-automatic methods allow for a fast draft to be created while allowing the fine tune adjustments required once new experimental data is found. It is only in this manner that the field of metabolic reconstructions will keep up with the ever-increasing numbers of annotated genomes.

Databases

  • Kyoto Encyclopedia of Genes and Genomes (KEGG): a bioinformatics database containing information on genes, proteins, reactions, and pathways. The ‘KEGG Organisms’ section, which is divided into eukaryotes and prokaryotes, encompasses many organisms for which gene and DNA information can be searched by typing in the enzyme of choice.
  • BioCyc, EcoCyc, and MetaCyc: BioCyc Is a collection of 3,000 pathway/genome databases (as of Oct 2013), with each database dedicated to one organism. For example, EcoCyc is a highly detailed bioinformatics database on the genome and metabolic reconstruction of Escherichia coli, including thorough descriptions of E. coli signaling pathways and regulatory network. The EcoCyc database can serve as a paradigm and model for any reconstruction. Additionally, MetaCyc, an encyclopedia of experimentally defined metabolic pathways and enzymes, contains 2,100 metabolic pathways and 11,400 metabolic reactions (Oct 2013).
  • ENZYME: An enzyme nomenclature database (part of the ExPASy proteonomics server of the Swiss Institute of Bioinformatics). After searching for a particular enzyme on the database, this resource gives you the reaction that is catalyzed. ENZYME has direct links to other gene/enzyme/literature databases such as KEGG, BRENDA, and PUBMED.
  • BRENDA: A comprehensive enzyme database that allows for an enzyme to be searched by name, EC number, or organism.
  • BiGG: A knowledge base of biochemically, genetically, and genomically structured genome-scale metabolic network reconstructions.
  • metaTIGER: Is a collection of metabolic profiles and phylogenomic information on a taxonomically diverse range of eukaryotes which provides novel facilities for viewing and comparing the metabolic profiles between organisms.
This table quickly compares the scope of each database.
Database Scope

Enzymes Genes Reactions Pathways Metabolites
KEGG X X X X X
BioCyc X X X X X
MetaCyc X
X X X
ENZYME X
X
X
BRENDA X
X
X
BiGG
X
X X

Tools for metabolic modeling

  • Pathway Tools: A bioinformatics software package that assists in the construction of pathway/genome databases such as EcoCyc. Developed by Peter Karp and associates at the SRI International Bioinformatics Research Group, Pathway Tools has several components. Its PathoLogic module takes an annotated genome for an organism and infers probable metabolic reactions and pathways to produce a new pathway/genome database. Its MetaFlux component can generate a quantitative metabolic model from that pathway/genome database using flux-balance analysis. Its Navigator component provides extensive query and visualization tools, such as visualization of metabolites, pathways, and the complete metabolic network.
  • ERGO: A subscription-based service developed by Integrated Genomics. It integrates data from every level including genomic, biochemical data, literature, and high-throughput analysis into a comprehensive user friendly network of metabolic and nonmetabolic pathways.
  • KEGGtranslator: an easy-to-use stand-alone application that can visualize and convert KEGG files (KGML formatted XML-files) into multiple output formats. Unlike other translators, KEGGtranslator supports a plethora of output formats, is able to augment the information in translated documents (e.g., MIRIAM annotations) beyond the scope of the KGML document, and amends missing components to fragmentary reactions within the pathway to allow simulations on those. KEGGtranslator converts these files to SBML, BioPAX, SIF, SBGN, SBML with qualitative modeling extension, GML, GraphML, JPG, GIF, LaTeX, etc.
  • ModelSEED: An online resource for the analysis, comparison, reconstruction, and curation of genome-scale metabolic models. Users can submit genome sequences to the RAST annotation system, and the resulting annotation can be automatically piped into the ModelSEED to produce a draft metabolic model. The ModelSEED automatically constructs a network of metabolic reactions, gene-protein-reaction associations for each reaction, and a biomass composition reaction for each genome to produce a model of microbial metabolism that can be simulated using Flux Balance Analysis.
  • MetaMerge: algorithm for semi-automatically reconciling a pair of existing metabolic network reconstructions into a single metabolic network model.
  • CoReCo: [21][22] algorithm for automatic reconstruction of metabolic models of related species. The first version of the software used KEGG as reaction database to link with the EC number predictions from CoReCo. Its automatic gap filling using atom map of all the reactions produce functional models ready for simulation.

Tools for literature

  • PUBMED: This is an online library developed by the National Center for Biotechnology Information, which contains a massive collection of medical journals. Using the link provided by ENZYME, the search can be directed towards the organism of interest, thus recovering literature on the enzyme and its use inside of the organism.

Methodology to draft a reconstruction

This is a visual representation of the metabolic network reconstruction process.

A reconstruction is built by compiling data from the resources above. Database tools such as KEGG and BioCyc can be used in conjunction with each other to find all the metabolic genes in the organism of interest. These genes will be compared to closely related organisms that have already developed reconstructions to find homologous genes and reactions. These homologous genes and reactions are carried over from the known reconstructions to form the draft reconstruction of the organism of interest. Tools such as ERGO, Pathway Tools and Model SEED can compile data into pathways to form a network of metabolic and non-metabolic pathways. These networks are then verified and refined before being made into a mathematical simulation.

The predictive aspect of a metabolic reconstruction hinges on the ability to predict the biochemical reaction catalyzed by a protein using that protein's amino acid sequence as an input, and to infer the structure of a metabolic network based on the predicted set of reactions. A network of enzymes and metabolites is drafted to relate sequences and function. When an uncharacterized protein is found in the genome, its amino acid sequence is first compared to those of previously characterized proteins to search for homology. When a homologous protein is found, the proteins are considered to have a common ancestor and their functions are inferred as being similar. However, the quality of a reconstruction model is dependent on its ability to accurately infer phenotype directly from sequence, so this rough estimation of protein function will not be sufficient. A number of algorithms and bioinformatics resources have been developed for refinement of sequence homology-based assignments of protein functions:
  • InParanoid: Identifies eukaryotic orthologs by looking only at in-paralogs.
  • CDD: Resource for the annotation of functional units in proteins. Its collection of domain models utilizes 3D structure to provide insights into sequence/structure/function relationships.
  • InterPro: Provides functional analysis of proteins by classifying them into families and predicting domains and important sites.
  • STRING: Database of known and predicted protein interactions.
Once proteins have been established, more information about the enzyme structure, reactions catalyzed, substrates and products, mechanisms, and more can be acquired from databases such as KEGG, MetaCyc and NC-IUBMB. Accurate metabolic reconstructions require additional information about the reversibility and preferred physiological direction of an enzyme-catalyzed reaction which can come from databases such as BRENDA or MetaCyc database.

Model refinement

An initial metabolic reconstruction of a genome is typically far from perfect due to the high variability and diversity of microorganisms. Often, metabolic pathway databases such as KEGG and MetaCyc will have "holes", meaning that there is a conversion from a substrate to a product (i.e., an enzymatic activity) for which there is no known protein in the genome that encodes the enzyme that facilitates the catalysis. What can also happen in semi-automatically drafted reconstructions is that some pathways are falsely predicted and don't actually occur in the predicted manner. Because of this, a systematic verification is made in order to make sure no inconsistencies are present and that all the entries listed are correct and accurate. Furthermore, previous literature can be researched in order to support any information obtained from one of the many metabolic reaction and genome databases. This provides an added level of assurance for the reconstruction that the enzyme and the reaction it catalyzes do actually occur in the organism.

Enzyme promiscuity and spontaneous chemical reactions can damage metabolites. This metabolite damage, and its repair or pre-emption, create energy costs that need to be incorporated into models. It is likely that many genes of unknown function encode proteins that repair or pre-empt metabolite damage, but most genome-scale metabolic reconstructions only include a fraction of all genes.

Any new reaction not present in the databases needs to be added to the reconstruction. This is an iterative process that cycles between the experimental phase and the coding phase. As new information is found about the target organism, the model will be adjusted to predict the metabolic and phenotypical output of the cell. The presence or absence of certain reactions of the metabolism will affect the amount of reactants/products that are present for other reactions within the particular pathway. This is because products in one reaction go on to become the reactants for another reaction, i.e. products of one reaction can combine with other proteins or compounds to form new proteins/compounds in the presence of different enzymes or catalysts.

Francke et al. provide an excellent example as to why the verification step of the project needs to be performed in significant detail. During a metabolic network reconstruction of Lactobacillus plantarum, the model showed that succinyl-CoA was one of the reactants for a reaction that was a part of the biosynthesis of methionine. However, an understanding of the physiology of the organism would have revealed that due to an incomplete tricarboxylic acid pathway, Lactobacillus plantarum does not actually produce succinyl-CoA, and the correct reactant for that part of the reaction was acetyl-CoA.

Therefore, systematic verification of the initial reconstruction will bring to light several inconsistencies that can adversely affect the final interpretation of the reconstruction, which is to accurately comprehend the molecular mechanisms of the organism. Furthermore, the simulation step also ensures that all the reactions present in the reconstruction are properly balanced. To sum up, a reconstruction that is fully accurate can lead to greater insight about understanding the functioning of the organism of interest.

Metabolic network simulation

A metabolic network can be broken down into a stoichiometric matrix where the rows represent the compounds of the reactions, while the columns of the matrix correspond to the reactions themselves. Stoichiometry is a quantitative relationship between substrates of a chemical reaction. In order to deduce what the metabolic network suggests, recent research has centered on a few approaches, such as extreme pathways, elementary mode analysis, flux balance analysis, and a number of other constraint-based modeling methods.

Extreme pathways

Price, Reed, and Papin, from the Palsson lab, use a method of singular value decomposition (SVD) of extreme pathways in order to understand regulation of a human red blood cell metabolism. Extreme pathways are convex basis vectors that consist of steady state functions of a metabolic network. For any particular metabolic network, there is always a unique set of extreme pathways available. Furthermore, Price, Reed, and Papin, define a constraint-based approach, where through the help of constraints like mass balance and maximum reaction rates, it is possible to develop a ‘solution space’ where all the feasible options fall within. Then, using a kinetic model approach, a single solution that falls within the extreme pathway solution space can be determined. Therefore, in their study, Price, Reed, and Papin, use both constraint and kinetic approaches to understand the human red blood cell metabolism. In conclusion, using extreme pathways, the regulatory mechanisms of a metabolic network can be studied in further detail.

Elementary mode analysis

Elementary mode analysis closely matches the approach used by extreme pathways. Similar to extreme pathways, there is always a unique set of elementary modes available for a particular metabolic network. These are the smallest sub-networks that allow a metabolic reconstruction network to function in steady state. According to Stelling (2002), elementary modes can be used to understand cellular objectives for the overall metabolic network. Furthermore, elementary mode analysis takes into account stoichiometrics and thermodynamics when evaluating whether a particular metabolic route or network is feasible and likely for a set of proteins/enzymes.

Minimal metabolic behaviors (MMBs)

In 2009, Larhlimi and Bockmayr presented a new approach called "minimal metabolic behaviors" for the analysis of metabolic networks. Like elementary modes or extreme pathways, these are uniquely determined by the network, and yield a complete description of the flux cone. However, the new description is much more compact. In contrast with elementary modes and extreme pathways, which use an inner description based on generating vectors of the flux cone, MMBs are using an outer description of the flux cone. This approach is based on sets of non-negativity constraints. These can be identified with irreversible reactions, and thus have a direct biochemical interpretation. One can characterize a metabolic network by MMBs and the reversible metabolic space.

Flux balance analysis

A different technique to simulate the metabolic network is to perform flux balance analysis. This method uses linear programming, but in contrast to elementary mode analysis and extreme pathways, only a single solution results in the end. Linear programming is usually used to obtain the maximum potential of the objective function that you are looking at, and therefore, when using flux balance analysis, a single solution is found to the optimization problem. In a flux balance analysis approach, exchange fluxes are assigned to those metabolites that enter or leave the particular network only. Those metabolites that are consumed within the network are not assigned any exchange flux value. Also, the exchange fluxes along with the enzymes can have constraints ranging from a negative to positive value (ex: -10 to 10).

Furthermore, this particular approach can accurately define if the reaction stoichiometry is in line with predictions by providing fluxes for the balanced reactions. Also, flux balance analysis can highlight the most effective and efficient pathway through the network in order to achieve a particular objective function. In addition, gene knockout studies can be performed using flux balance analysis. The enzyme that correlates to the gene that needs to be removed is given a constraint value of 0. Then, the reaction that the particular enzyme catalyzes is completely removed from the analysis.

Dynamic simulation and parameter estimation

In order to perform a dynamic simulation with such a network it is necessary to construct an ordinary differential equation system that describes the rates of change in each metabolite's concentration or amount. To this end, a rate law, i.e., a kinetic equation that determines the rate of reaction based on the concentrations of all reactants is required for each reaction. Software packages that include numerical integrators, such as COPASI or SBMLsimulator, are then able to simulate the system dynamics given an initial condition. Often these rate laws contain kinetic parameters with uncertain values. In many cases it is desired to estimate these parameter values with respect to given time-series data of metabolite concentrations. The system is then supposed to reproduce the given data. For this purpose the distance between the given data set and the result of the simulation, i.e., the numerically or in few cases analytically obtained solution of the differential equation system is computed. The values of the parameters are then estimated to minimize this distance. One step further, it may be desired to estimate the mathematical structure of the differential equation system because the real rate laws are not known for the reactions within the system under study. To this end, the program SBMLsqueezer allows automatic creation of appropriate rate laws for all reactions with the network. 

Synthetic accessibility

Synthetic accessibility is a simple approach to network simulation whose goal is to predict which metabolic gene knockouts are lethal. The synthetic accessibility approach uses the topology of the metabolic network to calculate the sum of the minimum number of steps needed to traverse the metabolic network graph from the inputs, those metabolites available to the organism from the environment, to the outputs, metabolites needed by the organism to survive. To simulate a gene knockout, the reactions enabled by the gene are removed from the network and the synthetic accessibility metric is recalculated. An increase in the total number of steps is predicted to cause lethality. Wunderlich and Mirny showed this simple, parameter-free approach predicted knockout lethality in E. coli and S. cerevisiae as well as elementary mode analysis and flux balance analysis in a variety of media.

Applications of a reconstruction

  • Several inconsistencies exist between gene, enzyme, reaction databases, and published literature sources regarding the metabolic information of an organism. A reconstruction is a systematic verification and compilation of data from various sources that takes into account all of the discrepancies.
  • The combination of relevant metabolic and genomic information of an organism.
  • Metabolic comparisons can be performed between various organisms of the same species as well as between different organisms.
  • Analysis of synthetic lethality
  • Predict adaptive evolution outcomes
  • Use in metabolic engineering for high value outputs.
Reconstructions and their corresponding models allow the formulation of hypotheses about the presence of certain enzymatic activities and the production of metabolites that can be experimentally tested, complementing the primarily discovery-based approach of traditional microbial biochemistry with hypothesis-driven research. The results these experiments can uncover novel pathways and metabolic activities and decipher between discrepancies in previous experimental data. Information about the chemical reactions of metabolism and the genetic background of various metabolic properties (sequence to structure to function) can be utilized by genetic engineers to modify organisms to produce high value outputs whether those products be medically relevant like pharmaceuticals; high value chemical intermediates such as terpenoids and isoprenoids; or biotechnological outputs like biofuels.

Metabolic network reconstructions and models are used to understand how an organism or parasite functions inside of the host cell. For example, if the parasite serves to compromise the immune system by lysing macrophages, then the goal of metabolic reconstruction/simulation would be to determine the metabolites that are essential to the organism's proliferation inside of macrophages. If the proliferation cycle is inhibited, then the parasite would not continue to evade the host's immune system. A reconstruction model serves as a first step to deciphering the complicated mechanisms surrounding disease. These models can also look at the minimal genes necessary for a cell to maintain virulence. The next step would be to use the predictions and postulates generated from a reconstruction model and apply it to discover novel biological functions such as drug-engineering and drug delivery techniques.

Political psychology

From Wikipedia, the free encyclopedia ...