Search This Blog

Wednesday, January 9, 2019

Public health

From Wikipedia, the free encyclopedia

Newspaper headlines from around the world about polio vaccine tests (13 April 1955)
 
Public health is "the science and art of preventing disease, prolonging life and promoting human health through organized efforts and informed choices of society, organizations, public and private, communities and individuals". Analyzing the health of a population and the threats is the basis for public health. The "public" in question can be as small as a handful of people, an entire village or it can be as large as several continents, in the case of a pandemic. "Health" takes into account physical, mental and social well-being. It is not merely the absence of disease or infirmity, according to the World Health Organization. Public health is interdisciplinary. For example, epidemiology, biostatistics and health services are all relevant. Environmental health, community health, behavioral health, health economics, public policy, mental health and occupational safety, gender issues in health, sexual and reproductive health are other important subfields.

Public health aims to improve the quality of life through prevention and treatment of disease, including mental health. This is done through the surveillance of cases and health indicators, and through the promotion of healthy behaviors. Common public health initiatives include promoting handwashing and breastfeeding, delivery of vaccinations, suicide prevention and distribution of condoms to control the spread of sexually transmitted diseases.

Modern public health practice requires multidisciplinary teams of public health workers and professionals. Teams might include epidemiologists, biostatisticians, medical assistants, public health nurses, midwives, medical microbiologists, economists, sociologists, geneticists and data managers. Depending on the need environmental health officers or public health inspectors, bioethicists, and even veterinarians, gender experts, sexual and reproductive health specialists might be called on.

Access to health care and public health initiatives are difficult challenges in developing countries. Public health infrastructures are still forming in those countries.

Background

The focus of a public health intervention is to prevent and manage diseases, injuries and other health conditions through surveillance of cases and the promotion of healthy behaviors, communities and environments. Many diseases are preventable through simple, non-medical methods. For example, research has shown that the simple act of handwashing with soap can prevent the spread of many contagious diseases. In other cases, treating a disease or controlling a pathogen can be vital to preventing its spread to others, either during an outbreak of infectious disease or through contamination of food or water supplies. Public health communications programs, vaccination programs and distribution of condoms are examples of common preventive public health measures. Measures such as these have contributed greatly to the health of populations and increases in life expectancy. 

Public health plays an important role in disease prevention efforts in both the developing world and in developed countries through local health systems and non-governmental organizations. The World Health Organization (WHO) is the international agency that coordinates and acts on global public health issues. Most countries have their own government public health agencies, sometimes known as ministries of health, to respond to domestic health issues. For example, in the United States, the front line of public health initiatives are state and local health departments. The United States Public Health Service (PHS), led by the Surgeon General of the United States, and the Centers for Disease Control and Prevention, headquartered in Atlanta, are involved with several international health activities, in addition to their national duties. In Canada, the Public Health Agency of Canada is the national agency responsible for public health, emergency preparedness and response, and infectious and chronic disease control and prevention. The Public health system in India is managed by the Ministry of Health & Family Welfare of the government of India with state-owned health care facilities.

Current practice

Public health programs

There's a push and pull, as you know, between cheap alternatives for industry and public health concerns...We're always looking at retrospectively what the data shows...Unfortunately, for example, take tobacco: It took 50, 60 years of research before policy catches up with what the science is showing— Laura Anderko, professor at Georgetown University and director of the Mid-Atlantic Center for Children's Health and the Environment commenting on public health practices in response to proposal to ban chlorpyrifos pesticide.
Most governments recognize the importance of public health programs in reducing the incidence of disease, disability, and the effects of aging and other physical and mental health conditions. However, public health generally receives significantly less government funding compared with medicine. Public health programs providing vaccinations have made strides in promoting health, including the eradication of smallpox, a disease that plagued humanity for thousands of years. 

Three former directors of the Global Smallpox Eradication Programme read the news that smallpox had been globally eradicated, 1980
 
The World Health Organization (WHO) identifies core functions of public health programs including:
  • providing leadership on matters critical to health and engaging in partnerships where joint action is needed;
  • shaping a research agenda and stimulating the generation, translation and dissemination of valuable knowledge;
  • setting norms and standards and promoting and monitoring their implementation;
  • articulating ethical and evidence-based policy options;
  • monitoring the health situation and assessing health trends.
In particular, public health surveillance programs can:
  • serve as an early warning system for impending public health emergencies;
  • document the impact of an intervention, or track progress towards specified goals; and
  • monitor and clarify the epidemiology of health problems, allow priorities to be set, and inform health policy and strategies.
  • diagnose, investigate, and monitor health problems and health hazards of the community
Public health surveillance has led to the identification and prioritization of many public health issues facing the world today, including HIV/AIDS, diabetes, waterborne diseases, zoonotic diseases, and antibiotic resistance leading to the reemergence of infectious diseases such as tuberculosis. Antibiotic resistance, also known as drug resistance, was the theme of World Health Day 2011. Although the prioritization of pressing public health issues is important, Laurie Garrett argues that there are following consequences. When foreign aid is funnelled into disease-specific programs, the importance of public health in general is disregarded. This public health problem of stovepiping is thought to create a lack of funds to combat other existing diseases in a given country.

For example, the WHO reports that at least 220 million people worldwide suffer from diabetes. Its incidence is increasing rapidly, and it is projected that the number of diabetes deaths will double by the year 2030. In a June 2010 editorial in the medical journal The Lancet, the authors opined that "The fact that type 2 diabetes, a largely preventable disorder, has reached epidemic proportion is a public health humiliation." The risk of type 2 diabetes is closely linked with the growing problem of obesity. The WHO’s latest estimates as of June 2016 highlighted that globally approximately 1.9 billion adults were overweight in 2014, and 41 million children under the age of five were overweight in 2014. The United States is the leading country with 30.6% of its population being obese. Mexico follows behind with 24.2% and the United Kingdom with 23%. Once considered a problem in high-income countries, it is now on the rise in low-income countries, especially in urban settings. Many public health programs are increasingly dedicating attention and resources to the issue of obesity, with objectives to address the underlying causes including healthy diet and physical exercise.

Some programs and policies associated with public health promotion and prevention can be controversial. One such example is programs focusing on the prevention of HIV transmission through safe sex campaigns and needle-exchange programs. Another is the control of tobacco smoking. Changing smoking behavior requires long-term strategies, unlike the fight against communicable diseases, which usually takes a shorter period for effects to be observed. Many nations have implemented major initiatives to cut smoking, such as increased taxation and bans on smoking in some or all public places. Proponents argue by presenting evidence that smoking is one of the major killers, and that therefore governments have a duty to reduce the death rate, both through limiting passive (second-hand) smoking and by providing fewer opportunities for people to smoke. Opponents say that this undermines individual freedom and personal responsibility, and worry that the state may be emboldened to remove more and more choice in the name of better population health overall.

Simultaneously, while communicable diseases have historically ranged uppermost as a global health priority, non-communicable diseases and the underlying behavior-related risk factors have been at the bottom. This is changing, however, as illustrated by the United Nations hosting its first General Assembly Special Summit on the issue of non-communicable diseases in September 2011.

Many health problems are due to maladaptive personal behaviors. From an evolutionary psychology perspective, over consumption of novel substances that are harmful is due to the activation of an evolved reward system for substances such as drugs, tobacco, alcohol, refined salt, fat, and carbohydrates. New technologies such as modern transportation also cause reduced physical activity. Research has found that behavior is more effectively changed by taking evolutionary motivations into consideration instead of only presenting information about health effects. The marketing industry has long known the importance of associating products with high status and attractiveness to others. Films are increasingly being recognized as a public health tool. In fact, film festivals and competitions have been established to specifically promote films about health. Conversely, it has been argued that emphasizing the harmful and undesirable effects of tobacco smoking on other persons and imposing smoking bans in public places have been particularly effective in reducing tobacco smoking.

Applications in health care

As well as seeking to improve population health through the implementation of specific population-level interventions, public health contributes to medical care by identifying and assessing population needs for health care services, including:
  • Assessing current services and evaluating whether they are meeting the objectives of the health care system
  • Ascertaining requirements as expressed by health professionals, the public and other stakeholders
  • Identifying the most appropriate interventions
  • Considering the effect on resources for proposed interventions and assessing their cost-effectiveness
  • Supporting decision making in health care and planning health services including any necessary changes.
  • Informing, educating, and empowering people about health issues

Implementing effective improvement strategies

To improve public health, one important strategy is to promote modern medicine and scientific neutrality to drive the public health policy and campaign, which is recommended by Armanda Solorzana, through a case study of the Rockefeller Foundation's hookworm campaign in Mexico in the 1920s. Soloranza argues that public health policy can't concern only politics or economics. Political concerns can lead government officials to hide the real numbers of people affected by disease in their regions, such as upcoming elections. Therefore, scientific neutrality in making public health policy is critical; it can ensure treatment needs are met regardless of political and economic conditions.

The history of public health care clearly shows the global effort to improve health care for all. However, in modern-day medicine, real, measurable change has not been clearly seen, and critics argue that this lack of improvement is due to ineffective methods that are being implemented. As argued by Paul E. Farmer, structural interventions could possibly have a large impact, and yet there are numerous problems as to why this strategy has yet to be incorporated into the health system. One of the main reasons that he suggests could be the fact that physicians are not properly trained to carry out structural interventions, meaning that the ground level health care professionals cannot implement these improvements. While structural interventions can not be the only area for improvement, the lack of coordination between socioeconomic factors and health care for the poor could be counterproductive, and end up causing greater inequity between the health care services received by the rich and by the poor. Unless health care is no longer treated as a commodity, global public health will ultimately not be achieved. This being the case, without changing the way in which health care is delivered to those who have less access to it, the universal goal of public health care cannot be achieved.

Another reason why measurable changes may not be noticed in public health is because agencies themselves may not be measuring their programs' efficacy. Perrault et al. analyzed over 4,000 published objectives from Community Health Improvement Plans (CHIPs) of 280 local accredited and non-accredited public health agencies in the U.S., and found that the majority of objectives - around two-thirds - were focused on achieving agency outputs (e.g., developing communication plans, installing sidewalks, disseminating data to the community). Only about one-third focused on seeking measurable changes in the populations they serve (i.e., changing people's knowledge, attitudes, behaviors). What this research showcases is that if agencies are only focused on accomplishing tasks (i.e., outputs) and do not have a focus on measuring actual changes in their populations with the activities they perform, it should not be surprising when measurable changes are not reported. Perrault et al. advocate for public health agencies to work with those in the discipline of Health Communication to craft objectives that are measurable outcomes, and to assist agencies in developing tools and methods to be able to track more proximal changes in their target populations (e.g., knowledge and attitude shifts) that may be influenced by the activities the agencies are performing.

Public Health 2.0

Public Health 2.0 is a movement within public health that aims to make the field more accessible to the general public and more user-driven. The term is used in three senses. In the first sense, "Public Health 2.0" is similar to "Health 2.0" and describes the ways in which traditional public health practitioners and institutions are reaching out (or could reach out) to the public through social media and health blogs.

In the second sense, "Public Health 2.0" describes public health research that uses data gathered from social networking sites, search engine queries, cell phones, or other technologies. A recent example is the proposal of statistical framework that utilizes online user-generated content (from social media or search engine queries) to estimate the impact of an influenza vaccination campaign in the UK.

In the third sense, "Public Health 2.0" is used to describe public health activities that are completely user-driven. An example is the collection and sharing of information about environmental radiation levels after the March 2011 tsunami in Japan. In all cases, Public Health 2.0 draws on ideas from Web 2.0, such as crowdsourcing, information sharing, and user-centred design. While many individual healthcare providers have started making their own personal contributions to "Public Health 2.0" through personal blogs, social profiles, and websites, other larger organizations, such as the American Heart Association (AHA) and United Medical Education (UME), have a larger team of employees centered around online driven health education, research, and training. These private organizations recognize the need for free and easy to access health materials often building libraries of educational articles.

Low- and middle-income countries

Emergency Response Team in Burma after Cyclone Nargis in 2008
 
There is a great disparity in access to health care and public health initiatives between developed nations and developing nations. In the developing world, public health infrastructures are still forming. There may not be enough trained health workers, monetary resources or, in some cases, sufficient knowledge to provide even a basic level of medical care and disease prevention. As a result, a large majority of disease and mortality in the developing world results from and contributes to extreme poverty. For example, many African governments spend less than US$10 per person per year on health care, while, in the United States, the federal government spent approximately US$4,500 per capita in 2000. However, expenditures on health care should not be confused with spending on public health. Public health measures may not generally be considered "health care" in the strictest sense. For example, mandating the use of seat belts in cars can save countless lives and contribute to the health of a population, but typically money spent enforcing this rule would not count as money spent on health care.

Large parts of the world remained plagued by largely preventable or treatable infectious diseases. In addition to this however, many low- and middle-income countries are also experiencing an epidemiological shift and polarization in which populations are now experiencing more of the effects of chronic diseases as life expectancy increases with, the poorer communities being heavily affected by both chronic and infectious diseases. Another major public health concern in the developing world is poor maternal and child health, exacerbated by malnutrition and poverty. The WHO reports that a lack of exclusive breastfeeding during the first six months of life contributes to over a million avoidable child deaths each year. Intermittent preventive therapy aimed at treating and preventing malaria episodes among pregnant women and young children is one public health measure in endemic countries. 

Each day brings new front-page headlines about public health: emerging infectious diseases such as SARS, rapidly making its way from China (see Public health in China) to Canada, the United States and other geographically distant countries; reducing inequities in health care access through publicly funded health insurance programs; the HIV/AIDS pandemic and its spread from certain high-risk groups to the general population in many countries, such as in South Africa; the increase of childhood obesity and the concomitant increase in type II diabetes among children; the social, economic and health effects of adolescent pregnancy; and the public health challenges related to natural disasters such as the 2004 Indian Ocean tsunami, 2005's Hurricane Katrina in the United States and the 2010 Haiti earthquake

Since the 1980s, the growing field of population health has broadened the focus of public health from individual behaviors and risk factors to population-level issues such as inequality, poverty, and education. Modern public health is often concerned with addressing determinants of health across a population. There is a recognition that our health is affected by many factors including where we live, genetics, our income, our educational status and our social relationships; these are known as "social determinants of health". The upstream drivers such as environment, education, employment, income, food security, housing, social inclusion and many others effect the distribution of health between and within populations and are often shaped by policy. A social gradient in health runs through society. The poorest generally suffer the worst health, but even the middle classes will generally have worse health outcomes than those of a higher social stratum. The new public health advocates for population-based policies that improve health in an equitable manner.

Health aid in less developed countries

Health aid to developing countries is an important source of public health funding for many low- and middle-income countries. Health aid to developing countries has shown a significant increase after World War II as concerns over the spread of disease as a result of globalization increased and the HIV/AIDS epidemic in sub-Saharan Africa surfaced. From 1990 to 2010, total health aid from developed countries increased from 5.5 billion to 26.87 billion with wealthy countries continuously donating billions of dollars every year with the goal of improving population health. Some efforts, however, receive a significantly larger proportion of funds such as HIV which received an increase in funds of over $6 billion dollars between 2000 and 2010 which was more than twice the increase seen in any other sector during those years. Health aid has seen an expansion through multiple channels including private philanthropy, non-governmental organizations, private foundations such as the Bill & Melinda Gates Foundation, bilateral donors, and multilateral donors such as the World Bank or UNICEF. In 2009 health aid from the OECD amounted to $12.47 billion which amounted to 11.4% of its total bilateral aid. In 2009, Multilateral donors were found to spend 15.3% of their total aid on bettering public healthcare. Recent data, however, shows that international health aid has plateaued and may begin to decrease.

International health aid debates

Debates exist questioning the efficacy of international health aid. Proponents of aid claim that health aid from wealthy countries is necessary in order for developing countries to escape the poverty trap. Opponents of health aid claim that international health aid actually disrupts developing countries' course of development, causes dependence on aid, and in many cases the aid fails to reach its recipients. For example, recently, health aid was funneled towards initiatives such as financing new technologies like antiretroviral medication, insecticide-treated mosquito nets, and new vaccines. The positive impacts of these initiatives can be seen in the eradication of smallpox and polio; however, critics claim that misuse or misplacement of funds may cause many of these efforts to never come into fruition.

Economic modeling based on the Institute for Health Metrics and Evaluation and the World Health Organization has shown a link between international health aid in developing countries and a reduction in adult mortality rates. However, a 2014-2016 study suggests that a potential confounding variable for this outcome is the possibility that aid was directed at countries once they were already on track for improvement. That same study, however, also suggests that 1 billion dollars in health aid was associated with 364,000 fewer deaths occurring between ages 0 and 5 in 2011.

Sustainable development goals 2030

To address current and future challenges in addressing health issues in the world, the United Nations have developed the Sustainable Development Goals building off of the Millennium Development Goals of 2000 to be completed by 2030. These goals in their entirety encompass the entire spectrum of development across nations, however Goals 1-6 directly address health disparities, primarily in developing countries. These six goals address key issues in global public health: Poverty, Hunger and food security, Health, Education, Gender equality and women's empowerment, and water and sanitation. Public health officials can use these goals to set their own agenda and plan for smaller scale initiatives for their organizations. These goals hope to lessen the burden of disease and inequality faced by developing countries and lead to a healthier future. 

The links between the various sustainable development goals and public health are numerous and well established:
  • Living below the poverty line is attributed to poorer health outcomes and can be even worse for persons living in developing countries where extreme poverty is more common. A child born into poverty is twice as likely to die before the age of five compared to a child from a wealthier family.
  • The detrimental effects of hunger and malnutrition that can arise from systemic challenges with food security are enormous. The World Health Organization estimates that 12.9 percent of the population in developing countries is undernourished.
  • Health challenges in the developing world are enormous, with "only half of the women in developing nations receiving the recommended amount of healthcare they need.
  • Educational equity has yet to be reached in the world. Public health efforts are impeded by this, as a lack of education can lead to poorer health outcomes. This is shown by children of mothers who have no education having a lower survival rate compared to children born to mothers with primary or greater levels of education. Cultural differences in the role of women vary by country, many gender inequalities are found in developing nations. Combating these inequalities has shown to also lead to better public health outcome.
  • In studies done by the World Bank on populations in developing countries, it was found that when women had more control over household resources, the children benefit through better access to food, healthcare, and education.
  • Basic sanitation resources and access to clean sources of water are a basic human right. However, 1.8 billion people globally use a source of drinking water that is fecally contaminated, and 2.4 billion people lack access to basic sanitation facilities like toilets or pit latrines. A lack of these resources is what causes approximately 1000 children a day to die from diarrheal diseases that could have been prevented from better water and sanitation infrastructure.

U.S. initiatives

The U.S. Global Health Initiative was created in 2009 by President Obama in an attempt to have a more holistic, comprehensive approach to improving global health as opposed to previous, disease-specific interventions. The Global Health Initiative is a six-year plan, "to develop a comprehensive U.S. government strategy for global health, building on the President's Emergency Plan for AIDS Relief (PEPFAR) to combat HIV as well as U.S. efforts to address tuberculosis (TB) and malaria, and augmenting the focus on other global health priorities, including neglected tropical diseases (NTDs), maternal, newborn and child health (MNCH), family planning and reproductive health (FP/RH), nutrition, and health systems strengthening (HSS)". The GHI programs are being implemented in more than 80 countries around the world and works closely with the United States Agency for International Development, the Centers for Disease Control and Prevention, the United States Deputy Secretary of State.

There are seven core principles:
  1. Women, girls, and gender equality
  2. Strategic coordination and integration
  3. Strengthen and leverage key multilaterals and other partners
  4. Country-ownership
  5. Sustainability through Health Systems
  6. Improve metrics, monitoring, and evaluation
  7. Promote research and innovation
The aid effectiveness agenda is a useful tool for measuring the impact of these large scale programs such as The Global Fund to Fight AIDS, Tuberculosis and Malaria and the Global Alliance for Vaccines and Immunization (GAVI) which have been successful in achieving rapid and visible results. The Global Fund claims that its efforts have provided antiretroviral treatment for over three million people worldwide. GAVI claims that its vaccination programs have prevented over 5 million deaths since it began in 2000.

Education and training

Education and training of public health professionals is available throughout the world in Schools of Public Health, Medical Schools, Veterinary Schools, Schools of Nursing, and Schools of Public Affairs. The training typically requires a university degree with a focus on core disciplines of biostatistics, epidemiology, health services administration, health policy, health education, behavioral science, gender issues, sexual and reproductive health, public health nutrition and environmental and occupational health. In the global context, the field of public health education has evolved enormously in recent decades, supported by institutions such as the World Health Organization and the World Bank, among others. Operational structures are formulated by strategic principles, with educational and career pathways guided by competency frameworks, all requiring modulation according to local, national and global realities. It is critically important for the health of populations that nations assess their public health human resource needs and develop their ability to deliver this capacity, and not depend on other countries to supply it.

Schools of public health: a US perspective

In the United States, the Welch-Rose Report of 1915 has been viewed as the basis for the critical movement in the history of the institutional schism between public health and medicine because it led to the establishment of schools of public health supported by the Rockefeller Foundation. The report was authored by William Welch, founding dean of the Johns Hopkins Bloomberg School of Public Health, and Wickliffe Rose of the Rockefeller Foundation. The report focused more on research than practical education. Some have blamed the Rockefeller Foundation's 1916 decision to support the establishment of schools of public health for creating the schism between public health and medicine and legitimizing the rift between medicine's laboratory investigation of the mechanisms of disease and public health's nonclinical concern with environmental and social influences on health and wellness.

Even though schools of public health had already been established in Canada, Europe and North Africa, the United States had still maintained the traditional system of housing faculties of public health within their medical institutions. A $25,000 donation from businessman Samuel Zemurray instituted the School of Public Health and Tropical Medicine at Tulane University in 1912 conferring its first doctor of public health degree in 1914. The Yale School of Public Health was founded by Charles-Edward Avory Winslow in 1915. The Johns Hopkins School of Hygiene and Public Health became an independent, degree-granting institution for research and training in public health, and the largest public health training facility in the United States, when it was founded in 1916. By 1922, schools of public health were established at Columbia and Harvard on the Hopkins model. By 1999 there were twenty nine schools of public health in the US, enrolling around fifteen thousand students.

Over the years, the types of students and training provided have also changed. In the beginning, students who enrolled in public health schools typically had already obtained a medical degree; public health school training was largely a second degree for medical professionals. However, in 1978, 69% of American students enrolled in public health schools had only a bachelor's degree.

Degrees in public health

Schools of public health offer a variety of degrees which generally fall into two categories: professional or academic. The two major postgraduate degrees are the Master of Public Health (MPH) or the Master of Science in Public Health (MSPH). Doctoral studies in this field include Doctor of Public Health (DrPH) and Doctor of Philosophy (PhD) in a subspeciality of greater Public Health disciplines. DrPH is regarded as a professional degree and PhD as more of an academic degree. 

Professional degrees are oriented towards practice in public health settings. The Master of Public Health, Doctor of Public Health, Doctor of Health Science (DHSc) and the Master of Health Care Administration are examples of degrees which are geared towards people who want careers as practitioners of public health in health departments, managed care and community-based organizations, hospitals and consulting firms, among others. Master of Public Health degrees broadly fall into two categories, those that put more emphasis on an understanding of epidemiology and statistics as the scientific basis of public health practice and those that include a more eclectic range of methodologies. A Master of Science of Public Health is similar to an MPH but is considered an academic degree (as opposed to a professional degree) and places more emphasis on scientific methods and research. The same distinction can be made between the DrPH and the DHSc. The DrPH is considered a professional degree and the DHSc is an academic degree.

Academic degrees are more oriented towards those with interests in the scientific basis of public health and preventive medicine who wish to pursue careers in research, university teaching in graduate programs, policy analysis and development, and other high-level public health positions. Examples of academic degrees are the Master of Science, Doctor of Philosophy, Doctor of Science (ScD), and Doctor of Health Science (DHSc). The doctoral programs are distinct from the MPH and other professional programs by the addition of advanced coursework and the nature and scope of a dissertation research project. 

In the United States, the Association of Schools of Public Health represents Council on Education for Public Health (CEPH) accredited schools of public health. Delta Omega is the honor society for graduate studies in public health. The society was founded in 1924 at the Johns Hopkins School of Hygiene and Public Health. Currently, there are approximately 68 chapters throughout the United States and Puerto Rico.

History

Early history

The primitive nature of medieval medicine rendered Europe helpless to the onslaught of the Black Death in the 14th century. Miniature from "The Chronicles of Gilles Li Muisis" (1272-1352). Bibliothèque royale de Belgique, MS 13076-77, f. 24v.
 
Public health has early roots in antiquity. From the beginnings of human civilization, it was recognized that polluted water and lack of proper waste disposal spread communicable diseases (theory of miasma). Early religions attempted to regulate behavior that specifically related to health, from types of food eaten, to regulating certain indulgent behaviors, such as drinking alcohol or sexual relations. Leaders were responsible for the health of their subjects to ensure social stability, prosperity, and maintain order. 

By Roman times, it was well understood that proper diversion of human waste was a necessary tenet of public health in urban areas. The ancient Chinese medical doctors developed the practice of variolation following a smallpox epidemic around 1000 BC. An individual without the disease could gain some measure of immunity against it by inhaling the dried crusts that formed around lesions of infected individuals. Also, children were protected by inoculating a scratch on their forearms with the pus from a lesion. 

In 1485 the Republic of Venice established a permanent Venetian Magistrate for Health comprising supervisors of health with special attention to the prevention of the spread of epidemics in the territory from abroad. The three supervisors were initially appointed by the Venetian Senate. In 1537 it was assumed by the Grand Council, and in 1556 added two judges, with the task of control, on behalf of the Republic, the efforts of the supervisors.

However, according to Michel Foucault, the plague model of governmentality was later controverted by the cholera model. A Cholera pandemic devastated Europe between 1829 and 1851, and was first fought by the use of what Foucault called "social medicine", which focused on flux, circulation of air, location of cemeteries, etc. All those concerns, born of the miasma theory of disease, were mixed with urbanistic concerns for the management of populations, which Foucault designated as the concept of "biopower". The German conceptualized this in the Polizeiwissenschaft ("Police science").

Modern public health

The 18th century saw rapid growth in voluntary hospitals in England. The latter part of the century brought the establishment of the basic pattern of improvements in public health over the next two centuries: a social evil was identified, private philanthropists brought attention to it, and changing public opinion led to government action.

1802 caricature of Edward Jenner vaccinating patients who feared it would make them sprout cowlike appendages.
 
The practice of vaccination became prevalent in the 1800s, following the pioneering work of Edward Jenner in treating smallpox. James Lind's discovery of the causes of scurvy amongst sailors and its mitigation via the introduction of fruit on lengthy voyages was published in 1754 and led to the adoption of this idea by the Royal Navy. Efforts were also made to promulgate health matters to the broader public; in 1752 the British physician Sir John Pringle published Observations on the Diseases of the Army in Camp and Garrison, in which he advocated for the importance of adequate ventilation in the military barracks and the provision of latrines for the soldiers.

With the onset of the Industrial Revolution, living standards amongst the working population began to worsen, with cramped and unsanitary urban conditions. In the first four decades of the 19th century alone, London's population doubled and even greater growth rates were recorded in the new industrial towns, such as Leeds and Manchester. This rapid urbanization exacerbated the spread of disease in the large conurbations that built up around the workhouses and factories. These settlements were cramped and primitive with no organized sanitation. Disease was inevitable and its incubation in these areas was encouraged by the poor lifestyle of the inhabitants. Unavailable housing led to the rapid growth of slums and the per capita death rate began to rise alarmingly, almost doubling in Birmingham and Liverpool. Thomas Malthus warned of the dangers of overpopulation in 1798. His ideas, as well as those of Jeremy Bentham, became very influential in government circles in the early years of the 19th century.

Public health legislation

Sir Edwin Chadwick was a pivotal influence on the early public health campaign.
 
The first attempts at sanitary reform and the establishment of public health institutions were made in the 1840s. Thomas Southwood Smith, physician at the London Fever Hospital, began to write papers on the importance of public health, and was one of the first physicians brought in to give evidence before the Poor Law Commission in the 1830s, along with Neil Arnott and James Phillips Kay. Smith advised the government on the importance of quarantine and sanitary improvement for limiting the spread of infectious diseases such as cholera and yellow fever.

The Poor Law Commission reported in 1838 that "the expenditures necessary to the adoption and maintenance of measures of prevention would ultimately amount to less than the cost of the disease now constantly engendered". It recommended the implementation of large scale government engineering projects to alleviate the conditions that allowed for the propagation of disease. The Health of Towns Association was formed in Exeter on 11 December 1844, and vigorously campaigned for the development of public health in the United Kingdom. Its formation followed the 1843 establishment of the Health of Towns Commission, chaired by Sir Edwin Chadwick, which produced a series of reports on poor and insanitary conditions in British cities.

These national and local movements led to the Public Health Act, finally passed in 1848. It aimed to improve the sanitary condition of towns and populous places in England and Wales by placing the supply of water, sewerage, drainage, cleansing and paving under a single local body with the General Board of Health as a central authority. The Act was passed by the Liberal government of Lord John Russell, in response to the urging of Edwin Chadwick. Chadwick's seminal report on The Sanitary Condition of the Labouring Population was published in 1842 and was followed up with a supplementary report a year later.

Vaccination for various diseases was made compulsory in the United Kingdom in 1851, and by 1871 legislation required a comprehensive system of registration run by appointed vaccination officers.

Further interventions were made by a series of subsequent Public Health Acts, notably the 1875 Act. Reforms included latrinization, the building of sewers, the regular collection of garbage followed by incineration or disposal in a landfill, the provision of clean water and the draining of standing water to prevent the breeding of mosquitoes.

The Infectious Disease (Notification) Act 1889 mandated the reporting of infectious diseases to the local sanitary authority, which could then pursue measures such as the removal of the patient to hospital and the disinfection of homes and properties.

In the United States, the first public health organization based on a state health department and local boards of health was founded in New York City in 1866.

Epidemiology

John Snow's dot map, showing the clusters of cholera cases in the London epidemic of 1854.
 
The science of epidemiology was founded by John Snow's identification of a polluted public water well as the source of an 1854 cholera outbreak in London. Dr. Snow believed in the germ theory of disease as opposed to the prevailing miasma theory. He first publicized his theory in an essay, On the Mode of Communication of Cholera, in 1849, followed by a more detailed treatise in 1855 incorporating the results of his investigation of the role of the water supply in the Soho epidemic of 1854.

By talking to local residents (with the help of Reverend Henry Whitehead), he identified the source of the outbreak as the public water pump on Broad Street (now Broadwick Street). Although Snow's chemical and microscope examination of a water sample from the Broad Street pump did not conclusively prove its danger, his studies of the pattern of the disease were convincing enough to persuade the local council to disable the well pump by removing its handle.

Snow later used a dot map to illustrate the cluster of cholera cases around the pump. He also used statistics to illustrate the connection between the quality of the water source and cholera cases. He showed that the Southwark and Vauxhall Waterworks Company was taking water from sewage-polluted sections of the Thames and delivering the water to homes, leading to an increased incidence of cholera. Snow's study was a major event in the history of public health and geography. It is regarded as the founding event of the science of epidemiology.

Disease control

Paul-Louis Simond injecting a plague vaccine in Karachi, 1898.
 
With the pioneering work in bacteriology of French chemist Louis Pasteur and German scientist Robert Koch, methods for isolating the bacteria responsible for a given disease and vaccines for remedy were developed at the turn of the 20th century. British physician Ronald Ross identified the mosquito as the carrier of malaria and laid the foundations for combating the disease. Joseph Lister revolutionized surgery by the introduction of antiseptic surgery to eliminate infection. French epidemiologist Paul-Louis Simond proved that plague was carried by fleas on the back of rats, and Cuban scientist Carlos J. Finlay and U.S. Americans Walter Reed and James Carroll demonstrated that mosquitoes carry the virus responsible for yellow fever. Brazilian scientist Carlos Chagas identified a tropical disease and its vector.

With onset of the epidemiological transition and as the prevalence of infectious diseases decreased through the 20th century, public health began to put more focus on chronic diseases such as cancer and heart disease. Previous efforts in many developed countries had already led to dramatic reductions in the infant mortality rate using preventative methods. In Britain, the infant mortality rate fell from over 15% in 1870 to 7% by 1930.

Country examples

France

France 1871-1914 followed well behind Bismarckian Germany, as well as Great Britain, in developing the welfare state including public health. Tuberculosis was the most dreaded disease of the day, especially striking young people in their 20s. Germany set up vigorous measures of public hygiene and public sanatoria, but France let private physicians handle the problem, which left it with a much higher death rate. The French medical profession jealously guarded its prerogatives, and public health activists were not as well organized or as influential as in Germany, Britain or the United States. For example, there was a long battle over a public health law which began in the 1880s as a campaign to reorganize the nation's health services, to require the registration of infectious diseases, to mandate quarantines, and to improve the deficient health and housing legislation of 1850. However the reformers met opposition from bureaucrats, politicians, and physicians. Because it was so threatening to so many interests, the proposal was debated and postponed for 20 years before becoming law in 1902. Success finally came when the government realized that contagious diseases had a national security impact in weakening military recruits, and keeping the population growth rate well below Germany's.

United States


Modern public health began developing in the 19th century, as a response to advances in science that led to the understanding of, the source and spread of disease. As the knowledge of contagious diseases increased, means to control them and prevent infection were soon developed. Once it became understood that these strategies would require community-wide participation, disease control began being viewed as a public responsibility. Various organizations and agencies were then created to implement these disease preventing strategies.

Most of the Public health activity in the United States took place at the municipal level before the mid-20th century. There was some activity at the national and state level as well.

In the administration of the second president of the United States John Adams, the Congress authorized the creation of hospitals for mariners. As the U.S. expanded, the scope of the governmental health agency expanded. In the United States, public health worker Sara Josephine Baker, M.D. established many programs to help the poor in New York City keep their infants healthy, leading teams of nurses into the crowded neighborhoods of Hell's Kitchen and teaching mothers how to dress, feed, and bathe their babies. 

Another key pioneer of public health in the U.S. was Lillian Wald, who founded the Henry Street Settlement house in New York. The Visiting Nurse Service of New York was a significant organization for bringing health care to the urban poor. 

Dramatic increases in average life span in the late 19th century and 20th century, is widely credited to public health achievements, such as vaccination programs and control of many infectious diseases including polio, diphtheria, yellow fever and smallpox; effective health and safety policies such as road traffic safety and occupational safety; improved family planning; tobacco control measures; and programs designed to decrease non-communicable diseases by acting on known risk factors such as a person's background, lifestyle and environment. 

Another major public health improvement was the decline in the "urban penalty" brought about by improvements in sanitation. These improvements included chlorination of drinking water, filtration and sewage treatment which led to the decline in deaths caused by infectious waterborne diseases such as cholera and intestinal diseases. The federal Office of Indian Affairs (OIA) operated a large-scale field nursing program. Field nurses targeted native women for health education, emphasizing personal hygiene and infant care and nutrition.

Mexico

Logo for the Mexican Social Security Institute, a governmental agency dealing with public health.
 
Public health issues were important for the Spanish empire during the colonial era. Epidemic disease was the main factor in the decline of indigenous populations in the era immediately following the sixteenth-century conquest era and was a problem during the colonial era. The Spanish crown took steps in eighteenth-century Mexico to bring in regulations to make populations healthier.

In the late nineteenth century, Mexico was in the process of modernization, and public health issues were again tackled from a scientific point of view. Even during the Mexican Revolution (1910–20), public health was an important concern, with a text on hygiene published in 1916.[105] During the Mexican Revolution, feminist and trained nurse Elena Arizmendi Mejia founded the Neutral White Cross, treating wounded soldiers no matter for what faction they fought. 

In the post-revolutionary period after 1920, improved public health was a revolutionary goal of the Mexican government. The Mexican state promoted the health of the Mexican population, with most resources going to cities. Concern about disease conditions and social impediments to the improvement of Mexicans' health were important in the formation of the Mexican Society for Eugenics. The movement flourished from the 1920s to the 1940s. Mexico was not alone in Latin America or the world in promoting eugenics. Government campaigns against disease and alcoholism were also seen as promoting public health.

The Mexican Social Security Institute was established in 1943, during the administration of President Manuel Avila Camacho to deal with public health, pensions, and social security.

Cuba

Since the 1959 Cuban Revolution the Cuban government has devoted extensive resources to the improvement of health conditions for its entire population via universal access to health care. Infant mortality has plummeted. Cuban medical internationalism as a policy has seen the Cuban government sent doctors as a form of aid and export to countries in need in Latin America, especially Venezuela, as well as Oceania and Africa countries.

Colombia and Bolivia

Public health was important elsewhere in Latin America in consolidating state power and integrating marginalized populations into the nation-state. In Colombia, public health was a means for creating and implementing ideas of citizenship. In Bolivia, a similar push came after their 1952 revolution.

Ghana

Though curable and preventative, malaria remains a huge public health problem and is the third leading cause of death in Ghana. In the absence of a vaccine, mosquito control, or access to anti-malaria medication, public health methods become the main strategy for reducing the prevalence and severity of malaria. These methods include reducing breeding sites, screening doors and windows, insecticide sprays, prompt treatment following infection, and usage of insecticide treated mosquito nets. Distribution and sale of insecticide-treated mosquito nets is a common, cost-effective anti-malaria public health intervention; however, barriers to use exist including cost, hosehold and family organization, access to resources, and social and behavioral determinants which have not only been shown to affect malaria prevalence rates but also mosquito net use.

Universal health care

From Wikipedia, the free encyclopedia

Universal health care, 2018
 
Universal health care (also called universal health coverage, universal coverage, universal care, or socialized health care) is a health care system that provides health care and financial protection to all citizens of a particular country. It is organized around providing a specified package of benefits to all members of a society with the end goal of providing financial risk protection, improved access to health services, and improved health outcomes.

Universal health care does not imply coverage for all people for everything. Universal health care can be determined by three critical dimensions: who is covered, what services are covered, and how much of the cost is covered. It is described by the World Health Organization as a situation where citizens can access health services without incurring financial hardship. The Director General of WHO describes universal health coverage as the “single most powerful concept that public health has to offer” since it unifies “services and delivers them in a comprehensive and integrated way”. One of the goals with universal healthcare is to create a system of protection which provides equality of opportunity for people to enjoy the highest possible level of health.

As part of Sustainable Development Goals, United Nations member states have agreed to work toward worldwide universal health coverage by 2030.

History

The first move towards a national health insurance system was launched in Germany in 1883, with the Sickness Insurance Law. Industrial employers were mandated to provide injury and illness insurance for their low-wage workers, and the system was funded and administered by employees and employers through "sick funds", which were drawn from deductions in workers' wages and from employers' contributions. Other countries soon began to follow suit. In the United Kingdom, the National Insurance Act 1911 provided coverage for primary care (but not specialist or hospital care) for wage earners, covering about one third of the population. The Russian Empire established a similar system in 1912, and other industrialized countries began following suit. By the 1930s, similar systems existed in virtually all of Western and Central Europe. Japan introduced an employee health insurance law in 1927, expanding further upon it in 1935 and 1940. Following the Russian Revolution of 1917, the Soviet Union established a fully public and centralized health care system in 1920. However, it was not a truly universal system at that point, as rural residents were not covered.

In New Zealand, a universal health care system was created in a series of steps, from 1939 to 1941. In Australia, the state of Queensland introduced a free public hospital system in the 1940s. 

Following World War II, universal health care systems began to be set up around the world. On July 5, 1948, the United Kingdom launched its universal National Health Service. Universal health care was next introduced in the Nordic countries of Sweden (1955), Iceland (1956), Norway (1956), Denmark (1961), and Finland (1964). Universal health insurance was then introduced in Japan (1961), and in Canada through stages, starting with the province of Saskatchewan in 1962, followed by the rest of Canada from 1968 to 1972. The Soviet Union extended universal health care to its rural residents in 1969. Italy introduced its Servizio Sanitario Nazionale (National Health Service) in 1978. Universal health insurance was implemented in Australia beginning with the Medibank system which led to universal coverage under the Medicare system.

From the 1970s to the 2000s, Southern and Western European countries began introducing universal coverage, most of them building upon previous health insurance programs to cover the whole population. For example, France built upon its 1928 national health insurance system, with subsequent legislation covering a larger and larger percentage of the population, until the remaining 1% of the population that was uninsured received coverage in 2000. In addition, universal health coverage was introduced in some Asian countries, including South Korea (1989), Taiwan (1995), Israel (1995), and Thailand (2001).

Following the collapse of the Soviet Union, Russia retained and reformed its universal health care system, as did other former Soviet nations and Eastern bloc countries. 

Beyond the 1990s, many countries in Latin America, the Caribbean, Africa, and the Asia-Pacific region, including developing countries, took steps to bring their populations under universal health coverage, including China which has the largest universal health care system in the world and Brazil's SUS which improved coverage up to 80% of the population. A 2012 study examined progress being made by these countries, focusing on nine in particular: Ghana, Rwanda, Nigeria, Mali, Kenya, India, Indonesia, the Philippines, and Vietnam.

Funding models

Universal health care in most countries has been achieved by a mixed model of funding. General taxation revenue is the primary source of funding, but in many countries it is supplemented by specific levies (which may be charged to the individual and/or an employer) or with the option of private payments (by direct or optional insurance) for services beyond those covered by the public system. Almost all European systems are financed through a mix of public and private contributions. Most universal health care systems are funded primarily by tax revenue (like in Portugal, Spain, Denmark, and Sweden). Some nations, such as Germany, France, and Japan, employ a multipayer system in which health care is funded by private and public contributions. However, much of the non-government funding is by contributions by employers and employees to regulated non-profit sickness funds. Contributions are compulsory and defined according to law. A distinction is also made between municipal and national healthcare funding. For example, one model is that the bulk of the healthcare is funded by the municipality, speciality healthcare is provided and possibly funded by a larger entity, such as a municipal co-operation board or the state, and the medications are paid by a state agency. A paper by Sherry A. Glied from Columbia University found that universal health care systems are modestly redistributive, and that the progressivity of health care financing has limited implications for overall income inequality.

Compulsory insurance

This is usually enforced via legislation requiring residents to purchase insurance, but sometimes the government provides the insurance. Sometimes, there may be a choice of multiple public and private funds providing a standard service (as in Germany) or sometimes just a single public fund (as in Canada). Healthcare in Switzerland and the US Patient Protection and Affordable Care Act are based on compulsory insurance.

In some European countries, in which private insurance and universal health care coexist, such as Germany, Belgium, and the Netherlands, the problem of adverse selection is overcome by using a risk compensation pool to equalize, as far as possible, the risks between funds. Thus, a fund with a predominantly healthy, younger population has to pay into a compensation pool and a fund with an older and predominantly less healthy population would receive funds from the pool. In this way, sickness funds compete on price, and there is no advantage to eliminate people with higher risks because they are compensated for by means of risk-adjusted capitation payments. Funds are not allowed to pick and choose their policyholders or deny coverage, but they compete mainly on price and service. In some countries, the basic coverage level is set by the government and cannot be modified.

The Republic of Ireland at one time had a "community rating" system by VHI, effectively a single-payer or common risk pool. The government later opened VHI to competition but without a compensation pool. That resulted in foreign insurance companies entering the Irish market and offering cheap health insurance to relatively healthy segments of the market, which then made higher profits at VHI's expense. The government later reintroduced community rating by a pooling arrangement and at least one main major insurance company, BUPA, withdrew from the Irish market.
Among the potential solutions posited by economists are single-payer systems as well as other methods of ensuring that health insurance is universal, such as by requiring all citizens to purchase insurance or limiting the ability of insurance companies to deny insurance to individuals or vary price between individuals.

Single payer

Single-payer health care is a system in which the government, rather than private insurers, pays for all health care costs. Single-payer systems may contract for healthcare services from private organizations (as is the case in Canada) or own and employ healthcare resources and personnel (as was the case in England before the introduction of the Health and Social Care Act). "Single-payer" thus describes only the funding mechanism and refers to health care financed by a single public body from a single fund and does not specify the type of delivery or for whom doctors work. Although the fund holder is usually the state, some forms of single-payer use a mixed public-private system.

Tax-based financing

In tax-based financing, individuals contribute to the provision of health services through various taxes. These are typically pooled across the whole population, unless local governments raise and retain tax revenues. Some countries (notably the United Kingdom, Canada, Ireland, New Zealand, Italy, Spain, Portugal, and the Nordic countries) choose to fund health care directly from taxation alone. Other countries with insurance-based systems effectively meet the cost of insuring those unable to insure themselves via social security arrangements funded from taxation, either by directly paying their medical bills or by paying for insurance premiums for those affected.

Social health insurance

In a social health insurance system, contributions from workers, the self-employed, enterprises, and governments are pooled into a single or multiple funds on a compulsory basis. It is based on risk pooling. The social health insurance model is also referred to as the 'Bismarck Model,' after Prussian Chancellor Otto von Bismarck, who introduced the first universal health care system in Germany in the 19th century. The funds typically contract with a mix of public and private providers for the provision of a specified benefit package. Preventive and public health care may be provided by these funds or responsibility kept solely by the Ministry of Health. Within social health insurance, a number of functions may be executed by parastatal or non-governmental sickness funds or in a few cases, by private health insurance companies. Social health insurance is used in a number of Western European countries and increasingly in Eastern Europe as well as in Israel and Japan.

Private insurance

In private health insurance, premiums are paid directly from employers, associations, individuals, and families to insurance companies, which pool risks across their membership base. Private insurance includes policies sold by commercial for profit firms, non-profit companies, and community health insurers. Generally, private insurance is voluntary in contrast to social insurance programs, which tend to be compulsory.

In some countries with universal coverage, private insurance often excludes many health conditions that are expensive and the state health care system can provide. For example, in the United Kingdom, one of the largest private health care providers is BUPA, which has a long list of general exclusions even in its highest coverage policy, most of which are routinely provided by the National Health Service. In the United States, dialysis treatment for end stage renal failure is generally paid for by government, not by the insurance industry. Those with privatized Medicare (Medicare Advantage) are the exception and must get their dialysis paid through their insurance company, but with end stage renal failure generally cannot buy Medicare Advantage plans. In the Netherlands, which has regulated competition for its main insurance system (but subject to a budget cap), insurers must cover a basic package for all enrollees, but may choose which additional services they cover in other, supplementary plans (which most people possess - citation needed). 

The Planning Commission of India has also suggested that the country should embrace insurance to achieve universal health coverage. General tax revenue is currently used to meet the essential health requirements of all people.

Community-based health insurance

A particular form of private health insurance that has often emerged if financial risk protection mechanisms have only a limited impact is community-based health insurance. Individual members of a specific community pay to a collective health fund, which they can draw from when they need of medical care. Contributions are not risk-related, and there is generally a high level of community involvement in the running of these plans.

Implementation and comparisons

Health spending per capita, in US$ purchasing power parity-adjusted, among various OECD countries
 
Universal health care systems vary according to the degree of government involvement in providing care and/or health insurance. In some countries, such as the UK, Spain, Italy, Australia, and the Nordic countries, the government has a high degree of involvement in the commissioning or delivery of health care services and access is based on residence rights, not on the purchase of insurance. Others have a much more pluralistic delivery system, based on obligatory health with contributory insurance rates related to salaries or income and usually funded by employers and beneficiaries jointly. 

Sometimes, the health funds are derived from a mixture of insurance premiums, salary related mandatory contributions by employees and/or employers to regulated sickness funds, and by government taxes. These insurance based systems tend to reimburse private or public medical providers, often at heavily regulated rates, through mutual or publicly owned medical insurers. A few countries, such as the Netherlands and Switzerland, operate via privately owned but heavily regulated private insurers, which are not allowed to make a profit from the mandatory element of insurance but can profit by selling supplemental insurance.

Universal health care is a broad concept that has been implemented in several ways. The common denominator for all such programs is some form of government action aimed at extending access to health care as widely as possible and setting minimum standards. Most implement universal health care through legislation, regulation, and taxation. Legislation and regulation direct what care must be provided, to whom, and on what basis. Usually, some costs are borne by the patient at the time of consumption, but the bulk of costs come from a combination of compulsory insurance and tax revenues. Some programs are paid for entirely out of tax revenues. In others, tax revenues are used either to fund insurance for the very poor or for those needing long-term chronic care.

The United Kingdom National Audit Office in 2003 published an international comparison of ten different health care systems in ten developed countries, nine universal systems against one non-universal system (the United States), and their relative costs and key health outcomes. A wider international comparison of 16 countries, each with universal health care, was published by the World Health Organization in 2004. In some cases, government involvement also includes directly managing the health care system, but many countries use mixed public-private systems to deliver universal health care.

Identity politics

From Wikipedia, the free encyclopedia

The term identity politics refers to political positions based on the interests and perspectives of social groups with which people identify. Identity politics includes the ways in which people's politics are shaped by aspects of their identity through loosely correlated social organizations. Examples include social organizations based on age, religion, social class or caste, culture, dialect, disability, education, ethnicity, language, nationality, sex, gender identity, generation, occupation, profession, race, political party affiliation, sexual orientation, settlement, urban and rural habitation, and veteran status.

The term identity politics has been in use in various forms since the 1960s or 1970s, but has been applied with, at times, radically different meanings by different populations.

History

The term identity politics has been used in political discourse since at least the 1970s. One aim of identity politics has been for those feeling oppressed by and actively suffering under systemic social inequities to articulate their suffering and felt oppression in terms of their own experience by processes of consciousness-raising and collective action. One of the older written examples of it can be found in the April 1977 statement of the black feminist group, Combahee River Collective, which was subsequently reprinted in a number of anthologies, and Barbara Smith and the Combahee River Collective have been credited with coining the term. For example, in their terminal statement, they said:
[A]s children we realized that we were different from boys and that we were treated different—for example, when we were told in the same breath to be quiet both for the sake of being 'ladylike' and to make us less objectionable in the eyes of white people. In the process of consciousness-raising, actually life-sharing, we began to recognize the commonality of our experiences and, from the sharing and growing consciousness, to build a politics that will change our lives and inevitably end our oppression....We realize that the only people who care enough about us to work consistently for our liberation are us. Our politics evolve from a healthy love for ourselves, our sisters and our community which allows us to continue our struggle and work. This focusing upon our own oppression is embodied in the concept of identity politics. We believe that the most profound and potentially most radical politics come directly out of our own identity, as opposed to working to end somebody else's oppression.
— Zillah R. Eisenstein (1978), The Combahee River Collective Statement, 
Identity politics, as a mode of categorizing, are closely connected to the ascription that some social groups are oppressed (such as women, ethnic minorities, and sexual minorities); that is, the claim that individuals belonging to those groups are, by virtue of their identity, more vulnerable to forms of oppression such as cultural imperialism, violence, exploitation of labor, marginalization, or powerlessness. Therefore, these lines of social difference can be seen as ways to gain empowerment or avenues through which to work towards a more equal society.

Some groups have combined identity politics and Marxist social class analysis and class consciousness — the most notable example being the Black Panther Party — but this is not necessarily characteristic of the form. Another example is MOVE, members of which mixed black nationalism with anarcho-primitivism (a radical form of green politics based on the idea that civilization is an instrument of oppression, advocating the return to a hunter gatherer society). Identity politics can be left wing or right wing, with examples of the latter being Ulster Loyalism, Islamism and Christian Identity movements. 

During the 1980s, the politics of identity became very prominent and it was linked to a new wave of social movement activism.

The mid-2010s have seen a marked rise of identity politics, including white identity politics in the United States. This phenomenon is attributed to increased demographic diversity and the prospect of whites becoming a minority in America. Such shifts have driven many to affiliate with conservative causes including those not related to diversity. This includes the presidential election of Donald Trump, who won the support of prominent white supremacists such as David Duke and Richard B. Spencer (which Trump disavowed.)

Debates and criticism

Nature of the movement

The term identity politics has been applied and misapplied retroactively to varying movements that long predate its coinage. Historian Arthur Schlesinger Jr. discussed identity politics extensively in his 1991 book The Disuniting of America. Schlesinger, a strong supporter of liberal conceptions of civil rights, argues that a liberal democracy requires a common basis for culture and society to function. Rather than seeing civil society as already fractured along lines of power and powerless (according to race, ethnicity, sexuality, etc), Schlesinger suggests that basing politics on group marginalization is itself what fractures the civil polity, and that identity politics therefore works against creating real opportunities for ending marginalization. Schlesinger believes that "movements for civil rights should aim toward full acceptance and integration of marginalized groups into the mainstream culture, rather than...perpetuating that marginalization through affirmations of difference".

Brendan O'Neill has similarly suggested that identity politics causes (rather than simply recognizes and acts on) political schisms along lines of social identity. Thus, he contrasts the politics of gay liberation and identity politics by saying "... [Peter] Tatchell also had, back in the day, ... a commitment to the politics of liberation, which encouraged gays to come out and live and engage. Now, we have the politics of identity, which invites people to stay in, to look inward, to obsess over the body and the self, to surround themselves with a moral forcefield to protect their worldview—which has nothing to do with the world—from any questioning." In these and other ways, a political perspective oriented to one's own well being can be recast as causing the divisions that it insists upon making visible. 

In this same vein, author Owen Jones argues that identity politics often marginalize the working class, saying that:
In the 1950s and 1960s, left-wing intellectuals who were both inspired and informed by a powerful labor movement wrote hundreds of books and articles on working-class issues. Such work would help shape the views of politicians at the very top of the Labor Party. Today, progressive intellectuals are far more interested in issues of identity. ... Of course, the struggles for the emancipation of women, gays, and ethnic minorities are exceptionally important causes. New Labour has co-opted them, passing genuinely progressive legislation on gay equality and women's rights, for example. But it is an agenda that has happily co-existed with the sidelining of the working class in politics, allowing New Labour to protect its radical flank while pressing ahead with Thatcherite policies.

LGBT issues

The gay liberation movement of the late 1960s through the mid-1980s urged lesbians and gay men to engage in radical direct action, and to counter societal shame with gay pride. In the feminist spirit of the personal being political, the most basic form of activism was an emphasis on coming out to family, friends and colleagues, and living life as an openly lesbian or gay person. While the 1970s were the peak of "gay liberation" in New York City and other urban areas in the United States, "gay liberation" was the term still used instead of "gay pride" in more oppressive areas into the mid-1980s, with some organizations opting for the more inclusive, "lesbian and gay liberation". While women and transgender activists had lobbied for more inclusive names from the beginning of the movement, the initialism LGBT, or "Queer" as a counterculture shorthand for LGBT, did not gain much acceptance as an umbrella term until much later in the 1980s, and in some areas not until the '90s or even '00s. During this period in the United States, identity politics were largely seen in these communities in the definitions espoused by writers such as self-identified, "black, dyke, feminist, poet, mother" Audre Lorde's view, that lived experience matters, defines us, and is the only thing that grants authority to speak; that, "If I didn't define myself for myself, I would be crunched into other people's fantasies for me and eaten alive."

By the 2000s, in some areas of postmodern queer studies, notably those around gender, the idea of "identity politics" began to shift away from that of naming and claiming lived experience, and authority arising from lived experience, to one emphasizing choice and performance. Some who draw on the work of authors like Judith Butler, stress the importance of not assuming an already existing identity, but of remaking and unmaking identities through "performance". Writers in the field of Queer theory have at times taken this to the extent as to now argue that "queer", despite generations of specific use, no longer needs to refer to any specific sexual orientation at all; that it is now only about disrupting the mainstream, with author David M. Halperin arguing that straight people may now also self-identify as "queer," which some believe, is a form of cultural appropriation which robs gays and lesbians of their identity and makes invisible and irrelevant the actual, lived experience that causes them to be marginalized in the first place. "It desexualizes identity, when the issue is precisely about a sexual identity."

There are also supporters of identity politics who have developed their stances on the basis of Gayatri Chakravorty Spivak's work (namely, "Can the Subaltern Speak?") and have described some forms of identity politics as strategic essentialism, a form which has sought to work with hegemonic discourses to reform the understanding of "universal" goals.

Critiques of identity politics

Some critics argue that groups based on a particular shared identity (e.g. race, or gender identity) can divert energy and attention from more fundamental issues, similar to the history of divide and rule strategies. Sociologist Charles Derber asserts that the American left is "largely an identity-politics party" and that it "offers no broad critique of the political economy of capitalism. It focuses on reforms for blacks and women and so forth. But it doesn’t offer a contextual analysis within capitalism." Both he and David North of the Socialist Equality Party posit that these fragmented and isolated identity movements which permeate the left have allowed for a far-right resurgence. Critiques of identity politics have also been expressed on other grounds by writers such as Eric Hobsbawm, Todd Gitlin, Michael Tomasky, Richard Rorty, Sean Wilentz, Robert W. McChesney, and Jim Sleeper. Hobsbawm, in particular, has criticized nationalisms, and the principle of national self-determination adopted internationally after World War I, since national governments are often merely an expression of a ruling class or power, and their proliferation was a source of the wars of the 20th century. Hence, Hobsbawm argues that identity politics, such as queer nationalism, Islamism, Cornish nationalism or Ulster loyalism are just other versions of bourgeois nationalism.[citation needed] The view that identity politics rooted in challenging racism, sexism, and the like actually obscures class inequality is widespread in the United States and many other Western nations; however, this framing ignores how class-based politics are identity politics themselves.

Intersectional critiques

In her journal article Mapping the Margins: Intersectionality, Identity Politics and Violence against Women of Color, Kimberle Crenshaw treats identity politics as a process that brings people together based on a shared aspect of their identity. Crenshaw applauds identity politics for bringing African Americans (and other non-white people), gays and lesbians, and other oppressed groups together in community and progress. However, Crenshaw also points out that frequently groups come together based on a shared political identity but then fail to examine differences among themselves within their own group: "The problem with identity politics is not that it fails to transcend differences, as some critics charge, but rather the opposite—that it frequently conflates or ignores intragroup differences." Crenshaw argues that when society thinks "black", they think black male, and when society thinks feminism, they think white woman. When considering black women, at least two aspects of their identity are the subject of oppression: their race and their sex. Crenshaw proposes instead that identity politics are useful but that we must be aware of intersectionality and the role it plays in identity politics. Nira Yuval-Davis supports Crenshaw's critiques in Intersectionality and Feminist Politics and explains that "Identities are individual and collective narratives that answer the question 'who am/are I/we?" 

In her journal article  Mapping the Margins: Intersectionality, Identity Politics and Violence against Women of Color, Crenshaw provides the example of the Clarence Thomas/Anita Hill controversy to expand on her point. Anita Hill came forward and accused Supreme Court Justice Nominee Clarence Thomas of sexual harassment; Clarence Thomas would be the second African American to serve on the Supreme Court. Crenshaw argues that when Anita Hill came forward she was deemed anti-black in the movement against racism, and though she came forward on the feminist issue of sexual harassment, she was excluded because when considering feminism, it is the narrative of white middle-class women that prevails. Crenshaw concludes that acknowledging intersecting categories when groups unite on the basis of identity politics is better than ignoring categories altogether.

Examples

A Le Monde/IFOP poll in January 2011 conducted in France and Germany found that a majority felt Muslims are "scattered improperly"; an analyst for IFOP said the results indicated something "beyond linking immigration with security or immigration with unemployment, to linking Islam with a threat to identity".

Group polarization

From Wikipedia, the free encyclopedia

In social psychology, group polarization refers to the tendency for a group to make decisions that are more extreme than the initial inclination of its members. These more extreme decisions are towards greater risk if individuals' initial tendencies are to be risky and towards greater caution if individuals' initial tendencies are to be cautious. The phenomenon also holds that a group's attitude toward a situation may change in the sense that the individuals' initial attitudes have strengthened and intensified after group discussion, a phenomenon known as attitude polarization.

Overview

Group polarization is an important phenomenon in social psychology and is observable in many social contexts. For example, a group of women who hold moderately feminist views tend to demonstrate heightened pro-feminist beliefs following group discussion. Similarly, have shown that after deliberating together, mock jury members often decided on punitive damage awards that were either larger or smaller than the amount any individual juror had favored prior to deliberation. The studies indicated that when the jurors favored a relatively low award, discussion would lead to an even more lenient result, while if the jury was inclined to impose a stiff penalty, discussion would make it even harsher. Moreover, in recent years, the Internet and online social media have also presented opportunities to observe group polarization and compile new research. Psychologists have found that social media outlets such as Facebook and Twitter demonstrate that group polarization can occur even when a group is not physically together. As long as the group of individuals begins with the same fundamental opinion on the topic and a consistent dialogue is kept going, group polarization can occur.

Research has suggested that well-established groups suffer less from polarization, as do groups discussing problems that are well known to them. However, in situations where groups are somewhat newly formed and tasks are new, group polarization can demonstrate a more profound influence on the decision-making.

Attitude polarization

Attitude polarization, also known as belief polarization and polarization effect, is a phenomenon in which a disagreement becomes more extreme as the different parties consider evidence on the issue. It is one of the effects of confirmation bias: the tendency of people to search for and interpret evidence selectively, to reinforce their current beliefs or attitudes. When people encounter ambiguous evidence, this bias can potentially result in each of them interpreting it as in support of their existing attitudes, widening rather than narrowing the disagreement between them.

The effect is observed with issues that activate emotions, such as political "hot button" issues. For most issues, new evidence does not produce a polarization effect. For those issues where polarization is found, mere thinking about the issue, without contemplating new evidence, produces the effect. Social comparison processes have also been invoked as an explanation for the effect, which is increased by settings in which people repeat and validate each other's statements. This apparent tendency is of interest not only to psychologists, but also to sociologists and philosophers.

Empirical findings

Since the late 1960s, psychologists have carried out a number of studies on various aspects of attitude polarization. 

In 1979, Charles Lord, Lee Ross and Mark Lepper performed a study in which they selected two groups of people, one group strongly in favor of capital punishment, the other strongly opposed. The researchers initially measured the strength with which people held their position. Later, both the pro- and anti-capital punishment people were put into small groups and shown one of two cards, each containing a statement about the results of a research project written on it. For example:
Kroner and Phillips (1977) compared murder rates for the year before and the year after adoption of capital punishment in 14 states. In 11 of the 14 states, murder rates were lower after adoption of the death penalty. This research supports the deterrent effect of the death penalty.
or:
Palmer and Crandall (1977) compared murder rates in 10 pairs of neighboring states with different capital punishment laws. In 8 of the 10 pairs, murder rates were higher in the state with capital punishment. This research opposes the deterrent effect of the death penalty.
The researchers again asked people about the strength of their beliefs about the deterrence effect of the death penalty, and, this time, also asked them about the effect that the research had had on their attitudes. 

In the next stage of the research, the participants were given more information about the study described on the card they received, including details of the research, critiques of the research, and the researchers' responses to those critiques. The participants' degree of commitment to their original positions were re-measured, and the participants were asked about the quality of the research and the effect the research had on their beliefs. Finally, the trial was rerun on all participants using a card that supported the opposite position to that they had initially seen. 

The researchers found that people tended to believe that research that supported their original views had been better conducted and was more convincing than research that didn't. Whichever position they held initially, people tended to hold that position more strongly after reading research that supported it. Lord et al. point out that it is reasonable for people to be less critical of research that supports their current position, but it seems less rational for people to significantly increase the strength of their attitudes when they read supporting evidence. When people had read both the research that supported their views and the research that did not, they tended to hold their original attitudes more strongly than before they received that information. These results should be understood in the context of several problems in the implementation of the study, including the fact the researchers changed the scaling of the outcome of the variable, so measuring attitude change was impossible, and measured polarization using a subjective assessment of attitude change not a direct measure of how much change had occurred.

Choice shifts

Group polarization and choice shifts are similar in many ways; however, they differ in one distinct way. Group polarization refers to attitude change on the individual level due to the influence of the group, and choice shift refers to the outcome of that attitude change; namely, the difference between the average group members' pre-group discussion attitudes and the outcome of the group decision.

Risky and cautious shifts are both a part of a more generalized idea known as group-induced attitude polarization. Though group polarization deals mainly with risk-involving decisions and/or opinions, discussion-induced shifts have been shown to occur on several non-risk-involving levels. This suggests that a general phenomenon of choice-shifts exists apart from only risk-related decisions. Stoner found that a decision is impacted by the values behind that circumstances of the decision. The study found that situations that normally favor the more risky alternative increased risky shifts. More so, situations that normally favor the cautious alternative increased cautious shifts. These findings also show the importance of previous group shifts. Choice shifts are mainly explained by largely differing human values and how highly these values are held by an individual. According to Moscovici et al. interaction within a group and differences of opinion are necessary for group polarization to take place. While an extremist in the group may sway opinion, the shift can only occur with sufficient and proper interaction within the group. In other words, the extremist will have no impact without interaction. Also, Moscovici et al. found individual preferences to be irrelevant; it is differences of opinion which will cause the shift. This finding demonstrates how one opinion in the group will not sway the group; it is the combination of all the individual opinions that will make an impact.

History and origins

The study of group polarization can be traced back to an unpublished 1961 Master's thesis by MIT student James Stoner, who observed the so-called "risky shift". The concept of risky shift maintains that a group's decisions are riskier than the average of the individual decisions of members before the group met. 

In early studies, the risky-shift phenomenon was measured using a scale known as the Choice-Dilemmas Questionnaire. This measure required participants to consider a hypothetical scenario in which an individual is faced with a dilemma and must make a choice to resolve the issue at hand. Participants were then asked to estimate the probability that a certain choice would be of benefit or risk to the individual being discussed. Consider the following example: 

"Mr. A, an electrical engineer, who is married and has one child, has been working for a large electronics corporation since graduating from college five years ago. He is assured of a lifetime job with a modest, though adequate, salary and liberal pension benefits upon retirement. On the other hand, it is very unlikely that his salary will increase much before he retires. While attending a convention, Mr. A is offered a job with a small, newly founded company which has a highly uncertain future. The new job would pay more to start and would offer the possibility of a share in the owner- ship if the company survived the competition of the larger firms." Participants were then asked to imagine that they were advising Mr. A. They would then be provided with a series of probabilities that indicate whether the new company that offered him a position is financially stable. 

It would read as following: "Please check the lowest probability that you would consider acceptable to make it worthwhile for Mr. A to take the new job." 

____The chances are 1 in 10 that the company will prove financially sound.
____The chances are 3 in 10 that the company will prove financially sound.
____The chances are 5 in 10 that the company will prove financially sound.
____The chances are 7 in 10 that the company will prove financially sound.
____The chances are 9 in 10 that the company will prove financially sound.
____Place a check here if you think Mr. A should not take the new job no matter what the probabilities.

Individuals completed the questionnaire and made their decisions independently of others. Later, they would be asked to join a group to reassess their choices. Indicated by shifts in the mean value, initial studies using this method revealed that group decisions tended to be relatively riskier than those that were made by individuals. This tendency also occurred when individual judgments were collected after the group discussion and even when the individual post-discussion measures were delayed two to six weeks.

The discovery of the risky shift was considered surprising and counter-intuitive, especially since earlier work in the 1920s and 1930s by Allport and other researchers suggested that individuals made more extreme decisions than did groups, leading to the expectation that groups would make decisions that would conform to the average risk level of its members. The seemingly counter-intuitive findings of Stoner led to a spurt of research around the risky shift, which was originally thought to be a special case exception to the standard decision-making practice. Many people had concluded that people in a group setting would make decisions based on what they assumed to be the overall risk level of a group; because Stoner's work did not necessarily address this specific theme, and because it does seem to contrast Stoner's initial definition of risky shift, additional controversy arose leading researchers to further examine the topic. By the late 1960s, however, it had become clear that the risky shift was just one type of many attitudes that became more extreme in groups, leading Moscovici and Zavalloni to term the overall phenomenon "group polarization".

Subsequently, a decade-long period of examination of the applicability of group polarization to a number of fields in both lab and field settings began. There is a substantial amount of empirical evidence demonstrating the phenomenon of group polarization. Group polarization has been widely considered as a fundamental group decision-making process and was well established, but remained non-obvious and puzzling because its mechanisms were not fully understood.

Major theoretical approaches

Almost as soon as the phenomenon of group polarization was discovered, a number of theories were offered to help explain and account for it. These explanations were gradually narrowed down and grouped together until two primary mechanisms remained, social comparison and informational influence.

Social comparison theory

The social comparison theory, or normative influence theory, has been widely used to explain group polarization. According to the social comparison interpretation, group polarization occurs as a result of individuals' desire to gain acceptance and be perceived in a favorable way by their group. The theory holds that people first compare their own ideas with those held by the rest of the group; they observe and evaluate what the group values and prefers. In order to gain acceptance, people then take a position that is similar to everyone else's but slightly more extreme. In doing so, individuals support the group's beliefs while still presenting themselves as admirable group "leaders". The presence of a member with an extreme viewpoint or attitude does not further polarize the group. Studies regarding the theory have demonstrated that normative influence is more likely with judgmental issues, a group goal of harmony, person-oriented group members, and public responses.

Informational influence

Informational influence, or persuasive arguments theory, has also been used to explain group polarization, and is most recognized by psychologists today. The persuasive arguments interpretation holds that individuals become more convinced of their views when they hear novel arguments in support of their position. The theory posits that each group member enters the discussion aware of a set of items of information or arguments favoring both sides of the issue, but lean toward that side that boasts the greater amount of information. In other words, individuals base their individual choices by weighing remembered pro and con arguments. Some of these items or arguments are shared among the members while some items are unshared, in which all but one member has considered these arguments before. Assuming most or all group members lean in the same direction, during discussion, items of unshared information supporting that direction are expressed, which provides members previously unaware of them more reason to lean in that direction. Group discussion shifts the weight of evidence as each group member expresses their arguments, shedding light onto a number of different positions and ideas. Research has indicated that informational influence is more likely with intellective issues, a group goal of making correct decision, task-oriented group members, and private responses. Furthermore, research suggests that it is not simply the sharing of information that predicts group polarization. Rather, the amount of information and persuasiveness of the arguments mediate the level of polarization experienced.

In the 1970s, significant arguments occurred over whether persuasive argumentation alone accounted for group polarization. Daniel Isenberg's 1986 meta-analysis of the data gathered by both the persuasive argument and social comparison camps succeeded, in large part, in answering the questions about predominant mechanisms. Isenberg concluded that there was substantial evidence that both effects were operating simultaneously, and that persuasive arguments theory operated when social comparison did not, and vice versa.

Self-categorization and social identity

While these two theories are the most widely accepted as explanations for group polarization, alternative theories have been proposed. The most popular of these theories is self-categorization theory. Self-categorization theory stems from social identity theory, which holds that conformity stems from psychological processes; that is, being a member of a group is defined as the subjective perception of the self as a member of a specific category. Accordingly, proponents of the self-categorization model hold that group polarization occurs because individuals identify with a particular group and conform to a prototypical group position that is more extreme than the group mean. In contrast to social comparison theory and persuasive argumentation theory, the self-categorization model maintains that inter-group categorization processes are the cause of group polarization.

Support for the self-categorization theory, which explains group polarization as conformity to a polarized norm, was found by Hogg, Turner, and Davidson in 1990. In their experiment, participants gave pre-test, post-test, and group consensus recommendations on three choice dilemma item-types (risky, neutral, or cautious). The researchers hypothesized that an ingroup confronted by a risky outgroup will polarize toward caution, an ingroup confronted by a caution outgroup will polarize toward risk, and an ingroup in the middle of the social frame of reference, confronted by both risky and cautious outgroups, will not polarize but will converge on its pre-test mean. The results of the study supported their hypothesis in that participants converged on a norm polarized toward risk on risky items and toward caution on cautious items. Another similar study found that in-group prototypes become more polarized as the group becomes more extreme in the social context. This further lends support to the self-categorization explanation of group polarization.

Real-life applications

The Internet

The rising popularity and increased number of online social media platforms, such as Facebook and Twitter, has enabled people to seek out and share ideas with others who have similar interests and values, making group polarization effects increasingly evident, particularly in generation Y individuals. Owing to this technology, it is possible for individuals to curate their sources of information and the opinions to which they are exposed, thereby reinforcing and strengthening their own views while effectively avoiding information and perspectives with which they disagree.

One study analyzed over 30,000 tweets on Twitter regarding the shooting of George Tiller, a late term abortion doctor, where the tweets analyzed were conversations among pro-life and pro-choice advocates post shooting. The study found that like-minded individuals strengthened group identity whereas replies between different-minded individuals reinforced a split in affiliation.

In a study conducted by Sia et al. in 2002, group polarization was found to occur with online (computer-mediated) discussions. In particular, this study found that group discussions, conducted when discussants are in a distributed (cannot see one another) or anonymous (cannot identify one another) environment, can lead to even higher levels of group polarization compared to traditional meetings. This is attributed to the greater numbers of novel arguments generated (due to persuasive arguments theory) and higher incidence of one-upmanship behaviors (due to social comparison).

However, some research suggests that important differences arise in measuring group polarization in laboratory versus field experiments. A study conducted by Taylor & MacDonald in 2002 featured a realistic setting of a computer-mediated discussion, but group polarization did not occur at the level expected. The study's results also showed that groupthink occurs less in computer-mediated discussions than when people are face to face. Moreover, computer-mediated discussions often fail to result in a group consensus, or lead to less satisfaction with the consensus that was reached, compared to groups operating in a natural environment. Furthermore, the experiment took place over a two-week period, leading the researchers to suggest that group polarization may occur only in the short-term. Overall, the results suggest that not only may group polarization not be as prevalent as previous studies suggest, but group theories, in general, may not be predictable when seen in a computer-related discussion.

Politics and law

Group polarization has been widely discussed in terms of political behavior. Researchers have identified an increase in affective polarization among the United States electorate, and report that hostility and discrimination towards the opposing political party has increased dramatically over time.
Group polarization is similarly influential in legal contexts. A study that assessed whether Federal district court judges behaved differently when they sat alone, or in small groups, demonstrated that those judges who sat alone took extreme action 35% of the time, whereas judges who sat in a group of three took extreme action 65% of the time. These results are noteworthy because they indicate that even trained, professional decision-makers are subject to the influences of group polarization.

War and violent behavior

Group polarization has been reported to occur during wartime and other times of conflict and helps to account partially for violent behavior and conflict. Researchers have suggested, for instance, that ethnic conflict exacerbates group polarization by enhancing identification with the ingroup and hostility towards the outgroup. While polarization can occur in any type of conflict, it has its most damaging effects in large-scale inter-group, public policy, and international conflicts.

College life

On a smaller scale, group polarization can also be seen in the everyday lives of students in higher education. A study by Myers in 2005 reported that initial differences among American college students become more accentuated over time. For example, students who do not belong to fraternities and sororities tend to be more liberal politically, and this difference increases over the course of their college careers. Researchers theorize that this is at least partially explained by group polarization, as group members tend to reinforce one another's proclivities and opinions.

Operator (computer programming)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Operator_(computer_programmin...