Search This Blog

Sunday, March 12, 2023

American Indian boarding schools

From Wikipedia, the free encyclopedia
Pupils at Carlisle Indian Industrial School, Pennsylvania, c. 1900

American Indian boarding schools, also known more recently as American Indian residential schools, were established in the United States from the mid-17th to the early 20th centuries with a primary objective of "civilizing" or assimilating Native American children and youth into European American culture. In the process, these schools denigrated Native American culture and made children give up their languages and religion. At the same time the schools provided a basic Western education. These boarding schools were first established by Christian missionaries of various denominations. The missionaries were often approved by the federal government to start both missions and schools on reservations, especially in the lightly populated areas of the West. In the late 19th and early 20th centuries especially, the government paid religious orders to provide basic education to Native American children on reservations, and later established its own schools on reservations. The Bureau of Indian Affairs (BIA) also founded additional off-reservation boarding schools based on the assimilation model. These sometimes drew children from a variety of tribes. In addition, religious orders established off-reservation schools.

Children were typically immersed in European American culture. Schools forced removal of indigenous cultural signifiers: cutting the children's hair, having them wear American-style uniforms, forbidding them from speaking their mother tongues, and replacing their tribal names with English language names (saints names under some religious orders) for use at the schools, as part of assimilation and to Christianize them. The schools were usually harsh, especially for younger children who had been forcibly separated from their families and forced to abandon their Native American identities and cultures. Children sometimes died in the school system due to infectious disease. Investigations of the later 20th century revealed cases of physical, emotional, and sexual abuse occurring mostly in church-run schools.

Summarizing recent scholarship from Native perspectives, Dr. Julie Davis said:

Boarding schools embodied both victimization and agency for Native people and they served as sites of both cultural loss and cultural persistence. These institutions, intended to assimilate Native people into mainstream society and eradicate Native cultures, became integral components of American Indian identities and eventually fueled the drive for political and cultural self-determination in the late 20th century.

Since those years, tribal nations have carried out political activism and gained legislation and federal policy that gives them the power to decide how to use federal education funds, how they educate their children, and the authority to establish their own community-based schools. Tribes have also founded numerous tribal colleges and universities on reservations. Tribal control over their schools has been supported by federal legislation and changing practices by the BIA. By 2007, most of the boarding schools had been closed down, and the number of Native American children in boarding schools had declined to 9,500.

Although there are hundreds of deceased Indigenous children yet to be found, investigations are increasing across the United States.

History of education of Native Americans by Europeans

... instead of exterminating a part of the human race ... we had persevered ... and at last had imparted our Knowledge of cultivating and the arts, to the Aboriginals of the Country ... But it has been conceived to be impracticable to civilize the Indians of North America – This opinion is probably more convenient than just.

— Henry Knox to George Washington, 1789.

In the late eighteenth century, reformers starting with President George Washington and Henry Knox, in efforts to "civilize" or otherwise assimilate Native Americans, adopted the practice of assimilating Native American children in current American culture. At the time the society was dominated by agriculture, with many yeomen subsistence farmers, and rural society made up of some small towns and few large cities. The Civilization Fund Act of 1819 promoted this policy by providing funding to societies (mostly religious missionaries) who worked on Native American education, often at schools established in or near Native American communities. The reformers believed this policy would help the Indians survive increasing contact with European-American settlers who were moving west into their territories.

Moses Tom sent his children to an Indian boarding school.

I rejoice, brothers, to hear you propose to become cultivators of the earth for the maintenance of your families. Be assured you will support them better and with less labor, by raising stock and bread, and by spinning and weaving clothes, than by hunting. A little land cultivated, and a little labor, will procure more provisions than the most successful hunt; and a woman will clothe more by spinning and weaving, than a man by hunting. Compared with you, we are but as of yesterday in this land. Yet see how much more we have multiplied by industry, and the exercise of that reason which you possess in common with us. Follow then our example, brethren, and we will aid you with great pleasure ...

— President Thomas Jefferson, Brothers of the Choctaw Nation, December 17, 1803

Early mission schools

In 1634, Fr. Andrew White of the English Province of the Society of Jesus established a mission in what is now Southern Maryland. He said the purpose of the mission, as an interpreter told the chief of a Native American tribe there, was "to extend civilization and instruction to his ignorant race, and show them the way to heaven." The mission's annual records report that by 1640, they had founded a community they named St. Mary's. Native Americans were sending their children there to be educated, including the daughter of Tayac, the Pascatoe chief. She was likely an exception because of her father's status, as girls were generally not educated with boys in English Catholic schools of the period. Other students discussed in the records were male.

The same records report that in 1677,

"a school for humanities was opened by our Society in the centre of Maryland, directed by two of the Fathers; and the native youth, applying themselves assiduously to study, made good progress. Maryland and the recently established school sent two boys to St. Omer who yielded in abilities to few Europeans, when competing for the honour of being first in their class. So that not gold, nor silver, nor the other products of the earth alone, but men also are gathered from thence to bring those regions, which foreigners have unjustly called ferocious, to a higher state of virtue and cultivation."

Young woman and young man standing at a church altar with a priest

In the mid-1600s, Harvard College had an "Indian College" on its campus in Cambridge, Massachusetts Bay Colony, supported by the Anglican Society for Propagation of the Gospel. Its few Native American students came from New England. In this period higher education was very limited for all classes, and most 'colleges' taught at a level more similar to today's high schools. In 1665, Caleb Cheeshahteaumuck, "from the Wampanoag...did graduate from Harvard, the first Indian to do so in the colonial period".

In the early colonial years, other Indian schools were created by local New England communities, as with the Indian school in Hanover, New Hampshire, in 1769. This gradually developed as Dartmouth College, which has retained some programs for Native Americans. Other schools were also created in the East, such as in Bethlehem, Pennsylvania by Moravian missionaries. Religious missionaries from various denominations developed the first schools as part of their missions near indigenous settlements, believing they could extend education and Christianity to Native Americans. East of the Appalachian Mountains, most Indians had been forced off their traditional lands before the American Revolutionary War. They had few reservations.

In the early nineteenth century, the new republic continued to deal with questions about how Native American peoples would live. The Foreign Mission School, a Protestant-backed institution that opened in Cornwall, Connecticut in 1816, was set up for male students from a variety of non-Christian peoples, mostly abroad. Native Hawaiians, Muslim and Hindu students from India and Southeast Asia were among the nearly 100 total who attended during its decade of operation. Also enrolled were Native American students from the Cherokee and Choctaw tribes (among the Five Civilized Tribes of the American Southeast), as well as Lenape (a mid-Atlantic tribe) and Osage students. It was intended to train young people as missionaries, interpreters, translators, etc. who could help guide their peoples.

Nationhood, Indian Wars, and western settlement

Through the 19th century, the encroachment of European Americans on Indian lands continued. From the 1830s, tribes from both the Southeast and the Great Lakes areas were pushed west of the Mississippi, forced off their lands to Indian Territory. As part of the treaties signed for land cessions, the United States was supposed to provide education to the tribes on their reservations. Some religious orders and organizations established missions in Kansas and what later became Oklahoma to work on these new reservations. Some of the Southeast tribes established their own schools, as the Choctaw did for both girls and boys.

After the Civil War and decades of Indian Wars in the West, more tribes were forced onto reservations after ceding vast amounts of land to the US. With the goal of assimilation, believed necessary so that tribal Indians could survive to become part of American society, the government increased its efforts to provide education opportunities. Some of this was related to the progressive movement, which believed the only way for the tribal peoples to make their way was to become assimilated, as American society was rapidly changing and urbanizing.

Following the Indian Wars, missionaries founded additional schools in the West with boarding facilities. Given the vast areas and isolated populations, they could support only a limited number of schools. Some children necessarily had to attend schools that were distant from their communities. Initially under President Ulysses S. Grant, only one religious organization or order was permitted on any single reservation. The various denominations lobbied the government to be permitted to set up missions, even in competition with each other.

Assimilation-era day schools

Day schools were also created to implement federal mandates. Compared to boarding schools, day schools were a less expensive option that usually received less parental pushback.

One example is the Fallon Indian Day School opened on the Stillwater Indian Reservation in 1908. Even after the process of closing boarding schools started, day schools remained open.

Carlisle Indian Industrial School

Chiricahua Apaches Four Months After Arriving at Carlisle. Undated photograph taken at Carlisle Indian Industrial School.
 

After the Indian Wars, Lieutenant Richard Henry Pratt was assigned to supervise Native prisoners of war at Fort Marion which was located in St. Augustine, Florida. The United States Army sent seventy-two warriors from the Cheyenne, Kiowa, Comanche and Caddo nations, to exile in St. Augustine, Florida. They were used as hostages to encourage their peoples in the West to remain peaceful.

Teacher Mary R. Hyde and students at Carlisle Indian Training School

Pratt began to work with them on education in European-American culture, essentially a kind of immersion. While he required changes: the men had to cut their hair and wear common uniforms rather than their traditional clothes, he also granted them increased autonomy and the ability to govern themselves within the prison. Pleased by his success, he was said to have supported the motto, "Kill the Indian, Save the Man." Pratt said in a speech in 1892:

"A great general has said that the only good Indian is a dead one. In a sense, I agree with the sentiment, but only in this: that all the Indian there is in the race should be dead."

Pratt provided for some of the younger men to pursue more education at the Hampton Institute, a historically black college founded in 1868 for the education of freedmen by biracial representatives of the American Missionary Association soon after the Civil War. Following Pratt's sponsored students, Hampton in 1875 developed a program for Native American students.

Pratt continued the assimilation model in developing the Carlisle Indian Industrial School. Pratt felt that within one generation Native children could be integrated into Euro-American culture. With this perspective he proposed an expensive experiment to the federal government. Pratt wanted the government to fund a school that would require Native children to move away from their homes to attend a school far away. The Carlisle Indian school, which became the template for over 300 schools across the United States, opened in 1879. Carlisle Barracks an abandoned Pennsylvanian military base was used for the school. It became the first school that was not on a reservation.

The Carlisle curriculum was heavy based on the culture and society of rural America. The classes included vocational training for boys and domestic science for girls. Students worked to carry out chores that helped sustain the farm and food production for the self-supporting school. They were also able to produce goods to sell at the market. Carlisle students produced a newspaper, had a well-regarded chorus and orchestra, and developed sports programs. In the summer students often lived with local farm families and townspeople, reinforcing their assimilation, and providing labor at low cost to the families..

Federally supported boarding schools

Children working in a school's garden
 
Thanksgiving Day play

Carlisle and its curriculum became the model for the Bureau of Indian Affairs. By 1902 it authorized 25 federally funded off-reservation schools in 15 states and territories, with a total enrollment of over 6,000 students. Federal legislation required Native American children to be educated according to Anglo-American standards. Parents had to authorize their children's attendance at boarding schools and, if they refused, officials could use coercion to gain a quota of students from any given reservation.

Boarding schools were also established on reservations, where they were often operated by religious missions or institutes, which were generally independent of the local diocese, in the case of Catholic orders. Because of the distances, often Native American children were separated from their families and tribes when they attended such schools on other reservations. At the peak of the federal program, the BIA supported 350 boarding schools.

In the late 19th and early 20th centuries, when students arrived at boarding schools, their lives altered dramatically. They were given short haircuts (a source of shame for boys of many tribes, who considered long hair part of their maturing identity), required to wear uniforms, and to take English names for use at the school. Sometimes the names were based on their own; other times they were assigned at random. The children were not allowed to speak their own languages, even between each other. They were required to attend church services and were often baptized as Christians. As was typical of the time, discipline was stiff in many schools. It often included assignment of extra chores for punishment, solitary confinement and corporal punishment, including beatings by teachers using sticks, rulers and belts.

Anna Moore said, regarding the Phoenix Indian School:

If we were not finished [scrubbing the dining room floors] when the 8 a.m. whistle sounded, the dining room matron would go around strapping us while we were still on our hands and knees.

Abuse in the boarding schools

Young girls posed in room

The children who were admitted into boarding schools experienced several forms of abuse. They were given white names, forced to speak English, and were not allowed to practice their culture. They took classes on how to conduct manual labor such as farming and housekeeping. When they were not in class, they were expected to maintain the upkeep of the schools. Unclean and overpopulated living conditions led to the spread of disease and many students did not receive enough food. Bounties were offered for students who tried to run away and many students committed suicide. Students who died were sometimes placed in coffins and buried in the school cemetery by their own classmates.

Indigenous children were forcibly removed from their families and admitted to these boarding schools. Their cultural traditions were discarded when they were taught about American ideas of refinement and civilization. This forced assimilation increased substance abuse and suicides among these students as they suffered mental illnesses such as depression and PTSD. These illnesses also increased the risk of developing cardiovascular diseases.

The sexual abuse of indigenous children in boarding schools was perpetrated by the administrators of these programs. Teachers, nuns, and priests performed these acts upon their students. Children were touched and molested to be used as pleasure by these mentors who were supposed to educate them. Several mentors considered these students as objects and sexually abused them by forming rotations to switch in and out whenever they were done sexually tormenting the next student. These adults also used sexual abuse as a form of embarrassment towards each other. In tracing the path of violence, several students experienced an assault that, "can only be described as unconscionable, it was a violation not only of a child's body but an assault on their spirit". This act created a majority among the children who were victims in silence. This recurred in boarding schools across the nation in different scenarios. These include boys being sexually assaulted on their 13th birthdays to girls being forcibly taken at night by the priest to be used as objects.

As claimed by Dr. Jon Reyhner, he described methods of discipline by mentioning that: "The boys were laid on an empty barrel and whipped with a long leather strap". Methods such as these have left physical injuries and made the institutions dangerous for these children as they lived in fear of violence. Many children did not recover from their wounds caused by abuse from as they were often left untreated.

Legality and policy

In 1776, the Continental Congress authorized the Indian commissioners to engage ministers as teachers to work with Indians. This movement increased after the War of 1812.

In 1819, Congress appropriated $10,000 to hire teachers and maintain schools. These resources were allocated to the missionary church schools because the government had no other mechanism to educate the Indian population.

In 1887, to provide funding for more boarding schools, Congress passed the Compulsory Indian Education Act.

In 1891, a compulsory attendance law enabled federal officers to forcibly take Native American children from their homes and reservations. The American government believed they were rescuing these children from a world of poverty and depression and teaching them "life skills".

Tabatha Toney Booth of the University of Central Oklahoma wrote in her paper, Cheaper Than Bullets,

"Many parents had no choice but to send their kids, when Congress authorized the Commissioner of Indian Affairs to withhold rations, clothing, and annuities of those families that refused to send students. Some agents even used reservation police to virtually kidnap youngsters, but experienced difficulties when the Native police officers would resign out of disgust, or when parents taught their kids a special "hide and seek" game. Sometimes resistant fathers found themselves locked up for refusal. In 1895, nineteen men of the Hopi Nation were imprisoned to Alcatraz because they refused to send their children to boarding school.

Between 1778 and 1871, the federal government signed 389 treaties with American Indian tribes. Most of these treaties contained provisions that the federal government would provide education and other services in exchange for land. The last of these treaties, the Fort Laramie Treaty of 1868, established the Great Sioux Reservation. One particular article in the Fort Laramie Treaty illustrates the attention the federal government paid to the "civilizing" nature of education: "Article 7. In order to insure the civilization of the Indians entering int this treaty the necessity of education is admitted, especially of such of them as are or may be settled on said agricultural reservations, and they therefore pledge themselves to compel their children, male and female, between the ages of six and sixteen years to attend school"

Use of the English language in the education of American Indian children was first mentioned in the report of the Indian Peace Commission, a body appointed by an act of Congress in 1867. The report stated that the difference of languages was a major problem and advocated elimination of Indian languages and replacement of them with English. This report created a controversy in Indian education because the missionaries who had been responsible for educating Native youth used a bilingual instructional policy. In 1870, President Grant criticized this beginning a new policy with eradication of Native languages as a major goal

In 1871, the United States government prohibited further treaties with Indian nations and also passed the Appropriations Act for Indian Education requiring the establishment of day schools on reservations.

In 1873, the Board of Indian Commissions argued in a Report to Congress that days schools were ineffective at teach Indian children English because they spent 20 hours per day at home speaking their native language. The Senate and House Indian Affairs committees joined in the criticism of day schools a year later arguing that they operated too much to perpetuate "the Indian as special-status individual rather than preparing for him independent citizenship"

"The boarding school movement began after the Civil War, when reformers turned their attention to the plight of Indian people and advocated for proper education and treatment so that Indians could become like other citizens. One of the first efforts to accomplish this goal was the establishment of the Carlisle Indian School in Pennsylvania, founded in 1879." The leader of the school, General Pratt also employed the "outing system" which placed Indians in non-Indian homes during the summers and for three years following high school to learn non-Indian culture (ibid). Government subsidies were made to participating families. Pratt believed that this was both educating American Indians and making them Americans. In 1900, 1,880 Carlisle students participated in this system, each with his or her own bank account.

In the late 1800s, the federal government pursued a policy of total assimilation of the American Indian into mainstream American society.

In 1918, Carlisle boarding school was closed because Pratt's method of assimilating American Indian students through off-reservation boarding schools was perceived as outdated. That same year Congress passed new Indian education legislation, the Act of May 25, 1918. It generally forbade expenditures for separate education of children less than 1/4 Indian whose parents are citizens of the United States when they live in an area where adequate free public schools are provided.

Meriam Report of 1928

In 1926, the Department of the Interior (DOI) commissioned the Brookings Institution to conduct a survey of the overall conditions of American Indians and to assess federal programs and policies. The Meriam Report, officially titled The Problem of Indian Administration, was submitted February 21, 1928, to Secretary of the Interior Hubert Work. Related to education of Native American children, it recommended that the government:

  • Abolish The Uniform Course of Study, which taught only European-American cultural values;
  • Educate younger children at community schools near home, and have older children attend non-reservation schools for higher grade work;
  • Have the Indian Service (now Bureau of Indian Affairs) provide American Indians the education and skills they need to adapt both in their own communities and United States society.

The Indian Reorganization Act of 1934

The Indian Reorganization Act of 1934 ended the allotment period of history, confirmed the rights to Indian self-government, and made Indians eligible to hold Bureau of Indian Affairs posts, which encouraged Indians to attend vocational schools and colleges." During this period there was an effort to encourage the development of community day schools; however, public school attendance for Indian children was also encouraged. In the same year, the Johnson–O'Malley Act (JOM) was passed, which provided for the reimbursement of states for the cost of educating Indian students in public schools. This federal-state contract provided that a specified sum be paid by the federal government and held the state responsible for the education and welfare of Indians within its boundaries. Funds made available from the O'Malley act were designated to assist in reducing the enrollment of Indian boarding schools, placing them in public schools instead.

The termination period

In 1953, Congress passed House Concurrent Resolution 108, which set a new direction in federal policy toward Indians. The major spokesperson for the resolution Senator Arthur Watkins (Utah), stated: "As rapidly as possible, we should end the status of Indians as wards of the government and grant them all the rights and prerogatives pertaining to American citizenship" The federal government implemented another new policy, aimed at relocating Indian people to urban cities and away from the reservations, terminating the tribes as separate entities. There were sixty-one tribes terminated during that period.

1968 onward

In 1968, President Lyndon B. Johnson ended this practice and the termination period. He also directed the Secretary of the Interior to establish Indian School boards for federal Indian schools to be comprised by members of the communities.

Major legislation aimed at improving Indian education occurred in the 1970s. In 1972, Congress passed the Indian Education Act, which established a comprehensive approach to meeting the unique needs of American Indians and Alaska Native students. This Act recognizes that American Indians have unique educational and culturally related academic needs and distinct language and cultural needs. The most far-reaching legislation to be signed during the 1970s, however, was the Indian Self-Determination and Education Assistance Act of 1975, which guaranteed tribes the opportunity to determine their own futures and the education of their children through funds allocated to and administrated by individual tribes.

Disease and death

Given the lack of public sanitation and the often crowded conditions at boarding schools in the early 20th-century, students were at risk for infectious diseases such as tuberculosis, measles, and trachoma. None of these diseases was yet treatable by antibiotics or controlled by vaccines, and epidemics swept schools as they did cities.

The overcrowding of the schools contributed to the rapid spread of disease within the schools. "An often-underpaid staff provided irregular medical care. And not least, apathetic boarding school officials frequently failed to heed their own directions calling for the segregation of children in poor health from the rest of the student body". Tuberculosis was especially deadly among students. Many children died while in custody at Indian schools. Often students were prevented from communicating with their families, and parents were not notified when their children fell ill; the schools also failed sometimes to notify them when a child died. "Many of the Indian deaths during the great influenza pandemic of 1918–1919, which hit the Native American population hard, took place in boarding schools."

The 1928 Meriam Report noted that infectious disease was often widespread at the schools due to malnutrition, overcrowding, poor sanitary conditions, and students weakened by overwork. The report said that death rates for Native American students were six and a half times higher than for other ethnic groups. A report regarding the Phoenix Indian School said, "In December of 1899, measles broke out at the Phoenix Indian School, reaching epidemic proportions by January. In its wake, 325 cases of measles, 60 cases of pneumonia, and 9 deaths were recorded in a 10-day period."

Implications of assimilation

Teacher and young boys posed for photograph

From 1810 to 1917, the U.S. federal government subsidized mission and boarding schools. By 1885, 106 Indian schools had been established, many of them on abandoned military installations. Using military personnel and Indian prisoners, boarding schools were seen as a means for the government to achieve assimilation of Native Americans into mainstream American culture. Assimilation efforts included forcibly removing Native Americans from their families, converting them to Christianity, preventing them from learning or practicing indigenous culture and customs, and living in a strict military fashion.

When students arrived at boarding schools, the routine was typically the same. First, the students were forced to give up their tribal clothing and their hair was cut. Second, "[t]o instill the necessary discipline, the entire school routine was organized in martial fashion, and every facet of student life followed a strict timetable".

One student recalled the routine in the 1890s:

Young boys eating in the dining hall

A small bell was tapped, and each of the pupils drew a chair from under the table. Supposing this act meant that they were to be seated, I pulled out mine and at once slipped into it from one side. But when I turned my head, I saw that I was the only one seated, and all the rest at our table remained standing. Just as I began to rise, looking shyly around to see how chairs were to be used, a second bell was sounded. All were seated at last, and I had to crawl back into my chair again. I heard a man's voice at one end of the hall, and I looked around to see him. But all the others hung their heads over their plates. As I glanced at the long chain of tables, I cause the eyes of a paleface woman upon me. Immediately I dropped my eyes, wondering why I was so keenly watched by the strange woman. The man ceased his mutterings, and then a third bell was tapped. Everyone picked up his knife and fork and began eating. I began crying instead, for by this time I was afraid to venture anything more.

Besides mealtime routines, administrators "educated" Indigenous students on how to farm using European-based methods, which they considered superior to indigenous methods. Given the constraints of rural locations and limited budgets, boarding schools often operated supporting farms, raising livestock and produced their vegetables and fruit.

Children doing calisthenics

From the moment students arrived at school, they could not "be Indian" in any way. Boarding school administrators "forbade, whether in school or on reservation, tribal singing and dancing, along with the wearing of ceremonial and 'savage' clothes, the practice of native religions, the speaking of tribal languages, the acting out of traditional gender roles". School administrators argued that young women needed to be specifically targeted due to their important place in continuing assimilation education in their future homes. Educational administrators and teachers were instructed that "Indian girls were to be assured that, because their grandmothers did things in a certain way, there was no reason for them to do the same".

Removal to reservations in the West in the early part of the century and the enactment of the Dawes Act in 1887 eventually took nearly 50 million acres of land from Indian control. On-reservation schools were either taken over by Anglo leadership or destroyed. Indian-controlled school systems became non-existent while "the Indians [were] made captives of federal or mission education".

Although schools did use verbal correction to enforce assimilation, more violent measures were also used, as corporal punishment was common in European American society. Archuleta et al. (2000) noted cases where students had "their mouths washed out with lye soap when they spoke their native languages; they could be locked up in the guardhouse with only bread and water for other rule violations; and they faced corporal punishment and other rigid discipline on a daily basis". Beyond physical and mental abuse, some school authorities sexually abused students as well.

One former student recounted,

Intimidation and fear were very much present in our daily lives. For instance, we would cower from the abusive disciplinary practices of some superiors, such as the one who yanked my cousin's ear hard enough to tear it. After a nine-year-old girl was raped in her dormitory bed during the night, we girls would be so scared that we would jump into each other's bed as soon as the lights went out. The sustained terror in our hearts further tested our endurance, as it was better to suffer with a full bladder and be safe than to walk through the dark, seemingly endless hallway to the bathroom. When we were older, we girls anguished each time we entered the classroom of a certain male teacher who stalked and molested girls.

Girls and young women taken from their families and placed into boarding schools, such as the Hampton Normal and Agricultural Institute, were urged to accomplish the U.S. federal government's vision of "educating Indian girls in the hope that women trained as good housewives would help their mates assimilate" into U.S. mainstream culture.

Historian Brenda Child asserts that boarding schools cultivated pan-Indian-ism and made possible cross-tribal coalitions that helped many different tribes collaborate in the later 20th century. She argues:

People formerly separated by language, culture, and geography lived and worked together in residential schools. Students formed close bonds and enjoyed a rich cross-cultural change. Graduates of government schools often married former classmates, found employment in the Indian Service, migrated to urban areas, returned to their reservations and entered tribal politics. Countless new alliances, both personal and political, were forged in government boarding schools.

Jacqueline Emery, introducing an anthology of boarding school writings, suggests that these writings prove that the children showed a cultural and personal resilience "more common among boarding school students than one might think". Although school authorities censored the material, it demonstrates multiple methods of resistance to school regimes. Several students educated in boarding schools, such as Gertrude Bonnin, Angel De Cora, Francis La Flesche, and Laura Cornelius Kellogg, became highly educated and were precursors to modern Indigenous activists.

After release or graduation from Indian boarding schools, students were expected to return to their tribes and induce European assimilation there. Many students who returned to their reservations experienced alienation, language and cultural barriers, and confusion, in addition to posttraumatic stress disorder and the legacy of trauma from abuse. They struggled to respect elders, but also met resistance from family and friends when trying to initiate Anglo-American changes.

Everyone in these boarding schools faced hardship but that did not stop them from building their foundation of resistance. Native students utilized what was taught at school to speak up and perform activism. They were very intelligent and resourceful to become knowledgeable in activist and political works. Forcibly being removed from their families, many put up a stance to refuse their kids to be kidnapped from them by hiding them and encouraging them to run away. It has not always been successful but it was a form of resistance that was present during this period.

As mentioned by historians Brian Klopotek and Brenda Child, "A remote Indian population living in Northern Minnesota who, in 1900, took a radical position against the construction of a government school." This Indigenous population is rather known as the Ojibwe people showed hostility to construction happening on their land by expressing armed resistance. The Ojibwe men stood as armed guards surrounding the construction workers and their building indicating the workmen were not welcomed to build on their region of living. This type of armed resistance was common throughout Native society during the boarding school period. Many indigenous communities expressed this rebellion throughout their stolen land.

A famous resistance tactic used by these students in boarding schools was speaking and responding back in their mother tongue. These schools stressed the importance of enforcing the extinction of their first language and adapting to English. Speaking their language symbolized a bond that strictly attached them closer still to their culture. Speaking their mother tongue resulted in physical abuse which was feared but resistance continued in this form to cause frustration. They wanted to show that their roots are deeply rooted in them and cannot be replaced with force. Another form of resistance they used was misbehavior, giving their staff a very hard time. This meant acting very foolish making it hard for them to be handled. Misbehaving meant consistently breaking the rules, acting out of character, and starting fires or fights. This was all an act and a habit to be kicked out of the boarding school and in hopes to be sent home. They wanted to be a huge headache enough to not suffer abuse but to be expelled. Resistance was a form of courage used to go against these boarding schools. These efforts were inspired by each other and from times of colonization. It was a way to keep their mother tongue, culture, and Native identities still attached and restored to civilization. Using resistance tactics helped slow down the intelligence of American culture being understood and taught.

The ongoing effects this event had brought within indigenous communities was hardly forgivable by these various groups. "According to Mary Annette Pember, whose mother was forced to attend St. Mary's Catholic Boarding school in Wisconsin, her mother often recollected, "the beatings, the shaming, and the withholding of food (P.15)." done by the nuns. Thus her mothers lasting effects, the traumatic effects that boarding schools have had continue for generations of Native people who never attended the schools such as families members with surviving and missing loved ones.

When faculty visited former students, they rated their success based on the following criteria: "orderly households, 'citizen's dress', Christian weddings, 'well-kept' babies, land in severalty, children in school, industrious work habits, and leadership roles in promoting the same 'civilized' lifestyles among family and tribe". Many students returned to the boarding schools. General Richard Henry Pratt, an administrator who had founded the Carlisle Indian Industrial School, began to believe that "[t]o civilize the Indian, get him into civilization. To keep him civilized, let him stay."

Schools in mid-20th century and later changes

Attendance in Indian boarding schools generally increased throughout the first half of the 20th century, doubling by the 1960s. In 1969, the BIA operated 226 schools in 17 states, including on reservations and in remote geographical areas. Some 77 were boarding schools. A total of 34,605 children were enrolled in the boarding schools; 15,450 in BIA day schools; and 3854 were housed in dormitories "while attending public schools with BIA financial support. In addition, 62,676 Indian youngsters attend public schools supported by the Johnson-O'Malley Act, which is administered by BIA."

Enrollment reached its highest point in the 1970s. In 1973, 60,000 American Indian children are estimated to have been enrolled in an Indian boarding school.

The rise of pan-Indian activism, tribal nations' continuing complaints about the schools, and studies in the late 1960s and mid-1970s (such as the Kennedy Report of 1969 and the National Study of American Indian Education) led to passage of the Indian Self-Determination and Education Assistance Act of 1975. This emphasized authorizing tribes to contract with federal agencies in order to take over management of programs such as education. It also enabled the tribes to establish community schools for their children on their reservations.

In 1978, Congress passed and the President signed the Indian Child Welfare Act, giving Native American parents the legal right to refuse their child's placement in a school. Damning evidence related to years of abuses of students in off-reservation boarding schools contributed to the enactment of the Indian Child Welfare Act. Congress approved this act after hearing testimony about life in Indian boarding schools.

As a result of these changes, many large Indian boarding schools closed in the 1980s and early 1990s.[citation needed] Some located on reservations were taken over by tribes. By 2007, the number of American Indian children living in Indian boarding school dormitories had declined to 9,500. This figure includes those in 45 on-reservation boarding schools, seven off-reservation boarding schools, and 14 peripheral dormitories. From 1879 to the present day, it is estimated that hundreds of thousands of Native Americans attended Indian boarding schools as children.

In the early 21st century, about two dozen off-reservation boarding schools still operate, but funding for them has declined.

Native American tribes developed one of the first women's colleges.

21st century

Circa 2020, the Bureau of Indian Education operates approximately 183 schools, primarily non-boarding, and primarily located on reservations. The schools have 46,000 students. Modern criticisms focus on the quality of education provided and compliance with federal education standards. In March 2020 the BIA finalized a rule to create Standards, Assessments and Accountability System (SAAS) for all BIA schools. The motivation behind the rule is to prepare BIA students to be ready for college and careers.

Books about Native American boarding schools

Movies and documentaries about Native American boarding schools

Person–environment fit

From Wikipedia, the free encyclopedia

Person–environment fit (P–E fit) is the degree to which individual and environmental characteristics match. Person characteristics may include an individual's biological or psychological needs, values, goals, abilities, or personality, while environmental characteristics could include intrinsic and extrinsic rewards, demands of a job or role, cultural values, or characteristics of other individuals and collectives in the person's social environment. Due to its important implications in the workplace, person–environment fit has maintained a prominent position in Industrial and organizational psychology and related fields.

Person–environment fit can be understood as a specific type of person–situation interaction that involves the match between corresponding person and environment dimensions. Even though person–situation interactions as they relate to fit have been discussed in the scientific literature for decades, the field has yet to reach consensus on how to conceptualize and operationalize person–environment fit. This is due partly to the fact that person–environment fit encompasses a number of subsets, such as person–supervisor fit and person–job fit, which are conceptually distinct from one another. There has been a long debate about the relative importance of the person versus the situation in terming human behavior. One group researchers have argued that it is the situation which primarily responsible for individual behaviors, while another group of searchers believe that the personal characteristics are primary responsible for behavior. Nevertheless, it is generally assumed that person–environment fit leads to positive outcomes, such as satisfaction, performance, and overall well-being.

Domains

Person–organization fit

Person–organization fit (P–O fit) is the most widely studied area of person–environment fit, and is defined by Kristof (1996) as, "the compatibility between people and organizations that occurs when (a) at least one entity provides what the other needs, (b) they share similar fundamental characteristics, or (c) both". High value congruence is a large facet of person–organization fit, which implies a strong culture and shared values among coworkers. This can translate to increased levels of trust and a shared sense of corporate community. This high value congruence would in turn reap benefits for the organization itself, including reduced turnover, increased citizenship behaviors, and organizational commitment. The attraction–selection–attrition theory states that individuals are attracted to and seek to work for organizations where they perceive high levels of person–organization fit. A strong person–organization fit can also lead to reduced turnover and increased organizational citizenship behaviors.

Person–job fit

Person–job fit, or P–J fit, refers to the compatibility between a person's characteristics and those of a specific job. The complementary perspective has been the foundation for person–job fit. This includes the traditional view of selection that emphasizes the matching of employee KSAs and other qualities to job demands. The discrepancy models of job satisfaction and stress that focus on employees’ needs and desires being met by the supplies provided by their job.

Person–group fit

Person–group fit, or P–G fit, is a relatively new topic with regard to person–environment fit. Since person–group fit is so new, limited research has been conducted to demonstrate how the psychological compatibility between coworkers influences individual outcomes in group situations. However, person–group fit is most strongly related to group-oriented outcomes like co-worker satisfaction and feelings of cohesion.

Person–person fit

Person–person fit is conceptualized as the fit between an individual's culture preferences and those preferences of others. It corresponds to the similarity-attraction hypothesis which states people are drawn to similar others based on their values, attitudes, and opinions. The most studied types are mentors and protégés, supervisors and subordinates, or even applicants and recruiters. Research has shown that person–supervisor fit is most strongly related to supervisor-oriented outcomes like supervisor satisfaction.

Antecedents

Training and development

Training and development on the job can be used to update or enhance skills or knowledge so employees are more in tune with the requirements and demands of their jobs, or to prepare them to make the transition into new ones. Training can be used as a socialization method, or as a way of making the employee aware of the organization's desired values, which would aid in increasing person–organization fit. As people learn about the organization they are working for through either company-initiated or self-initiated socialization, they should be able to be more accurate in their appraisal of fit or misfit. Furthermore, there is evidence that employees come to identify with their organization over time by mirroring its values, and socialization is a critical part of this process.

Performance appraisal

In the workplace, performance appraisal and recognition or rewards can be used to stimulate skill-building and knowledge enhancement, which would thereby enhance person–job fit. Expanding upon this notion, Cable and Judge (1994) showed that compensation systems have a direct effect on job search decisions, and additionally, the effects of compensation systems on job search decisions are strengthened when the applicant's personality characteristics fit with the various components of the compensation system. When an employer's aim is to strengthen person–organization fit, they can use performance appraisal to focus on an employee's value and goal congruence, and ensure the individual's goals are in line with the company's goals.

On a group-level, organizations could evaluate the achievement of a group or team goal. Recognizing and supporting this achievement would build trust in the idea that everyone is contributing to the collective for the greater good, and aid in increasing person–group fit.

Attraction–selection–attrition processes

Schneider (1987) proposed attraction–selection–attrition (ASA) model which addresses how attraction, selection and attrition could generate high levels of fit in an organization. The model is based on the proposition that it is the collective characteristics that define an organization. As a result, through the ASA process, organizations become more homogeneous with respect to people in them.

The attraction process of the model explains how employees find organizations attractive when they see congruence between characteristics of themselves and values of the organizations. The next step in ASA process is formal or informal selection procedures used by the organization during recruitment and hiring of applicants that fit the organization.

From the employee life cycle, recruitment and selection are the first stages that are taken into account when considering person–environment fit. The complementary model would posit that selection processes may work in part to select individuals whose values are compatible with the values of the organization, and screening out those whose values are incompatible. Additionally, in accordance with supplementary fit models, an applicant will seek out and apply to organizations that they feel represent the values that he or she may have. This theory is exemplified through a study by Bretz and Judge (1994), which found that individuals who scored high on team orientation measures were likely to pick an organization that had good work–family policies in place. Along this same vein, when job searching, applicants will look for job characteristics such as the amount of participation they will have, autonomy, and the overall design of the job. These characteristics are shown to be significantly and positively related to person–organization and person–job fit, which is positively associated the measurement of job satisfaction one year after entry.

The last process in ASA model is attrition, which outlines that the misfitting employee would be more likely to make errors once hired, and therefore leave the organization. Thus, the people who do not fit choose or are forced to leave, and the people remaining are a more homogeneous group than those who were originally hired, which should then result in higher levels of fit for individuals in an organization.

Lastly, the research suggests that for a better fit between an employee and a job, organization, or group to be more probable, it is important to spend an adequate amount of time with the applicant. This is because spending time with members before they enter the firm has been found to be positively associated with the alignment between individual values and firm values at entry. Furthermore, if there are more extensive HR practices in place in the selection phase of hiring, then people are more likely to report that they experience better fits with their job and the organization as a whole.

Consequences

There are few studies that have taken upon the task of trying to synthesize the different types of fit in order to draw significant conclusions about the true impact of fit on individual-level outcomes. However, some progress has been made, but most of the existing reviews have been non-quantitative, undifferentiated between various types of fit, or focused solely on single types of person–environment fit.

Person–environment fit has been linked to a number of affective outcomes, including job satisfaction, organizational commitment, and intent to quit. Among which, job satisfaction is the attitude most strongly predicted by person–job fit. Stress has also been demonstrated as a consequence of poor person–environment fit, especially in the absence of the complementary fit dimension. Since main effects of E are often greater than those of P, making insufficient supplies (P > E) is more detrimental for attitudes than excess supplies (P < E).

Assessing fit

Direct measures

Compatibility between the person and the environment can be assessed directly or indirectly, depending on the measure. Direct measures of perceived fit are typically used when person-environment fit is conceptualized as general compatibility. These measures ask an individual to report the fit that he or she believes exists. Examples of questions in direct measures are “How well do you think you fit in the organization?” or “How well do your skills match the requirements of your job?” An assumption is made such that individuals assess P and E characteristics and then determine how compatible they are. Although research has shown that these judgements are highly related to job attitudes, they have been criticized because they confound the independent effects of the person and the environment with their joint effect and do not adequately capture the psychological process by which people compare themselves to the environment.

Indirect measures

Indirect measures assess the person and environment separately. These measures are then used to compute an index intended to represent the fit between the person and environment, such as an algebraic, absolute, or squared difference score, or are analyzed jointly to assess the effects of fit without computing a difference score. Characteristics of the person are generally measured through self-report while characteristics of the environment can be reported by the person or by others in the person's environment. French et al. (1974, 1982) differentiated subjective fit, which are the match between P and E as they perceived by employees, from the objective fit, which is the match between P and E as distinct from the person's perception (French et al. 1974; French et al. 1982).

Difference Scores and Profile Correlation

Up until the 1990s, studies using indirect measures of the person and environment typically operationalized fit by combining the measures into a single index representing the difference between the person and environment. Despite their intuitive appeal, difference scores are plagued with numerous conceptual and methodological problems, such as reduced reliability, conceptual ambiguity, confounded effects, untested constraints, and reducing an inherently three-dimensional relationship between the person, the environment, and the outcome to two dimensions. These problems undermine the interpretation of the results of person-environment fit studies that rely on difference scores. Similar problems apply to studies that operationalize fit using profile similarity indices that compare the person and environment on multiple dimensions.

Polynomial regression

Many of the problems with difference scores and profile similarity indices can be avoided by using polynomial regression. Polynomial regression involves using measures of the person and environment along with relevant higher-order terms (e.g., the squares and product of the person and environment measures) as joint predictors. In addition to avoiding problems with difference scores, polynomial regression allows for the development and testing of hypotheses that go beyond the simple functions captured by difference scores. The polynomial regression equation commonly used in person-environment fit research is as follows:

In this equation, E represents the environment, P represents the person, and Z is the outcome (e.g., satisfaction, well-being, performance). By retaining E, P, and Z as separate variables, results from polynomial regression equations can be translated into three-dimensional surfaces, whose properties can be formally tested using procedures set forth by Edwards and Parry. Studies using polynomial regression have found that the restrictive assumptions underlying difference scores are usually rejected, such that the relationship of the person and environment to outcomes is more complex than the simplified functions represented by difference scores. These findings have provided a foundation for developing fit hypotheses that are more refined than those considered in prior research, such as considering whether the effects of misfit are asymmetric and whether outcomes depend on the absolute levels of the person and environment (e.g., the effects of fit between actual and desired job complexity are likely to vary depending on whether job complexity is low or high).

Contributing theories

Supplementary fit

Supplementary fit refers to the similarity between characteristics of a person and characteristics of the environment, or other persons within the environment. Based on compatibility that derives from similarity, a person fits into some environmental context because he/she supplements, embellishes, or possesses characteristics that are similar to other individuals in the environment. People perceive themselves as fitting in because they are like or similar to other people possessing the same characteristics. Therefore, it is essentially a model of a person-person fit.

Complementary fit

Complementary fit occurs when a person's characteristics "make whole" the environment or add to it what is missing. When individuals and environments complement one another by addressing each other's needs, such as when an environment provides opportunities for achievement that are concordant with the individuals’ needs for achievement or when an individual with exceptional problem solving skills is in an environment that is in turmoil. Piasentin and Chapman (2007) found that only a small portion of the workforce perceive fit due to complementary while most view fit as supplementary (resulting from being similar to others).

The second dimension of complementary fit is needs supplies fit and demands-abilities fit. Needs-supplies fit occurs when an environment satisfies individuals' needs, desires, or preference. Demands-abilities fit occurs when an individuals has the abilities required to meet environmental demands.

The third dimension of perceived versus actual distinction. Perceived fit is typically measured by explicitly asking people to what degree they believe a fit exist. Good fit said to exist as long as it is perceived to exist, regardless of whether or not the person has similar characteristics to, complements, or is complemented by the environment. Actual fit is measured by comparing characteristics at two levels, namely the individual and environment.

Implications for practice

Person–environment fit has important implications for organizations because it is critical for them to establish and maintain a “good fit” between people and their jobs. Companies use a substantial amount of resources when recruiting new employees, and it is crucial for them to ensure that these new hires will align with the environment they are thrust into. Furthermore, it has been theorized that person–environment fit can mediate the relation of group-specific workplace experiences with job outcomes.

Evolutionary mismatch

From Wikipedia, the free encyclopedia

Evolutionary mismatch, also known as mismatch theory or evolutionary trap, is a concept in evolutionary biology that refers to evolved traits that were once advantageous but became maladaptive due to changes in the environment. This can take place in humans and animals and is often attributed to rapid environmental change.

Timeline showing a period of mismatch following an environmental change.

Mismatch theory represents the idea that traits that evolved in an organism in one environment can be disadvantageous in a different environment. This environmental change leading to evolutionary mismatch can be broken down into two major categories: temporal (change of the existing environment over time, e.g. a climate change) or spatial (placing organisms into a new environment, e.g. a population migrating). Since environmental change occurs naturally and constantly, there will certainly be examples of evolutionary mismatch over time. However, because large-scale natural environmental change – like a natural disaster – is often rare, it is less often observed. Another more prevalent kind of environmental change is anthropogenic (human-caused). In recent times, humans have had a large, rapid, and trackable impact on the environment, thus creating scenarios where it is easier to observe evolutionary mismatch.

Because of the mechanism of evolution by natural selection, the environment ("nature") determines ("selects") which traits will persist in a population. Therefore, there will be a gradual weeding out of disadvantageous traits over several generations as the population becomes more adapted to its environment. Any significant change in a population's traits that cannot be attributed to other factors (such as genetic drift and mutation) will be responsive to a change in that population's environment; in other words, natural selection is inherently reactive. Shortly following an environmental change, traits that evolved in the previous environment, whether they were advantageous or neutral, are persistent for several generations in the new environment. Because evolution is gradual and environmental changes often occur very quickly on a geological scale, there is always a period of "catching-up" as the population evolves to become adapted to the environment. It is this temporary period of "disequilibrium" that is referred to as mismatch. Mismatched traits are ultimately addressed in one of several possible ways: the organism may evolve such that the maladaptive trait is no longer expressed, the organism may decline and/or become extinct as a result of the disadvantageous trait, or the environment may change such that the trait is no longer selected against.

History

As evolutionary thought became more prevalent, scientists studied and attempted to explain the existence of disadvantageous traits, known as maladaptations, that are the basis of evolutionary mismatch.

The theory of evolutionary mismatch began under the term evolutionary trap as early as the 1940s. In his 1942 book, evolutionary biologist Ernst Mayr described evolutionary traps as the phenomenon that occurs when a genetically uniform population suited for a single set of environmental conditions is susceptible to extinction from sudden environment changes. Since then, key scientists such as Warren J. Gross and Edward O. Wilson have studied and identified numerous examples of evolutionary traps.

The first occurrence of the term "evolutionary mismatch" may have been in a paper by Jack E. Riggs published in the Journal of Clinical Epidemiology in 1993. In the years to follow, the term evolutionary mismatch has become widely used to describe biological maladaptations in a wide range of disciplines. A coalition of modern scientists and community organizers assembled to found the Evolution Institute in 2008, and in 2011 published a more recent culmination of information on evolutionary mismatch theory in an article by Elisabeth Lloyd, David Sloan Wilson, and Elliott Sober. In 2018 a popular science book appeared by evolutionary psychologists on evolutionary mismatch and the implications for humans

Mismatch in human evolution

The Neolithic Revolution: transitional context

The Neolithic Revolution brought about significant evolutionary changes in humans; namely the transition from a hunter-gatherer lifestyle, in which humans foraged for food, to an agricultural lifestyle. This change occurred approximately 10,000–12,000 years ago. Humans began to domesticate both plants and animals, allowing for the maintenance of constant food resources. This transition quickly and dramatically changed the way that humans interact with the environment, with societies taking up practices of farming and animal husbandry. However, human bodies had evolved to be adapted to their previous foraging lifestyle. The slow pace of evolution in comparison with the very fast pace of human advancement allowed for the persistence of these adaptations in an environment where they are no longer necessary. In some human societies that now function in a vastly different way from the hunter-gatherer lifestyle, these outdated adaptations now lead to the presence of maladaptive, or mismatched, traits.

Obesity and diabetes

Human bodies are predisposed to maintain homeostasis, especially when storing energy as fat. This trait serves as the main basis for the "thrifty gene hypothesis", the idea that "feast-or-famine conditions during human evolutionary development naturally selected for people whose bodies were efficient in their use of food calories". Hunter-gatherers, who used to live under environmental stress, benefit from this trait; there was an uncertainty of when the next meal would be, and they would spend most of their time performing high levels of physical activity. Therefore, those that consumed many calories would store the extra energy as fat, which they could draw upon in times of hunger.

However, modern humans have evolved to a world of more sedentary lifestyles and convenience foods. People are sitting more throughout their days, whether it be in their cars during rush hour or in their cubicles during their full-time jobs. Less physical activity in general means fewer calories burned throughout the day. Human diets have changed considerably over the 10,000 years since the advent of agriculture, with more processed foods in their diets that lack nutritional value and lead them to consume more sodium, sugar, and fat. These high calorie, nutrient-deficient foods cause people to consume more calories than they burn. Fast food combined with decreased physical activity means that the "thrifty gene" that once benefit human predecessors now works against them, causing their bodies to store more fat and leading to higher levels of obesity in the population.

Obesity is one consequence of mismatched genes. Known as "metabolic syndrome", this condition is also associated with other health concerns, including insulin resistance, where the body no longer responds to insulin secretion, so blood glucose levels are unable to be lowered, which can lead to type 2 diabetes.

Osteoporosis

Another human disorder that can be explained by mismatch theory is the rise in osteoporosis in modern humans. In advanced societies, many people, especially women, are remarkably susceptible to osteoporosis during aging. Fossil evidence has suggested that this was not always the case, with bones from elderly hunter-gatherer women often showing no evidence of osteoporosis. Evolutionary biologists have posited that the increase in osteoporosis in modern Western populations is likely due to our considerably sedentary lifestyles. Women in hunter-gatherer societies were physically active both from a young age and well into their late-adult lives. This constant physical activity likely lead to peak bone mass being considerably higher in hunter-gatherer humans than in modern-day humans. While the pattern of bone mass degradation during aging is purportedly the same for both hunter-gatherers and modern humans, the higher peak bone mass associated with more physical activity may have led hunter-gatherers to be able to develop a propensity to avoid osteoporosis during aging.

Hygiene hypothesis

The hygiene hypothesis, a concept initially theorized by immunologists and epidemiologists, has been proved to have a strong connection with evolutionary mismatch through recent year studies. Hygiene hypothesis states that the profound increase in allergies, autoimmune diseases, and some other chronic inflammatory diseases is related to the reduced exposure of the immune system to antigens. Such reduced exposure is more common in industrialized countries and especially urban areas, where the inflammatory chronic diseases are also more frequently seen. Recent analysis and studies have tied the hygiene hypothesis and evolutionary mismatch together. Some researchers suggest that the overly sterilized urban environment changes or depletes the microbiota composition and diversity. Such environmental conditions favor the development of the inflammatory chronic diseases because human bodies have been selected to adapt to a pathogen-rich environment in the history of evolution. For example, studies have shown that change in our symbiont community can lead to the disorder of immune homeostasis, which can be used to explain why antibiotic use in early childhood can result in higher asthma risk. Because the change or depletion of the microbiome is often associated with hygiene hypothesis, the hypothesis is sometimes also called "biome depletion theory".

Human behavior

Behavioral examples of evolutionary mismatch theory include the abuse of dopaminergic pathways and the reward system. An action or behavior that stimulates the release of dopamine, a neurotransmitter known for generating a sense of pleasure, will likely be repeated since the brain is programmed to continually seek such pleasure. In hunter-gatherer societies, this reward system was beneficial for survival and reproductive success. But now, when there are fewer challenges to survival and reproducing, certain activities in the present environment (gambling, drug use, eating) exploit this system, leading to addictive behaviors.

Anxiety

The mismatch theory is when the brain has not evolved in the current environment, therefore keeping old traits that would be counterintuitive to human survival. An immediate return environment is when decisions made in the present create immediate results. Prehistoric human brains have evolved to assimilate to this particular environment; creating reactions such as anxiety to solve short-term problems. For example, the fear of a predator stalking a human, causes the human to run away consequently immediately ensuring the safety of the human as the distance increases from the predator. However, humans currently live in a different environment called the delayed reaction environment. In this environment, current decisions do not create immediate results. The advancement of society has reduced the threat of external factors such as predators, lack of food,shelter, etc. therefore human problems that once circulated around current survival have changed into how the present will affect the quality of future survival. In summation, traits like anxiety have become outdated as the advancement of society has allowed humans to no longer be under constant threat and instead worry about the future. 

Work stress

Examples of evolutionary mismatch also occur in the modern workplace. Unlike our hunter-gatherer ancestors who lived in small egalitarian societies, the modern work place is large, complex, and hierarchical. Humans spend significant amounts of time interacting with strangers in conditions that are very different from those of our ancestral past. Hunter-gatherers do not separate work from their private lives, they have no bosses to be accountable to, or no deadlines to adhere to. Our stress system reacts to immediate threats and opportunities. The modern workplace exploits evolved psychological mechanisms that are aimed at immediate survival or longer-term reproduction. These basic instincts misfire in the modern workplace, causing conflicts at work, burnout, job alienation and poor management practices.

Gambling

There are two aspects of gambling that make it an addictive activity: chance and risk. Chance gives gambling its novelty. Back when humans had to forage and hunt for food, novelty-seeking was advantageous for them, particularly for their diet. However, with the development of casinos, this trait of pursuing novelties has become disadvantageous. Risk assessment, the other behavioral trait applicable to gambling, was also beneficial to hunter-gatherers in the face of danger. However, the types of risks hunter-gatherers had to assess are significantly different and more life-threatening than the risks people now face. The attraction to gambling stems from the attraction to risk and reward related activity.

Drug addiction

Herbivores have created selective pressure for plants to possess specific molecules that deter plant consumption, such as nicotine, morphine, and cocaine. Plant-based drugs, however, have reinforcing and rewarding effects on the human neurological system, suggesting a "paradox of drug reward" in humans. Human behavioral evolutionary mismatch explains the contradiction between plant evolution and human drug use. In the last 10,000 years, humans found the dopaminergic system, or reward system, particularly useful in optimizing Darwinian fitness. While drug use has been a common characteristic of past human populations, drug use involving potent substances and diverse intake methods is a relatively contemporary feature of society. Human ancestors lived in an environment that lacked drug use of this nature, so the reward system was primarily used in maximizing survival and reproductive success. In contrast, present-day humans live in a world where the current nature of drugs render the reward system maladaptive. This class of drugs falsely triggers a fitness benefit in the reward system, leaving people susceptible to drug addiction. The modern-day dopaminergic system presents vulnerabilities to the difference in accessibility and social perception of drugs.

Eating

In the era of foraging for food, hunter-gatherers rarely knew where their next meal would come from. This food scarcity rewarded consumption of high energy meals in order to save excess energy as fat. Now that food is readily available, the neurological system that once helped people recognize the survival advantages of essential eating has now become disadvantageous as it promotes overeating. This has become especially dangerous after the rise of processed foods, as the popularity of foods that have unnaturally high levels of sugar and fat has significantly increased.

Non-human examples

Evolutionary mismatch can occur any time an organism is exposed to an environment that does not resemble the typical environment the organism adapted in. Due to human influences, such as global warming and habitat destruction, the environment is changing very rapidly for many organisms, leading to numerous cases of evolutionary mismatch.

Examples with human influence

Sea turtles and light pollution

Female sea turtles create nests to lay their eggs by digging a pit on the beach, typically between the high tide line and dune, using their rear flippers. Consequently, within the first seven days of hatching, hatchling sea turtles must make the journey from the nest back into the ocean. This trip occurs predominantly at night in order to avoid predators and overheating.

Hatchling sea turtles must make their way back into the ocean.

In order to orient themselves towards the ocean, the hatchlings depend on their eyes to turn towards the brightest direction. This is because the open horizon of the ocean, illuminated by celestial light, tends to be much brighter in a natural undeveloped beach than the dunes and vegetation. Studies propose two mechanisms of the eye for this phenomenon. Referred to as the "raster system", the theory is that sea turtles' eyes contain numerous light sensors which take in the overall brightness information of a general area and make a "measurement" of where the light is most intense. If the light sensors detect the most intense light on a hatchling's left side, the sea turtle would turn left. A similar proposal called the complex phototropotaxis system theorizes that the eyes contain light intensity comparators that take in detailed information of the intensity of light from all directions. Sea turtles are able to "know" that they are facing the brightest direction when the light intensity is balanced between both eyes.

This method of finding the ocean is successful in natural beaches, but in developed beaches, the intense artificial lights from buildings, light houses, and even abandoned fires overwhelm the sea turtles and cause them to head towards the artificial light instead of the ocean. Scientists call this misorientation. Sea turtles can also become disoriented and circle around in the same place. Numerous cases show that misoriented hatchling sea turtles either die from dehydration, get consumed by a predator, or even burn to death in an abandoned fire. The direct impact of light pollution on the number of sea turtles has been too difficult to measure. However, this problem is exacerbated because all species of sea turtles are endangered. Other animals, including migratory birds and insects, are also victims to light pollution because they also depend on light intensity at night to properly orient themselves.

Dodo bird and hunting

Dodo birds became completely extinct due to hunting.

The Dodo bird lived on a remote Island, Mauritius, in the absence of predators. Here, the Dodo evolved to lose its instinct for fear and the ability to fly. This allowed them to be easily hunted by Dutch sailors who arrived on the island in the late 16th century. The Dutch sailors also brought foreign animals to the island such as monkeys and pigs that ate the Dodo bird's eggs, which was detrimental to the population growth of the slow breeding bird. Their fearlessness made them easy targets and their inability to fly gave them no opportunity to evade danger. Thus, they were easily driven to extinction within a century of their discovery.

The Dodo's inability to fly was once beneficial for the bird because it conserved energy. The Dodo conserved more energy relative to birds with the ability to fly, due to the Dodo's smaller pectoral muscles. Smaller muscle sizes are linked to lower rates of maintenance metabolism, which in turn conserves energy for the Dodo. Lacking an instinct for fear was another mechanism through which the Dodo conserved energy because it never had to expend any energy for a stress response. Both mechanisms of conservation of energy was once advantageous because it enabled the Dodo to execute activities with minimal energy expenditure. However, these proved disadvantageous when their island was invaded, rendering them defenseless to the new dangers that humans brought.

Peppered moths during the English Industrial Revolution

Before the English Industrial Revolution of the late 18th and early 19th centuries the most common phenotypic color of the peppered moth was white with black speckles. However, that changed when the Industrial Revolution produced high levels of pollution. Due to the Industrial Revolution the trees blackened in urban regions, causing the original phenotype to stand out significantly more to predators. Natural selection then began favoring the rare dark peppered carbonaria moth in order for the species to camouflage and prevent attacks. The dark moth's population expanded rapidly and by the 1950s vast amounts of England saw carbonaria frequencies rise above 90%. The once favorable white speckled phenotype quickly became mismatched in the new environment.

However, in the late 1900s, the English made efforts to reduce air pollution, causing the trees to turn back to their normal shade. The change in color led the dark skin phenotype to revert from beneficial to disadvantageous. Once again, the moth was not able to adapt fast enough to the changing environment and thus the carbonaria phenotype became mismatched. Since the trees' return to their natural color caused the original phenotype to become advantageous again since it allowed the peppered moth to hide from predators.

Giant jewel beetle and beer bottles

The jewel beetle has a shiny, brown exterior similar to that of a beer bottle

Evolutionary mismatch can also be seen among insects. One such example is in the case of the giant jewel beetle (Julodimorpha bakewelli). The male jewel beetle has evolved to be attracted to features of the female jewel beetle that allow the male to identify a female jewel beetle as it flies across the desert. These features include size, color, and texture. However, these physical traits are seen manifested in some beer bottles as well. As a result, males often consider beer bottles more attractive than female jewel beetles due to the beer bottle's large size and attractive coloring. Beer bottles are often discarded by humans in the Australian desert that the jewel beetle thrives in, creating an environment where male jewel beetles prefer to mate with beer bottles instead of females. This is a situation that is extremely disadvantageous as it reduces the reproductive output of the jewel beetle as fewer beetles are mating. This condition can be considered an evolutionary mismatch, as a habit that evolved to aid in reproduction has become disadvantageous due to the littering of beer bottles, an anthropogenic cause.

Examples without human influence

Information cascades between birds

A group of Nutmeg Mannikins at a bird feeder
A group of Nutmeg Mannikins at a bird feeder

Normally, gaining information from watching other organisms allows the observer to make good decisions without spending effort. More specifically, birds often observe the behavior of other organisms to gain valuable information, such as the presence of predators, good breeding sites, and optimal feeding spots. Although this allows the observer to spend less effort gathering information, it can also lead to bad decisions if the information gained from observing is unreliable. In the case of the nutmeg mannikins, the observer can minimize the time spent looking for an optimal feeder and maximize its feeding time by watching where other nutmeg mannikins feed. However, this relies on the assumption that the observed mannikins also had reliable information that indicated the feeding spot was an ideal one. This behavior can become maladaptive when prioritizing information gained from watching others leads to information cascades, where birds follow the rest of the crowd even though prior experience may have suggested that the decision of the crowd is a poor one. For instance, if a nutmeg mannikin sees enough mannikins feeding at a feeder, nutmeg mannikins have been shown to choose that feeder even if their personal experience indicates that the feeder is a poor one.

House finches and the introduction of the MG disease

Evolutionary mismatch occurs in house finches when they are exposed to infectious individuals. Male house finches tend to feed in close proximity to other finches that are sick or diseased, because sick individuals are less competitive than usual, in turn making the healthy male more likely to win an aggressive interaction if it happens. To make it less likely to lose a social confrontation, healthy finches are inclined to forage near individuals that are lethargic or listless due to disease. However, this disposition has created an evolutionary trap for the finches after the introduction of the MG disease in 1994. Since this disease is infectious, healthy finches will be in danger of contraction if they are in the vicinity of individuals that have previously developed the disease. The relatively short duration of the disease's introduction has caused an inability for the finches to adapt quickly enough to avoid nearing sick individuals, which ultimately results in the mismatch between their behavior and the changing environment.

Exploitation of earthworm's reaction to vibrations

Worm charming is a practice used by people to attract earthworms out of the ground by driving in a wooden stake to vibrate the soil. This activity is commonly performed to collect fishing bait and as a competitive sport. Worms that sense the vibrations rise to the surface. Research shows that humans are actually taking advantage of a trait that worms adapted to avoid hungry burrowing moles which prey on the worms. This type of evolutionary trap, where an originally beneficial trait is exploited in order to catch prey, was coined the "rare enemy effect" by Richard Dawkins, an English evolutionary biologist. This trait of worms has been exploited not only by humans, but by other animals. Herring gulls and wood turtles have been observed to also stamp on the ground to drive the worms up to the surface and consume them.

Apocalypticism

From Wikipedia, the free encyclopedia ...