Search This Blog

Saturday, January 19, 2019

Molecular anthropology

From Wikipedia, the free encyclopedia

Molecular anthropology is a field of anthropology in which molecular analysis is used to determine evolutionary links between ancient and modern human populations, as well as between contemporary species. Generally, comparisons are made between sequences, either DNA or protein sequences; however, early studies used comparative serology.

By examining DNA sequences in different populations, scientists can determine the closeness of relationships between populations (or within populations). Certain similarities in genetic makeup let molecular anthropologists determine whether or not different groups of people belong to the same haplogroup, and thus if they share a common geographical origin. This is significant because it allows anthropologists to trace patterns of migration and settlement, which gives helpful insight as to how contemporary populations have formed and progressed over time.

Molecular anthropology has been extremely useful in establishing the evolutionary tree of humans and other primates, including closely related species like chimps and gorillas. While there are clearly many morphological similarities between humans and chimpanzees, for example, certain studies also have concluded that there is roughly a 98 percent commonality between the DNA of both species. However, more recent studies have modified the commonality of 98 percent to a commonality of 94 percent, showing that the genetic gap between humans and chimps is larger than originally thought. Such information is useful in searching for common ancestors and coming to a better understanding of how humans evolved.

Haploid loci in molecular anthropology

Image of mitochondrion. There are many mitochondria within a cell, and DNA in them replicates independently of the chromosomes in the nucleus.
 
There are two continuous linkage groups in humans that are carried by a single sex. The first is the Y chromosome, which is passed from father to son. Anatomical females carry a Y chromosome only rarely, as a result of genetic defect. The other linkage group is the mitochondrial DNA (mtDNA). MtDNA is almost always only passed to the next generation by females, but under highly exceptional circumstances mtDNA can be passed through males. The non-recombinant portion of the Y chromosome and the mtDNA, under normal circumstances, do not undergo productive recombination. Part of the Y chromosome can undergo recombination with the X chromosome and within ape history the boundary has changed. Such recombinant changes in the non-recombinant region of Y are extremely rare.

Mitochondrial DNA

Illustration of the human mitochondrial DNA with the control region (CR, in grey) containing hypervariable sequences I and II.
 
Mitochondrial DNA became an area of research in phylogenetics in the late 1970s. Unlike genomic DNA, it offered advantages in that it did not undergo recombination. The process of recombination, if frequent enough, corrupts the ability to create parsimonious trees because of stretches of amino acid subsititions (SNPs). When looking between distantly related species, recombination is less of a problem since recombination between branches from common ancestors is prevented after true speciation occurs. When examining closely related species, or branching within species, recombination creates a large number of 'irrelevant SNPs' for cladistic analysis. MtDNA, through the process of organelle division, became clonal over time; very little, or often none, of that paternal mtDNA is passed. While recombination may occur in mtDNA, there is little risk that it will be passed to the next generation. As a result, mtDNA become clonal copies of each other, except when a new mutation arises. As a result, mtDNA does not have pitfalls of autosomal loci when studied in interbreeding groups. Another advantage of mtDNA is that the hyper-variable regions evolve very quickly; this shows that certain regions of mitochondrial DNA approach neutrality. This allowed the use of mitochondrial DNA to determine that the relative age of the human population was small, having gone through a recent constriction at about 150,000 years ago.

Mitochondrial DNA has also been used to verify the proximity of chimpanzees to humans relative to gorillas, and to verify the relationship of these three species relative to the orangutan. 

A population bottleneck, as illustrated was detected by intrahuman mtDNA phylogenetic studies; the length of the bottleneck itself is indeterminate per mtDNA.
 
More recently, the mtDNA genome has been used to estimate branching patterns in peoples around the world, such as when the new world was settled and how. The problem with these studies have been that they rely heavily on mutations in the coding region. Researchers have increasingly discovered that as humans moved from Africa's south-eastern regions, that more mutations accumulated in the coding region than expected, and in passage to the new world some groups are believed to have passed from the Asian tropics to Siberia to an ancient land region called Beringia and quickly migrated to South America. Many of the mtDNA have far more mutations and at rarely mutated coding sites relative to expectations of neutral mutations. 

Mitochondrial DNA offers another advantage over autosomal DNA. There are generally 2 to 4 copies of each chromosome in each cell (1 to 2 from each parent chromosome). For mtDNA there can be dozens to hundreds in each cell. This increases the amount of each mtDNA loci by at least a magnitude. For ancient DNA, in which the DNA is highly degraded, the number of copies of DNA is helpful in extending and bridging short fragments together, and decreases the amount of bone extracted from highly valuable fossil/ancient remains. Unlike Y chromosome, both male and female remains carry mtDNA in roughly equal quantities.

Schematic of typical animal cell, showing subcellular components. Organelles: (1) nucleolus (2) nucleus (9) mitochondria

Y chromosome

Illustration of human Y chromosome
 
The Y chromosome is found in the nucleus of normal cells (nuclear DNA). Unlike mtDNA, it has mutations in the non-recombinant portion (NRY) of the chromosome spaced widely apart, so far apart that finding the mutations on new Y chromosomes is labor-intensive compared with mtDNA. Many studies rely on tandem repeats; however, tandem repeats can expand and retract rapidly and in some predictable patterns. The Y chromosome only tracks male lines, and is not found in females, whereas mtDNA can be traced in males even though they fail to pass on mtDNA. In addition, it has been estimated that effective male populations in the prehistoric period were typically two females per male, and recent studies show that cultural hegemony plays a large role in the passage of Y. This has created discordance between males and females for the Time to the Most Recent Common Ancestor (TMRCA). The estimates for Y TMRCA range from 1/4 to less than 1/2 that of mtDNA TMRCA. It is unclear whether this is due to high male-to-female ratios in the past coupled with repeat migrations from Africa, as a result of mutational rate change, or as some have even proposed that females of the LCA between chimps and humans continued to pass DNA millions after males ceased to pass DNA. At present the best evidence suggests that in migration the male to female ratio in humans may have declined, causing a trimming of Y diversity on multiple occasions within and outside of Africa.

Diagram of human X chromosome showing genetic map
 
For short-range molecular phylogenetics and molecular clocking, the Y chromosome is highly effective and creates a second perspective. One argument that arose was that the Maori by mtDNA appear to have migrated from Eastern China or Taiwan, by Y chromosome from the Papua New Guinea region. When HLA haplotypes were used to evaluate the two hypotheses, it was uncovered that both were right, that the Maori were an admixed population. Such admixtures appear to be common in the human population and thus the use of a single haploid loci can give a biased perspective.

X-linked studies

The X-chromosome is also a form of nuclear DNA. Since it is found as 1 copy in males and 2 non-identical chromosomes in females it has a ploidy of 1.5. However, in humans the effective ploidy is somewhat higher, ~1.7, as females in the breeding population have tended to outnumber males by 2:1 during a large portion of human prehistory. Like mtDNA, X-linked DNA tends to over emphasize female population history much more than male. There have been several studies of loci on X chromosome, in total 20 sites have been examined. These include PDHA1, PDHA1, Xq21.3, Xq13.3, Zfx, Fix, Il2rg, Plp, Gk, Ids, Alas2, Rrm2p4, AmeIX, Tnfsf5, Licam, and Msn. The time to most recent common ancestor (TMRCA) ranges from fixed to ~1.8 million years, with a median around 700ky. These studies roughly plot to the expected fixation distribution of alleles, given linkage disequilibrium between adjacent sites. For some alleles the point of origin is elusive, for others, the point of origin points toward Sub-Saharan Africa. There are some distinctions within SSA that suggest a smaller region, but there is not adequate enough sample size and coverage to define a place of most recent common ancestor. The TMRCA is consistent with and extends the bottleneck implied by mtDNA, confidently to about 500,000 years.

Autosomal loci

Diagram of human karyotype

Ancient DNA sequencing

Krings Neandertal mtDNA have been sequenced, and sequence similarity indicates an equally recent origin from a small population on the Neanderthal branch of late hominids. The MCR1 gene has also been sequenced but the results are controversial, with one study claiming that contamination issues cannot be resolved from human Neandertal similarities. Critically, however, no DNA sequence has been obtained from Homo erectus, Homo floriensis, or any of the other late hominids. Some of the ancient sequences obtained have highly probable errors, and proper control to avoid contamination. 

Comparison of differences between human and Neanderthal mtDNA

Causes of errors

The molecular phylogenetics is based on quantification substitutions and then comparing sequence with other species, there are several points in the process which create errors. The first and greatest challenge is finding "anchors" that allow the research to calibrate the system. In this example, there are 10 mutations between chimp and humans, but the researcher has no known fossils that are agreeably ancestral to both but not ancestral to the next species in the tree, gorilla. However, there are fossils believed to be ancestral to Orangutans and Humans, from about 14 million years ago. So that the researcher can use Orangutan and Human comparison and comes up with a difference of 24. Using this he can estimate (24/(14*2, the "2" is for the length of the branch to Human (14my) and the branch to Orangutan (14 my) from their last common ancestor (LCA). The mutation rate at 0.857 for a stretch of sequence. Mutation rates are given, however, as rate per nucleotide(nt)-site, so if the sequence were say 100 nt in length that rate would be 0.00857/nt per million years. Ten mutations*100nt/(0.00857*2) = 5.8 million years.

Problem of calibration

There are several problems not seen in the above. First, mutations occur as random events. Second, the chance that any site in the genome varies is different from the next site, a very good example is the codons for amino acids, the first two nt in a codon may mutate at 1 per billion years, but the third nt may mutate 1 per million years. Unless scientist study the sequence of a great many animals, particularly those close to the branch being examined, they generally do not know what the rate of mutation for a given site. Mutations do occur at 1st and 2nd positions of codons, but in most cases these mutations are under negative selection and so are removed from the population over small periods of time. In defining the rate of evolution in the anchor one has the problem that random mutation creates. For example, a rate of .005 or .010 can also explain 24 mutations according to the binomial probability distribution. Some of the mutations that did occur between the two have reverted, hiding an initially higher rate. Selection may play into this, a rare mutation may be selective at point X in time, but later climate may change or the species migrates and it is not longer selective, and pressure exerted on new mutations that revert the change, and sometimes the reversion of a nt can occur, the greater the distance between two species the more likely this is going to occur. In addition, from that ancestral species both species may randomly mutate a site to the same nucleotide. Many times this can be resolved by obtaining DNA samples from species in the branches, creating a parsimonious tree in which the order of mutation can be deduced, creating branch-length diagram. This diagram will then produce a more accurate estimate of mutations between two species. Statistically one can assign variance based on the problem of randomness, back mutations, and parallel mutations (homoplasies) in creating an error range.

There is another problem in calibration however that has defied statistical analysis. There is a true/false designation of a fossil to a least common ancestor. In reality the odds of having the least common ancestor of two extant species as an anchor is low, often that fossil already lies in one branch (underestimating the age), lies in a third branch (underestimating the age) or in the case of being within the LCA species, may have been millions of years older than the branch. To date the only way to assess this variance is to apply molecular phylogenetics on species claimed to be branch points. This only, however identifies the 'outlying' anchor points. And since it is more likely the more abundant fossils are younger than the branch point the outlying fossil may simply be a rare older representative. These unknowns create uncertainty that is difficult to quantify, and often not attempted.

Recent papers have been able to estimate, roughly, variance. The general trend as new fossils are discovered, is that the older fossils underestimated the age of the branch point. In addition to this dating of fossils has had a history of errors and there have been many revised datings. The age assigned by researchers to some major branch points have almost doubled in age over the last 30 years. An excellent example of this is the debate over LM3 (Mungo lake 3) in Australia. Originally it was dated to around 30 ky by carbon dating, carbon dating has problems, however, for sampled over 20ky in age, and severe problems for samples around 30ky in age. Another study looked at the fossil and estimated the age to be 62 ky in age.

At the point one has an estimation of mutation rate, given the above there must be two sources of variance that need to be cross-multiplied to generate an overall variance. This is infrequently done in the literature.

Problems in estimating TMRCA

Time to most recent common ancestor (TMRCA) combines the errors in calibration with errors in determining the age of a local branch.

History

Protein era

Structure of human hemoglobin. Hemoglobins from dozens of animals and even plants were sequenced in the 1960s and early 1970s
 
With DNA newly discovered as the genetic material, in the early 1960s protein sequencing was beginning to take off. Protein sequencing began on cytochrome C and Hemoglobin. Gerhard Braunitzer sequenced hemoglobin and myoglobin, in total more than hundreds of sequences from wide ranging species were done. In 1967 A.C. Wilson began to promote the idea of a "molecular clock". By 1969 molecular clocking was applied to anthropoid evolution and V. Sarich and A.C. Wilson found that albumin and hemoglobin has comparable rates of evolution, indicating chimps and humans split about 4 to 5 million years ago. In 1970, Louis Leakey confronted this conclusion with arguing for improper calibration of molecular clocks. By 1975 protein sequencing and comparative serology combined were used to propose that humans closest living relative (as a species) was the chimpanzee. In hindsight, the last common ancestor (LCA) from humans and chimps appears to older than the Sarich and Wilson estimate, but not as old as Leakey claimed, either. However, Leakey was correct in the divergence of old and new world monkeys, the value Sarich and wilson used was a significant underestimate. This error in prediction capability highlights a common theme.

DNA era

Restriction fragment length polymorphisms studies the cutting of mtDNA into fragments, Later the focus of PCR would be on the D 'control'-loop, at the top of the circle

RLFP and DNA hybridization

In 1979, W.M.Brown and Wilson began looking at the evolution of mitochodrial DNA in animals, and found they were evolving rapidly. The technique they used was restriction fragment length polymorphism (RFLP), which was more affordable at the time compared to sequencing. In 1980, W.M. Brown, looking at the relative variation between human and other species, recognized there was a recent constriction (180,000 years ago) in the human population. A year later Brown and Wilson were looking at RFLP fragments and determined the human population expanded more recently than other ape populations. In 1984 the first DNA sequence from an extinct animal was done. Sibley and Ahlquist apply DNA-DNA hybridization technology to anthropoid phylogeny, and see pan/human split closer than gorilla/pan or gorilla/human split, a highly controversial claim. However, in 1987 they were able to support their claim. In 1987, Cann, Stoneking and Wilson suggest, by RFLP analysis of human mitochondrial DNA, that humans evolved from a constrict in Africa of a single female in a small population, ~10,00 individuals, 200,000 years ago.

Era of PCR

PCR could rapidly amplify DNA from one molecule to billions, allowing sequencing from human hairs or ancient DNA.
 
In 1987, PCR-amplification of mtDNA was first used to determine sequences. In 1991 Vigilante et al. published the seminal work on mtDNA phylogeny implicating sub-saharan Africa as the place of humans most recent common ancestors for all mtDNAs. The war between out-of-Africa and multiregionalism, already simmering with the critiques of Allan Templeton, soon escalated with the paleoanthropologist, like Milford Wolpoff, getting involved. In 1995, F. Ayala published his critical Science article "The Myth about Eve", which relied on HLA-DR sequence. At the time, however Ayala was not aware of rapid evolution of HLA loci via recombinatory process. In 1996, Parham and Ohta published their finds on the rapid evolution of HLA by short-distance recombination ('gene conversion' or 'abortive recombination'), weakening Ayala's claim (Parham had actually written a review a year earlier, but this had gone unnoticed). A stream of papers would follow from both sides, many with highly flawed methods and sampling. One of the more interesting was Harris and Hey, 1998 which showed that the TMCRA (time to most recent common ancestor) for the PDHA1 gene was well in excess of 1 million years. Given a ploidy at this locus of 1.5 (3 fold higher than mtDNA) the TMRCA was more than double the expectation. While this falls into the 'fixation curve' of 1.5 ploidy (averaging 2 female and 1 male) the suggested age of 1.8 my is close a significantly deviant p-value for the population size, possibly indicating that the human population shrank or split off of another population. Oddly, the next X-linked loci they examined, Factor IX, showed a TMRCA of less than 300,000 years.

Cross-linked DNA extracted from the 4,000-year-old liver of an Ancient Egyptian priest called Nekht-Ankh

Ancient DNA

Ancient DNA sequencing had been conducted on a limited scale up to the late 1990s when the staff at the Max Planck Institute shocked the anthropology world by sequencing DNA from an estimated 40,000-year-old Neanderthal. The result of that experiment is that the differences between humans living in Europe, many of which were derived from haplogroup H (CRS), Neandertals branched from humans more than 300,000 years before haplogroup H reached Europe. While the mtDNA and other studies continued to support a unique recent African origin, this new study basically answered critiques from the Neanderthal side.

Genomic sequencing

Significant progress has been made in genomic sequencing since Ingman and colleague published their finding on mitochondrial genome. Several papers on genomic mtDNA have been published; there is considerable variability in the rate of evolution, and rate variation and selection are evident at many sites. In 2007, Gonder et al. proposed that a core population of humans, with greatest level of diversity and lowest selection, once lived in the region of Tanzania and proximal parts of southern Africa, since humans left this part of Africa, mitochondria have been selectively evolving to new regions.

Critical progress

Critical in the history of molecular anthropology:
  • That molecular phylogenetics could compete with comparative anthropology for determining the proximity of species to humans.
  • Wilson and King realized in 1975, that while there was equity between the level of molecular evolution branching from chimp to human to putative LCA, that there was an inequity in morphological evolution. Comparative morphology based on fossils could be biased by different rates of change.
  • Realization that in DNA there are multiple independent comparisons. Two techniques, mtDNA and hybridization converge on a single answer, chimps as a species are most closely related to humans.
  • The ability to resolve population sizes based on the 2N rule, proposed by Kimura in the 1950s. To use that information to compare relative sizes of population and come to a conclusion about abundance that contrasted observations based on the paleontological record. While human fossils in the early and middle stone age are far more abundant than chimpanzee or gorilla, there are few unambiguous chimpanzee or gorilla fossils from the same period.
Loci that have been used in molecular phylogenetics:
Cytochrome C
Serum albumin
Hemoglobin - Braunitizer, 1960s, Harding et al. 1997
Mitochondrial D-loop - Wilson group, 1980, 1981, 1984, 1987, 1989, 1991(posthumously) - TMRCA about 170 kya.
Y-chromosome
HLA-DR - Ayala 1995 - TMRCA for locus is 60 million years.
CD4 (Intron) - Tishkoff, 1996 - most of the diversity is in Africa.
PDHA1 (X-linked) Harris and Hey - TMRCA for locus greater than 1.5 million years.
Xlinked loci: PDHA1, Xq21.3, Xq13.3, Zfx, Fix, Il2rg, Plp, Gk, Ids, Alas2, Rrm2p4, AmeIX, Tnfsf5, Licam, and Msn
 
Autosomal:Numerous.

Science and technology studies

From Wikipedia, the free encyclopedia

Science and technology studies, or science, technology and society studies (both abbreviated STS) is the study of how society, politics, and culture affect scientific research and technological innovation, and how these, in turn, affect society, politics and culture.

History

Like most interdisciplinary programs, STS emerged from the confluence of a variety of disciplines and disciplinary subfields, all of which had developed an interest—typically, during the 1960s or 1970s—in viewing science and technology as socially embedded enterprises. The key disciplinary components of STS took shape independently, beginning in the 1960s, and developed in isolation from each other well into the 1980s, although Ludwik Fleck's (1935) monograph Genesis and Development of a Scientific Fact anticipated many of STS's key themes. In the 1970s Elting E. Morison founded the STS program at Massachusetts Institute of Technology (MIT), which served as a model. By 2011, 111 STS research centres and academic programs were counted worldwide.

Key themes

  • History of technology, that examines technology in its social and historical context. Starting in the 1960s, some historians questioned technological determinism, a doctrine that can induce public passivity to technologic and scientific "natural" development. At the same time, some historians began to develop similarly contextual approaches to the history of medicine.
  • History and philosophy of science (1960s). After the publication of Thomas Kuhn's well-known The Structure of Scientific Revolutions (1962), which attributed changes in scientific theories to changes in underlying intellectual paradigms, programs were founded at the University of California, Berkeley and elsewhere that brought historians of science and philosophers together in unified programs.
  • Science, technology, and society. In the mid- to late-1960s, student and faculty social movements in the U.S., UK, and European universities helped to launch a range of new interdisciplinary fields (such as women's studies) that were seen to address relevant topics that the traditional curriculum ignored. One such development was the rise of "science, technology, and society" programs, which are also—confusingly—known by the STS acronym. Drawn from a variety of disciplines, including anthropology, history, political science, and sociology, scholars in these programs created undergraduate curricula devoted to exploring the issues raised by science and technology. Unlike scholars in science studies, history of technology, or the history and philosophy of science, they were and are more likely to see themselves as activists working for change rather than dispassionate, "ivory tower" researchers. As an example of the activist impulse, feminist scholars in this and other emerging STS areas addressed themselves to the exclusion of women from science and engineering.
  • Science, engineering, and public policy studies emerged in the 1970s from the same concerns that motivated the founders of the science, technology, and society movement: A sense that science and technology were developing in ways that were increasingly at odds with the public's best interests. The science, technology, and society movement tried to humanize those who would make tomorrow's science and technology, but this discipline took a different approach: It would train students with the professional skills needed to become players in science and technology policy. Some programs came to emphasize quantitative methodologies, and most of these were eventually absorbed into systems engineering. Others emphasized sociological and qualitative approaches, and found that their closest kin could be found among scholars in science, technology, and society departments.
During the 1970s and 1980s, leading universities in the US, UK, and Europe began drawing these various components together in new, interdisciplinary programs. For example, in the 1970s, Cornell University developed a new program that united science studies and policy-oriented scholars with historians and philosophers of science and technology. Each of these programs developed unique identities due to variation in the components that were drawn together, as well as their location within the various universities. For example, the University of Virginia's STS program united scholars drawn from a variety of fields (with particular strength in the history of technology); however, the program's teaching responsibilities—it is located within an engineering school and teaches ethics to undergraduate engineering students—means that all of its faculty share a strong interest in engineering ethics.

The "turn to technology" (and beyond)

A decisive moment in the development of STS was the mid-1980s addition of technology studies to the range of interests reflected in science. During that decade, two works appeared en seriatim that signaled what Steve Woolgar was to call the "turn to technology": Social Shaping of Technology (MacKenzie and Wajcman, 1985) and The Social Construction of Technological Systems (Bijker, Hughes and Pinch, 1987). MacKenzie and Wajcman primed the pump by publishing a collection of articles attesting to the influence of society on technological design. In a seminal article, Trevor Pinch and Wiebe Bijker attached all the legitimacy of the Sociology of Scientific Knowledge to this development by showing how the sociology of technology could proceed along precisely the theoretical and methodological lines established by the sociology of scientific knowledge. This was the intellectual foundation of the field they called the social construction of technology. 

The "turn to technology" helped to cement an already growing awareness of underlying unity among the various emerging STS programs. More recently, there has been an associated turn to ecology, nature, and materiality in general, whereby the socio-technical and natural/material co-produce each other. This is especially evident in work in STS analyses of biomedicine (such as Carl May, Annemarie Mol, Nelly Oudshoorn, and Andrew Webster) and ecological interventions (such as Bruno Latour, Sheila Jasanoff, Matthias Gross, S. Lochlann Jain, and Jens Lachmund).

Professional associations

The subject has several professional associations. 

Founded in 1975, the Society for Social Studies of Science, initially provided scholarly communication facilities, including a journal (Science, Technology, and Human Values) and annual meetings that were mainly attended by science studies scholars. The society has since grown into the most important professional association of science and technology studies scholars worldwide. The Society for Social Studies of Science members also include government and industry officials concerned with research and development as well as science and technology policy; scientists and engineers who wish to better understand the social embeddedness of their professional practice; and citizens concerned about the impact of science and technology in their lives. Proposals have been made to add the word "technology" to the association's name, thereby reflecting its stature as the leading STS professional society, but there seems to be widespread sentiment that the name is long enough as it is. 

In Europe, the European Association for the Study of Science and Technology (EASST) was founded in 1981 to "stimulate communication, exchange and collaboration in the field of studies of science and technology". Similarly, the European Inter-University Association on Society, Science and Technology (ESST) researches and studies science and technology in society, in both historical and contemporary perspectives. 

In Asia several STS associations exist. In Japan, the Japanese Society for Science and Technology Studies (JSSTS) was founded in 2001. The Asia Pacific Science Technology & Society Network (APSTSN) primarily has members from Australasia, Southeast and East Asia and Oceania. 

In Latin America ESOCITE (Estudios Sociales de la Ciencia y la Tecnología) is the biggest association of Science and Technology studies. The study of STS (CyT in Spanish, CTS in Portuguese) here was shaped by authors like Amílcar Herrera and Jorge Sabato y Oscar Varsavsky in Argentina, José Leite Lopes in Brazil, Miguel Wionczek in Mexico, Francisco Sagasti in Peru, Máximo Halty Carrere in Uruguay and Marcel Roche in Venezuela.

Founded in 1958, the Society for the History of Technology initially attracted members from the history profession who had interests in the contextual history of technology. After the "turn to technology" in the mid-1980s, the society's well-regarded journal (Technology and Culture) and its annual meetings began to attract considerable interest from non-historians with technology studies interests.

Less identified with STS, but also of importance to many STS scholars, are the History of Science Society, the Philosophy of Science Association, and the American Association for the History of Medicine.

Additionally, within the US there are significant STS-oriented special interest groups within major disciplinary associations, including the American Anthropological Association, the American Political Science Association, the National Women's Studies Association, and the American Sociological Association.

Journals

Notable peer-reviewed journals in STS include:
Student journals in STS include:
  • Intersect: the Stanford Journal of Science, Technology, and Society at Stanford
  • DEMESCI: International Journal of Deliberative Mechanisms in Science
  • The Science In Society Review: A Production of the Triple Helix at Cornell
  • Synthesis: An Undergraduate Journal of the History of Science at Harvard

Important concepts

STS social construction

Social constructions are human created ideas, objects, or events created by a series of choices and interactions. These interactions have consequences that change the perception that different groups of people have on these constructs. Some examples of social construction include class, race, money, and citizenship.

The following also alludes to the notion that not everything is set, a circumstance or result could potentially be one way or the other. According to the article "What is Social Construction?" by Laura Flores, "Social construction work is critical of the status quo. Social constructionists about X tend to hold that:
  1. X need not have existed, or need not be at all as it is. X, or X as it is at present, is not determined by the nature of things; it is not inevitable
Very often they go further, and urge that:
  1. X is quite as bad as it is.
  2. We would be much better off if X were done away with, or at least radically transformed."
In the past, there have been viewpoints that were widely regarded as fact until being called to question due to the introduction of new knowledge. Such viewpoints include the past concept of a correlation between intelligence and the nature of a human's ethnicity or race (X may not be at all as it is).

An example of the evolution and interaction of various social constructions within science and technology can be found in the development of both the high-wheel bicycle, or velocipede, and then of the bicycle. The velocipede was widely used in the latter half of the 19th century. In the latter half of the 19th century, a social need was first recognized for a more efficient and rapid means of transportation. Consequently, the velocipede was first developed, which was able to reach higher translational velocities than the smaller non-geared bicycles of the day, by replacing the front wheel with a larger radius wheel. One notable trade-off was a certain decreased stability leading to a greater risk of falling. This trade-off resulted in many riders getting into accidents by losing balance while riding the bicycle or being thrown over the handle bars.

The first "social construction" or progress of the velocipede caused the need for a newer "social construction" to be recognized and developed into a safer bicycle design. Consequently, the velocipede was then developed into what is now commonly known as the "bicycle" to fit within society's newer "social construction," the newer standards of higher vehicle safety. Thus the popularity of the modern geared bicycle design came as a response to the first social construction, the original need for greater speed, which had caused the high-wheel bicycle to be designed in the first place. The popularity of the modern geared bicycle design ultimately ended the widespread use of the velocipede itself, as eventually it was found to best accomplish the social-needs/ social-constructions of both greater speed and of greater safety.

Technoscience

Technoscience is a subset of Science, Technology, and Society studies that focuses on the inseparable connection between science and technology. It states that fields are linked and grow together, and scientific knowledge requires an infrastructure of technology in order to remain stationary or move forward. Both technological development and scientific discovery drive one another towards more advancement. Technoscience excels at shaping human thought and behavior by opening up new possibilities that gradually or quickly come to be perceived as necessities.

Technosocial

"Technological action is a social process." Social factors and technology are intertwined so that they are dependent upon each other. This includes the aspect that social, political, and economic factors are inherent in technology and that social structure influences what technologies are pursued. In other words, "technoscientific phenomena combined inextricably with social/political/ economic/psychological phenomena, so 'technology' includes a spectrum of artifacts, techniques, organizations, and systems." Winner expands on this idea by saying "in the late twentieth century technology and society, technology and culture, technology and politics are by no means separate."

Examples

  • Ford Pinto – Ford Motor Company sold and produced the Pinto during the 1970s. A flaw in the automobile design of the rear gas tank caused a fiery explosion upon impact. The exploding fuel tank killed and injured hundreds of people. Internal documents of test results, proved Ford CEO Lee Iacocca and engineers were aware of the flaw. The company decided to ignore improving their technology because of profit-driven motives, strict internal control, and competition from foreign competitors such as Volkswagen. Ford Motor Company conducted a cost-benefit analysis to determine if altering the Ford Pinto model was feasible. An analysis conducted by Ford employees argued against a new design because of increased cost. Employees were also under tight control by the CEO who rushed the Pinto through production lines to increase profits. Ford finally changed after public scrutiny. Safety organizations later influenced this technology by requiring stricter safety standards for motor vehicles.
  • DDT/toxins – DDT was a common and highly effective insecticide used during the 1940s until its ban in the early 1970s. It was utilized during World War 2 to combat insect-borne human disease that plagued military members and civilian populations. People and companies soon realized other benefits of DDT for agricultural purposes. Rachel Carson became worried of wide spread use on public health and the environment. Rachel Carson's book Silent Spring left an imprint on the industry by claiming linkage of DDT to many serious illness such as cancer. Carson's book drew criticism from chemical companies who felt their reputation and business threatened by such claims.. DDT was eventually banned by the United States Environmental Protection Agency (EPA) after a long and arduous process of research on the chemical substance. The main cause for the removal of DDT was the public deciding that any benefits outweighed the potential health risk.
  • Autopilots/computer aided tasks (CATs) – From a security point of view the effects of making a task more computer driven is in the favor of technological advance because there is less reaction time required and computational error than a human pilot. Due to reduced error and reaction times flights on average, using autopilot, have been shown to be safer. Thus the technology has a direct impact on people by increasing their safety, and society affects the technology because people want to be safer so they are constantly trying to improve the autopilot systems.
  • Cell phones – Cell phone technology emerged in the early 1920s after advancements were made in radio technology. Engineers at Bell Laboratories, the research and development division of AT&T discovered that cell towers can transmit and receive signals to and from many directions. The discovery by Bell Labs revolutionized the capabilities and outcomes of cellular technology. Technology only improved once mobile phone users could communicate outside of a designated area. First generation mobile phones were first created and sold by Motorola. Their phone was only intended for use in cars. Second generation mobile phone capabilities continued to improve because of the switch to digital. Phones were faster which enhanced communication capabilities of customers. They were also sleeker and weighed less than bulky first generation technology. Technologically advances boosted customer satisfaction and broadened cell phone companies customer base. Third generation technology changed the way people interact with other. Now customers had access to wifi, texting and other applications. Mobile phones are now entering into the fourth generations. Cellular and mobile phones revolutionized the way people socialize and communicate in order to establish modern social structure. People have affected the development of this technology by demanding features such as larger screens, touch capabilities, and internet accessibility.
  • Internet – The internet arose because of extensive research on ARPANET between various university, corporations, and ARPA (Advanced Research Project Agency), an agency of the Department of Defense. Scientist theorized a network of computers connected to each other. Computing capabilities contributed to developments and the creation of the modern day computer or laptop. The internet has become a normal part of life and business, to such a degree that the United Nations views it as a basic human right. The internet is becoming larger, one way is that more things are being moved into the digital world due to demand, for example online banking. It has drastically changed the way most people go about daily habits.

Deliberative democracy

Deliberative democracy is a reform of representative or direct democracies which mandates discussion and debate of popular topics which affect society. Deliberative Democracy is a tool for making decisions. Deliberative democracy can be traced back all the way to Aristotle’s writings. More recently, the term was coined by Joseph Bessette in his 1980 work Deliberative Democracy: The Majority Principle in Republican Government, where he uses the idea in opposition to the elitist interpretations of the United States Constitution with emphasis on public discussion.

Deliberative Democracy can lead to more legitimate, credible, and trustworthy outcomes. Deliberative Democracy allows for "a wider range of public knowledge," and it has been argued that this can lead to "more socially intelligent and robust" science. One major shortcoming of deliberative democracy is that many models insufficiently ensure critical interaction.

According to Ryfe, there are five mechanisms that stand out as critical to the successful design of deliberative democracy:
  1. Rules of equality, civility, and inclusivity may prompt deliberation even when our first impulse is to avoid it.
  2. Stories anchor reality by organizing experience and instilling a normative commitment to civic identities and values, and function as a medium for framing discussions.
  3. Leadership provides important cues to individuals in deliberative settings, and can keep groups on a deliberative track when their members slip into routine and habit.
  4. Individuals are more likely to sustain deliberative reasoning when they have a stake in the outcomes.
  5. Apprenticeship teaches citizens to deliberate well. We might do well to imagine education as a form of apprenticeship learning, in which individuals learn to deliberate by doing it in concert with others more skilled in the activity.

Importance of deliberative democracy in STS

Recently, there has been a movement towards greater transparency in the fields of policy and technology. Jasanoff comes to the conclusion that there is no longer a question of if there needs to be increased public participation in making decisions about science and technology, but now there needs to be ways to make a more meaningful conversation between the public and those developing the technology.

Deliberative democracy in practice

Ackerman and Fishkin offer an example of a reform in their paper "Deliberation Day." The deliberation is to enhance public understanding of popular, complex, and controversial issues, through devices such as Fishkin’s Deliberative Polling. Although implementation of these reforms is unlikely in a large government situation such as the United States Federal Government. However, things similar to this have been implemented in small, local, governments like New England towns and villages. New England town hall meetings are a good example of deliberative democracy in a realistic setting.

An ideal Deliberative Democracy balances the voice and influence of all participants. While the main aim is to reach consensus, a deliberative democracy should encourage the voices of those with opposing viewpoints, concerns due to uncertainties, and questions about assumptions made by other participants. It should take its time and ensure that those participating understand the topics on which they debate. Independent managers of debates should also have substantial grasp of the concepts discussed, but must "[remain] independent and impartial as to the outcomes of the process."

Tragedy of the commons

In 1968, Garrett Hardin popularized the phrase "tragedy of the commons." It is an economic theory where rational people act against the best interest of the group by consuming a common resource. Since then, the tragedy of the commons has been used to symbolize the degradation of the environment whenever many individuals use a common resource. Although Garrett Hardin was not an STS scholar, the concept of tragedy of the commons still applies to science, technology and society.

In a contemporary setting, the Internet acts as an example of the tragedy of the commons through the exploitation of digital resources and private information. Data and internet passwords can be stolen much more easily than physical documents. Virtual spying is almost free compared to the costs of physical spying. Additionally, net neutrality can be seen as an example of tragedy of the commons in an STS context. The movement for net neutrality argues that the Internet should not be a resource that is dominated by one particular group, specifically those with more money to spend on Internet access.
A counterexample to the tragedy of the commons is offered by Andrew Kahrl. Privatization can be a way to deal with the tragedy of the commons. However, Kahrl suggests that the privatization of beaches on Long Island, in an attempt to combat overuse of Long Island beaches, made the residents of Long Island more susceptible to flood damage from Hurricane Sandy. The privatization of these beaches took away from the protection offered by the natural landscape. Tidal lands that offer natural protection were drained and developed. This attempt to combat the tragedy of the commons by privatization was counter-productive. Privatization actually destroyed the public good of natural protection from the landscape.

Alternative modernity

Alternative modernity is a conceptual tool conventionally used to represent the state of present western society. Modernity represents the political and social structures of the society, the sum of interpersonal discourse, and ultimately a snapshot of society's direction at a point in time. Unfortunately conventional modernity is incapable of modeling alternative directions for further growth within our society. Also, this concept is ineffective at analyzing similar but unique modern societies such as those found in the diverse cultures of the developing world. Problems can be summarized into two elements: inward failure to analyze growth potentials of a given society, and outward failure to model different cultures and social structures and predict their growth potentials.

Previously, modernity carried a connotation of the current state of being modern, and its evolution through European colonialism. The process of becoming "modern" is believed to occur in a linear, pre-determined way, and is seen by Philip Brey as a way of to interpret and evaluate social and cultural formations. This thought ties in with modernization theory, the thought that societies progress from "pre-modern" to "modern" societies.

Within the field of science and technology, there are two main lenses with which to view modernity. The first is as a way for society to quantify what it wants to move towards. In effect, we can discuss the notion of "alternative modernity" (as described by Andrew Feenberg) and which of these we would like to move towards. Alternatively, modernity can be used to analyze the differences in interactions between cultures and individuals. From this perspective, alternative modernities exist simultaneously, based on differing cultural and societal expectations of how a society (or an individual within society) should function. Because of different types of interactions across different cultures, each culture will have a different modernity.

Pace of innovation

Pace of Innovation is the speed at which technological innovation or advancement is occurring, with the most apparent instances being too slow or too rapid. Both these rates of innovation are extreme and therefore have effects on the people that get to use this technology.

No innovation without representation

"No innovation without representation" is a democratic ideal of ensuring that everyone involved gets a chance to be represented fairly in technological developments.
  • Langdon Winner states that groups and social interests likely to be affected by a particular kind of technological change ought to be represented at an early stage in defining exactly what that technology will be. It is the idea that relevant parties have a say in technological developments and are not left in the dark.
  • Spoken about by Massimiano Bucchi
  • This ideal does not require the public to become experts on the topics of science and engineering, it only asks that the opinions and ideas be heard before making drastic decisions, as talked about by Steven L. Goldman.

Privileged positions of business and science

The privileged positions of business and science refer to the unique authority that persons in these areas hold in economic, political, and technosocial affairs. Businesses have strong decision-making abilities in the function of society, essentially choosing what technological innovations to develop. Scientists and technologists have valuable knowledge, ability to pursue the technological innovations they want. They proceed largely without public scrutiny and as if they had the consent of those potentially affected by their discoveries and creations.

Legacy thinking

Legacy thinking is defined as an inherited method of thinking imposed from an external source without objection by the individual, because it is already widely accepted by society. 

Legacy thinking can impair the ability to drive technology for the betterment of society by blinding people to innovations that do not fit into their accepted model of how society works. By accepting ideas without questioning them, people often see all solutions that contradict these accepted ideas as impossible or impractical. Legacy thinking tends to advantage the wealthy, who have the means to project their ideas on the public. It may be used by the wealthy as a vehicle to drive technology in their favor rather than for the greater good. Examining the role of citizen participation and representation in politics provides an excellent example of legacy thinking in society. The belief that one can spend money freely to gain influence has been popularized, leading to public acceptance of corporate lobbying. As a result, a self-established role in politics has been cemented where the public does not exercise the power ensured to them by the Constitution to the fullest extent. This can become a barrier to political progress as corporations who have the capital to spend have the potential to wield great influence over policy. Legacy thinking however keeps the population from acting to change this, despite polls from Harris Interactive that report over 80% of Americans feel that big business holds too much power in government. Therefore, Americans are beginning to try to steer away this line of thought, rejecting legacy thinking, and demanding less corporate, and more public, participation in political decision making.

Additionally, an examination of net neutrality functions as a separate example of legacy thinking. Starting with dial-up, the internet has always been viewed as a private luxury good. Internet today is a vital part of modern-day society members. They use it in and out of life every day. Corporations are able to mislabel and greatly overcharge for their internet resources. Since the American public is so dependent upon internet there is little for them to do. Legacy thinking has kept this pattern on track despite growing movements arguing that the internet should be considered a utility. Legacy thinking prevents progress because it was widely accepted by others before us through advertising that the internet is a luxury and not a utility. Due to pressure from grassroots movements the Federal Communications Commission (FCC) has redefined the requirements for broadband and internet in general as a utility. Now AT&T and other major internet providers are lobbying against this action and are in-large able to delay the onset of this movement due to legacy thinking’s grip on American culture and politics.

For example, those who cannot overcome the barrier of legacy thinking may not consider the privatization of clean drinking water as an issue. This is partially because access to water has become such a given fact of the matter to them. For a person living in such circumstances, it may be widely accepted to not concern themselves with drinking water because they have not needed to be concerned with it in the past. Additionally, a person living within an area that does not need to worry about their water supply or the sanitation of their water supply is less likely to be concerned with the privatization of water.

This notion can be examined through the thought experiment of "veil of ignorance". Legacy thinking causes people to be particularly ignorant about the implications behind the "you get what you pay for" mentality applied to a life necessity. By utilizing the "veil of ignorance", one can overcome the barrier of legacy thinking as it requires a person to imagine that they are unaware of their own circumstances, allowing them to free themselves from externally imposed thoughts or widely accepted ideas.

Related concepts

  • Technoscience – The perception that science and technology are intertwined and depend on each other.
  • Technosociety – An industrially developed society with a reliance on technology.
  • Technological utopianism – A positive outlook on the effect technology has on social welfare. Includes the perception that technology will one day enable society to reach a utopian state.
  • Technosocial systems – people and technologies that combine to work as heterogeneous but functional wholes.

Classifications

  • Technological optimism – The opinion that technology has positive effects on society and should be used in order to improve the welfare of people.
  • Technological pessimism – The opinion that technology has negative effects on society and should be discouraged from use.
  • Technological neutrality – "maintains that a given technology has no systematic effects on society: individuals are perceived as ultimately responsible, for better or worse, because technologies are merely tools people use for their own ends."
  • Technological determinism – "maintains that technologies are understood as simply and directly causing particular societal outcomes."
  • Scientism – The belief in the total separation of facts and values.
  • Technological progressivism – technology is a means to an end itself and an inherently positive pursuit.

STS programs around the world

STS is taught in several countries. According to the STS wiki, STS programs can be found in twenty countries, including 45 programs in the United States, three programs in India, and eleven programs in the UK. STS programs can be found in Canada, Germany, Israel, Malaysia, and Taiwan. Some examples of institutions offering STS programs are Stanford University, Harvard University, the University of Oxford, Mines ParisTech, Bar-Ilan University, and York University.

Technology and society

From Wikipedia, the free encyclopedia

Technology society and life or technology and culture refers to cyclical co-dependence, co-influence, and co-production of technology and society upon the other (technology upon culture, and vice versa). This synergistic relationship occurred from the dawn of humankind, with the invention of simple tools and continues into modern technologies such as the printing press and computers. The academic discipline studying the impacts of science, technology, and society, and vice versa is called science and technology studies.

Pre-historical

The importance of stone tools, circa 2.5 million years ago, is considered fundamental in the human development in the hunting hypothesis

Primatologist, Richard Wrangham, theorizes that the control of fire by early humans and the associated development of cooking was the spark that radically changed human evolution. Texts such as Guns, Germs, and Steel suggest that early advances in plant agriculture and husbandry fundamentally shifted the way that collective groups of individuals, and eventually societies, developed.

Modern examples and effects

Technology has become a huge part in society and day-to-day life. When societies know more about the development in a technology, they become able to take advantage of it. When an innovation achieves a certain point after it has been presented and promoted, this technology becomes part of the society.The use of technology in education provides students with technology literacy, information literacy, capacity for life-long learning and other skills necessary for the 21st century workplace.  Digital technology has entered each process and activity made by the social system. In fact, it constructed another worldwide communication system in addition to its origin.

A 1982 study by The New York Times described a technology assessment study by the Institute for the Future, "peering into the future of an electronic world." The study focused on the emerging videotex industry, formed by the marriage of two older technologies, communications and computing. It estimated that 40 percent of American households will have two-way videotex service by the end of the century. By comparison, it took television 16 years to penetrate 90 percent of households from the time commercial service was begun. 

Since the creation of computers achieved an entire better approach to transmit and store data. Digital technology became commonly used for downloading music and watching movies at home either by DVDs or purchasing it online. Digital music records are not quite the same as traditional recording media. Obviously, because digital ones are reproducible, portable and free.

Several states started to implement education technology in schools, universities and colleges. According to the statistics, in the early beginnings of 1990s the use of Internet in schools was ,on average, 2-3%. Continuously, by the end of 1990s the evolution of technology increases rapidly and reaches to 60%, and by the year of 2008 nearly 100% of schools use Internet on educational form. According to ISTE researchers, technological improvements can lead to numerous achievements in classrooms. E-learning system, collaboration of students on project based learning, and technological skills for future results in motivation of students. 

Although these previous examples only show a few of the positive aspects of technology in society, there are negative side effects as well. Within this virtual realm, social media platforms such as Instagram, Facebook, and Snapchat have altered the way Generation Y culture is understanding the world and thus how they view themselves. In recent years, there has been more research on the development of social media depression in users of sites like these. "Facebook Depression" is when users are so affected by their friends' posts and lives that their own jealousy depletes their sense of self-worth. They compare themselves to the posts made by their peers and feel unworthy or monotonous because they feel like their lives are not nearly as exciting as the lives of others.

Another instance of the negative effects of technology in society, is how quickly it is pushing younger generations into maturity. With the world at their fingertips, children can learn anything they wish to. But with the uncensored sources from the internet, without proper supervision, children can be exposed to explicit material at inappropriate ages. This comes in the forms of premature interests in experimenting with makeup or opening an email account or social media page—all of which can become a window for predators and other dangerous entities that threaten a child's innocence. Technology has a serious effect on youth's health. The overuse of technology is said to be associated with sleep deprivation which is linked to obesity and poor academic performance in the lives of adolescents.

Economics and technological development

Nuclear reactor and windmill, Doel, Belgium
 
In ancient history, economics began when spontaneous exchange of goods and services was replaced over time by deliberate trade structures. Makers of arrowheads, for example, might have realized they could do better by concentrating on making arrowheads and barter for other needs. Regardless of goods and services bartered, some amount of technology was involved—if no more than in the making of shell and bead jewelry. Even the shaman's potions and sacred objects can be said to have involved some technology. So, from the very beginnings, technology can be said to have spurred the development of more elaborate economies.Technology is seen as primary source in economic development.

Technology advancement and economic growth are related to each other.The level of technology is important to determine the economic growth.It is the technological process which keeps the economy moving. 

In the modern world, superior technologies, resources, geography, and history give rise to robust economies; and in a well-functioning, robust economy, economic excess naturally flows into greater use of technology. Moreover, because technology is such an inseparable part of human society, especially in its economic aspects, funding sources for (new) technological endeavors are virtually illimitable. However, while in the beginning, technological investment involved little more than the time, efforts, and skills of one or a few men, today, such investment may involve the collective labor and skills of many millions.

Funding

Consequently, the sources of funding for large technological efforts have dramatically narrowed, since few have ready access to the collective labor of a whole society, or even a large part. It is conventional to divide up funding sources into governmental (involving whole, or nearly whole, social enterprises) and private (involving more limited, but generally more sharply focused) business or individual enterprises.

Government funding for new technology

The government is a major contributor to the development of new technology in many ways. In the United States alone, many government agencies specifically invest billions of dollars in new technology. 

[In 1980, the UK government invested just over six million pounds in a four-year program, later extended to six years, called the Microelectronics Education Programme (MEP), which was intended to give every school in Britain at least one computer, software, training materials, and extensive teacher training. Similar programs have been instituted by governments around the world.] 

Technology has frequently been driven by the military, with many modern applications developed for the military before they were adapted for civilian use. However, this has always been a two-way flow, with industry often developing and adopting a technology only later adopted by the military. 

Entire government agencies are specifically dedicated to research, such as America's National Science Foundation, the United Kingdom's scientific research institutes, America's Small Business Innovative Research effort. Many other government agencies dedicate a major portion of their budget to research and development.

Private funding

Research and development is one of the smallest areas of investments made by corporations toward new and innovative technology. 

Many foundations and other nonprofit organizations contribute to the development of technology. In the OECD, about two-thirds of research and development in scientific and technical fields is carried out by industry, and 98 percent and 10 percent, respectively, by universities and government. But in poorer countries such as Portugal and Mexico the industry contribution is significantly less. The U.S. government spends more than other countries on military research and development, although the proportion has fallen from about 30 percent in the 1980s to less than 10 percent.

The 2009 founding of Kickstarter allows individuals to receive funding via crowdsourcing for many technology related products including both new physical creations as well as documentaries, films, and webseries that focus on technology management. This circumvents the corporate or government oversight most inventors and artists struggle against but leaves the accountability of the project completely with the individual receiving the funds.

Other economic considerations

Sociological factors and effects

Values

The implementation of technology influences the values of a society by changing expectations and realities. The implementation of technology is also influenced by values. There are (at least) three major, interrelated values that inform, and are informed by, technological innovations:
  • Mechanistic world view: Viewing the universe as a collection of parts (like a machine), that can be individually analyzed and understood. This is a form of reductionism that is rare nowadays. However, the "neo-mechanistic world view" holds that nothing in the universe cannot be understood by the human intellect. Also, while all things are greater than the sum of their parts (e.g., even if we consider nothing more than the information involved in their combination), in principle, even this excess must eventually be understood by human intelligence. That is, no divine or vital principle or essence is involved.
  • Efficiency: A value, originally applied only to machines, but now applied to all aspects of society, so that each element is expected to attain a higher and higher percentage of its maximal possible performance, output, or ability.
  • Social progress: The belief that there is such a thing as social progress, and that, in the main, it is beneficent. Before the Industrial Revolution, and the subsequent explosion of technology, almost all societies believed in a cyclical theory of social movement and, indeed, of all history and the universe. This was, obviously, based on the cyclicity of the seasons, and an agricultural economy's and society's strong ties to that cyclicity. Since much of the world is closer to their agricultural roots, they are still much more amenable to cyclicity than progress in history. This may be seen, for example, in Prabhat Rainjan Sarkar's modern social cycles theory. For a more westernized version of social cyclicity, see Generations: The History of America's Future, 1584 to 2069 (Paperback) by Neil Howe and William Strauss; Harper Perennial; Reprint edition (September 30, 1992); ISBN 0-688-11912-3, and subsequent books by these authors.

Institutions and groups

Technology often enables organizational and bureaucratic group structures that otherwise and heretofore were simply not possible. Examples of this might include:
  • The rise of very large organizations: e.g., governments, the military, health and social welfare institutions, supranational corporations.
  • The commercialization of leisure: sports events, products, etc. (McGinn)
  • The almost instantaneous dispersal of information (especially news) and entertainment around the world.

International

Technology enables greater knowledge of international issues, values, and cultures. Due mostly to mass transportation and mass media, the world seems to be a much smaller place, due to the following:
  • Globalization of ideas
  • Embeddedness of values
  • Population growth and control

Environment

Technology provides an understanding, and an appreciation for the world around us.

Most modern technological processes produce unwanted by products in addition to the desired products, which is known as industrial waste and pollution. While most material waste is re-used in the industrial process, many forms are released into the environment, with negative environmental side effects, such as pollution and lack of sustainability. Different social and political systems establish different balances between the value they place on additional goods versus the disvalues of waste products and pollution. Some technologies are designed specifically with the environment in mind, but most are designed first for economic or ergonomic effects. Historically, the value of a clean environment and more efficient productive processes has been the result of an increase in the wealth of society, because once people are able to provide for their basic needs, they are able to focus on less tangible goods such as clean air and water.

The effects of technology on the environment are both obvious and subtle. The more obvious effects include the depletion of nonrenewable natural resources (such as petroleum, coal, ores), and the added pollution of air, water, and land. The more subtle effects include debates over long-term effects (e.g., global warming, deforestation, natural habitat destruction, coastal wetland loss.)

Each wave of technology creates a set of waste previously unknown by humans: toxic waste, radioactive waste, electronic waste.

One of the main problems is the lack of an effective way to remove these pollutants on a large scale expediently. In nature, organisms "recycle" the wastes of other organisms, for example, plants produce oxygen as a by-product of photosynthesis, oxygen-breathing organisms use oxygen to metabolize food, producing carbon dioxide as a by-product, which plants use in a process to make sugar, with oxygen as a waste in the first place. No such mechanism exists for the removal of technological wastes.

Construction and shaping

Choice

Society also controls technology through the choices it makes. These choices not only include consumer demands; they also include:
  • the channels of distribution, how do products go from raw materials to consumption to disposal;
  • the cultural beliefs regarding style, freedom of choice, consumerism, materialism, etc.;
  • the economic values we place on the environment, individual wealth, government control, capitalism, etc.
According to Williams and Edge, the construction and shaping of technology includes the concept of choice (and not necessarily conscious choice). Choice is inherent in both the design of individual artifacts and systems, and in the making of those artifacts and systems. 

The idea here is that a single technology may not emerge from the unfolding of a predetermined logic or a single determinant, technology could be a garden of forking paths, with different paths potentially leading to different technological outcomes. This is a position that has been developed in detail by Judy Wajcman. Therefore, choices could have differing implications for society and for particular social groups.

Autonomous technology

In one line of thought, technology develops autonomously, in other words, technology seems to feed on itself, moving forward with a force irresistible by humans. To these individuals, technology is "inherently dynamic and self-augmenting."

Jacques Ellul is one proponent of the irresistibleness of technology to humans. He espouses the idea that humanity cannot resist the temptation of expanding our knowledge and our technological abilities. However, he does not believe that this seeming autonomy of technology is inherent. But the perceived autonomy is because humans do not adequately consider the responsibility that is inherent in technological processes.

Langdon Winner critiques the idea that technological evolution is essentially beyond the control of individuals or society in his book Autonomous Technology. He argues instead that the apparent autonomy of technology is a result of "technological somnambulism," the tendency of people to uncritically and unreflectively embrace and utilize new technologies without regard for their broader social and political effects.

In 1980, Mike Cooley published a critique of the automation and computerization of engineering work under the title "Architect or Bee? The human/technology relationship". The title alludes to a comparison made by Karl Marx, on the issue of the creative achievements of human imaginative power. According to Cooley ""Scientific and technological developments have invariably proved to be double-edged. They produced the beauty of Venice and the hideousness of Chernobyl; the caring therapies of Rontgen's X-rays and the destruction of Hiroshima." 

Government

Individuals rely on governmental assistance to control the side effects and negative consequences of technology.
  • Supposed independence of government. An assumption commonly made about the government is that their governance role is neutral or independent. However, some argue that governing is a political process, so government will be influenced by political winds of influence. In addition, because government provides much of the funding for technological research and development, it has a vested interest in certain outcomes. Other point out that the world's biggest ecological disasters, such as the Aral Sea, Chernobyl, and Lake Karachay have been caused by government projects, which are not accountable to consumers.
  • Liability. One means for controlling technology is to place responsibility for the harm with the agent causing the harm. Government can allow more or less legal liability to fall to the organizations or individuals responsible for damages.
  • Legislation. A source of controversy is the role of industry versus that of government in maintaining a clean environment. While it is generally agreed that industry needs to be held responsible when pollution harms other people, there is disagreement over whether this should be prevented by legislation or civil courts, and whether ecological systems as such should be protected from harm by governments.
Recently, the social shaping of technology has had new influence in the fields of e-science and e-social science in the United Kingdom, which has made centers focusing on the social shaping of science and technology a central part of their funding programs.

Negative Criticism

Governments have been criticized for collaborating with Google Earth to spy on us. "Google’s Earth: how the tech giant is helping the state spy on us" is the title of a The Guardian article by Yasha Levin who goes on to state 'We knew that being connected had a price – our data. But we didn’t care. Then it turned out that Google’s main clients included the military and intelligence agencies`.

1947–1948 civil war in Mandatory Palestine

From Wikipedia, the free encyclopedia During the civil war, the Jewish and Arab communities of Palestine clashed (the latter supported b...