Search This Blog

Thursday, January 24, 2019

Social psychology

From Wikipedia, the free encyclopedia
 
Social psychology is the scientific study of how people's thoughts, feelings and behaviors are influenced by the actual, imagined or implied presence of others. In this definition, scientific refers to the empirical investigation using the scientific method. The terms thoughts, feelings and behavior refer to psychological variables that can be measured in humans. The statement that others' presence may be imagined or implied suggests that humans are malleable to social influences even when alone, such as when watching television or following internalized cultural norms. Social psychologists typically explain human behavior as a result of the interaction of mental states and social situations
 
Social psychologists examine factors that cause behaviors to unfold in a given way in the presence of others. They study conditions under which certain behavior, actions, and feelings occur. Social psychology is concerned with the way these feelings, thoughts, beliefs, intentions, and goals are cognitively constructed and how these mental representations, in turn, influence our interactions with others.

Social psychology traditionally bridged the gap between psychology and sociology. During the years immediately following World War II there was frequent collaboration between psychologists and sociologists. The two disciplines, however, have become increasingly specialized and isolated from each other in recent years, with sociologists focusing on "macro variables" (e.g., social structure) to a much greater extent than psychologists.[citation needed] Nevertheless, sociological approaches to psychology remain an important counterpart to psychological research in this area.

In addition to the split between psychology and sociology, there has been a somewhat less pronounced difference in emphasis between American social psychologists and European social psychologists. As a generalization, American researchers traditionally have focused more on the individual, whereas Europeans have paid more attention to group level phenomena (see group dynamics).

History

Although there were some older writings about social psychology, such as those by Islamic philosopher Al-Farabi (Alpharabius), the discipline of social psychology, as its modern-day definition, began in the United States at the beginning of the 20th century. By that time, though, the discipline had already developed a significant foundation. Following the 18th century, those in the emerging field of social psychology were concerned with developing concrete explanations for different aspects of human nature. They attempted to discover concrete cause and effect relationships that explained the social interactions in the world around them. In order to do so, they believed that the scientific method, an empirically based scientific measure, could be applied to human behavior.

The first published study in this area was an experiment in 1898 by Norman Triplett, on the phenomenon of social facilitation. During the 1930s, many Gestalt psychologists, most notably Kurt Lewin, fled to the United States from Nazi Germany. They were instrumental in developing the field as something separate from the behavioral and psychoanalytic schools that were dominant during that time, and social psychology has always maintained the legacy of their interests in perception and cognition. Attitudes and small group phenomena were the most commonly studied topics in this era.

During World War II, social psychologists studied persuasion and propaganda for the U.S. military. After the war, researchers became interested in a variety of social problems, including gender issues and racial prejudice. Most notable, revealing, and contentious of these were the Stanley Milgram shock experiments on obedience to authority. In the sixties, there was growing interest in new topics, such as cognitive dissonance, bystander intervention, and aggression. By the 1970s, however, social psychology in America had reached a crisis. There was heated debate over the ethics of laboratory experimentation, whether or not attitudes really predicted behavior, and how much science could be done in a cultural context. This was also the time when a radical situationist approach challenged the relevance of self and personality in psychology. Throughout the 1980s and 1990s social psychology reached a more mature level. Two of the areas social psychology matured in were theories and methods. Careful ethical standards now regulate research. Pluralistic and multicultural perspectives have emerged. Modern researchers are interested in many phenomena, but attribution, social cognition, and the self-concept are perhaps the greatest areas of growth in recent years.[citation needed] Social psychologists have also maintained their applied interests with contributions in the social psychology of health, education, law, and the workplace.

Intrapersonal phenomena

Attitudes

In social psychology, attitudes are defined as learned, global evaluations of a person, object, place, or issue that influence thought and action. Put more simply, attitudes are basic expressions of approval or disapproval, favorability or unfavorability, or as Bem put it, likes and dislikes. Examples would include liking chocolate ice cream, or endorsing the values of a particular political party.

Social psychologists have studied attitude formation, the structure of attitudes, attitude change, the function of attitudes, and the relationship between attitudes and behavior. Because people are influenced by the situation, general attitudes are not always good predictors of specific behavior. For example, for a variety of reasons, a person may value the environment but not recycle a can on a particular day.

In recent times, research on attitudes has examined the distinction between traditional, self-reported attitude measures and "implicit" or unconscious attitudes. For example, experiments using the Implicit Association Test have found that people often demonstrate implicit bias against other races, even when their explicit responses reveal equal mindedness. One study found that explicit attitudes correlate with verbal behavior in interracial interactions, whereas implicit attitudes correlate with nonverbal behavior.

One hypothesis on how attitudes are formed, first advanced by Abraham Tesser in 1983, is that strong likes and dislikes are ingrained in our genetic make-up. Tesser speculates that individuals are disposed to hold certain strong attitudes as a result of inborn physical, sensory, and cognitive skills, temperament, and personality traits. Whatever disposition nature elects to give us, our most treasured attitudes are often formed as a result of exposure to attitude objects; our history of rewards and punishments; the attitude that our parents, friends, and enemies express; the social and cultural context in which we live; and other types of experiences we have. Obviously, attitudes are formed through the basic process of learning. Numerous studies have shown that people can form strong positive and negative attitudes toward neutral objects that are in some way linked to emotionally charged stimuli.

Attitudes are also involved in several other areas of the discipline, such as conformity, interpersonal attraction, social perception, and prejudice.

Persuasion

The topic of persuasion has received a great deal of attention in recent years. Persuasion is an active method of influence that attempts to guide people toward the adoption of an attitude, idea, or behavior by rational or emotive means. Persuasion relies on "appeals" rather than strong pressure or coercion. Numerous variables have been found to influence the persuasion process; these are normally presented in five major categories: who said what to whom and how.
  1. The Communicator, including credibility, expertise, trustworthiness, and attractiveness.
  2. The Message, including varying degrees of reason, emotion (such as fear), one-sided or two sided arguments, and other types of informational content.
  3. The Audience, including a variety of demographics, personality traits, and preferences.
  4. The Channel or Medium, including the printed word, radio, television, the internet, or face-to-face interactions.
  5. The Context, including the environment, group dynamics, and preamble to the message.
Dual-process theories of persuasion maintain that the persuasive process is mediated by two separate routes; central and peripheral. The central route of persuasion is more fact-based and results in longer lasting change, but requires motivation to process. The peripheral route is more superficial and results in shorter lasting change, but does not require as much motivation to process. An example of a peripheral route of persuasion might be a politician using a flag lapel pin, smiling, and wearing a crisp, clean shirt. Notice that this does not require motivation to be persuasive, but should not last as long as persuasion based on the central route. If that politician were to outline exactly what they believed, and their previous voting record, this would be using the central route, and would result in longer lasting change, but would require a good deal of motivation to process.

Social cognition

Social cognition is a growing area of social psychology that studies how people perceive, think about, and remember information about others. Much research rests on the assertion that people think about (other) people differently from non-social targets. This assertion is supported by the social cognitive deficits exhibited by people with Williams syndrome and autism. Person perception is the study of how people form impressions of others. The study of how people form beliefs about each other while interacting is known as interpersonal perception.

A major research topic in social cognition is attribution. Attributions are the explanations we make for people's behavior, either our own behavior or the behavior of others. One element of attribution ascribes the locus of a behavior to either internal or external factors. An internal, or dispositional, attribution assigns behavior to causes related to inner traits such as personality, disposition, character or ability. An external, or situational, attribution involves situational elements, such as the weather. A second element of attribution ascribes the cause of behavior to either stable or unstable factors (whether the behavior will be repeated or changed under similar circumstances). Finally, we also attribute causes of behavior to either controllable or uncontrollable factors: how much control one has over the situation at hand.

Numerous biases in the attribution process have been discovered. For instance, the fundamental attribution error is the tendency to make dispositional attributions for behavior, overestimating the influence of personality and underestimating the influence of situations. The actor-observer difference is a refinement of this bias, the tendency to make dispositional attributions for other people's behavior and situational attributions for our own. The self-serving bias is the tendency to attribute dispositional causes for successes, and situational causes for failure, particularly when self-esteem is threatened. This leads to assuming one's successes are from innate traits, and one's failures are due to situations, including other people. Other ways people protect their self-esteem are by believing in a just world, blaming victims for their suffering, and making defensive attributions, which explain our behavior in ways which defend us from feelings of vulnerability and mortality. Researchers have found that mildly depressed individuals often lack this bias and actually have more realistic perceptions of reality (as measured by the opinions of others).

Heuristics are cognitive short cuts. Instead of weighing all the evidence when making a decision, people rely on heuristics to save time and energy. The availability heuristic occurs when people estimate the probability of an outcome based on how easy that outcome is to imagine. As such, vivid or highly memorable possibilities will be perceived as more likely than those that are harder to picture or are difficult to understand, resulting in a corresponding cognitive bias. The representativeness heuristic is a shortcut people use to categorize something based on how similar it is to a prototype they know of. Numerous other biases have been found by social cognition researchers. The hindsight bias is a false memory of having predicted events, or an exaggeration of actual predictions, after becoming aware of the outcome. The confirmation bias is a type of bias leading to the tendency to search for, or interpret information in a way that confirms one's preconceptions.

Another key concept in social cognition is the assumption that reality is too complex to easily discern. As a result, we tend to see the world according to simplified schemas or images of reality. Schemas are generalized mental representations that organize knowledge and guide information processing. Schemas often operate automatically and unintentionally, and can lead to biases in perception and memory. Expectations from schemas may lead us to see something that is not there. One experiment found that people are more likely to misperceive a weapon in the hands of a black man than a white man. This type of schema is actually a stereotype, a generalized set of beliefs about a particular group of people (when incorrect, an ultimate attribution error). Stereotypes are often related to negative or preferential attitudes (prejudice) and behavior (discrimination). Schemas for behaviors (e.g., going to a restaurant, doing laundry) are known as scripts.

Self-concept

Self-concept is a term referring to the whole sum of beliefs that people have about themselves. However, what specifically does self-concept consist of? According to Hazel Markus (1977), the self-concept is made up of cognitive molecules called self-schemas – beliefs that people have about themselves that guide the processing of self-reliant information. For example, an athlete at a university would have multiple selves that would process different information pertinent to each self: the student would be one "self," who would process information pertinent to a student (taking notes in class, completing a homework assignment, etc.); the athlete would be the "self" who processes information about things related to being an athlete (recognizing an incoming pass, aiming a shot, etc.). These "selves" are part of one's identity and the self-reliant information is the information that relies on the proper "self" to process and react on it. If a "self" is not part of one's identity, then it is much more difficult for one to react. For example, a civilian may not know how to handle a hostile threat as a trained Marine would. The Marine contains a "self" that would enable him/her to process the information about the hostile threat and react accordingly, whereas a civilian may not contain that self, disabling them from properly processing the information from the hostile threat and, furthermore, debilitating them from acting accordingly. Self-schemas are to an individual’s total self–concept as a hypothesis is to a theory, or a book is to a library. A good example is the body weight self-schema; people who regard themselves as over or underweight, or for those whom body image is a significant self-concept aspect, are considered schematics with respect to weight. For these people a range of otherwise mundane events – grocery shopping, new clothes, eating out, or going to the beach – can trigger thoughts about the self. In contrast, people who do not regard their weight as an important part of their lives are a-schematic on that attribute.

It is rather clear that the self is a special object of our attention. Whether one is mentally focused on a memory, a conversation, a foul smell, the song that is stuck in one's head, or this sentence, consciousness is like a spotlight. This spotlight can shine on only one object at a time, but it can switch rapidly from one object to another and process the information out of awareness. In this spotlight the self is front and center: things relating to the self have the spotlight more often.

The self's ABCs are affect, behavior, and cognition. An affective (or emotional) question: How do people evaluate themselves, enhance their self-image, and maintain a secure sense of identity? A behavioral question: How do people regulate their own actions and present themselves to others according to interpersonal demands? A cognitive question: How do individuals become themselves, build a self-concept, and uphold a stable sense of identity?

Affective forecasting is the process of predicting how one would feel in response to future emotional events. Studies done by Timothy Wilson and Daniel Gilbert in 2003 have shown that people overestimate the strength of reaction to anticipated positive and negative life events that they actually feel when the event does occur.

There are many theories on the perception of our own behavior. Daryl Bem's (1972) self-perception theory claims that when internal cues are difficult to interpret, people gain self-insight by observing their own behavior. Leon Festinger's 1954 social comparison theory is that people evaluate their own abilities and opinions by comparing themselves to others when they are uncertain of their own ability or opinions. There is also the facial feedback hypothesis: that changes in facial expression can lead to corresponding changes in emotion.

The fields of social psychology and personality have merged over the years, and social psychologists have developed an interest in self-related phenomena. In contrast with traditional personality theory, however, social psychologists place a greater emphasis on cognitions than on traits. Much research focuses on the self-concept, which is a person's understanding of their self. The self-concept is often divided into a cognitive component, known as the self-schema, and an evaluative component, the self-esteem. The need to maintain a healthy self-esteem is recognized as a central human motivation in the field of social psychology.

Self-efficacy beliefs are associated with the self-schema. These are expectations that performance on some task will be effective and successful. Social psychologists also study such self-related processes as self-control and self-presentation.

People develop their self-concepts by varied means, including introspection, feedback from others, self-perception, and social comparison. By comparing themselves to relevant others, people gain information about themselves, and they make inferences that are relevant to self-esteem. Social comparisons can be either "upward" or "downward," that is, comparisons to people who are either higher in status or ability, or lower in status or ability. Downward comparisons are often made in order to elevate self-esteem.

Self-perception is a specialized form of attribution that involves making inferences about oneself after observing one's own behavior. Psychologists have found that too many extrinsic rewards (e.g. money) tend to reduce intrinsic motivation through the self-perception process, a phenomenon known as overjustification. People's attention is directed to the reward and they lose interest in the task when the reward is no longer offered. This is an important exception to reinforcement theory.

Interpersonal phenomena

Social influence

Social influence is an overarching term given to describe the persuasive effects people have on each other. It is seen as a fundamental value in social psychology and overlaps considerably with research on attitudes and persuasion. The three main areas of social influence include: conformity, compliance, and obedience. Social influence is also closely related to the study of group dynamics, as most principles of influence are strongest when they take place in social groups.

The first major area of social influence is conformity. Conformity is defined as the tendency to act or think like other members of a group. The identity of members within a group, i.e. status, similarity, expertise, as well as cohesion, prior commitment, and accountability to the group help to determine the level of conformity of an individual. Individual variation among group members plays a key role in the dynamic of how willing people will be to conform. Conformity is usually viewed as a negative tendency in American culture, but a certain amount of conformity is adaptive in some situations, as is nonconformity in other situations.

Which line matches the first line, A, B, or C? In the Asch conformity experiments, people frequently followed the majority judgment, even when the majority was (objectively) wrong.
 
The second major area of social influence research is compliance. Compliance refers to any change in behavior that is due to a request or suggestion from another person. The foot-in-the-door technique is a compliance method in which the persuader requests a small favor and then follows up with requesting a larger favor, e.g., asking for the time and then asking for ten dollars. A related trick is the bait and switch.

The third major form of social influence is obedience; this is a change in behavior that is the result of a direct order or command from another person. Obedience as a form of compliance was dramatically highlighted by the Milgram study, wherein people were ready to administer shocks to a person in distress on a researcher's command.

An unusual kind of social influence is the self-fulfilling prophecy. This is a prediction that, in being made, actually causes itself to become true. For example, in the stock market, if it is widely believed that a crash is imminent, investors may lose confidence, sell most of their stock, and thus actually cause the crash. Similarly, people may expect hostility in others and actually induce this hostility by their own behavior.

Group dynamics

A group can be defined as two or more individuals that are connected to each another by social relationships. Groups tend to interact, influence each other, and share a common identity. They have a number of emergent qualities that distinguish them from aggregates:
  • Norms: Implicit rules and expectations for group members to follow, e.g. saying thank you, shaking hands.
  • Roles: Implicit rules and expectations for specific members within the group, e.g. the oldest sibling, who may have additional responsibilities in the family.
  • Relations: Patterns of liking within the group, and also differences in prestige or status, e.g., leaders, popular people.
Temporary groups and aggregates share few or none of these features, and do not qualify as true social groups. People waiting in line to get on a bus, for example, do not constitute a group.

Groups are important not only because they offer social support, resources, and a feeling of belonging, but because they supplement an individual's self-concept. To a large extent, humans define themselves by the group memberships which form their social identity. The shared social identity of individuals within a group influences inter-group behavior, the way in which groups behave towards and perceive each other. These perceptions and behaviors in turn define the social identity of individuals within the interacting groups. The tendency to define oneself by membership in a group may lead to intergroup discrimination, which involves favorable perceptions and behaviors directed towards the in-group, but negative perceptions and behaviors directed towards the out-group. On the other hand, such discrimination and segregation may sometimes exist partly to facilitate a diversity which strengthens society. Inter-group discrimination leads to prejudice and stereotyping, while the processes of social facilitation and group polarization encourage extreme behaviors towards the out-group. 

Groups often moderate and improve decision making, and are frequently relied upon for these benefits, such as in committees and juries. A number of group biases, however, can interfere with effective decision making. For example, group polarization, formerly known as the "risky shift," occurs when people polarize their views in a more extreme direction after group discussion. More problematic is the phenomenon of group-think. This is a collective thinking defect that is characterized by a premature consensus or an incorrect assumption of consensus, caused by members of a group failing to promote views which are not consistent with the views of other members. Group-think occurs in a variety of situations, including isolation of a group and the presence of a highly directive leader. Janis offered the 1961 Bay of Pigs Invasion as a historical case of group-think.

Groups also affect performance and productivity. Social facilitation, for example, is a tendency to work harder and faster in the presence of others. Social facilitation increases the dominant response's likelihood, which tends to improve performance on simple tasks and reduce it on complex tasks. In contrast, social loafing is the tendency of individuals to slack off when working in a group. Social loafing is common when the task is considered unimportant and individual contributions are not easy to see.

Social psychologists study group-related (collective) phenomena such as the behavior of crowds. An important concept in this area is deindividuation, a reduced state of self-awareness that can be caused by feelings of anonymity. Deindividuation is associated with uninhibited and sometimes dangerous behavior. It is common in crowds and mobs, but it can also be caused by a disguise, a uniform, alcohol, dark environments, or online anonymity.

Social psychologists study interactions within groups, and between both groups and individuals.

Interpersonal attraction

A major area in the study of people's relations to each other is interpersonal attraction. This refers to all forces that lead people to like each other, establish relationships, and (in some cases) fall in love. Several general principles of attraction have been discovered by social psychologists, but many still continue to experiment and do research to find out more. One of the most important factors in interpersonal attraction is how similar two particular people are. The more similar two people are in general attitudes, backgrounds, environments, worldviews, and other traits, the more probable an attraction is possible.

Physical attractiveness is an important element of romantic relationships, particularly in the early stages characterized by high levels of passion. Later on, similarity and other compatibility factors become more important, and the type of love people experience shifts from passionate to companionate. Robert Sternberg has suggested that there are actually three components of love: intimacy, passion, and commitment. When two (or more) people experience all three, they are said to be in a state of consummate love.

According to social exchange theory, relationships are based on rational choice and cost-benefit analysis. If one partner's costs begin to outweigh their benefits, that person may leave the relationship, especially if there are good alternatives available. This theory is similar to the minimax principle proposed by mathematicians and economists (despite the fact that human relationships are not zero-sum games). With time, long term relationships tend to become communal rather than simply based on exchange.

Research

Methods

Social psychology is an empirical science that attempts to answer questions about human behavior by testing hypotheses, both in the laboratory and in the field. Careful attention to sampling, research design, and statistical analysis is important; results are published in peer reviewed journals such as the Journal of Experimental Social Psychology, Personality and Social Psychology Bulletin and the Journal of Personality and Social Psychology. Social psychology studies also appear in general science journals such as Psychological Science and Science

Experimental methods involve the researcher altering a variable in the environment and measuring the effect on another variable. An example would be allowing two groups of children to play violent or nonviolent video games, and then observing their subsequent level of aggression during free-play period. A valid experiment is controlled and uses random assignment.

Correlational methods examine the statistical association between two naturally occurring variables. For example, one could correlate the amount of violent television children watch at home with the number of violent incidents the children participate in at school. Note that this study would not prove that violent TV causes aggression in children: it is quite possible that aggressive children choose to watch more violent TV. 

Observational methods are purely descriptive and include naturalistic observation, "contrived" observation, participant observation, and archival analysis. These are less common in social psychology but are sometimes used when first investigating a phenomenon. An example would be to unobtrusively observe children on a playground (with a videocamera, perhaps) and record the number and types of aggressive actions displayed.

Whenever possible, social psychologists rely on controlled experimentation. Controlled experiments require the manipulation of one or more independent variables in order to examine the effect on a dependent variable. Experiments are useful in social psychology because they are high in internal validity, meaning that they are free from the influence of confounding or extraneous variables, and so are more likely to accurately indicate a causal relationship. However, the small samples used in controlled experiments are typically low in external validity, or the degree to which the results can be generalized to the larger population. There is usually a trade-off between experimental control (internal validity) and being able to generalize to the population (external validity). 

Because it is usually impossible to test everyone, research tends to be conducted on a sample of persons from the wider population. Social psychologists frequently use survey research when they are interested in results that are high in external validity. Surveys use various forms of random sampling to obtain a sample of respondents that are representative of a population. This type of research is usually descriptive or correlational because there is no experimental control over variables. However, new statistical methods like structural equation modeling are being used to test for potential causal relationships in this type of data. Some psychologists, including Dr. David O. Sears, have criticized social psychological research for relying too heavily on studies conducted on university undergraduates in academic settings. Over 70% of experiments in Sears' study used North American undergraduates as subjects, a subset of the population that may not be representative of the population as a whole.

Regardless of which method has been chosen to be used, the results are of high importance. Results need to be used to evaluate the hypothesis of the research that is done. These results should either confirm or reject the original hypothesis that was predicted.There are two different types of testing social psychologists use in order to test their results. Statistics and probability testing define a significant finding that can be as low as 5% or less, likely to be due to chance. Replications are important, to ensure that the result is valid and not due to chance, or some feature of a particular sample. False positive conclusions, often resulting from the pressure to publish or the author's own confirmation bias, are a hazard in the field.

Ethics

The goal of social psychology is to understand cognition and behavior as they naturally occur in a social context, but the very act of observing people can influence and alter their behavior. For this reason, many social psychology experiments utilize deception to conceal or distort certain aspects of the study. Deception may include false cover stories, false participants (known as confederates or stooges), false feedback given to the participants, and so on.

The practice of deception has been challenged by some psychologists who maintain that deception under any circumstances is unethical, and that other research strategies (e.g., role-playing) should be used instead. Unfortunately, research has shown that role-playing studies do not produce the same results as deception studies and this has cast doubt on their validity. In addition to deception, experimenters have at times put people into potentially uncomfortable or embarrassing situations (e.g., the Milgram experiment), and this has also been criticized for ethical reasons. 

To protect the rights and well-being of research participants, and at the same time discover meaningful results and insights into human behavior, virtually all social psychology research must pass an ethical review process. At most colleges and universities, this is conducted by an ethics committee or Institutional Review Board. This group examines the proposed research to make sure that no harm is likely to be done to the participants, and that the study's benefits outweigh any possible risks or discomforts to people taking part in the study.

Furthermore, a process of informed consent is often used to make sure that volunteers know what will happen in the experiment and understand that they are allowed to quit the experiment at any time. A debriefing is typically done at the experiment's conclusion in order to reveal any deceptions used and generally make sure that the participants are unharmed by the procedures. Today, most research in social psychology involves no more risk of harm than can be expected from routine psychological testing or normal daily activities.

Replication crisis

Social psychology has recently found itself at the center of a "replication crisis" due to some research findings proving difficult to replicate. Replication failures are not unique to social psychology and are found in all fields of science. However, several factors have combined to put social psychology at the center of the current controversy. 

Firstly, questionable research practices (QRP) have been identified as common in the field. Such practices, while not necessarily intentionally fraudulent, involve converting undesired statistical outcomes into desired outcomes via the manipulation of statistical analyses, sample size or data management, typically to convert non-significant findings into significant ones. Some studies have suggested that at least mild versions of QRP are highly prevalent. One of the critics of Daryl Bem in the feeling the future controversy has suggested that the evidence for precognition in this study could (at least in part) be attributed to QRP. 

Secondly, social psychology has found itself at the center of several recent scandals involving outright fraudulent research. Most notably the admitted data fabrication by Diederik Stapel as well as allegations against others. However, most scholars acknowledge that fraud is, perhaps, the lesser contribution to replication crises.

Third, several effects in social psychology have been found to be difficult to replicate even before the current replication crisis. For example, the scientific journal Judgment and Decision Making has published several studies over the years that fail to provide support for the unconscious thought theory. Replications appear particularly difficult when research trials are pre-registered and conducted by research groups not highly invested in the theory under questioning.

These three elements together have resulted in renewed attention for replication supported by Daniel Kahneman. Scrutiny of many effects have shown that several core beliefs are hard to replicate. A recent special edition of the journal Social Psychology focused on replication studies and a number of previously held beliefs were found to be difficult to replicate. A 2012 special edition of the journal Perspectives on Psychological Science also focused on issues ranging from publication bias to null-aversion that contribute to the replication crises in psychology.

It is important to note that this replication crisis does not mean that social psychology is unscientific. Rather this process is a healthy if sometimes acrimonious part of the scientific process in which old ideas or those that cannot withstand careful scrutiny are pruned. The consequence is that some areas of social psychology once considered solid, such as social priming, have come under increased scrutiny due to failed replications.

Famous experiments

The Milgram experiment: The experimenter (E) persuades the participant (T) to give what the participant believes are painful electric shocks to another participant (L), who is actually an actor. Many participants continued to give shocks despite pleas for mercy from the actor.

The Asch conformity experiments demonstrated the power of conformity in small groups with a line length estimation task that was designed to be extremely easy. In well over a third of the trials, participants conformed to the majority, who had been instructed to provide incorrect answers, even though the majority judgment was clearly wrong. Seventy-five percent of the participants conformed at least once during the experiment. Additional manipulations to the experiment showed participant conformity decreased when at least one other individual failed to conform, but increased when the individual began conforming or withdrew from the experiment. Also, participant conformity increased substantially as the number of incorrect individuals increased from one to three, and remained high as the incorrect majority grew. Participants with three incorrect opponents made mistakes 31.8% of the time, while those with one or two incorrect opponents made mistakes only 3.6% and 13.6% of the time, respectively.

Muzafer Sherif's Robbers' Cave Experiment divided boys into two competing groups to explore how much hostility and aggression would emerge. Sherif's explanation of the results became known as realistic group conflict theory, because the intergroup conflict was induced through competition over resources. Inducing cooperation and superordinate goals later reversed this effect. 

In Leon Festinger's cognitive dissonance experiment, participants were asked to perform a boring task. They were divided into 2 groups and given two different pay scales. At the study's end, some participants were paid $1 to say that they enjoyed the task and another group of participants was paid $20 to say the same lie. The first group ($1) later reported liking the task better than the second group ($20). Festinger's explanation was that for people in the first group being paid only $1 is not sufficient incentive for lying and those who were paid $1 experienced dissonance. They could only overcome that dissonance by justifying their lies by changing their previously unfavorable attitudes about the task. Being paid $20 provides a reason for doing the boring task, therefore no dissonance.

One of the most notable experiments in social psychology was the Milgram experiment, which studied how far people would go to obey an authority figure. Following the events of The Holocaust in World War II, the experiment showed that (most) normal American citizens were capable of following orders from an authority even when they believed they were causing an innocent person to suffer.

Albert Bandura's Bobo doll experiment demonstrated how aggression is learned by imitation. This set of studies fueled debates regarding media violence which continue to be waged among scholars.

Wednesday, January 23, 2019

Agent-based model in biology

From Wikipedia, the free encyclopedia
 
Agent-based models have many applications in biology, primarily due to the characteristics of the modeling method. Agent-based modeling is a rule-based, computational modeling methodology that focuses on rules and interactions among the individual components or the agents of the system. The goal of this modeling method is to generate populations of the system components of interest and simulate their interactions in a virtual world. Agent-based models start with rules for behavior and seek to reconstruct, through computational instantiation of those behavioral rules, the observed patterns of behavior. Several of the characteristics of agent-based models important to biological studies include:
  1. Modular structure: The behavior of an agent-based model is defined by the rules of its agents. Existing agent rules can be modified or new agents can be added without having to modify the entire model.
  2. Emergent properties: Through the use of the individual agents that interact locally with rules of behavior, agent-based models result in a synergy that leads to a higher level whole with much more intricate behavior than those of each individual agent.
  3. Abstraction: Either by excluding non-essential details or when details are not available, agent-based models can be constructed in the absence of complete knowledge of the system under study. This allows the model to be as simple and verifiable as possible.
  4. Stochasticity: Biological systems exhibit behavior that appears to be random. The probability of a particular behavior can be determined for a system as a whole and then be translated into rules for the individual agents.
Before the agent-based model can be developed, one must choose the appropriate software or modeling toolkit to be used. Madey and Nikolai provide an extensive list of toolkits in their paper "Tools of the Trade: A Survey of Various Agent Based Modeling Platforms". The paper seeks to provide users with a method of choosing a suitable toolkit by examining five characteristics across the spectrum of toolkits: the programming language required to create the model, the required operating system, availability of user support, the software license type, and the intended toolkit domain. Some of the more commonly used toolkits include Swarm, NetLogo, RePast, and Mason. Listed below are summaries of several articles describing agent-based models that have been employed in biological studies. The summaries will provide a description of the problem space, an overview of the agent-based model and the agents involved, and a brief discussion of the model results.

Forest insect infestations

In the paper titled "Exploring Forest Management Practices Using an Agent-Based Model of Forest Insect Infestations", an agent-based model was developed to simulate attack behavior of the mountain pine beetle, Dendroctonus ponderosae, (MPB) in order to evaluate how different harvesting policies influence spatial characteristics of the forest and spatial propagation of the MPB infestation over time. About two-thirds of the land in British Columbia, Canada is covered by forests that are constantly being modified by natural disturbances such as fire, disease, and insect infestation. Forest resources make up approximately 15% of the province’s economy, so infestations caused by insects such as the MPB can have significant impacts on the economy. The MPB outbreaks are considered a major natural disturbance that can result in widespread mortality of the lodgepole pine tree, one of the most abundant commercial tree species in British Columbia. Insect outbreaks have resulted in the death of trees over areas of several thousand square kilometers.

The agent-based model developed for this study was designed to simulate the MPB attack behavior in order to evaluate how management practices influence the spatial distribution and patterns of insect population and their preferences for attacked and killed trees. Three management strategies were considered by the model: 1) no management, 2) sanitation harvest and 3) salvage harvest. In the model, the Beetle Agent represented the MPB behavior; the Pine Agent represented the forest environment and tree health evolution; the Forest Management Agent represented the different management strategies. The Beetle Agent follows a series of rules to decide where to fly within the forest and to select a healthy tree to attack, feed, and breed. The MPB typically kills host trees in its natural environment in order to successfully reproduce. The beetle larvae feed on the inner bark of mature host trees, eventually killing them. In order for the beetles to reproduce, the host tree must be sufficiently large and have thick inner bark. The MPB outbreaks end when the food supply decreases to the point that there is not enough to sustain the population or when climatic conditions become unfavorable for the beetle. The Pine Agent simulates the resistance of the host tree, specifically the Lodgepole pine tree, and monitors the state and attributes of each stand of trees. At some point in the MPB attack, the number of beetles per tree reaches the host tree capacity. When this point is reached, the beetles release a chemical to direct beetles to attack other trees. The Pine Agent models this behavior by calculating the beetle population density per stand and passes the information to the Beetle Agents. The Forest Management Agent was used, at the stand level, to simulate two common silviculture practices (sanitation and salvage) as well as the strategy where no management practice was employed. With the sanitation harvest strategy, if a stand has an infestation rate greater than a set threshold, the stand is removed as well as any healthy neighbor stand when the average size of the trees exceeded a set threshold. For the salvage harvest strategy, a stand is removed even it is not under a MPB attack if a predetermined number of neighboring stands are under a MPB attack.

The study considered a forested area in the North-Central Interior of British Columbia of approximately 560 hectare. The area consisted primarily of Lodgepole pine with smaller proportions of Douglas fir and White spruce. The model was executed for five time steps, each step representing a single year. Thirty simulation runs were conducted for each forest management strategy considered. The results of the simulation showed that when no management strategy was employed, the highest overall MPB infestation occurred. The results also showed that the salvage forest management technique resulted in a 25% reduction in the number of forest strands killed by the MPB, as opposed to a 19% reduction by the salvage forest management strategy. In summary, the results show that the model can be used as a tool to build forest management policies.

Invasive species

Invasive species refers to "non-native" plants and animals that adversely affect the environments they invade. The introduction of invasive species may have environmental, economic, and ecological implications. In the paper titled "An Agent-Based Model of Border Enforcement for Invasive Species Management", an agent-based model is presented that was developed to evaluate the impacts of port-specific and importer-specific enforcement regimes for a given agricultural commodity that presents invasive species risk. Ultimately, the goal of the study was to improve the allocation of enforcement resources and to provide a tool to policy makers to answer further questions concerning border enforcement and invasive species risk.

The agent-based model developed for the study considered three types of agents: invasive species, importers, and border enforcement agents. In the model, the invasive species can only react to their surroundings, while the importers and border enforcement agents are able to make their own decisions based on their own goals and objectives. The invasive species has the ability to determine if it has been released in an area containing the target crop, and to spread to adjacent plots of the target crop. The model incorporates spatial probability maps that are used to determine if an invasive species becomes established. The study focused on shipments of broccoli from Mexico into California through the ports of entry Calexico, California and Otay Mesa, California. The selected invasive species of concern was the crucifer flea beetle (Phyllotreta cruciferae). California is by far the largest producer of broccoli in the United States and so the concern and potential impact of an invasive species introduction through the chosen ports of entry is significant. The model also incorporated a spatially explicit damage function that was used to model invasive species damage in a realistic manner. Agent-based modeling provides the ability to analyze the behavior of heterogeneous actors, so three different types of importers were considered that differed in terms of commodity infection rates (high, medium, and low), pretreatment choice, and cost of transportation to the ports. The model gave predictions on inspection rates for each port of entry and importer and determined the success rate of border agent inspection, not only for each port and importer but also for each potential level of pretreatment (no pretreatment, level one, level two, and level three).

The model was implemented and ran in NetLogo, version 3.1.5. Spatial information on the location of the ports of entry, major highways, and transportation routes was included in the analysis as well as a map of California broccoli crops layered with invasive species establishment probability maps. BehaviorSpace, a software tool integrated with NetLogo, was used to test the effects of different parameters (e.g. shipment value, pretreatment cost) in the model. On average, 100 iterations were calculated at each level of the parameter being used, where an iteration represented a one-year run. 

The results of the model showed that as inspection efforts increase, importers increase due care, or the pre-treatment of shipments, and the total monetary loss of California crops decreases. The model showed that importers respond to an increase in inspection effort in different ways. Some importers responded to increased inspection rate by increasing pre-treatment effort, while others chose to avoid shipping to a specific port, or shopped for another port. An important result of the model results is that it can show or provide recommendations to policy makers about the point at which importers may start to shop for ports, such as the inspection rate at which port shopping is introduced and the importers associated with a certain level of pest risk or transportation cost are likely to make these changes. Another interesting outcome of the model is that when inspectors were not able to learn to respond to an importer with previously infested shipments, damage to California broccoli crops was estimated to be $150 million. However, when inspectors were able to increase inspection rates of importers with previous violations, damage to the California broccoli crops was reduced by approximately 12%. The model provides a mechanism to predict the introduction of invasive species from agricultural imports and their likely damage. Equally as important, the model provides policy makers and border control agencies with a tool that can be used to determine the best allocation of inspectional resources.

Aphid population dynamics

In the article titled "Aphid Population Dynamics in Agricultural Landscapes: An Agent-based Simulation Model", an agent-based model is presented to study the population dynamics of the bird cherry-oat aphid, Rhopalosiphum padi (L.). The study was conducted in a five square kilometer region of North Yorkshire, a county located in the Yorkshire and the Humber region of England. The agent-based modeling method was chosen because of its focus on the behavior of the individual agents rather than the population as a whole. The authors propose that traditional models that focus on populations as a whole do not take into account the complexity of the concurrent interactions in ecosystems, such as reproduction and competition for resources which may have significant impacts on population trends. The agent-based modeling approach also allows modelers to create more generic and modular models that are more flexible and easier to maintain than modeling approaches that focus on the population as a whole. Other proposed advantages of agent-based models include realistic representation of a phenomenon of interest due to the interactions of a group of autonomous agents, and the capability to integrate quantitative variables, differential equations, and rule based behavior into the same model.

The model was implemented in the modeling toolkit Repast using the JAVA programming language. The model was run in daily time steps and focused on the autumn and winter seasons. Input data for the model included habitat data, daily minimum, maximum, and mean temperatures, and wind speed and direction. For the Aphid agents, age, position, and morphology (alate or apterous) were considered. Age ranged from 0.00 to 2.00, with 1.00 being the point at which the agent becomes an adult. Reproduction by the Aphid agents is dependent on age, morphology, and daily minimum, maximum, and mean temperatures. Once nymphs hatch, they remain in the same location as their parents. The morphology of the nymphs is related to population density and the nutrient quality of the aphid’s food source. The model also considered mortality among the Aphid agents, which is dependent on age, temperatures, and quality of habitat. The speed at which an Aphid agent ages is determined by the daily minimum, maximum, and mean temperatures. The model considered movement of the Aphid agents to occur in two separate phases, a migratory phase and a foraging phase, both of which affect the overall population distribution

The study started the simulation run with an initial population of 10,000 alate aphids distributed across a grid of 25 meter cells. The simulation results showed that there were two major population peaks, the first in early autumn due to an influx of alate immigrants and the second due to lower temperatures later in the year and a lack of immigrants. Ultimately, it is the goal of the researchers to adapt this model to simulate broader ecosystems and animal types.

Aquatic population dynamics

In the article titled "Exploring Multi-Agent Systems In Aquatic Population Dynamics Modeling", a model is proposed to study the population dynamics of two species of macrophytes. Aquatic plants play a vital role in the ecosystems in which they live as they may provide shelter and food for other aquatic organisms. However, they may also have harmful impacts such as the excessive growth of non-native plants or eutrophication of the lakes in which they live leading to anoxic conditions. Given these possibilities, it is important to understand how the environment and other organisms affect the growth of these aquatic plants to allow mitigation or prevention of these harmful impacts.

Potamogeton pectinatus is one of the aquatic plant agents in the model. It is an annual growth plant that absorbs nutrients from the soil and reproduces through root tubers and rhizomes. Reproduction of the plant is not impacted by water flow, but can be influenced by animals, other plants, and humans. The plant can grow up to two meters tall, which is a limiting condition because it can only grow in certain water depths, and most of its biomass is found at the top of the plant in order to capture the most sunlight possible. The second plant agent in the model is Chara aspera, also a rooted aquatic plant. One major difference in the two plants is that the latter reproduces through the use of very small seeds called oospores and bulbills which are spread via the flow of water. Chara aspera only grows up to 20 cm and requires very good light conditions as well as good water quality, all of which are limiting factors on the growth of the plant. Chara aspera has a higher growth rate than Potamogeton pectinatus but has a much shorter life span. The model also considered environmental and animal agents. Environmental agents considered included water flow, light penetration, and water depth. Flow conditions, although not of high importance to Potamogeton pectinatus, directly impact the seed dispersal of Chara aspera. Flow conditions affect the direction as well as the distance the seeds will be distributed. Light penetration strongly influences Chara aspera as it requires high water quality. Extinction coefficient (EC) is a measure of light penetration in water. As EC increases, the growth rate of Chara aspera decreases. Finally, depth is important to both species of plants. As water depth increases, the light penetration decreases making it difficult for either species to survive beyond certain depths. 

The area of interest in the model was a lake in the Netherlands named Lake Veluwe. It is a relatively shallow lake with an average depth of 1.55 meters and covers about 30 square kilometers. The lake is under eutrophication stress which means that nutrients are not a limiting factor for either of the plant agents in the model. The initial position of the plant agents in the model was randomly determined. The model was implemented using Repast software package and was executed to simulate the growth and decay of the two different plant agents, taking into account the environmental agents previously discussed as well as interactions with other plant agents. The results of the model execution show that the population distribution of Chara aspera has a spatial pattern very similar to the GIS maps of observed distributions. The authors of the study conclude that the agent rules developed in the study are reasonable to simulate the spatial pattern of macrophyte growth in this particular lake.

Bacteria aggregation leading to biofilm formation

In the article titled "iDynoMiCS: next-generation individual-based modelling of biofilms", an agent-based model is presented that models the colonisation of bacteria onto a surface, leading to the formation of biofilms. The purpose of iDynoMiCS (standing for individual-based Dynamics of Microbial Communities Simulator) is to simulate the growth of populations and communities of individual microbes (small unicellular organisms such as bacteria, archaea and protists) that compete for space and resources in biofilms immersed in aquatic environments. iDynoMiCS can be used to seek to understand how individual microbial dynamics lead to emergent population- or biofilm-level properties and behaviours. Examining such formations is important in soil and river studies, dental hygiene studies, infectious disease and medical implant related infection research, and for understanding biocorrosion. An agent-based modelling paradigm was employed to make it possible to explore how each individual bacterium, of a particular species, contributes to the development of the biofilm. The initial illustration of iDynoMiCS considered how environmentally fluctuating oxygen availability affects the diversity and composition of a community of denitrifying bacteria that induce the denitrification pathway under anoxic or low oxygen conditions. The study explores the hypothesis that the existence of diverse strategies of denitrification in an environment can be explained by solely assuming that faster response incurs a higher cost. The agent-based model suggests that if metabolic pathways can be switched without cost the faster the switching the better. However, where faster switching incurs a higher cost, there is a strategy with optimal response time for any frequency of environmental fluctuations. This suggests that different types of denitrifying strategies win in different biological environments. Since this introduction the applications of iDynoMiCS continues to increase: a recent exploration of the plasmid invasion in biofilms being one example. This study explored the hypothesis that poor plasmid spread in biofilms is caused by a dependence of conjugation on the growth rate of the plasmid donor agent. Through simulation, the paper suggests that plasmid invasion into a resident biofilm is only limited when plasmid transfer depends on growth. Sensitivity analysis techniques were employed that suggests parameters relating to timing (lag before plasmid transfer between agents) and spatial reach are more important for plasmid invasion into a biofilm than the receiving agents growth rate or probability of segregational loss. Further examples that use iDynoMiCS continue to be published, including use of iDynoMiCS in modelling of a Pseudomonas aeruginosa biofilm with glucose substrate.

iDynoMiCS has been developed by an international team of researchers in order to provide a common platform for further development of all individual-based models of microbial biofilms and such like. The model was originally the result of years of work by Laurent Lardon, Brian Merkey, and Jan-Ulrich Kreft, with code contributions from Joao Xavier. With additional funding from the National Centre for Replacement, Refinement, and Reduction of Animals in Research (NC3Rs) in 2013, the development of iDynoMiCS as a tool for biological exploration continues apace, with new features being added when appropriate. From its inception, the team have committed to releasing iDynoMiCS as an open source platform, encouraging collaborators to develop additional functionality that can then be merged into the next stable release. IDynoMiCS has been implemented in the Java programming language, with MATLAB and R scripts provided to analyse results. Biofilm structures that are formed in simulation can be viewed as a movie using POV-Ray files that are generated as the simulation is run.

Mammary stem cell enrichment following irradiation during puberty

Experiments have shown that exposure to ionizing irradiation of pubertal mammary glands results in an increase in the ratio of mammary stem cells in the gland. This is important because stem cells are thought to be key targets for cancer initiation by ionizing radiation because they have the greatest long-term proliferative potential and mutagenic events persist in multiple daughter cells. Additionally, epidemiology data show that children exposed to ionizing radiation have a substantially greater breast cancer risk than adults. These experiments thus prompted questions about the underlying mechanism for the increase in mammary stem cells following radiation. In this research article titled "Irradiation of Juvenile, but not Adult, Mammary Gland Increases Stem Cell Self-Renewal and Estrogen Receptor Negative Tumors", two agent-based models were developed and were used in parallel with in vivo and in vitro experiments to evaluate cell inactivation, dedifferentiation via epithelial-mesenchymal transition (EMT), and self-renewal (symmetric division) as mechanisms by which radiation could increase stem cells. 

The first agent-based model is a multiscale model of mammary gland development starting with a rudimentary mammary ductal tree at the onset of puberty (during active proliferation) all the way to a full mammary gland at adulthood (when there is little proliferation). The model consists of millions of agents, with each agent representing a mammary stem cell, a progenitor cell, or a differentiated cell in the breast. Simulations were first run on the Lawrence Berkeley National Laboratory Lawrencium supercomputer to parameterize and benchmark the model against a variety of in vivo mammary gland measurements. The model was then used to test the three different mechanisms to determine which one led to simulation results that matched in vivo experiments the best. Surprisingly, radiation-induced cell inactivation by death did not contribute to increased stem cell frequency independently of the dose delivered in the model. Instead the model revealed that the combination of increased self-renewal and cell proliferation during puberty led to stem cell enrichment. In contrast epithelial-mesenchymal transition in the model was shown to increase stem cell frequency not only in pubertal mammary glands but also in adult glands. This latter prediction, however, contradicted the in vivo data; irradiation of adult mammary glands did not lead to increased stem cell frequency. These simulations therefore suggested self-renewal as the primary mechanism behind pubertal stem cell increase.

To further evaluate self-renewal as the mechanism, a second agent-based model was created to simulate the growth dynamics of human mammary epithelial cells (containing stem/progenitor and differentiated cell subpopulations) in vitro after irradiation. By comparing the simulation results with data from the in vitro experiments, the second agent-based model further confirmed that cells must extensively proliferate to observe a self-renewal dependent increase in stem/progenitor cell numbers after irradiation.

The combination of the two agent-based models and the in vitro/in vivo experiments provide insight into why children exposed to ionizing radiation have a substantially greater breast cancer risk than adults. Together, they support the hypothesis that the breast is susceptible to a transient increase in stem cell self-renewal when exposed to radiation during puberty, which primes the adult tissue to develop cancer decades later.

Agent-based model

From Wikipedia, the free encyclopedia
 
An agent-based model (ABM) is a class of computational models for simulating the actions and interactions of autonomous agents (both individual or collective entities such as organizations or groups) with a view to assessing their effects on the system as a whole. It combines elements of game theory, complex systems, emergence, computational sociology, multi-agent systems, and evolutionary programming. Monte Carlo methods are used to introduce randomness. Particularly within ecology, ABMs are also called individual-based models (IBMs), and individuals within IBMs may be simpler than fully autonomous agents within ABMs. A review of recent literature on individual-based models, agent-based models, and multiagent systems shows that ABMs are used on non-computing related scientific domains including biology, ecology and social science. Agent-based modeling is related to, but distinct from, the concept of multi-agent systems or multi-agent simulation in that the goal of ABM is to search for explanatory insight into the collective behavior of agents obeying simple rules, typically in natural systems, rather than in designing agents or solving specific practical or engineering problems.
 
Agent-based models are a kind of microscale model that simulate the simultaneous operations and interactions of multiple agents in an attempt to re-create and predict the appearance of complex phenomena. The process is one of emergence from the lower (micro) level of systems to a higher (macro) level. As such, a key notion is that simple behavioral rules generate complex behavior. This principle, known as K.I.S.S. ("Keep it simple, stupid"), is extensively adopted in the modeling community. Another central tenet is that the whole is greater than the sum of the parts. Individual agents are typically characterized as boundedly rational, presumed to be acting in what they perceive as their own interests, such as reproduction, economic benefit, or social status, using heuristics or simple decision-making rules. ABM agents may experience "learning", adaptation, and reproduction.

Most agent-based models are composed of: (1) numerous agents specified at various scales (typically referred to as agent-granularity); (2) decision-making heuristics; (3) learning rules or adaptive processes; (4) an interaction topology; and (5) an environment. ABMs are typically implemented as computer simulations, either as custom software, or via ABM toolkits, and this software can be then used to test how changes in individual behaviors will affect the system's emerging overall behavior.

History

The idea of agent-based modeling was developed as a relatively simple concept in the late 1940s. Since it requires computation-intensive procedures, it did not become widespread until the 1990s.

Early developments

The history of the agent-based model can be traced back to the Von Neumann machine, a theoretical machine capable of reproduction. The device von Neumann proposed would follow precisely detailed instructions to fashion a copy of itself. The concept was then built upon by von Neumann's friend Stanislaw Ulam, also a mathematician; Ulam suggested that the machine be built on paper, as a collection of cells on a grid. The idea intrigued von Neumann, who drew it up—creating the first of the devices later termed cellular automata. Another advance was introduced by the mathematician John Conway. He constructed the well-known Game of Life. Unlike von Neumann's machine, Conway's Game of Life operated by tremendously simple rules in a virtual world in the form of a 2-dimensional checkerboard.

1970s and 1980s: the first models

One of the earliest agent-based models in concept was Thomas Schelling's segregation model, which was discussed in his paper "Dynamic Models of Segregation" in 1971. Though Schelling originally used coins and graph paper rather than computers, his models embodied the basic concept of agent-based models as autonomous agents interacting in a shared environment with an observed aggregate, emergent outcome. 

In the early 1980s, Robert Axelrod hosted a tournament of Prisoner's Dilemma strategies and had them interact in an agent-based manner to determine a winner. Axelrod would go on to develop many other agent-based models in the field of political science that examine phenomena from ethnocentrism to the dissemination of culture. By the late 1980s, Craig Reynolds' work on flocking models contributed to the development of some of the first biological agent-based models that contained social characteristics. He tried to model the reality of lively biological agents, known as artificial life, a term coined by Christopher Langton

The first use of the word "agent" and a definition as it is currently used today is hard to track down. One candidate appears to be John Holland and John H. Miller's 1991 paper "Artificial Adaptive Agents in Economic Theory", based on an earlier conference presentation of theirs.

At the same time, during the 1980s, social scientists, mathematicians, operations researchers, and a scattering of people from other disciplines developed Computational and Mathematical Organization Theory (CMOT). This field grew as a special interest group of The Institute of Management Sciences (TIMS) and its sister society, the Operations Research Society of America (ORSA).

1990s: expansion

With the appearance of StarLogo in 1990, Swarm and NetLogo in the mid-1990s and RePast and AnyLogic in 2000, or GAMA in 2007 as well as some custom-designed code, modelling software became widely available and the range of domains that ABM was applied to, grew. Bonabeau (2002) is a good survey of the potential of agent-based modeling as of the time.

The 1990s were especially notable for the expansion of ABM within the social sciences, one notable effort was the large-scale ABM, Sugarscape, developed by Joshua M. Epstein and Robert Axtell to simulate and explore the role of social phenomena such as seasonal migrations, pollution, sexual reproduction, combat, and transmission of disease and even culture. Other notable 1990s developments included Carnegie Mellon University's Kathleen Carley ABM, to explore the co-evolution of social networks and culture. During this 1990s time frame Nigel Gilbert published the first textbook on Social Simulation: Simulation for the social scientist (1999) and established a journal from the perspective of social sciences: the Journal of Artificial Societies and Social Simulation (JASSS). Other than JASSS, agent-based models of any discipline are within scope of SpringerOpen journal Complex Adaptive Systems Modeling (CASM).

Through the mid-1990s, the social sciences thread of ABM began to focus on such issues as designing effective teams, understanding the communication required for organizational effectiveness, and the behavior of social networks. CMOT—later renamed Computational Analysis of Social and Organizational Systems (CASOS)—incorporated more and more agent-based modeling. Samuelson (2000) is a good brief overview of the early history, and Samuelson (2005) and Samuelson and Macal (2006) trace the more recent developments.

In the late 1990s, the merger of TIMS and ORSA to form INFORMS, and the move by INFORMS from two meetings each year to one, helped to spur the CMOT group to form a separate society, the North American Association for Computational Social and Organizational Sciences (NAACSOS). Kathleen Carley was a major contributor, especially to models of social networks, obtaining National Science Foundation funding for the annual conference and serving as the first President of NAACSOS. She was succeeded by David Sallach of the University of Chicago and Argonne National Laboratory, and then by Michael Prietula of Emory University. At about the same time NAACSOS began, the European Social Simulation Association (ESSA) and the Pacific Asian Association for Agent-Based Approach in Social Systems Science (PAAA), counterparts of NAACSOS, were organized. As of 2013, these three organizations collaborate internationally. The First World Congress on Social Simulation was held under their joint sponsorship in Kyoto, Japan, in August 2006. The Second World Congress was held in the northern Virginia suburbs of Washington, D.C., in July 2008, with George Mason University taking the lead role in local arrangements.

2000s and later

More recently, Ron Sun developed methods for basing agent-based simulation on models of human cognition, known as cognitive social simulation. Bill McKelvey, Suzanne Lohmann, Dario Nardi, Dwight Read and others at UCLA have also made significant contributions in organizational behavior and decision-making. Since 2001, UCLA has arranged a conference at Lake Arrowhead, California, that has become another major gathering point for practitioners in this field. In 2014, Sadegh Asgari from Columbia University and his colleagues developed an agent-based model of the construction competitive bidding. While his model was used to analyze the low-bid lump-sum construction bids, it could be applied to other bidding methods with little modifications to the model.

Theory

Most computational modeling research describes systems in equilibrium or as moving between equilibria. Agent-based modeling, however, using simple rules, can result in different sorts of complex and interesting behavior. The three ideas central to agent-based models are agents as objects, emergence, and complexity.

Agent-based models consist of dynamically interacting rule-based agents. The systems within which they interact can create real-world-like complexity. Typically agents are situated in space and time and reside in networks or in lattice-like neighborhoods. The location of the agents and their responsive behavior are encoded in algorithmic form in computer programs. In some cases, though not always, the agents may be considered as intelligent and purposeful. In ecological ABM (often referred to as "individual-based models" in ecology), agents may, for example, be trees in forest, and would not be considered intelligent, although they may be "purposeful" in the sense of optimizing access to a resource (such as water). The modeling process is best described as inductive. The modeler makes those assumptions thought most relevant to the situation at hand and then watches phenomena emerge from the agents' interactions. Sometimes that result is an equilibrium. Sometimes it is an emergent pattern. Sometimes, however, it is an unintelligible mangle.

In some ways, agent-based models complement traditional analytic methods. Where analytic methods enable humans to characterize the equilibria of a system, agent-based models allow the possibility of generating those equilibria. This generative contribution may be the most mainstream of the potential benefits of agent-based modeling. Agent-based models can explain the emergence of higher-order patterns—network structures of terrorist organizations and the Internet, power-law distributions in the sizes of traffic jams, wars, and stock-market crashes, and social segregation that persists despite populations of tolerant people. Agent-based models also can be used to identify lever points, defined as moments in time in which interventions have extreme consequences, and to distinguish among types of path dependency.

Rather than focusing on stable states, many models consider a system's robustness—the ways that complex systems adapt to internal and external pressures so as to maintain their functionalities. The task of harnessing that complexity requires consideration of the agents themselves—their diversity, connectedness, and level of interactions.

Framework

Recent work on the Modeling and simulation of Complex Adaptive Systems has demonstrated the need for combining agent-based and complex network based models. describe a framework consisting of four levels of developing models of complex adaptive systems described using several example multidisciplinary case studies:
  1. Complex Network Modeling Level for developing models using interaction data of various system components.
  2. Exploratory Agent-based Modeling Level for developing agent-based models for assessing the feasibility of further research. This can e.g. be useful for developing proof-of-concept models such as for funding applications without requiring an extensive learning curve for the researchers.
  3. Descriptive Agent-based Modeling (DREAM) for developing descriptions of agent-based models by means of using templates and complex network-based models. Building DREAM models allows model comparison across scientific disciplines.
  4. Validated agent-based modeling using Virtual Overlay Multiagent system (VOMAS) for the development of verified and validated models in a formal manner.
Other methods of describing agent-based models include code templates and text-based methods such as the ODD (Overview, Design concepts, and Design Details) protocol.

The role of the environment where agents live, both macro and micro, is also becoming an important factor in agent-based modelling and simulation work. Simple environment affords simple agents, but complex environments generates diversity of behavior.

Applications

In biology

Agent-based modeling has been used extensively in biology, including the analysis of the spread of epidemics, and the threat of biowarfare, biological applications including population dynamics, vegetation ecology, landscape diversity, the growth and decline of ancient civilizations, evolution of ethnocentric behavior, forced displacement/migration, language choice dynamics, cognitive modeling, and biomedical applications including modeling 3D breast tissue formation/morphogenesis, the effects of ionizing radiation on mammary stem cell sub-population dynamics, inflammation, and the human immune system. Agent-based models have also been used for developing decision support systems such as for breast cancer. Agent-based models are increasingly being used to model pharmacological systems in early stage and pre-clinical research to aid in drug development and gain insights into biological systems that would not be possible a priori. Military applications have also been evaluated. Moreover, agent-based models have been recently employed to study molecular-level biological systems.

In business, technology and network theory

Agent-based models have been used since the mid-1990s to solve a variety of business and technology problems. Examples of applications include the modeling of organizational behavior and cognition, team working, supply chain optimization and logistics, modeling of consumer behavior, including word of mouth, social network effects, distributed computing, workforce management, and portfolio management. They have also been used to analyze traffic congestion.

Recently, agent based modelling and simulation has been applied to various domains such as studying the impact of publication venues by researchers in the computer science domain (journals versus conferences). In addition, ABMs have been used to simulate information delivery in ambient assisted environments. A November 2016 article in arXiv analyzed an agent based simulation of posts spread in the Facebook online social network. In the domain of peer-to-peer, ad-hoc and other self-organizing and complex networks, the usefulness of agent based modeling and simulation has been shown. The use of a computer science-based formal specification framework coupled with wireless sensor networks and an agent-based simulation has recently been demonstrated.

Agent based evolutionary search or algorithm is a new research topic for solving complex optimization problems.

In economics and social sciences

Screenshot of an agent-based modeling software program
Graphic user interface for an agent-based modeling tool.
 
Prior to, and in the wake of the financial crisis, interest has grown in ABMs as possible tools for economic analysis. ABMs do not assume the economy can achieve equilibrium and "representative agents" are replaced by agents with diverse, dynamic, and interdependent behavior including herding. ABMs take a "bottom-up" approach and can generate extremely complex and volatile simulated economies. ABMs can represent unstable systems with crashes and booms that develop out of non-linear (disproportionate) responses to proportionally small changes. A July 2010 article in The Economist looked at ABMs as alternatives to DSGE models. The journal Nature also encouraged agent-based modeling with an editorial that suggested ABMs can do a better job of representing financial markets and other economic complexities than standard models along with an essay by J. Doyne Farmer and Duncan Foley that argued ABMs could fulfill both the desires of Keynes to represent a complex economy and of Robert Lucas to construct models based on microfoundations. Farmer and Foley pointed to progress that has been made using ABMs to model parts of an economy, but argued for the creation of a very large model that incorporates low level models. By modeling a complex system of analysts based on three distinct behavioral profiles – imitating, anti-imitating, and indifferent – financial markets were simulated to high accuracy. Results showed a correlation between network morphology and the stock market index.

Since the beginning of the 21st century ABMs have been deployed in architecture and urban planning to evaluate design and to simulate pedestrian flow in the urban environment. There is also a growing field of socio-economic analysis of infrastructure investment impact using ABM's ability to discern systemic impacts upon a socio-economic network.

Organizational ABM: agent-directed simulation

The agent-directed simulation (ADS) metaphor distinguishes between two categories, namely "Systems for Agents" and "Agents for Systems." Systems for Agents (sometimes referred to as agents systems) are systems implementing agents for the use in engineering, human and social dynamics, military applications, and others. Agents for Systems are divided in two subcategories. Agent-supported systems deal with the use of agents as a support facility to enable computer assistance in problem solving or enhancing cognitive capabilities. Agent-based systems focus on the use of agents for the generation of model behavior in a system evaluation (system studies and analyses).

Implementation

Many agent-based modeling software are designed for serial von-Neumann computer architectures. This limits the speed and scalability of these systems. A recent development is the use of data-parallel algorithms on Graphics Processing Units GPUs for ABM simulation. The extreme memory bandwidth combined with the sheer number crunching power of multi-processor GPUs has enabled simulation of millions of agents at tens of frames per second.

Verification and validation

Verification and validation (V&V) of simulation models is extremely important. Verification involves the model being debugged to ensure it works correctly, whereas validation ensures that the right model has been built. Face validation, sensitivity analysis, calibration and statistical validation have also been demonstrated. A discrete-event simulation framework approach for the validation of agent-based systems has been proposed. A comprehensive resource on empirical validation of agent-based models can be found here.

As an example of V&V technique, consider VOMAS (virtual overlay multi-agent system), a software engineering based approach, where a virtual overlay multi-agent system is developed alongside the agent-based model. The agents in the multi-agent system are able to gather data by generation of logs as well as provide run-time validation and verification support by watch agents and also agents to check any violation of invariants at run-time. These are set by the Simulation Specialist with help from the SME (subject-matter expert). Muazi et al. also provide an example of using VOMAS for verification and validation of a forest fire simulation model.

VOMAS provides a formal way of validation and verification. To develop a VOMAS, one must design VOMAS agents along with the agents in the actual simulation, preferably from the start. In essence, by the time the simulation model is complete, one can essentially consider it to be one model containing two models:
  1. An agent-based model of the intended system
  2. An agent-based model of the VOMAS
Unlike all previous work on verification and validation, VOMAS agents ensure that the simulations are validated in-simulation i.e. even during execution. In case of any exceptional situations, which are programmed on the directive of the Simulation Specialist (SS), the VOMAS agents can report them. In addition, the VOMAS agents can be used to log key events for the sake of debugging and subsequent analysis of simulations. In other words, VOMAS allows for a flexible use of any given technique for the sake of verification and validation of an agent-based model in any domain.

Details of validated agent-based modeling using VOMAS along with several case studies are given in. This thesis also gives details of "exploratory agent-based modeling", "descriptive agent-based modeling" and "validated agent-based modeling", using several worked case study examples.

Complex systems modelling

Mathematical models of complex systems are of three types: black-box (phenomenological), white-box (mechanistic, based on the first principles) and grey-box (mixtures of phenomenological and mechanistic models).

In black-box models, the individual-based (mechanistic) mechanisms of a complex dynamic system remain hidden.

Mathematical models for complex systems
 
Black-box models are completely non-mechanistic. They are phenomenological and ignore a composition and internal structure of a complex system. We cannot investigate interactions of subsystems of such a non-transparent model. A white-box model of complex dynamic system has ‘transparent walls’ and directly shows underlying mechanisms. All events at micro-, meso- and macro-levels of a dynamic system are directly visible at all stages of its white-box model evolution. In most cases mathematical modelers use the heavy black-box mathematical methods, which cannot produce mechanistic models of complex dynamic systems. Grey-box models are intermediate and combine black-box and white-box approaches. 

Logical deterministic individual-based cellular automata model of single species population growth
 
Creation of a white-box model of complex system is associated with the problem of the necessity of an a priori basic knowledge of the modeling subject. The deterministic logical cellular automata are necessary but not sufficient condition of a white-box model. The second necessary prerequisite of a white-box model is the presence of the physical ontology of the object under study. The white-box modeling represents an automatic hyper-logical inference from the first principles because it is completely based on the deterministic logic and axiomatic theory of the subject. The purpose of the white-box modeling is to derive from the basic axioms a more detailed, more concrete mechanistic knowledge about the dynamics of the object under study. The necessity to formulate an intrinsic axiomatic system of the subject before creating its white-box model distinguishes the cellular automata models of white-box type from cellular automata models based on arbitrary logical rules. If cellular automata rules have not been formulated from the first principles of the subject, then such a model may have a weak relevance to the real problem.

Introduction to entropy

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Introduct...