Moral psychology is a field of study in both philosophy and psychology. Some use the term "moral psychology" relatively narrowly to refer to the study of moral development. However, others tend to use the term more broadly to include any topics at the intersection of ethics, psychology, and philosophy of mind. Some of the main topics of the field are moral judgment, moral reasoning, moral sensitivity, moral responsibility, moral motivation, moral identity, moral action, moral development, moral diversity, moral character (especially as related to virtue ethics), altruism, psychological egoism, moral luck, moral forecasting, moral emotion, affective forecasting, and moral disagreement.
A moral act is a type of behavior that refers has either a moral or immoral consequence. In many cultures, a moral act refers to an act that entails free will, purity, liberty, honesty, and meaning, whereas an immoral act refers to an act that entails corruption and fraudulence.
Some psychologists that have worked in the field are: Jean Piaget, Lawrence Kohlberg, Carol Gilligan, Elliot Turiel, Jonathan Haidt, Linda Skitka, Leland Saunders, Marc Hauser, C. Daniel Batson, Jean Decety, Joshua Greene, A. Peter McGraw, Philip Tetlock, Darcia Narvaez, Tobias Krettenauer, Liane Young, Daniel Hart, Suzanne Fegley, and Fiery Cushman. Philosophers that have worked in the field include Stephen Stich, John Doris, Joshua Knobe, John Mikhail, Shaun Nichols, Thomas Nagel, Robert C. Roberts, Jesse Prinz, Michael Smith, and R. Jay Wallace.
A moral act is a type of behavior that refers has either a moral or immoral consequence. In many cultures, a moral act refers to an act that entails free will, purity, liberty, honesty, and meaning, whereas an immoral act refers to an act that entails corruption and fraudulence.
Some psychologists that have worked in the field are: Jean Piaget, Lawrence Kohlberg, Carol Gilligan, Elliot Turiel, Jonathan Haidt, Linda Skitka, Leland Saunders, Marc Hauser, C. Daniel Batson, Jean Decety, Joshua Greene, A. Peter McGraw, Philip Tetlock, Darcia Narvaez, Tobias Krettenauer, Liane Young, Daniel Hart, Suzanne Fegley, and Fiery Cushman. Philosophers that have worked in the field include Stephen Stich, John Doris, Joshua Knobe, John Mikhail, Shaun Nichols, Thomas Nagel, Robert C. Roberts, Jesse Prinz, Michael Smith, and R. Jay Wallace.
Background
History
Historically, early philosophers such as Aristotle and Plato engaged in both empirical research and a priori conceptual analysis about the ways in which people make decisions about issues that raise moral concerns. With the development of psychology as a discipline separate from philosophy, it was natural for psychologists to continue pursuing work in moral psychology, and much of the empirical research of the 20th century in this area was completed by academics working in psychology departments.Today, moral psychology is a thriving area of research in both philosophy and psychology, even at an interdisciplinary level.[8] For example, the psychologist Lawrence Kohlberg questioned boys and young men about their thought processes when they were faced with a moral dilemma producing one of many very useful empirical studies in the area of moral psychology. As another example, the philosopher Joshua Knobe recently[when?] completed an empirical study on how the way in which an ethical problem is phrased dramatically affects an individual's intuitions about the proper moral response to the problem.[citation needed] More conceptually focused research has been completed by researchers such as John Doris. Doris discusses the way in which social psychological experiments—such as the Stanford prison experiments involving the idea of situationism—call into question a key component in virtue ethics: the idea that individuals have a single, environment-independent moral character.[11][page needed] As a further example, Shaun Nichols (2004) examines how empirical data on psychopathology suggests that moral rationalism is false.[12][page needed]
Measures
Philosophers and psychologists have created structured interviews and surveys as a means to study moral psychology and its development.Interview techniques
Since at least 1894, philosophers and psychologists attempted to evaluate the morality of an individual, especially attempting to distinguish adults from children in terms of their judgment, but the efforts failed because they "attempted to quantify how much morality an individual had—a notably contentious idea—rather than understand the individual's psychological representation of morality".[13] Lawrence Kohlberg addressed that difficulty in 1963 by modeling evaluative diversity as reflecting a series of developmental stages (à la Jean Piaget). Lawrence Kohlberg's stages of moral development are:[14]- Obedience and punishment orientation
- Self-interest orientation
- Interpersonal accord and conformity
- Authority and social-order maintaining orientation
- Social contract orientation
- Universal ethical principles
Rather than confirm the existence of a single highest stage, Larry Walker's cluster analysis of a wide variety of interview and survey variables for moral exemplars found three types: the "caring" or "communal" cluster was strongly relational and generative, the "deliberative" cluster had sophisticated epistemic and moral reasoning, and the "brave" or "ordinary" cluster was less distinguished by personality.[15]
Survey instruments
Between 1910 and 1930, in the United States and Europe, several morality tests were developed to classify subjects as fit or unfit to make moral judgments.[13][16] Test-takers would classify or rank standardized lists of personality traits, hypothetical actions, or pictures of hypothetical scenes. As early as 1926, catalogs of personality tests included sections specifically for morality tests, though critics persuasively argued that they merely measured awareness of social expectations.[17][page needed]Meanwhile, Kohlberg inspired a new wave of morality tests. The Defining Issues Test (dubbed "Neo-Kohlbergian" by its constituents) scores relative preference for post-conventional justifications,[18] and the Moral Judgment Test scores consistency of one's preferred justifications.[19] Both treat evaluative ability as similar to IQ (hence the single score), allowing categorization by high score vs. low score.
The Moral Foundations Questionnaire is based on moral intuitions consistent across cultures: care/harm, fairness/cheating, loyalty/betrayal, authority/subversion, and sanctity/degradation (liberty/oppression may be added). The questions ask respondents to rate what they consider morally relevant post-consciously (i.e. this is not a behavioral measure). The purpose of the questionnaire is to measure the degree to which people rely upon different sets of moral intuitions (which may coexist), rather than to categorize decision-makers, but the first two foundations cluster together with liberal political orientation and the latter three cluster with conservative political orientation.[20][21]
The Moral DNA survey by Roger Steare asks respondents to rank their virtues, then divides respondents by three virtue clusters: obedience, care, and reason. The survey was developed for use in business settings, especially to raise awareness of ways perceived workplace discrimination diminishes effective evaluative diversity.[22]
In 1999, some of Kohlberg's measures were tested when Anne Colby and William Damon published a study in which the development of extraordinary moral development was examined in the lives of moral exemplars. In order to specifically show how people may develop exceptional moral commitments, the researchers focused on the two stories of women seen as moral exemplars. They came from different backgrounds and yet had similar moral development over the course of their lives. After finding these participants that exhibited high levels of moral commitment in their everyday behavior, the researchers utilized the moral judgement interview (MJI) to compare the 23 exemplars (including the two women) with a more ordinary group of people. The researchers also put the 23 exemplars through two standard dilemmas to assess what level they were on in Kohlberg's stages. The intention was to learn more about moral exemplars and to examine the strengths and weaknesses of the Kohlberg measure. It was found that the MJI scores were not clustered at the high end of Kohlberg's scale, they ranged from stage 3 to stage 5. Half landed at the conventional level (stages 3, 3/4, and 4) and the other half landed at the postconventional level (stages 4/5 and 5). Compared to the general population, the scores of the moral exemplars may be somewhat higher than those of groups not selected for outstanding moral behaviour. Researchers noted that the "moral judgement scores are clearly related to subjects' educational attainment in this study". Among the participants that had attained college education or above, there was no difference in moral judgement scores between genders. The study noted that although the exemplars' scores may have been higher than those of nonexemplars, it is also clear that one is not required to score at Kohlberg's highest stages in order to exhibit high degrees of moral commitment and exemplary behaviour.[10] Apart from their scores, it was found that the 23 participating moral exemplars described three similar themes within all of their moral developments: certainty, positivity, and the unity of self and moral goals. The unity between self and moral goals was highlighted as the most important theme as it is what truly sets the exemplars apart from the 'ordinary' people. It was discovered that the moral exemplars see their morality as a part of their sense of identity and sense of self, not as a conscious choice or chore. Also, the moral exemplars showed a much broader range of moral concern than did the ordinary people and go beyond the normal acts of daily moral engagements. For example, the moral exemplars would feed their own children, but then go farther and fight to end world hunger on a global scale as well. In order to encourage this strong sense of moral development in children and adolescents, it is recommended to encourage a sense of empowerment and to also show a positive and optimistic approach to life.
Theories
Moral identity
Empirical studies show that reasoning and emotion only moderately predicted moral action. Scholars, such as Blasi, began proposing identity as a motivating factor in moral motivation.[23] Blasi proposed the self model of moral functioning, which described the effects of the judgment of responsibility to perform a moral action, one's sense of moral identity, and the desire for self-consistency on moral action. Blasi also elaborates on the structure of identity and its connection to morality. According to Blasi, there are two aspects that form identity. One of the aspects focuses on the specific contents that make up the self (objective identity content), which include moral ideals. The second refers to the ways in which identity is subjectively experienced (subjective identity experience). As the subjective side of identity matures, the objective side tends to lean towards internal contents like values, beliefs, and goals, rather than external identity contents like physical aspects, behaviors, and relationships. A mature subjective identity yearns for a greater sense of self-consistency. Therefore, identity would serve as a motivation for moral action. Studies of moral exemplars have shown that exemplary moral action often results from the intertwining of personal goals and desires with moral goals, and studies on moral behavior also show a correlation between moral identity and action. S. Hardy and G. Carlo raise critical questions about Blasi's model as well, and propose that researchers should seek to better operationalize and measure moral identity and apply findings to moral education and intervention programs.[24]Anne Colby and William Damon suggest that one's moral identity is formed through that individual's synchronization of their personal and moral goals. This unity of their self and morality is what distinguishes them from non-exemplars and in turn makes them exceptional.[25] Colby and Damon studied moral identity through the narratives of civil rights activist Virginia Foster Durr and Suzie Valadez, who provided services for the poor, whose behavior, actions, and life's works were considered to be morally exemplary by their communities and those with whom they came in contact. Some common characteristics that these moral exemplars possess are certainty, positivity (e.g. enjoyment of work, and optimism), and unity of self and moral goals.[26] The research suggests that a "transformation of goals" takes place during the evolution of one's moral identity and development and therefore is not an exercise of self-sacrifice but rather one done with great joy; moral exemplars see their personal goals and moral goals as synonymous. This transformation is not always a deliberate process and is most often a gradual process, but can also be rapidly set off by a triggering event.[27] Triggering events can be anything from a powerful moment in a movie to a traumatic life event, or as in the case of Suzie Valadez, the perception of a vision from God. In many of the moral exemplars interviewed, the triggering events and goal transformation did not take place until their 40s. Moral exemplars are said to have the same concerns and commitments as other moral people but to a greater degree, "extensions in scope, intensity and breadth".[28] Furthermore, exemplars possess the ability to be open to new ideas and experiences, also known as an "active receptiveness"[29] to things exterior to themselves.
A 1995 study was conducted to see how teenagers who conducted themselves in a caring manner throughout their communities saw themselves. The findings suggested that adolescent caring exemplars formulated their self-concept differently from comparable peers. Moral exemplars were found to have more references to positive, moral, caring personality traits as well as moral and caring goals. They were also more likely to emphasize academic goals and amoral typical activities. There were no significant differences between the exemplars and the control group concerning moral knowledge. On a semantic space analyses, the moral exemplars tended to view their actual self as more integrated with their ideal and expected self.[30]
David Wong proposes that we think of cultures in an analogy to a conversation, there are people with different beliefs, values, and norms that can voice their opinion loudly or quietly, but over the course of time these factors can change. A moral culture can provide other members with a kind of "language" where there is plenty of room for different "dialects", this allows moral identities to be established and voiced more.[citation needed]
According to Blasi's theory on moral character, moral character is identified by the person's set of the morality of virtues and vices. He theorized willpower, moral desires, and integrity have the capability for a person to act morally by the hierarchical order of virtues. He believed that the "highest" and complex of virtues are expressed by willpower while the "lowest" and simplistic of virtues are expressed integrity. He essentially stated that to have the lower virtues, one must have one or more of the higher virtues. The end goals of moral development identity are to establish and act upon core goals, as well as and use one's strengths to make a difference.[31]
Moral self
A "moral self" is fostered by mutually-responsive parenting in childhood. Children with responsive parents develop more empathy, prosociality, a moral self and conscience.[32] Darcia Narvazes describes the neurobiological and social elements of early experience and their effects on moral capacities.[33]The moral self results when people integrate moral values into their self-concept.[34] Research on the moral self has mostly focused on adolescence as a critical time period for the integration of self and morality[35] (i.e. self and morality are traditionally seen as separate constructs that become integrated in adolescence.[36] However, the moral self may be established around age 2–3 years.[37][38] In fact, children as young as 5 years-old are able to consistently identify themselves as having certain moral behavioral preferences.[39] Children's moral self is also increasingly predictive of moral emotions with age.[39]
Moral values
Kristiansen and Hotte[citation needed] review many research articles regarding people's values and attitudes and whether they guide behavior. With the research they reviewed and their own extension of Ajzen and Fishbein's theory of reasoned action, they conclude that value-attitude-behavior depends on the individual and their moral reasoning. They also pointed out that there are such things as good values and bad values. Good values are those that guide our attitudes and behaviors and allow us to express and define ourselves. It also involves the ability to know when values are appropriate in response to the situation or person that you are dealing with. Bad values on the other hand are those that are relied on so much that it makes you unresponsive to the needs and perspectives of others.Another issue that Kristiansen and Hotte discovered through their research was that individuals tended to "create" values to justify their reactions to certain situations, which they called the "value justification hypothesis".[citation needed] The authors use an example from feminist Susan Faludi's journal entry of how during the period when women were fighting for their right to vote, a New Rights group appealed to society's ideals of "traditional family values" as an argument against the new law in order to mask their own "anger at women's rising independence." Their theory is comparable to Jonathan Haidt's social intuition theory, where individuals justify their intuitive emotions and actions through reasoning in a post-hoc fashion.
Kristiansen and Hotte also found that independent selves had actions and behaviors that are influenced by their own thoughts and feelings, but Interdependent selves have actions, behaviors and self-concepts that were based on the thoughts and feelings of others. Westerners have two dimensions of emotions, activation and pleasantness. The Japanese have one more, the range of their interdependent relationships. Markus and Kitayama found that these two different types of values had different motives. Westerners, in their explanations, show self-bettering biases. Easterners, on the other hand, tend to focus on "other-oriented" biases.[40]
Psychologist S. H. Schwartz defines individual values as "conceptions of the desirable that guide the way social actors (e.g.organisational leaders, policymakers, individual persons) select actions, evaluate people an events, and explain their actions and evaluations."[41] Cultural values form the basis for social norms, laws, customs and practices. While individual values vary case by case (a result of unique life experience), the average of these values point to widely held cultural beliefs (a result of shared cultural values).
Moral virtues
Piaget and Kohlberg both developed stages of development to understand the timing and meaning of moral decisions. In 2004, D. Lapsley and D. Narvaez outlined how social cognition explains aspects of moral functioning.[42] The social cognitive approach to personality has six critical resources of moral personality: cognition, self-processes, affective elements of personality, changing social context, lawful situational variability, and the integration of other literature. Lapsley and Narvaez suggest that moral values and actions stem from more than our virtues and are more so controlled by a set of self-created schemas (cognitive structures that organize related concepts and integrate past events). They claim that schemas are "fundamental to our very ability to notice dilemmas as we appraise the moral landscape" and that over time, people develop greater "moral expertise".[43]Moral reasoning
Jean Piaget, in watching children play games, noted how their rationales for cooperation changed with experience and maturation. He identified two stages, heteronomous (morality centered outside the self) and autonomous (internalized morality). Lawerence Kohlberg sought to expand Piaget's work. His cognitive developmental theory of moral reasoning dominated the field for decades. He focused on moral development as one's progression in the capacity to reason about justice. Kohlberg's interview method included hypothetical moral dilemmas or conflicts of interest. The most widely known moral scenario used in his research is usually referred to as the Heinz dilemma. He interviewed children and described what he saw in six stages (claiming that "anyone who interviewed children about dilemmas and who followed them longitudinally in time would come to our six stages and no others).[44]In the Heinz dilemma, Heinz's wife is dying of cancer and the town's druggist has something that can help her, but is charging more than Heinz can afford; Heinz steals the drug to save his wife's life. Children aged 10, 13, and 16 years old were asked if what Heinz did was morally justified. Kohlberg's stages of moral development consisted of six stages and three levels. At the Preconventional level, the first two stages included the punishment-and-obedience orientation and the instrumental-relativist orientation. The next level, the conventional level, included the interpersonal concordance or "good boy – nice girl" orientation, along with the "law and order" orientation. Lastly, the final Postconventional level consisted of the social-contract, legalistic orientation and the universal-ethical-principle orientation. Children progressed from stage one, where they began to recognize higher authorities and that there are set rules and punishments for breaking those rules; to stage six, where good principles make a good society. They also start to define which of the principles are most agreeable and fair.[45] According to Kohlberg, an individual is considered more cognitively mature depending on their stage of moral reasoning, which grows as they advance in education and world experience. One of the examples that Kohlberg gives is called "cognitive-moral conflict", wherein an individual who is currently in one stage of moral reasoning has their beliefs challenged by a surrounding peer group. Through this challenge of beliefs, the individual engages in "reflective reorganization", which allows for movement to a new stage to occur.
Previous moral development scales, particularly Kohlberg's, assert that moral reasoning is dominated by one main perspective: justice. However, in a study done by Carol Gilligan and Jane Attanucci, they argue that there is an additional perspective to moral reasoning, known as the care perspective. The justice perspective draws attention to inequality and oppression, while striving for reciprocal rights and equal respect for all. The care perspective draws attention to the ideas of detachment and abandonment, while striving for attention and response to people who need it. Gilligan and Attanucci analyzed male and female responses to moral situations thought up by the participant; they found that a majority of participants represent both care and justice in their moral orientations. In addition, they found that men do tend to use the justice perspective more often than women, and women use the care perspective more frequently than men.[46] However, reviews by others have found that Gilligan's theory was not supported by empirical studies.[47][48] In fact, in neo-Kohlbergian studies with the Defining Issues Test, females tend to get slightly higher scores than males.[49][page needed]
Moral willpower
Metcalfe and Mischel offered a new theory of willpower that focused on delayed gratification.[50] They proposed a "hot/cool" structure of analysis to deprive the way one controls the way stimulus is interpreted and willpower is exerted. The hot system is referred to as the "go" system whereas the cool system is referred to as the "know" system. The hot system is characterized as being highly emotional, reflexive, and impulsive. This system leads to go response (instant gratification) and therefore undermines efforts in self-control. The cool system is characterized as being cognitive, emotionally neutral/flexible, slow, integrated, contemplative, and strategic. The hot system develops early in life, whereas the cool system develops later, as it relies on particular brain structures, notably the prefrontal cortex and hippocampus, and particular cognitive capacities that develop later. With age, there is a shift of dominance from the hot system to the cool system. The balance between them is determined by stress, developmental levels, and a person's self-regulating dynamics.[50]Baumeister, Miller, and Delaney explored the notion of willpower by first defining the self as being made up of three parts: reflexive consciousness, or the person's awareness of their environment and of himself as an individual; interpersonal being, which seeks to mold the self into one that will be accepted by others; and executive function.[51] They stated, "[T]he self can free its actions from being determined by particular influences, especially those of which it is aware".[52] The three prevalent theories of willpower describe it as a limited supply of energy, as a cognitive process, and as a skill that is developed over time. Research has largely supported that willpower works like a "moral muscle" with a limited supply of strength that may be depleted, conserved, or replenished, and that a single act requiring much self-control can significantly deplete the "supply" of willpower.[51] While exertion reduces the ability to engage in further acts of willpower in the short term, such exertions actually improve a person's ability to exert willpower for extended periods in the long run. Muraven, Baumeister and Tice conducted a study on self-regulation and its relationship to power and stamina. This study demonstrated that the moral muscle, when exercised, is strengthened in stamina but not necessarily in power—meaning the subjects became less susceptible to the depletion of self-regulatory faculties.[53] Their study showed that more complex tasks like regulating one's mood present substantive difficulty and may not be as effective in increasing willpower as other, more straight forward activities like posture correction or maintaining a food journal.[53] However, over time, the "moral muscle" may be exercised by performing small tasks of self-control, such as attempting to correct slouched posture, resist desserts, or complete challenging self-regulatory tasks. Lastly, Baumeister argues that self-management, or the ability to alter one's responses, is a kind of skill that develops as one grows up.[51] There are many things that can help a person replenish this source of will power, such as meditation, rest, and positive emotion between tasks.[53] They also showed that there is a conservation effect when it comes to will power: people tend to realize that they are using up their stored-up will and self-control and then use it as needed.[53]
Moral behaviour
James Rest reviewed the literature on moral functioning and identified at least four components necessary for a moral behavior to take place:[54][55]- Sensitivity – noticing and interpreting the situation
- Reasoning and making a judgment regarding the best (most moral) option
- Motivation (in the moment but also habitually, such as moral identity)
- implementation—having the skills and perseverance to carry out the action
In looking at the relations between moral values, attitudes, and behaviors, previous research asserts that there is no dependable correlation between these three aspects, differing from what we would assume. In fact, it seems to be more common for people to label their behaviors with a justifying value rather than having a value beforehand and then acting on it. There are some people that are more likely to act on their personal values: those low in self-monitoring and high in self-consciousness, due to the fact that they are more aware of themselves and less aware of how others may perceive them. Self consciousness here means being literally more conscious of yourself, not fearing judgement or feeling anxiety from others. Social situations and the different categories of norms can be telling of when people may act in accordance with their values, but this still isn't concrete either. People will typically act in accordance with social, contextual and personal norms, and there is a likelihood that these norms can also follow one's moral values. Though there are certain assumptions and situations that would suggest a major value-attitude-behavior relation, there is not enough research to confirm this phenomenon.
Moral intuitions
In 2001, Jonathan Haidt introduced his social intuitionist models which claimed that with few exceptions, moral judgments are made based upon socially-derived intuitions. Moral intuitions happen immediately, automatically, and unconsciously.[57]This model suggests that moral reasoning is largely post-hoc rationalizations that function to justify one's instinctual reactions. He provides four arguments to doubt causal importance of reason. Firstly, Haidt argues that since there is a dual process system in the brain when making automatic evaluations or assessments, this same process must be applicable to moral judgement as well. The second argument contains evidence from Chaiken that says social motives cause humans to be biased and to cohere and relate to other's attitudes in order to achieve higher societal goals, which in turn influences one's moral judgment. Thirdly, Haidt found that people have post hoc reasoning when faced with a moral situation, this a posteriori (after the fact) explanation gives the illusion of objective moral judgement but in reality is subjective to one's gut feeling. Lastly, research has shown that moral emotion has a stronger link to moral action than moral reasoning, citing Damasio's research on psychopaths and Batson's empathy-altruism hypothesis.[57]
In 2008, Joshua Greene published a compilation which, in contrast to Haidt's model, suggested that fair moral reasoning does take place. A "deontologist" is someone who has rule-based morality that is mainly focused on duties and rights; in contrast, a "consequentialist" is someone who believes that only the best overall consequences ultimately matter.[58] Generally speaking, individuals who answer to moral dilemmas in a consequential manner take longer to respond and show frontal-lobe activity (associated with cognitive processing). Individuals who answer to moral dilemmas in a deontological manner, however, generally answer more quickly and show brain activity in the amygdala (associated with emotional processing).
In regard to moral intuitions, researchers Jonathan Haidt and Jesse Graham performed a study to research the difference between the moral foundations of political liberals and political conservatives.[59] They challenged individuals to question the legitimacy of their moral world and introduce five psychological foundations of morality:
- Harm/care, which starts with the sensitivity to signs of suffering in offspring and develops into a general dislike of seeing suffering in others and the potential to feel compassion in response.
- Fairness/reciprocity, which is developed when someone observes or engages in reciprocal interactions. This foundation is concerned with virtues related to fairness and justice.
- Ingroup/loyalty, which constitutes recognizing, trusting, and cooperating with members of one's ingroup as well as being wary of members of other groups.
- Authority/respect, which is how someone navigates in a hierarchal ingroups and communities.
- Purity/sanctity, which stems from the emotion of disgust that guards the body by responding to elicitors that are biologically or culturally linked to disease transmission.
Hadit and Graham suggest a compromise can be found to allow liberals and conservatives to see eye-to-eye. They suggest that the five foundations can be used as "doorway" to allow liberals to step to the conservative side of the "wall" put up between these two political affiliations on major political issues (i.e. legalizing gay marriage). If liberals try to consider the latter three foundations in addition to the former two (therefore adopting all five foundations like conservatives for a brief amount of time) they could understand where the conservatives viewpoints stem from and long-lasting political issues could finally be settled.
Augusto Blasi emphasizes the importance of moral responsibility and reflection as one analyzes an intuition.[61] His main argument is that some, if not most, intuitions tend to be self-centered and self-seeking.[62] Blasi critiques Haidt in describing the average person and questioning if this model (having an intuition, acting on it, and then justifying it) always happens. He came to the conclusion that not everyone follows this model. In more detail, Blasi proposes Haidt's five default positions on intuition.[clarification needed]
- Normally moral judgments are caused by intuitions, whether the intuitions are themselves caused by heuristics, or the heuristics are intuitions; whether they are intrinsically based on emotions, or depend on grammar type of rules and externally related to emotions.
- Intuitions occur rapidly and appear as unquestionably evident; either the intuitions themselves or their sources are unconscious.
- Intuitions are responses to minimal information, are not a result of analyses or reasoning; neither do they require reasoning to appear solid and true.
- Reasoning may occur but infrequently; its use is in justifying the judgment after the fact, either to other people or to oneself. Reasons in sum do not have a moral function.
Moral emotions
Moral reasoning has been the focus of most study of morality dating all the way back to Plato and Aristotle. The emotive side of morality has been looked upon with disdain, as subservient to the higher, rational, moral reasoning, with scholars like Piaget and Kohlberg touting moral reasoning as the key forefront of morality.[17] However, in the last 30–40 years, there has been a rise in a new front of research: moral emotions as the basis for moral behavior. This development began with a focus on empathy and guilt, but has since moved on to encompass new emotional scholarship stocks like anger, shame, disgust, awe, and elevation. With the new research, theorists have begun to question whether moral emotions might hold a larger in determining morality, one that might even surpass that of moral reasoning.[64]There have generally been two approaches taken by philosophers to define moral emotion. The first "is to specify the formal conditions that make a moral statement (e.g., that is prescriptive, that it is universalizable, such as expedience)".[65][page needed] This first approach is more tied to language and the definitions we give to a moral emotions. The second approach "is to specify the material conditions of a moral issue, for example, that moral rules and judgments 'must bear on the interest or welfare either of society as a whole or at least of persons other than the judge or agent'".[66] This definition seems to be more action based. It focuses on the outcome of a moral emotion. The second definition is more preferred because it is not tied to language and therefore can be applied to prelinguistic children and animals. Moral emotions are "emotions that are linked to the interests or welfare either of society as a whole or at least of persons other than the judge or agent."[67]
There is a debate whether there is a set of basic emotions or if there are "scripts or set of components that can be mixed and matched, allowing for a very large number of possible emotions".[64] Even those arguing for a basic set acknowledge that there are variants of each emotion. Psychology Paul Ekman calls these variants "families":[68]
The principal moral emotions can be divided into two large and two small joint families. The large families are the "other-condemning" family, in which the three brothers are contempt, anger, and disgust (and their many children, such as indignation and loathing), and the "self-conscious" family (shame, embarrassment, and guilt)…[T]he two smaller families the "other-suffering" family (compassion) and the "other-praising" family (gratitude and elevation).[64]Different cultures, Haidt suggests, can also formulate different moral emotions that reflect the values of that culture. For example, Eastern cultures may be more inclined to consider serenity/calmness as a moral emotion than Western cultures.
As Haidt would suggest, the higher the emotionality of a moral agent the more likely they are to act morally. He also uses the term "disinterested elicitor" to describe someone who is less concerned with the self, and more concerned about the well-being of things exterior to him or herself. Haidt suggests that each person's pro-social action tendency is determined by his or her degree of emotionality. Haidt uses Ekman's idea of "emotion families" and builds a scale of emotionality from low to high. If a person works on a low level of emotion and has self-interested emotions, such as sad/happy, they are unlikely to act. If the moral agent possesses a high emotionality and operates as a disinterested elicitor with emotions such as elevation, they are much more likely to be morally altruistic.
These moral emotions all have elicitors and action tendencies. The "other-condemning" family, which includes anger, contempt, and disgust, is united by caring for what other people do and developing negative feelings about the actions and characters of others who violate the moral codes and order. The moral emotion of anger is elicited by unjustified actions and insults. The action tendency of this emotion is to attack, humiliate, or get back at the person responsible of acting unfairly.[64]
Empathy also plays a large role in altruism. The empathy-altruism hypothesis states that feelings of empathy for another leads to an altruistic motivation to help that person.[69] In contrast, there may also be an egoistic motivation to help someone in need. This is the Hullian tension-reduction model in which personal distress caused by another in need leads the person to help in order to alleviate their own discomfort.[70]
Batson, Klein, Highberger, and Shaw conducted experiments where they manipulated people through the use of empathy-induced altruism to make decisions that required them to show partiality to one individual over another. The first experiment involved a participant from each group to choose someone to experience a positive or negative task. These groups included a non-communication, communication/low-empathy, and communication/high-empathy. They were asked to make their decisions based on these standards resulting in the communication/high-empathy group showing more partiality in the experiment than the other groups due to being successfully manipulated emotionally. Those individuals who they successfully manipulated reported that despite feeling compelled in the moment to show partiality, they still felt they had made the more "immoral" decision since they followed an empathy-based emotion rather than adhering to a justice perspective of morality.[69]
Batson, Klein, Highberger, & Shaw conducted two experiments on empathy-induced altruism, proposing that this can lead to actions that violate the justice principle. The second experiment operated similarly to the first using low-empathy and high-empathy groups. Participants were faced with the decision to move an ostensibly ill child to an "immediate help" group versus leaving her on a waiting list after listening to her emotionally-driven interview describing her condition and the life it has left her to lead. Those who were in the high-empathy group were more likely than those in the low-empathy group to move the child higher up the list to receive treatment earlier. When these participants were asked what the more moral choice was, they agreed that the more moral choice would have been to not move this child ahead of the list at the expense of the other children. In this case, it is evident that when empathy induced altruism is at odds with what is seen as moral, oftentimes empathy induced altruism has the ability to win out over morality.[69]
Recently neuroscientist Jean Decety, drawing on empirical research in evolutionary theory, developmental psychology, social neuroscience, and psychopathy, argued that empathy and morality are neither systematically opposed to one another, nor inevitably complementary.[71][72]
Emmons (2009) defines gratitude as a natural emotional reaction and a universal tendency to respond positively to another's benevolence. Gratitude is motivating and leads to what Emmons' describes as "upstream reciprocity". This is the passing on of benefits to third parties instead of returning benefits to one's benefactors (Emmons, 2009).
Moral conviction
Linda Skitka and colleagues have introduced the concept of moral conviction, which refers to a "strong and absolute belief that something is right or wrong, moral or immoral."[73] According to Skitka's integrated theory of moral conviction (ITMC), attitudes held with moral conviction, known as moral mandates, differ from strong but non-moral attitudes in a number of important ways. Namely, moral mandates derive their motivational force from their perceived universality, perceived objectivity, and strong ties to emotion.[74] Perceived universality refers to the notion that individuals experience moral mandates as transcending persons and cultures; additionally, they are regarded as matters of fact. Regarding association with emotion, ITMC is consistent with Jonathan Haidt's social intuitionist model in stating that moral judgments are accompanied by discrete moral emotions (i.e., disgust, shame, guilt). Importantly, Skitka maintains that moral mandates are not the same thing as moral values. Whether an issue will be associated with moral conviction varies across persons.One of the main lines of IMTC research addresses the behavioral implications of moral mandates. Individuals prefer greater social and physical distance from attitudinally dissimilar others when moral conviction was high. This effect of moral conviction could not be explained by traditional measures of attitude strength, extremity, or centrality. Skitka, Bauman, and Sargis placed participants in either attitudinally heterogeneous or homogenous groups to discuss procedures regarding two morally mandated issues, abortion and capital punishment. Those in attitudinally heterogeneous groups demonstrated the least amount of goodwill towards other group members, the least amount of cooperation, and the most tension/defensiveness. Furthermore, individuals discussing a morally-mandated issue were less likely to reach a consensus compared to those discussing non-moral issues.[75]
Evolution
In Unto Others: the Evolution and Psychology of Unselfish Behavior (1998), Elliott Sober and David Sloan Wilson demonstrated that diverse moralities could evolve through group selection. In particular, they dismantled the idea that natural selection will favor a homogeneous population in which all creatures care only about their own personal welfare and/or behave only in ways which advance their own personal reproduction.[76] Tim Dean has advanced the more general claim that moral diversity would evolve through frequency-dependent selection because each moral approach is vulnerable to a different set of situations which threatened our ancestors.[77]Integrated theories
More recent attempts to develop an integrated model of moral motivation[78] have identified at least six different levels of moral functioning, each of which has been shown to predict some type of moral or prosocial behavior: moral intuitions, moral emotions, moral virtues/vices (behavioral capacities), moral values, moral reasoning, and moral willpower. This social intuitionist model of moral motivation[79] suggests that moral behaviors are typically the product of multiple levels of moral functioning, and are usually energized by the "hotter" levels of intuition, emotion, and behavioral virtue/vice. The "cooler" levels of values, reasoning, and willpower, while still important, are proposed to be secondary to the more affect-intensive processes.Psychologist Jonathan Haidt's moral foundations theory examines the way morality varies between cultures and identifies five fundamental moral values shared by different societies and individuals:[59] care for others, fairness, loyalty, authority and purity.[80][non sequitur]