Search This Blog

Sunday, April 14, 2024

Good and evil

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Good_and_evil
In Ary Scheffer's 1854 painting The Temptation of Christ, the devil (right), the personification of evil, tempts Christ (left), the personification of the character and will of God.

In religion, ethics, philosophy, and psychology, "good and evil" is a very common dichotomy. In cultures with Manichaean and Abrahamic religious influence, evil is perceived as the dualistic antagonistic opposite of good, in which good should prevail and evil should be defeated. In cultures with Buddhist spiritual influence, both good and evil are perceived as part of an antagonistic duality that itself must be overcome through achieving Śūnyatā: emptiness in the sense of recognition of good and evil being two opposing principles but not a reality, emptying the duality of them, and achieving a oneness.

Evil is often used to denote profound immorality. Evil has also been described as a supernatural force. Definitions of evil vary, as does the analysis of its motives. However, elements that are commonly associated with evil involve unbalanced behavior involving expediency, selfishness, ignorance, or neglect. Shakespeare once famously wrote the phrase, "There is nothing that is either good or bad, but thinking makes it so."

The modern philosophical questions regarding good and evil are subsumed into three major areas of study: metaethics concerning the nature of good and evil, normative ethics concerning how we ought to behave, and applied ethics concerning particular moral issues.

One of the five paintings of Extermination of Evil portrays Sendan Kendatsuba, one of the eight guardians of Buddhist law, banishing evil.

History and etymology

Every language has a word expressing good in the sense of "having the right or desirable quality" (ἀρετή) and bad in the sense "undesirable". A sense of moral judgment and a distinction "right and wrong, good and bad" are cultural universals.

Ancient world

The philosopher Zoroaster simplified the pantheon of early Iranian gods into two opposing forces: Ahura Mazda (Illuminating Wisdom) and Angra Mainyu (Destructive Spirit) which were in conflict.

This idea developed into a religion which spawned many sects, some of which embraced an extreme dualistic belief that the material world should be shunned and the spiritual world should be embraced. Gnostic ideas influenced many ancient religions which teach that gnosis (variously interpreted as enlightenment, salvation, emancipation or 'oneness with God') may be reached by practising philanthropy to the point of personal poverty, sexual abstinence (as far as possible for hearers, total for initiates) and diligently searching for wisdom by helping others.

Similarly, in ancient Egypt, there were the concepts of Ma'at, the principle of justice, order, and cohesion, and Isfet, the principle of chaos, disorder, and decay, with the former being the power and principles which society sought to embody where the latter was such that undermined society. This correspondence can also be seen reflected in ancient Mesopotamian religion as well in the conflict between Marduk and Tiamat.

Classical world

In Western civilisation, the basic meanings of κακός and ἀγαθός are "bad, cowardly" and "good, brave, capable", and their absolute sense emerges only around 400 BC, with pre-Socratic philosophy, in particular Democritus. Morality in this absolute sense solidifies in the dialogues of Plato, together with the emergence of monotheistic thought (notably in Euthyphro, which ponders the concept of piety (τὸ ὅσιον) as a moral absolute). The idea was further developed in Late Antiquity by Neoplatonists, Gnostics, and Church Fathers.

This development from the relative or habitual to the absolute is also evident in the terms ethics and morality both being derived from terms for "regional custom", Greek ήθος and Latin mores, respectively (see also siðr).

Medieval period

According to the classical definition of Augustine of Hippo, sin is "a word, deed, or desire in opposition to the eternal law of God."

Many medieval Christian theologians both broadened and narrowed the basic concept of Good and evil until it came to have several, sometimes complex definitions such as:

Modern ideas

Today the basic dichotomy often breaks down along these lines:

  • Good is a broad concept often associated with life, charity, continuity, happiness, love, or justice.
  • Evil is often associated with conscious and deliberate wrongdoing, discrimination designed to harm others, humiliation of people designed to diminish their psychological needs and dignity, destructiveness, and acts of unnecessary or indiscriminate violence.

The modern English word evil (Old English yfel) and its cognates such as the German Übel and Dutch euvel are widely considered to come from a Proto-Germanic reconstructed form of *ubilaz, comparable to the Hittite huwapp- ultimately from the Proto-Indo-European form *wap- and suffixed zero-grade form *up-elo-. Other later Germanic forms include Middle English evel, ifel, ufel, Old Frisian evel (adjective and noun), Old Saxon ubil, Old High German ubil, and Gothic ubils.

The nature of being good has been given many treatments; one is that the good is based on the natural love, bonding, and affection that begins at the earliest stages of personal development; another is that goodness is a product of knowing truth. Differing views also exist as to why evil might arise. Many religious and philosophical traditions claim that evil behavior is an aberration that results from the imperfect human condition (e.g. "The Fall of Man"). Sometimes, evil is attributed to the existence of free will and human agency. Some argue that evil itself is ultimately based in an ignorance of truth (i.e., human value, sanctity, divinity). A variety of thinkers have alleged the opposite, by suggesting that evil is learned as a consequence of tyrannical social structures.

Theories of moral goodness

Chinese moral philosophy

In Confucianism and Taoism, there is no direct analogue to the way good and evil are opposed, although references to demonic influence is common in Chinese folk religion. Confucianism's primary concern is with correct social relationships and the behavior appropriate to the learned or superior man. Evil would thus correspond to wrong behavior. Still less does it map into Taoism, in spite of the centrality of dualism in that system, but the opposite of the basic virtues of Taoism (compassion, moderation, and humility) can be inferred to be the analogue of evil in it.

Western philosophy

Pyrrhonism

Pyrrhonism holds that good and evil do not exist by nature, meaning that good and evil do not exist within the things themselves. All judgments of good and evil are relative to the one doing the judging.

Spinoza

Benedict de Spinoza states:

1. By good, I understand that which we certainly know is useful to us.
2. By evil, on the contrary I understand that which we certainly know hinders us from possessing anything that is good.

Spinoza assumes a quasi-mathematical style and states these further propositions which he purports to prove or demonstrate from the above definitions in part IV of his Ethics :

  • Proposition 8 "Knowledge of good or evil is nothing but affect of joy or sorrow in so far as we are conscious of it."
  • Proposition 30 "Nothing can be evil through that which it possesses in common with our nature, but in so far as a thing is evil to us it is contrary to us."
  • Proposition 64 "The knowledge of evil is inadequate knowledge."
    • Corollary "Hence it follows that if the human mind had none but adequate ideas, it would form no notion of evil."
  • Proposition 65 "According to the guidance of reason, of two things which are good, we shall follow the greater good, and of two evils, follow the less."
  • Proposition 68 "If men were born free, they would form no conception of good and evil so long as they were free."

Nietzsche

Friedrich Nietzsche, in a rejection of Judeo-Christian morality, addresses this in two books, Beyond Good and Evil and On the Genealogy of Morals. In these works, he states that the natural, functional, "non-good" has been socially transformed into the religious concept of evil by the "slave mentality" of the masses, who resent their "masters", the strong. He also critiques morality by saying that many who consider themselves to be moral are simply acting out of cowardice – wanting to do evil but afraid of the repercussions.

Psychology

Carl Jung

Carl Jung, in his book Answer to Job and elsewhere, depicted evil as the dark side of the Devil. People tend to believe evil is something external to them, because they project their shadow onto others. Jung interpreted the story of Jesus as an account of God facing his own shadow.

Philip Zimbardo

In 2007, Philip Zimbardo suggested that people may act in evil ways as a result of a collective identity. This hypothesis, based on his previous experience from the Stanford prison experiment, was published in the book The Lucifer Effect: Understanding How Good People Turn Evil.

Religion

Abrahamic religions

Baháʼí Faith

The Baháʼí Faith asserts that evil is non-existent and that it is a concept for the lacking of good, just as cold is the state of no heat, darkness is the state of no light, forgetfulness the lacking of memory, ignorance the lacking of knowledge. All of these are states of lacking and have no real existence.

Thus, evil does not exist, and is relative to man. `Abdu'l-Bahá, son of the founder of the religion, in Some Answered Questions states:

"Nevertheless, a doubt occurs to the mind—that is, scorpions and serpents are poisonous. Are they good or evil, for they are existing beings? Yes, a scorpion is evil in relation to man; a serpent is evil in relation to man; but in relation to themselves they are not evil, for their poison is their weapon, and by their sting they defend themselves."

Thus, evil is more of an intellectual concept than a true reality. Since God is good, and upon creating creation he confirmed it by saying it is Good (Genesis 1:31) evil cannot have a true reality.

Christianity
In many religions, angels are considered good beings. In the Christian tradition, God—being the creator of all life—manifests himself through the son of God, Jesus Christ, who is the personification of goodness.
 
Satan, as seen in Codex Gigas. Demons are generally seen as evil beings, and Satan as the greatest of these (in the Christian tradition).

Christian theology draws its concept of evil from the Old and New Testaments. The Christian Bible exercises "the dominant influence upon ideas about God and evil in the Western world." In the Old Testament, evil is understood to be an opposition to God as well as something unsuitable or inferior such as the leader of the fallen angels Satan. In the New Testament the Greek word poneros is used to indicate unsuitability, while kakos is used to refer to opposition to God in the human realm. Officially, the Catholic Church extracts its understanding of evil from its canonical antiquity and the Dominican theologian, Thomas Aquinas, who in Summa Theologica defines evil as the absence or privation of good. French-American theologian Henri Blocher describes evil, when viewed as a theological concept, as an "unjustifiable reality. In common parlance, evil is 'something' that occurs in experience that ought not to be." According to 1 Timothy 6:10 "For the love of money is the root of all of evil"

In Mormonism, mortal life is viewed as a test of faith, where one's choices are central to the Plan of Salvation. See Agency (LDS Church). Evil is that which keeps one from discovering the nature of God. It is believed that one must choose not to be evil to return to God.

Christian Science believes that evil arises from a misunderstanding of the goodness of nature, which is understood as being inherently perfect if viewed from the correct (spiritual) perspective. Misunderstanding God's reality leads to incorrect choices, which are termed evil. This has led to the rejection of any separate power being the source of evil, or of God as being the source of evil; instead, the appearance of evil is the result of a mistaken concept of good. Christian Scientists argue that even the most evil person does not pursue evil for its own sake, but from the mistaken viewpoint that he or she will achieve some kind of good thereby.

Islam

There is no concept of absolute evil in Islam, as a fundamental universal principle that is independent from and equal with good in a dualistic sense. Within Islam, it is considered essential to believe that all comes from God, whether it is perceived as good or bad by individuals; and things that are perceived as evil or bad are either natural events (natural disasters or illnesses) or caused by humanity's free will to disobey God's orders.

According to the Ahmadiyya understanding of Islam, evil does not have a positive existence in itself and is merely the lack of good, just as darkness is the result of lack of light.

Judaism

In Judaism, yetzer hara is the congenital inclination to do evil, by violating the will of God. The term is drawn from the phrase "the imagination of the heart of man [is] evil" (יֵצֶר לֵב הָאָדָם רַע‎, yetzer lev-ha-adam ra), which occurs twice at the beginning of the Torah. Genesis 6:5 and 8:21. The Hebrew word "yetzer" having appeared twice in Genesis occurs again at the end of the Torah: "I knew their devisings that they do". Thus from beginning to end the heart's "yetzer" is continually bent on evil, a profoundly pessimistic view of the human being. However, the Torah which began with blessing anticipates future blessing which will come as a result of God circumcising the heart in the latter days.

In traditional Judaism, the yetzer hara is not a demonic force, but rather man's misuse of things the physical body needs to survive. Thus, the need for food becomes gluttony due to the yetzer hara. The need for procreation becomes promiscuity, and so on. The yetzer hara could thus be best described as one's baser instincts.

According to the Talmudic tractate Avot de-Rabbi Natan, a boy's evil inclination is greater than his good inclination until he turns 13 (bar mitzvah), at which point the good inclination is "born" and able to control his behavior. Moreover, the rabbis have stated: "The greater the man, the greater his [evil] inclination."

Indian religions

Buddhism
Extermination of Evil. Late Heian period (12th century Japan)

Buddhist ethics are traditionally based on what Buddhists view as the enlightened perspective of the Buddha, or other enlightened beings such as Bodhisattvas. The Indian term for ethics or morality used in Buddhism is Śīla or sīla (Pāli). Śīla in Buddhism is one of three sections of the Noble Eightfold Path, and is a code of conduct that embraces a commitment to harmony and self-restraint with the principal motivation being nonviolence, or freedom from causing harm. It has been variously described as virtue, moral discipline and precept.

Sīla is an internal, aware, and intentional ethical behavior, according to one's commitment to the path of liberation. It is an ethical compass within self and relationships, rather than what is associated with the English word "morality" (i.e., obedience, a sense of obligation, and external constraint).

Sīla is one of the three practices foundational to Buddhism and the non-sectarian Vipassana movement; sīla, samādhi, and paññā as well as the Theravadin foundations of sīla, dāna, and bhavana. It is also the second pāramitā. Sīla is also wholehearted commitment to what is wholesome. Two aspects of sīla are essential to the training: right "performance" (caritta), and right "avoidance" (varitta). Honoring the precepts of sīla is considered a "great gift" (mahadana) to others, because it creates an atmosphere of trust, respect, and security. It means the practitioner poses no threat to another person's life, property, family, rights, or well-being.

Moral instructions are included in Buddhist scriptures or handed down through tradition. Most scholars of Buddhist ethics thus rely on the examination of Buddhist scriptures, and the use of anthropological evidence from traditional Buddhist societies, to justify claims about the nature of Buddhist ethics.

Hinduism

In Hinduism the concept of dharma or righteousness clearly divides the world into good and evil, and clearly explains that wars have to be waged sometimes to establish and protect dharma; this war is called Dharmayuddha. This division of good and evil is of major importance in both the Hindu epics of Ramayana and Mahabharata. However, the main emphasis in Hinduism is on bad action, rather than bad people. The Hindu holy text, the Bhagavad Gita, speaks of the balance of good and evil. When this balance goes off, divine incarnations come to help to restore this balance, as a balance must be maintained for peace and harmony in the world.

Sikhism

In adherence to the core principle of spiritual evolution, the Sikh idea of evil changes depending on one's position on the path to liberation. At the beginning stages of spiritual growth, good and evil may seem neatly separated. However, once one's spirit evolves to the point where it sees most clearly, the idea of evil vanishes and the truth is revealed. In his writings Guru Arjan explains that, because God is the source of all things, what we believe to be evil must too come from God. And because God is ultimately a source of absolute good, nothing truly evil can originate from God.

Nevertheless, Sikhism, like many other religions, does incorporate a list of "vices" from which suffering, corruption, and abject negativity arise. These are known as the Five Thieves, called such due to their propensity to cloud the mind and lead one astray from the prosecution of righteous action. These are:

One who gives in to the temptations of the Five Thieves is known as "Manmukh", or someone who lives selfishly and without virtue. Inversely, the "Gurmukh, who thrive in their reverence toward divine knowledge, rise above vice via the practice of the high virtues of Sikhism. These are:

  • Sewa, or selfless service to others.
  • Nam Simran, or meditation upon the divine name.

Zoroastrianism

In the originally Persian religion of Zoroastrianism, the world is a battle ground between the God Ahura Mazda (also called Ormazd) and the malignant spirit Angra Mainyu (also called Ahriman). The final resolution of the struggle between good and evil was supposed to occur on a Day of Judgement, in which all beings that have lived will be led across a bridge of fire, and those who are evil will be cast down forever. In Afghan belief, angels (yazata) and saints are beings sent to help us achieve the path towards goodness.

Descriptive, meta-ethical, and normative fields

It is possible to treat the essential theories of value by the use of a philosophical and academic approach. In properly analyzing theories of value, everyday beliefs are not only carefully catalogued and described, but also rigorously analyzed and judged.

There are at least two basic ways of presenting a theory of value, based on two different kinds of questions:

  • What do people find good, and what do they despise?
  • What really is good, and what really is bad?

The two questions are subtly different. One may answer the first question by researching the world by use of social science, and examining the preferences that people assert. However, one may answer the second question by use of reasoning, introspection, prescription, and generalization. The former kind of method of analysis is called "descriptive", because it attempts to describe what people actually view as good or evil; while the latter is called "normative", because it tries to actively prohibit evils and cherish goods. These descriptive and normative approaches can be complementary. For example, tracking the decline of the popularity of slavery across cultures is the work of descriptive ethics, while advising that slavery be avoided is normative.

Meta-ethics is the study of the fundamental questions concerning the nature and origins of the good and the evil, including inquiry into the nature of good and evil, as well as the meaning of evaluative language. In this respect, meta-ethics is not necessarily tied to investigations into how others see the good, or of asserting what is good.

Theories of the intrinsically good

A satisfying formulation of goodness is valuable because it might allow one to construct a good life or society by reliable processes of deduction, elaboration, or prioritization. One could answer the ancient question, "How should we then live?" among many other important related questions. It has long been thought that this question can best be answered by examining what it is that necessarily makes a thing valuable, or in what the source of value consists.

Platonic idealism

One attempt to define goodness describes it as a property of the world with Platonic idealism. According to this claim, to talk about the good is to talk about something real that exists in the object itself, independent of the perception of it. Plato advocated this view, in his expression that there is such a thing as an eternal realm of forms or ideas, and that the greatest of the ideas and the essence of being was goodness, or The good. The good was defined by many ancient Greeks and other ancient philosophers as a perfect and eternal idea, or blueprint. The good is the right relation between all that exists, and this exists in the mind of the Divine, or some heavenly realm. The good is the harmony of a just political community, love, friendship, the ordered human soul of virtues, and the right relation to the Divine and to Nature. The characters in Plato's dialogues mention the many virtues of a philosopher, or a lover of wisdom.

A theist is a person who believes that the Supreme Being exists or gods exist (monotheism or polytheism). A theist may, therefore, claim that the universe has a purpose and value according to the will of such creator(s) that lies partially beyond human understanding. For instance, Thomas Aquinas—a proponent of this view—believed he had proven the existence of God, and the right relations that humans ought to have to the divine first cause.

Monotheists might also hope for infinite universal love. Such hope is often translated as "faith", and wisdom itself is largely defined within some religious doctrines as a knowledge and understanding of innate goodness. The concepts of innocence, spiritual purity, and salvation are likewise related to a concept of being in, or returning to, a state of goodness—one that, according to various teachings of "enlightenment", approaches a state of holiness (or Godliness).

Perfectionism

Aristotle believed that virtues consisted of realization of potentials unique to humanity, such as the use of reason. This type of view, called perfectionism, has been recently defended in modern form by Thomas Hurka.

An entirely different form of perfectionism has arisen in response to rapid technological change. Some techno-optimists, especially transhumanists, avow a form of perfectionism in which the capacity to determine good and trade off fundamental values, is expressed not by humans but by software, genetic engineering of humans, artificial intelligence. Skeptics assert that rather than perfect goodness, it would be only the appearance of perfect goodness, reinforced by persuasion technology and probably brute force of violent technological escalation, which would cause people to accept such rulers or rules authored by them.

Welfarist theories

Welfarist theories of value say things that are good are such because of their positive effects on human well-being.

Subjective theories of well-being

It is difficult to figure out where an immaterial trait such as "goodness" could reside in the world. A counterproposal is to locate values inside people. Some philosophers go so far as to say that if some state of affairs does not tend to arouse a desirable subjective state in self-aware beings, then it cannot be good.

Most philosophers who think goods have to create desirable mental states also say that goods are experiences of self-aware beings. These philosophers often distinguish the experience, which they call an intrinsic good, from the things that seem to cause the experience, which they call "inherent" goods.

Some theories describe no higher collective value than that of maximizing pleasure for individual(s). Some even define goodness and intrinsic value as the experience of pleasure, and bad as the experience of pain. This view is called hedonism, a monistic theory of value. It has two main varieties: simple, and Epicurean.

Simple hedonism is the view that physical pleasure is the ultimate good. However, the ancient philosopher Epicurus used the word 'pleasure' in a more general sense that encompassed a range of states from bliss to contentment to relief. Contrary to popular caricature, he valued pleasures of the mind to bodily pleasures, and advocated moderation as the surest path to happiness.

Jeremy Bentham's book The Principles of Morals and Legislation prioritized goods by considering pleasure, pain and consequences. This theory had a wide effect on public affairs, up to and including the present day. A similar system was later named Utilitarianism by John Stuart Mill. More broadly, utilitarian theories are examples of Consequentialism. All utilitarian theories are based upon the maxim of utility, which states that good is whatever provides the greatest happiness for the greatest number. It follows from this principle that what brings happiness to the greatest number of people, is good.

A benefit of tracing good to pleasure and pain is that both are easily understandable, both in oneself and to an extent in others. For the hedonist, the explanation for helping behaviour may come in the form of empathy—the ability of a being to "feel" another's pain. People tend to value the lives of gorillas more than those of mosquitoes because the gorilla lives and feels, making it easier to empathize with them. This idea is carried forward in the ethical relationship view and has given rise to the animal rights movement and parts of the peace movement. The impact of sympathy on human behaviour is compatible with Enlightenment views, including David Hume's stances that the idea of a self with unique identity is illusory, and that morality ultimately comes down to sympathy and fellow feeling for others, or the exercise of approval underlying moral judgments.

A view adopted by James Griffin attempts to find a subjective alternative to hedonism as an intrinsic value. He argues that the satisfaction of one's informed desires constitutes well-being, whether or not these desires actually bring the agent happiness. Moreover, these preferences must be life-relevant, that is, contribute to the success of a person's life overall.

Desire satisfaction may occur without the agent's awareness of the satisfaction of the desire. For example, if a man wishes for his legal will to be enacted after his death, and it is, then his desire has been satisfied even though he will never experience or know of it.

Meher Baba proposed that it is not the satisfaction of desires that motivates the agent but rather "a desire to be free from the limitation of all desires. Those experiences and actions which increase the fetters of desire are bad, and those experiences and actions which tend to emancipate the mind from limiting desires are good." It is through good actions, then, that the agent becomes free from selfish desires and achieves a state of well-being: "The good is the main link between selfishness thriving and dying. Selfishness, which in the beginning is the father of evil tendencies, becomes through good deeds the hero of its own defeat. When the evil tendencies are completely replaced by good tendencies, selfishness is transformed into selflessness, i.e., individual selfishness loses itself in universal interest."

Objective theories of well-being

The idea that the ultimate good exists and is not orderable but is globally measurable is reflected in various ways in economic (classical economics, green economics, welfare economics, gross national happiness) and scientific (positive psychology, the science of morality) well-being measuring theories, all of which focus on various ways of assessing progress towards that goal, a so-called genuine progress indicator. Modern economics thus reflects very ancient philosophy, but a calculation or quantitative or other process based on cardinality and statistics replaces the simple ordering of values.

For example, in both economics and in folk wisdom, the value of something seems to rise so long as it is relatively scarce. However, if it becomes too scarce, it leads often to a conflict, and can reduce collective value.

In the classical political economy of Adam Smith and David Ricardo, and in its critique by Karl Marx, human labour is seen as the ultimate source of all new economic value. This is an objective theory of value, which attributes value to real production-costs, and ultimately expenditures of human labour-time (see law of value). It contrasts with marginal utility theory, which argues that the value of labour depends on subjective preferences by consumers, which may however also be objectively studied.

The economic value of labour may be assessed technically in terms of its use-value or utility or commercially in terms of its exchange-value, price or production cost (see labour power). But its value may also be socially assessed in terms of its contribution to the wealth and well-being of a society.

In non-market societies, labour may be valued primarily in terms of skill, time, and output, as well as moral or social criteria and legal obligations. In market societies, labour is valued economically primarily through the labour market. The price of labour may then be set by supply and demand, by strike action or legislation, or by legal or professional entry-requirements into occupations.

Mid-range theories

Conceptual metaphor theories argue against both subjective and objective conceptions of value and meaning, and focus on the relationships between body and other essential elements of human life. In effect, conceptual metaphor theories treat ethics as an ontology problem and the issue of how to work-out values as a negotiation of these metaphors, not the application of some abstraction or a strict standoff between parties who have no way to understand each other's views.

Philosophical questions

Universality

Adolf Hitler is sometimes used as a modern definition of evil. Hitler's policies and orders resulted in the deaths of about 50 million people.

A fundamental question is whether there is a universal, transcendent definition of evil, or whether evil is determined by one's social or cultural background. C. S. Lewis, in The Abolition of Man, maintained that there are certain acts that are universally considered evil, such as rape and murder. However, the numerous instances in which rape or murder is morally affected by social context call this into question. Up until the mid-19th century, many countries practiced forms of slavery. As is often the case, those transgressing moral boundaries stood to profit from that exercise. Arguably, slavery has always been the same and objectively evil, but individuals with a motivation to transgress will justify that action.

The Nazis, during World War II, considered genocide to be acceptable, as did the Hutu Interahamwe in the Rwandan genocide. One might point out, though, that the actual perpetrators of those atrocities probably avoided calling their actions genocide, since the objective meaning of any act accurately described by that word is to wrongfully kill a selected group of people, which is an action that at least their victims will understand to be evil. Universalists consider evil independent of culture, and wholly related to acts or intents.

Views on the nature of evil tend to fall into one of four opposed camps:

  • Moral absolutism holds that good and evil are fixed concepts established by a deity or deities, nature, morality, common sense, or some other source.
  • Amoralism claims that good and evil are meaningless, that there is no moral ingredient in nature.
  • Moral relativism holds that standards of good and evil are only products of local culture, custom, or prejudice.
  • Moral universalism is the attempt to find a compromise between the absolutist sense of morality, and the relativist view; universalism claims that morality is only flexible to a degree, and that what is truly good or evil can be determined by examining what is commonly considered to be evil amongst all humans.

Plato wrote that there are relatively few ways to do good, but there are countless ways to do evil, which can therefore have a much greater impact on our lives, and the lives of other beings capable of suffering.

Usefulness as a term

Psychologist Albert Ellis, in his school of psychology called Rational Emotive Behavioral Therapy, says the root of anger and the desire to harm someone is almost always related to variations of implicit or explicit philosophical beliefs about other human beings. He further claims that without holding variants of those covert or overt belief and assumptions, the tendency to resort to violence in most cases is less likely.

American psychiatrist M. Scott Peck on the other hand, describes evil as militant ignorance. The original Judeo-Christian concept of sin is as a process that leads one to miss the mark and not achieve perfection. Peck argues that while most people are conscious of this at least on some level, those that are evil actively and militantly refuse this consciousness. Peck describes evil as a malignant type of self-righteousness which results in a projection of evil onto selected specific innocent victims (often children or other people in relatively powerless positions). Peck considers those he calls evil to be attempting to escape and hide from their own conscience (through self-deception) and views this as being quite distinct from the apparent absence of conscience evident in sociopaths.

According to Peck, an evil person:

  • Is consistently self-deceiving, with the intent of avoiding guilt and maintaining a self-image of perfection
  • Deceives others as a consequence of their own self-deception
  • Psychologically projects his or her evils and sins onto very specific targets, scapegoating those targets while treating everyone else normally ("their insensitivity toward him was selective")
  • Commonly hates with the pretense of love, for the purposes of self-deception as much as the deception of others
  • Abuses political or emotional power ("the imposition of one's will upon others by overt or covert coercion")
  • Maintains a high level of respectability and lies incessantly in order to do so
  • Is consistent in his or her sins. Evil people are defined not so much by the magnitude of their sins, but by their consistency (of destructiveness)
  • Is unable to think from the viewpoint of their victim
  • Has a covert intolerance to criticism and other forms of narcissistic injury

He also considers certain institutions may be evil, as his discussion of the My Lai Massacre and its attempted coverup illustrate. By this definition, acts of criminal and state terrorism would also be considered evil.

Necessary evil

Martin Luther believed that occasional minor evil could have a positive effect

Martin Luther argued that there are cases where a little evil is a positive good. He wrote, "Seek out the society of your boon companions, drink, play, talk bawdy, and amuse yourself. One must sometimes commit a sin out of hate and contempt for the Devil, so as not to give him the chance to make one scrupulous over mere nothings... ."

The necessary evil approach to politics was put forth by Niccolò Machiavelli, a 16th-century Florentine writer who advised tyrants that "it is far safer to be feared than loved." Treachery, deceit, eliminating political rivals, and the usage of fear are offered as methods of stabilizing the prince's security and power.

The international relations theories of realism and neorealism, sometimes called realpolitik advise politicians to explicitly ban absolute moral and ethical considerations from international politics, and to focus on self-interest, political survival, and power politics, which they hold to be more accurate in explaining a world they view as explicitly amoral and dangerous. Political realists usually justify their perspectives by laying claim to a higher moral duty specific to political leaders, under which the greatest evil is seen to be the failure of the state to protect itself and its citizens. Machiavelli wrote: "...there will be traits considered good that, if followed, will lead to ruin, while other traits, considered vices which if practiced achieve security and well being for the Prince."

Anton LaVey, founder of the Church of Satan, was a materialist and claimed that evil is actually good. He was responding to the common practice of describing sexuality or disbelief as evil, and his claim was that when the word evil is used to describe the natural pleasures and instincts of men and women, or the skepticism of an inquiring mind, the things called evil are really good.

Goodness and agency

Goodwill

John Rawls' book A Theory of Justice prioritized social arrangements and goods based on their contribution to justice. Rawls defined justice as fairness, especially in distributing social goods, defined fairness in terms of procedures, and attempted to prove that just institutions and lives are good, if rational individuals' goods are considered fairly. Rawls's crucial invention was the original position, a procedure in which one tries to make objective moral decisions by refusing to let personal facts about oneself enter one's moral calculations. Immanuel Kant, a great influence for Rawls, similarly applies a lot of procedural practice within the practical application of The Categorical Imperative, however, this is indeed not based solely on 'fairness'.

Society, life and ecology

Many views value unity as a good: to go beyond eudaimonia by saying that an individual person's flourishing is valuable only as a means to the flourishing of society as a whole. In other words, a single person's life is, ultimately, not important or worthwhile in itself, but is good only as a means to the success of society as a whole. Some elements of Confucianism are an example of this, encouraging the view that people ought to conform as individuals to demands of a peaceful and ordered society.

According to the naturalistic view, the flourishing of society is not, or not the only, intrinsically good thing. Defenses of this notion are often formulated by reference to biology, and observations that living things compete more with their own kind than with other kinds. Rather, what is of intrinsic good is the flourishing of all sentient life, extending to those animals that have some level of similar sentience, such as Great Ape personhood. Others go farther, declaring that life itself is of intrinsic value.

By another approach, one achieves peace and agreement by focusing, not on one's peers (who may be rivals or competitors), but on the common environment. The reasoning: As living beings it is clearly and objectively good that we are surrounded by an ecosystem that supports life. Indeed, if we weren't, we could neither discuss that good nor even recognize it. The anthropic principle in cosmology recognizes this view.

Under materialism or even embodiment values, or in any system that recognizes the validity of ecology as a scientific study of limits and potentials, an ecosystem is a fundamental good. To all who investigate, it seems that goodness, or value, exists within an ecosystem, Earth. Creatures within that ecosystem and wholly dependent on it, evaluate good relative to what else could be achieved there. In other words, good is situated in a particular place and one does not dismiss everything that is not available there (such as very low gravity or absolutely abundant sugar candy) as "not good enough", one works within its constraints. Transcending them and learning to be satisfied with them, is thus another sort of value, perhaps called satisfaction.

Values and the people that hold them seem necessarily subordinate to the ecosystem. If this is so, then what kind of being could validly apply the word "good" to an ecosystem as a whole? Who would have the power to assess and judge an ecosystem as good or bad? By what criteria? And by what criteria would ecosystems be modified, especially larger ones such as the atmosphere (climate change) or oceans (extinction) or forests (deforestation)?

"Remaining on Earth" as the most basic value. While green ethicists have been most forthright about it, and have developed theories of Gaia philosophy, biophilia, bioregionalism that reflect it, the questions are now universally recognized as central in determining value, e.g. the economic "value of Earth" to humans as a whole, or the "value of life" that is neither whole-Earth nor human. Many have come to the conclusion that without assuming ecosystem continuation as a universal good, with attendant virtues like biodiversity and ecological wisdom it is impossible to justify such operational requirements as sustainability of human activity on Earth.

One response is that humans are not necessarily confined to Earth, and could use it and move on. A counter-argument is that only a tiny fraction of humans could do this—and they would be self-selected by ability to do technological escalation on others (for instance, the ability to create large spacecraft to flee the planet in, and simultaneously fend off others who seek to prevent them). Another counter-argument is that extraterrestrial life would encounter the fleeing humans and destroy them as a locust species. A third is that if there are no other worlds fit to support life (and no extraterrestrials who compete with humans to occupy them) it is both futile to flee, and foolish to imagine that it would take less energy and skill to protect the Earth as a habitat than it would take to construct some new habitat.

Accordingly, remaining on Earth, as a living being surrounded by a working ecosystem, is a fair statement of the most basic values and goodness to any being we are able to communicate with. A moral system without this axiom seems simply not actionable.

However, most religious systems acknowledge an afterlife and improving this is seen as an even more basic good. In many other moral systems, also, remaining on Earth in a state that lacks honor or power over self is less desirable—consider seppuku in bushido, kamikazes or the role of suicide attacks in Jihadi rhetoric. In all these systems, remaining on Earth is perhaps no higher than a third-place value.

Radical values environmentalism can be seen as either a very old or a very new view: that the only intrinsically good thing is a flourishing ecosystem; individuals and societies are merely instrumentally valuable, good only as means to having a flourishing ecosystem. The Gaia philosophy is the most detailed expression of this overall thought but it strongly influenced deep ecology and the modern Green Parties.

It is often claimed that aboriginal peoples never lost this sort of view. Anthropological linguistics studies links between their languages and the ecosystems they lived in, which gave rise to their knowledge distinctions. Very often, environmental cognition and moral cognition were not distinguished in these languages. Offenses to nature were like those to other people, and Animism reinforced this by giving nature "personality" via myth. Anthropological theories of value explore these questions.

Most people in the world reject older situated ethics and localized religious views. However small-community-based and ecology-centric views have gained some popularity in recent years. In part, this has been attributed to the desire for ethical certainties. Such a deeply rooted definition of goodness would be valuable because it might allow one to construct a good life or society by reliable processes of deduction, elaboration or prioritisation. Ones that relied only on local referents one could verify for oneself, creating more certainty and therefore less investment in protection, hedging and insuring against consequences of loss of the value.

History and novelty

An event is often seen as being of value simply because of its novelty in fashion and art. By contrast, cultural history and other antiques are sometimes seen as of value in and of themselves due to their age. Philosopher-historians Will and Ariel Durant spoke as much with the quote, "As the sanity of the individual lies in the continuity of his memories, so the sanity of the group lies in the continuity of its traditions; in either case a break in the chain invites a neurotic reaction" (The Lessons of History, 72).

Assessment of the value of old or historical artifacts takes into consideration, especially but not exclusively: the value placed on having a detailed knowledge of the past, the desire to have tangible ties to ancestral history, or the increased market value scarce items traditionally hold.

Creativity and innovation and invention are sometimes upheld as fundamentally good especially in Western industrial society—all imply newness, and even opportunity to profit from novelty. Bertrand Russell was notably pessimistic about creativity and thought that knowledge expanding faster than wisdom necessarily was fatal.

Goodness and morality in biology

The issue of good and evil in the human visuality, often associated with morality, is regarded by some biologists (notably Edward O. Wilson, Jeremy Griffith, David Sloan Wilson and Frans de Waal) as an important question to be addressed by the field of biology.

Judeo-Christian ethics

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Judeo-Christian_ethics

The idea that a common Judaeo-Christian ethics or Judeo-Christian values underpins American politics, law and morals has been part of the "American civil religion" since the 1940s. In recent years, the phrase has been associated with American conservatism, but the concept—though not always the exact phrase—has frequently featured in the rhetoric of leaders across the political spectrum, including that of Franklin D. Roosevelt and Lyndon B. Johnson.

Ethical value system

The current American use of "Judeo-Christian" — to refer to a value system common to Jews and Christians — first appeared in print on 11 July 1939 in a book review by the English writer George Orwell, with the phrase "… incapable of acting meanly, a thing that carries no weight the Judaeo-Christian scheme of morals." Orwell repeated the term in his 1941 essay: "It was the idea of human equality—the "Jewish" or "Judeo-Christian" idea of equality—that Hitler came into the world to destroy."

Orwell's usage of the term followed at least a decade of efforts by Jewish and Christian leaders, through such groups as the U.S. National Conference of Christians and Jews (founded in 1927), to emphasize common ground. The term continued to gain currency in the 1940s. In part, it was a way of countering antisemitism with the idea that the foundation of morals and law in the United States was a shared one between Jews and Christians.

Orwell was not the first to publicly speak about the moral commonality of Jewish and Christian traditions. On May 19, 1939, Albert Einstein, in a speech at Princeton Theological Seminary, explaining the importance of moral principles for modern science, emphasized: "The highest principles for our aspirations and judgments are given to us in the Jewish-Christian religious tradition."

And back in 1884, three years after a large-scale wave of anti-Jewish pogroms in Russia, Vladimir Solovyov (Soloviev), a prominent Russian philosopher and Christian writer, wrote in his essay "The Jews and the Christian Question":

"Our religion begins with a personal relationship between God and man in the ancient covenant of Abraham and Moses, and is confirmed in the closest personal unity of God and man in the New Testament of Jesus Christ, in which both natures exist inseparably, but unmerged as well. These two covenants are not two different religions, but only two stages of one and the same Divine-human religion, or speaking in the language of the German school, two moments of one and the same God-human process. This single and true Divine-human Judeo-Christian religion proceeds by a direct and magisterial path amid the two extreme errors of paganism, in which first man is absorbed by Divinity (in India), and then Divinity itself is transformed into a shadow of man (in Greece and Rome)."

Franklin D. Roosevelt

The first inaugural address of Franklin D. Roosevelt (FDR), in 1933, the famous speech in which FDR declared that "the only thing we have to fear is fear itself", had numerous religious references, which was widely commented upon at the time. Although it did not use the term "Judeo-Christian", it has come to be seen by scholars as in tune with the emerging view of a Judeo-Christian tradition. Historian Mary Stuckey emphasizes "Roosevelt's use of the shared values grounded in the Judeo-Christian tradition" as a way to unify the American nation, and justify his own role as its chief policymaker.

In the speech, FDR attacked the bankers and promised a reform in an echo of the gospels: "The money changers have fled from their high seats in the temple of our civilization. We may now restore that temple to the ancient truths. The measure of the restoration lies in the extent to which we apply social values more noble than mere monetary profit." Houck and Nocasian, examining the flood of responses to the First Inaugural, and commenting on this passage, argue:

The nation's overwhelmingly Judeo-Christian response to the address thus had both textual and extratextual warrants. For those inclined to see the Divine Hand of Providence at work, Roosevelt's miraculous escape [from assassination] in Miami was a sign—perhaps The Sign—that God had sent another Washington or Lincoln at the appointed hour. ... Many others could not resist the subject position that Roosevelt ... had cultivated throughout the address—that of savior. After all, it was Christ who had expelled the moneychangers from the Temple. ... [Many listeners saw] a composite sign that their new president had a godly mandate to lead.

Gary Scott Smith stresses that Roosevelt believed his welfare programs were "wholly in accord with the social teachings of Christianity." He saw the achievement of social justice through government action as morally superior to the old laissez-faire approach. He proclaimed, "The thing we are seeking is justice," as guided by the precept of "Do unto your neighbor as you would be done by." Roosevelt saw the moral issue as religiosity versus anti-religion. According to Smith, "He pleaded with Protestants, Catholics, and Jews to transcend their sectarian creeds and 'unite in good works' whenever they could 'find common cause.'"

Atalia Omer and Jason A. Springs point to Roosevelt's 1939 State of the Union Address, which called upon Americans to "defend, not their homes alone, but the tenets of faith and humanity on with which their churches, their governments and their very civilization are founded." They state that, "This familiar rhetoric invoked a conception of the sanctity of the United States' Judeo-Christian values as a basis for war."

Timothy Wyatt notes that in the coming of World War II Roosevelt's isolationist opponents said he was calling for a "holy war." Wyatt says:

Often in his Fireside Chats or speeches to the houses of Congress, FDR argued for the entrance of America into the war by using both blatant and subtle religious rhetoric. Roosevelt portrayed the conflict in the light of good versus evil, the religious against the irreligious. In doing so, he pitted the Christian ideals of democracy against the atheism of National Socialism.

Lyndon Johnson

Biographer Randall B. Woods has argued that President Lyndon B. Johnson effectively used appeals to the Judeo-Christian ethical tradition to garner support for the civil rights law of 1965. Woods writes that Johnson undermined the Southern filibuster against the bill:

LBJ wrapped white America in a moral straight jacket. How could individuals who fervently, continuously, and overwhelmingly identified themselves with a merciful and just God continue to condone racial discrimination, police brutality, and segregation? Where in the Judeo-Christian ethic was there justification for killing young girls in a church in Alabama, denying an equal education to black children, barring fathers and mothers from competing for jobs that would feed and clothe their families? Was Jim Crow to be America's response to "Godless Communism"?

Woods went on to assess the role of Judeo-Christian ethics among the nation's political elite:

Johnson's decision to define civil rights as a moral issue, and to wield the nation's self-professed Judeo-Christian ethic as a sword in its behalf, constituted something of a watershed in twentieth-century political history. All presidents were fond of invoking the deity, and some conservatives like Dwight Eisenhower had flirted with employing Judeo-Christian teachings to justify their actions, but modern-day liberals, both politicians and the intellectuals who challenged and nourished them, had shunned spiritual witness. Most liberal intellectuals were secular humanists. Academics in particular had historically been deeply distrustful of organized religion, which they identified with small-mindedness, bigotry, and anti-intellectualism. Like his role model, FDR, Johnson equated liberal values with religious values, insisting freedom and social justice served the ends of both god and man. And he was not loath to say so.

Woods notes that Johnson's religiosity ran deep: "At 15 he joined the Disciples of Christ, or Christian, church and would forever believe that it was the duty of the rich to care for the poor, the strong to assist the weak, and the educated to speak for the inarticulate."

History

1930s and 1940s

Promoting the concept of the United States as a Judeo-Christian nation first became a political program in the 1940s, in response to the growth of anti-Semitism in America. The rise of Nazi anti-semitism in the 1930s led concerned Protestants, Catholics, and Jews to take steps to increase understanding and tolerance.

In this effort, precursors of the National Conference of Christians and Jews created teams consisting of a priest, a rabbi, and a minister, to run programs across the country, and fashion a more pluralistic America, no longer defined as a Christian land, but "one nurtured by three ennobling traditions: Protestantism, Catholicism and Judaism. ... The phrase 'Judeo-Christian' entered the contemporary lexicon as the standard liberal term for the idea that Western values rest on a religious consensus that included Jews."

In the 1930s, "In the face of worldwide antisemitic efforts to stigmatize and destroy Judaism, influential Christians and Jews in America labored to uphold it, pushing Judaism from the margins of American religious life towards its very center." During World War II, Jewish chaplains worked with Catholic priests and Protestant ministers to promote goodwill, addressing servicemen who, "in many cases had never seen, much less heard a Rabbi speak before." At funerals for the unknown soldier, rabbis stood alongside the other chaplains and recited prayers in Hebrew. In a much publicized wartime tragedy, the sinking of the Dorchester, the ship's multi-faith chaplains gave up their lifebelts to evacuating seamen and stood together "arm in arm in prayer" as the ship went down. A 1948 postage stamp commemorated their heroism with the words: "interfaith in action."

1950s, 1960s, and 1970s

In December 1952, then-President-elect Dwight Eisenhower, speaking extemporaneously a month before his inauguration, said, in what may be the first direct public reference by a U.S. president to the Judeo-Christian concept:

[The Founding Fathers said] 'we hold that all men are endowed by their Creator ... ' In other words, our form of government has no sense unless it is founded in a deeply felt religious faith, and I don't care what it is. With us of course it is the Judeo-Christian concept, but it must be a religion with all men created equal.

By the 1950s, many early modern conservatives emphasized the Judeo-Christian roots of their values. In 1958, economist Elgin Groseclose claimed that it was ideas "drawn from Judeo-Christian Scriptures that have made possible the economic strength and industrial power of this country."

Senator Barry Goldwater noted that conservatives "believed the communist projection of man as a producing, consuming animal to be used and discarded was antithetical to all the Judeo-Christian understandings which are the foundations upon which the Republic stands."

Belief in the superiority of Western Judeo-Christian traditions led conservatives to downplay the aspirations of the Third World to free themselves from colonial rule.

The emergence of the "Christian right" as a political force and part of the conservative coalition dates from the 1970s. According to Cambridge University historian Andrew Preston, the emergence of "conservative ecumenism." bringing together Catholics, Mormons, and conservative Protestants into the religious right coalition, was facilitated "by the rise of a Judeo-Christian ethic." These groups "began to mobilize together on cultural-political issues such as abortion and the proposed Equal Rights Amendment for women." As Wilcox and Robinson conclude:

The Christian Right is an attempt to restore Judeo-Christian values to a country that is in deep moral decline. ... [They] believe that society suffers from the lack of a firm basis of Judeo-Christian values and they seek to write laws that embody those values.

1980s and 1990s

By the 1980s and 1990s, favorable references to "Judeo-Christian values" were common, and the term was used by conservative Christians.

President Ronald Reagan frequently emphasized Judeo-Christian values as necessary ingredients in the fight against Communism. He argued that the Bible contains "all the answers to the problems that face us." Reagan disapproved of the growth of secularism and emphasized the need to take the idea of sin seriously. Tom Freiling, a Christian publisher and head of a conservative PAC, stated in his 2003 book, Reagan's God and Country, that "Reagan's core religious beliefs were always steeped in traditional Judeo-Christian heritage." Religion—and the Judeo-Christian concept—was a major theme in Reagan's rhetoric by 1980.

President Bill Clinton during his 1992 presidential campaign, likewise emphasized the role of religion in society, and in his personal life, having made references to the Judeo-Christian tradition.

The term became especially significant in American politics, and, promoting "Judeo-Christian values" in the culture wars, usage surged in the 1990s.

Since 9/11

According to Hartmann et al., usage shifted between 2001 and 2005, with the mainstream media using the term less, in order to characterize America as multicultural. The study finds the term is now most likely to be used by liberals in connection with discussions of Muslim and Islamic inclusion in America, and renewed debate about the separation of church and state.

The 2012 book Kosher Jesus by Orthodox rabbi Shmuley Boteach concludes with the statement that "the hyphen between Jewish and Christian values is Jesus himself."

In U.S. law

In the case of Marsh v. Chambers, 463 U.S. 783 (1983), the Supreme Court of the United States held that a state legislature could constitutionally have a paid chaplain to conduct legislative prayers "in the Judeo-Christian tradition." In Simpson v. Chesterfield County Board of Supervisors, the Fourth Circuit Court of Appeals held that the Supreme Court's holding in the Marsh case meant that the "Chesterfield County could constitutionally exclude Cynthia Simpson, a Wiccan priestess, from leading its legislative prayers, because her faith was not 'in the Judeo-Christian tradition.'" Chesterfield County's board included Jewish, Christian, and Muslim clergy in its invited list.

Responses

Some theologians warn against the uncritical use of "Judeo-Christian" entirely, arguing that it can license mischief, such as opposition to secular humanism with scant regard to modern Jewish, Catholic, or Christian traditions, including the liberal strains of different faiths, such as Reform Judaism and liberal Protestant Christianity.

Two notable books addressed the relations between contemporary Judaism and Christianity. Abba Hillel Silver's Where Judaism Differs and Leo Baeck's Judaism and Christianity were both motivated by an impulse to clarify Judaism's distinctiveness "in a world where the term Judeo-Christian had obscured critical differences between the two faiths."

Reacting against the blurring of theological distinctions, Rabbi Eliezer Berkovits wrote that "Judaism is Judaism because it rejects Christianity, and Christianity is Christianity because it rejects Judaism."

Theologian and author Arthur A. Cohen, in The Myth of the Judeo-Christian Tradition, questioned the theological validity of the Judeo-Christian concept and suggested that it was essentially an invention of American politics, while Jacob Neusner, in Jews and Christians: The Myth of a Common Tradition, writes, "The two faiths stand for different people talking about different things to different people."

Law professor Stephen M. Feldman, looking at the period before 1950, chiefly in Europe, sees the concept of a Judeo-Christian tradition as supersessionism, which he characterizes as "dangerous Christian dogma (at least from a Jewish perspective)", and as a "myth" which "insidiously obscures the real and significant differences between Judaism and Christianity."

Abrahamic religion

Advocates of the term "Abrahamic religion" since the second half of the 20th century have proposed an inclusivism that widens the "Judeo-Christian" concept to include Islam as well. The rationale for the term "Abrahamic" is that Islam, like Judaism and Christianity, traces its origins to the figure of Abraham, whom Islam regards as a prophet. Advocates of this umbrella term consider it the "exploration of something positive" in the sense of a "spiritual bond" between Jews, Christians, and Muslims.

Australia

Australian historian Tony Taylor points out that Australia has borrowed the "Judeo-Christian" theme from American conservative discourse.

Jim Berryman, another Australian historian, argues that from the 1890s to the present, rhetoric upholding Australia's traditional attachment to Western civilisation emphasizes three themes: the core British heritage; Australia's Judeo-Christian belief system; and the rational principles of the Enlightenment. These themes have been expressed mostly on the Australian center-right political spectrum, and most prominently among conservative-leaning commentators.

Development theory

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Development_theory
 
Development theory is a collection of theories about how desirable change in society is best achieved. Such theories draw on a variety of social science disciplines and approaches. In this article, multiple theories are discussed, as are recent developments with regard to these theories. Depending on which theory that is being looked at, there are different explanations to the process of development and their inequalities.

Modernization theory

Modernization theory is used to analyze the processes in which modernization in societies take place. The theory looks at which aspects of countries are beneficial and which constitute obstacles for economic development. The idea is that development assistance targeted at those particular aspects can lead to modernization of 'traditional' or 'backward' societies. Scientists from various research disciplines have contributed to modernization theory.

Sociological and anthropological modernization theory

The earliest principles of modernization theory can be derived from the idea of progress, which stated that people can develop and change their society themselves. Marquis de Condorcet was involved in the origins of this theory. This theory also states that technological advancements and economic changes can lead to changes in moral and cultural values. The French sociologist Émile Durkheim stressed the interdependence of institutions in a society and the way in which they interact with cultural and social unity. His work The Division of Labor in Society was very influential. It described how social order is maintained in society and ways in which primitive societies can make the transition to more advanced societies.

Other scientists who have contributed to the development of modernization theory are: David Apter, who did research on the political system and history of democracy; Seymour Martin Lipset, who argued that economic development leads to social changes which tend to lead to democracy; David McClelland, who approached modernization from the psychological side with his motivations theory; and Talcott Parsons who used his pattern variables to compare backwardness to modernity.

Linear stages of growth model

The linear stages of growth model is an economic model which is heavily inspired by the Marshall Plan which was used to revitalize Europe's economy after World War II. It assumes that economic growth can only be achieved by industrialization. Growth can be restricted by local institutions and social attitudes, especially if these aspects influence the savings rate and investments. The constraints impeding economic growth are thus considered by this model to be internal to society.

According to the linear stages of growth model, a correctly designed massive injection of capital coupled with intervention by the public sector would ultimately lead to industrialization and economic development of a developing nation.

The Rostow's stages of growth model is the most well-known example of the linear stages of growth model. Walt W. Rostow identified five stages through which developing countries had to pass to reach an advanced economy status: (1) Traditional society, (2) Preconditions for take-off, (3) Take-off, (4) Drive to maturity, (5) Age of high mass consumption. He argued that economic development could be led by certain strong sectors; this is in contrast to for instance Marxism which states that sectors should develop equally. According to Rostow's model, a country needed to follow some rules of development to reach the take-off: (1) The investment rate of a country needs to be increased to at least 10% of its GDP, (2) One or two manufacturing sectors with a high rate of growth need to be established, (3) An institutional, political and social framework has to exist or be created in order to promote the expansion of those sectors.

The Rostow model has serious flaws, of which the most serious are: (1) The model assumes that development can be achieved through a basic sequence of stages which are the same for all countries, a doubtful assumption; (2) The model measures development solely by means of the increase of GDP per capita; (3) The model focuses on characteristics of development, but does not identify the causal factors which lead development to occur. As such, it neglects the social structures that have to be present to foster development. Economic modernization theories such as Rostow's stages model have been heavily inspired by the Harrod-Domar model which explains in a mathematical way the growth rate of a country in terms of the savings rate and the productivity of capital. Heavy state involvement has often been considered necessary for successful development in economic modernization theory; Paul Rosenstein-Rodan, Ragnar Nurkse and Kurt Mandelbaum argued that a big push model in infrastructure investment and planning was necessary for the stimulation of industrialization, and that the private sector would not be able to provide the resources for this on its own. Another influential theory of modernization is the dual-sector model by Arthur Lewis. In this model Lewis explained how the traditional stagnant rural sector is gradually replaced by a growing modern and dynamic manufacturing and service economy.

Because of the focus on the need for investments in capital, the Linear Stages of Growth Models are sometimes referred to as suffering from ‘capital fundamentalism’.

Critics of modernization theory

Modernization theory observes traditions and pre-existing institutions of so-called "primitive" societies as obstacles to modern economic growth. Modernization which is forced from outside upon a society might induce violent and radical change, but according to modernization theorists it is generally worth this side effect. Critics point to traditional societies as being destroyed and slipping away to a modern form of poverty without ever gaining the promised advantages of modernization.

Structuralism

Structuralism is a development theory which focuses on structural aspects which impede the economic growth of developing countries. The unit of analysis is the transformation of a country's economy from, mainly, a subsistence agriculture to a modern, urbanized manufacturing and service economy. Policy prescriptions resulting from structuralist thinking include major government intervention in the economy to fuel the industrial sector, known as import substitution industrialization (ISI). This structural transformation of the developing country is pursued in order to create an economy which in the end enjoys self-sustaining growth. This can only be reached by ending the reliance of the underdeveloped country on exports of primary goods (agricultural and mining products), and pursuing inward-oriented development by shielding the domestic economy from that of the developed economies. Trade with advanced economies is minimized through the erection of all kinds of trade barriers and an overvaluation of the domestic exchange rate; in this way the production of domestic substitutes of formerly imported industrial products is encouraged. The logic of the strategy rests on the infant industry argument, which states that young industries initially do not have the economies of scale and experience to be able to compete with foreign competitors and thus need to be protected until they are able to compete in the free market. The Prebisch–Singer hypothesis states that over time the terms of trade for commodities deteriorate compared to those for manufactured goods, because the income elasticity of demand of manufactured goods is greater than that of primary products. If true, this would also support the ISI strategy.

Structuralists argue that the only way Third World countries can develop is through action by the state. Third world countries have to push industrialization and have to reduce their dependency on trade with the First World, and trade among themselves.

The roots of structuralism lie in South America, and particularly Chile. In 1950, Raul Prebisch went to Chile to become the first director of the Economic Commission for Latin America. In Chile, he cooperated with Celso Furtado, Aníbal Pinto, Osvaldo Sunkel, and Dudley Seers, who all became influential structuralists.

Dependency theory

Dependency theory is essentially a follow-up to structuralist thinking, and shares many of its core ideas. Whereas structuralists did not consider that development would be possible at all unless a strategy of delinking and rigorous ISI was pursued, dependency thinking could allow development with external links with the developed parts of the globe. However, this kind of development is considered to be "dependent development", i.e., it does not have an internal domestic dynamic in the developing country and thus remains highly vulnerable to the economic vagaries of the world market. Dependency thinking starts from the notion that resources flow from the ‘periphery’ of poor and underdeveloped states to a ‘core’ of wealthy countries, which leads to accumulation of wealth in the rich states at the expense of the poor states. Contrary to modernization theory, dependency theory states that not all societies progress through similar stages of development. Periphery states have unique features, structures and institutions of their own and are considered weaker with regards to the world market economy, while the developed nations have never been in this colonized position in the past. Dependency theorists argue that underdeveloped countries remain economically vulnerable unless they reduce their connections to the world market.

Dependency theory states that poor nations provide natural resources and cheap labor for developed nations, without which the developed nations could not have the standard of living which they enjoy. When underdeveloped countries try to remove the Core's influence, the developed countries hinder their attempts to keep control. This means that poverty of developing nations is not the result of the disintegration of these countries in the world system, but because of the way in which they are integrated into this system.

In addition to its structuralist roots, dependency theory has much overlap with Neo-Marxism and World Systems Theory, which is also reflected in the work of Immanuel Wallerstein, a famous dependency theorist. Wallerstein rejects the notion of a Third World, claiming that there is only one world which is connected by economic relations (World Systems Theory). He argues that this system inherently leads to a division of the world in core, semi-periphery and periphery. One of the results of expansion of the world-system is the commodification of things, like natural resources, labor and human relationships.

Basic needs

The basic needs model was introduced by the International Labour Organization in 1976, mainly in reaction to prevalent modernization- and structuralism-inspired development approaches, which were not achieving satisfactory results in terms of poverty alleviation and combating inequality in developing countries. It tried to define an absolute minimum of resources necessary for long-term physical well-being. The poverty line which follows from this, is the amount of income needed to satisfy those basic needs. The approach has been applied in the sphere of development assistance, to determine what a society needs for subsistence, and for poor population groups to rise above the poverty line. Basic needs theory does not focus on investing in economically productive activities. Basic needs can be used as an indicator of the absolute minimum an individual needs to survive.

Proponents of basic needs have argued that elimination of absolute poverty is a good way to make people active in society so that they can provide labor more easily and act as consumers and savers. There have been also many critics of the basic needs approach. It would lack theoretical rigour, practical precision, be in conflict with growth promotion policies, and run the risk of leaving developing countries in permanent turmoil.

Neoclassical theory

Neoclassical development theory has it origins in its predecessor: classical economics. Classical economics was developed in the 18th and 19th centuries and dealt with the value of products and on which production factors it depends. Early contributors to this theory are Adam Smith and David Ricardo. Classical economists argued – as do the neoclassical ones – in favor of the free market, and against government intervention in those markets. The 'invisible hand' of Adam Smith makes sure that free trade will ultimately benefit all of society. John Maynard Keynes was a very influential classical economist as well, having written his General Theory of Employment, Interest, and Money in 1936.

Neoclassical development theory became influential towards the end of the 1970s, fired by the election of Margaret Thatcher in the UK and Ronald Reagan in the USA. Also, the World Bank shifted from its Basic Needs approach to a neoclassical approach in 1980. From the beginning of the 1980s, neoclassical development theory really began to roll out.

Structural adjustment

One of the implications of the neoclassical development theory for developing countries were the Structural Adjustment Programmes (SAPs) which the World Bank and the International Monetary Fund wanted them to adopt. Important aspects of those SAPs include:

These measures are more or less reflected by the themes which were identified by the Institute of International Economics which were believed to be necessary for the recovery of Latin America from the economic and financial crises of the 1980s. These themes are known as the Washington consensus, a termed coined in 1989 by the economist John Williamson.

Recent trends

Post-development theory

Postdevelopment theory is a school of thought which questions the idea of national economic development altogether. According to postdevelopment scholars, the goal of improving living standards leans on arbitrary claims as to the desirability and possibility of that goal. Postdevelopment theory arose in the 1980s and 1990s.

According to postdevelopment theorists, the idea of development is just a 'mental structure' (Wolfgang Sachs) which has resulted in a hierarchy of developed and underdeveloped nations, of which the underdeveloped nations desire to be like developed nations. Development thinking has been dominated by the West and is very ethnocentric, according to Sachs. The Western lifestyle may neither be a realistic nor a desirable goal for the world's population, postdevelopment theorists argue. Development is being seen as a loss of a country's own culture, people's perception of themselves and modes of life. According to Majid Rahnema, another leading postdevelopment scholar, things like notions of poverty are very culturally embedded and can differ a lot among cultures. The institutes which voice the concern over underdevelopment are very Western-oriented, and postdevelopment calls for a broader cultural involvement in development thinking.

Postdevelopment proposes a vision of society which removes itself from the ideas which currently dominate it. According to Arturo Escobar, postdevelopment is interested instead in local culture and knowledge, a critical view against established sciences and the promotion of local grassroots movements. Also, postdevelopment argues for structural change in order to reach solidarity, reciprocity, and a larger involvement of traditional knowledge.

Sustainable development

Sustainable development is development that meets the needs of the present without compromising the ability of future generations to meet their own needs. (Brundtland Commission) There exist more definitions of sustainable development, but they all have to do with the carrying capacity of the earth and its natural systems and the challenges faced by humanity. Sustainable development can be broken up into environmental sustainability, economic sustainability and sociopolitical sustainability. The book Limits to Growth, commissioned by the Club of Rome, gave huge momentum to the thinking about sustainability. Global warming issues are also problems which are emphasized by the sustainable development movement. This led to the 1997 Kyoto Accord, with the plan to cap greenhouse-gas emissions.

Opponents of the implications of sustainable development often point to the environmental Kuznets curve. The idea behind this curve is that, as an economy grows, it shifts towards more capital and knowledge-intensive production. This means that as an economy grows, its pollution output increases, but only until it reaches a particular threshold where production becomes less resource-intensive and more sustainable. This means that a pro-growth, not an anti-growth policy is needed to solve the environmental problem. But the evidence for the environmental Kuznets curve is quite weak. Also, empirically spoken, people tend to consume more products when their income increases. Maybe those products have been produced in a more environmentally friendly way, but on the whole the higher consumption negates this effect. There are people like Julian Simon however who argue that future technological developments will resolve future problems.

Human development theory

Human development theory is a theory which uses ideas from different origins, such as ecology, sustainable development, feminism and welfare economics. It wants to avoid normative politics and is focused on how social capital and instructional capital can be deployed to optimize the overall value of human capital in an economy.

Amartya Sen and Mahbub ul Haq are the most well-known human development theorists. The work of Sen is focused on capabilities: what people can do and be. It is these capabilities, rather than the income or goods that they receive (as in the Basic Needs approach), that determine their well-being. This core idea also underlies the construction of the Human Development Index, a human-focused measure of development pioneered by the UNDP in its Human Development Reports; this approach has become popular the world over, with indexes and reports published by individual counties, including the American Human Development Index and Report in the United States. The economic side of Sen's work can best be categorized under welfare economics, which evaluates the effects of economic policies on the well-being of peoples. Sen wrote the influential book Development as Freedom which added an important ethical side to development economics.

Introduction to entropy

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Introduct...