Search This Blog

Tuesday, September 15, 2020

Self

From Wikipedia, the free encyclopedia
 

The self is an individual person as the object of its own reflective consciousness. Since the self is a reference by a subject to the same subject, this reference is necessarily subjective. The sense of having a self—or self-hood—should, however, not be confused with subjectivity itself. Ostensibly, this sense is directed outward from the subject to refer inward, back to its "self" (or itself). Examples of psychiatric conditions where such "sameness" may become broken include depersonalization, which sometimes occurs in schizophrenia: the self appears different from the subject.

The first-person perspective distinguishes self-hood from personal identity. Whereas "identity" is (literally) sameness and may involve categorization and labeling, self-hood implies a first-person perspective and suggests potential uniqueness. Conversely, we use "person" as a third-person reference. Personal identity can be impaired in late-stage Alzheimer's disease and in other neurodegenerative diseases. Finally, the self is distinguishable from "others". Including the distinction between sameness and otherness, the self versus other is a research topic in contemporary philosophy) and contemporary phenomenology, psychology, psychiatry, neurology, and neuroscience.

Although subjective experience is central to self-hood, the privacy of this experience is only one of many problems in the Philosophy of self and scientific study of consciousness.

Neuroscience

Two areas of the brain that are important in retrieving self-knowledge are the medial prefrontal cortex and the medial posterior parietal cortex. The posterior cingulate cortex, the anterior cingulate cortex, and medial prefrontal cortex are thought to combine to provide humans with the ability to self-reflect. The insular cortex is also thought to be involved in the process of self-reference.

Psychology

The psychology of self is the study of either the cognitive and affective representation of one's identity or the subject of experience. The earliest formulation of the self in modern psychology forms the distinction between the self as I, the subjective knower, and the self as Me, the subject that is known. Current views of the self in psychology position the self as playing an integral part in human motivation, cognition, affect, and social identity. Self following from John Locke has been seen as a product of episodic memory but research upon those with amnesia find they have a coherent sense of self based upon preserved conceptual autobiographical knowledge. It is increasingly possible to correlate cognitive and affective experience of self with neural processes. A goal of this ongoing research is to provide grounding and insight into the elements of which the complex multiply situated selves of human identity are composed. The 'Disorders of the Self' have also been extensively studied by psychiatrists.

For example, facial and pattern recognition take large amounts of brain processing capacity but pareidolia cannot explain many constructs of self for cases of disorder, such as schizophrenia or schizo-affective disorder. One's sense of self can also be changed upon becoming part of a stigmatized group. According to Cox, Abramson, Devine, and Hollon (2012), if an individual has prejudice against a certain group, like the elderly and then later becomes part of this group this prejudice can be turned inward causing depression (i.e. deprejudice).

The philosophy of a disordered self, such as in schizophrenia, is described in terms of what the psychiatrist understands are actual events in terms of neuron excitation but are delusions nonetheless, and the schizo-affective or schizophrenic person also believes are actual events in terms of essential being. PET scans have shown that auditory stimulation is processed in certain areas of the brain, and imagined similar events are processed in adjacent areas, but hallucinations are processed in the same areas as actual stimulation. In such cases, external influences may be the source of consciousness and the person may or may not be responsible for "sharing" in the mind's process, or the events which occur, such as visions and auditory stimuli, may persist and be repeated often over hours, days, months or years—and the afflicted person may believe themselves to be in a state of rapture or possession.

What the Freudian tradition has subjectively called, "sense of self" is for Jungian analytic psychology, where one's identity is lodged in the persona or ego and is subject to change in maturation. Carl Jung distinguished, "The self is not only the center, but also the whole circumference which embraces both conscious and unconscious; it is the center of this totality...". The Self in Jungian psychology is "the archetype of wholeness and the regulating center of the psyche ... a transpersonal power that transcends the ego." As a Jungian archetype, it cannot be seen directly, but by ongoing individuating maturation and analytic observation, can be experienced objectively by its cohesive wholeness making factor.

Sociology

The self can be redefined as a dynamic, responsive process that structures neural pathways according to past and present environments including material, social, and spiritual aspects. Self-concept is a concept or belief that an individual has of him or herself as an emotional, spiritual, and social being. Therefore, the self-concept is the idea of who I am, kind of like a self-reflection of one's well being. For example, the self-concept is anything you say about yourself.

A society is a group of people who share a common belief or aspect of self interacting for the maintenance or betterment of the collective. Culture consists of explicit and implicit patterns of historically derived and selected ideas and their embodiment in institutions, cognitive and social practices, and artifacts. Cultural systems may, on the one hand, be considered as products of action, and on the other, as conditioning elements of further action. Therefore, the following sections will explore how the self and self-concept can be changed due to different cultures.

Markus and Kitayama's early 1990s theory hypothesized that representations of the self in human cultures would fall on a continuum from independent to interdependent. The independent self is supposed to be egoistic, unique, separated from the various contexts contexts, critical in judgement and prone to self-expression. The interdependent self is supposed to be altruistic, similar with the others, flexible according to contexts, conformist and unlikely to express opinions that would disturb the harmony of his or her group of belonging. This theory enjoyed huge popularity despite its many problems such as being based on popular stereotypes and myths about different cultures rather than on rigorous scientific research as well as postulating a series of causal links between culture and self-construals without presenting any evidence supporting them. A large study from 2016 involving a total of 10,203 participants from 55 cultural groups found that there is no independent versus interdependent dimension of self-construal because traits supposed by Markus & Kitayama to form a coherent construct do not actually correlate, or if they correlate, they have correlations opposite to those postulated by Markus & Kitayama. There are seven separate dimension of self-construal which can be found at both the cultural level of analysis and the individual level of analysis. These dimensions are difference versus similarity (if the individual considers himself or herself to be a unique person or to be the same as everybody else), self-containment versus connection to others (feeling oneself as being separated from others versus feeling oneself as being together with the others), self-direction versus receptiveness to influence (independent thinking versus conformity),

Westerners, Latin Americans and the Japanese are relatively likely to represent their individual self as unique and different from that of others while Arabs, South-East Asians and Africans are relatively likely to represent their self as being similar with that of others. Individuals from Uganda, Japan, Colombia, Namibia, Ghana and Belgium were most likely to represent their selves as being emotionally separated from the community while individuals from Oman, Malaysia, Thailand and central Brazil were most likely to consider themselves as emotionally connected to their communities. Japanese, Belgians, British and Americans from Colorado were most likely to value independent thinking and consider themselves as making their own decisions in life independently from others. On the other hand, respondents from rural Peru, Malaysia, Ghana, Oman and Hungary were most likely to place more value on following others rather than thinking for themselves as well as to describe themselves as being often influenced by others in their decisions. Middle Easterners from Lebanon, Turkey, Egypt and Oman were most likely to value self-reliance and consider themselves as working on their own and being economically independent from others. On the other hand, respondents from Uganda, Japan and Namibia were most likely to consider cooperation between different individuals in economical activities as being important. Chileans, Ethiopians from the highlands, Turks and people from Lebanon placed a relatively high degree of importance on maintaining a stable pattern of behavior regardless of situation or context. Individuals from Japan, Cameroon, the United Kingdom and Sweden were most likely to describe themselves as being adaptable to various contexts and to place value on this ability. Colombians, Chileans, US Hispanics, Belgians and Germans were most likely to consider self expression as being more important than maintaining harmony within a group. Respondents from Oman, Cameroon and Malaysia were most likely to say that they prefer keeping harmony within a group to engaging in self-expression. Sub-Saharan Africans from Namibia, Ghana and Uganda considered that they would follow their own interests even if this means harming the interests of those close to them. Europeans from Belgium, Italy and Sweden had the opposite preference, considering self-sacrifice for other members of the community as being more important than accomplishing selfish goals.

Contrary to the theory of Markus & Kitayama, egoism correlates negatively with individual uniqueness, independent thinking and self-expression. Self-reliance correlates strongly and negatively with emotional self-containment, which is also unexpected given Markus & Kitayama’s theory. The binary classification of cultural self-construals into independent versus interdependent is deeply flawed because in reality, the traits do not correlate according to Markus & Kitayama’s self construal theory, and this theory fails to take into consideration the extremely diverse and complex variety of self-construals present in various cultures across the world.

The way individuals construct themselves may be different due to their culture. The self is dynamic and complex and it will change or conform to whatever social influence it is exposed to. The main reason why the self is constantly dynamic is because it always looks for reasons to not be harmed. The self in any culture looks out for its well being and will avoid as much threat as possible. This can be explained through the evolutionary psychology concept called survival of the fittest.

Philosophy

The philosophy of self seeks to describe essential qualities that constitute a person's uniqueness or essential being. There have been various approaches to defining these qualities. The self can be considered that being which is the source of consciousness, the agent responsible for an individual's thoughts and actions, or the substantial nature of a person which endures and unifies consciousness over time.

In addition to Emmanuel Levinas writings on "otherness", the distinction between "you" and "me" has been further elaborated in Martin Buber's philosophical work: Ich und Du.

Religion

Religious views on the self vary widely. The self is a complex and core subject in many forms of spirituality. Two types of self are commonly considered—the self that is the ego, also called the learned, superficial self of mind and body, an egoic creation, and the self which is sometimes called the "True Self", the "Observing Self", or the "Witness". In Hinduism, the Ātman (self) is not an individual, but a representation of the transcendent reality Brahman.

One description of spirituality is the self's search for "ultimate meaning" through an independent comprehension of the sacred. Another definition of spiritual identity is: "A persistent sense of self that addresses ultimate questions about the nature, purpose, and meaning of life, resulting in behaviors that are consonant with the individual’s core values. Spiritual identity appears when the symbolic religious and spiritual value of a culture is found by individuals in the setting of their own life. There can be different types of spiritual self because it is determined by one's life and experiences."

Human beings have a self—that is, they are able to look back on themselves as both subjects and objects in the universe. Ultimately, this brings questions about who we are and the nature of our own importance. Traditions such as Buddhism see the attachment to self is an illusion that serves as the main cause of suffering and unhappiness. Christianity makes a distinction between the true self and the false self, and sees the false self negatively, distorted through sin: 'The heart is deceitful above all things, and desperately wicked; who can know it?' (Jeremiah 17:9)

According to Marcia Cavell, identity comes from both political and religious views. He also identified exploration and commitment as interactive parts of identity formation, which includes religious identity. Erik Erikson compared faith with doubt and found that healthy adults take heed to their spiritual side.

Abstract and concrete

From Wikipedia, the free encyclopedia

In metaphysics, abstract and concrete are classifications that denote whether the object that a term describes has physical referents. Abstract objects have no physical referents, whereas concrete objects do. They are most commonly used in philosophy and semantics. Abstract objects are sometimes called abstracta (sing. abstractum) and concrete objects are sometimes called concreta (sing. concretum). An abstract object is an object that does not exist at any particular time or place, but rather exists as a type of thing—i.e., an idea, or abstraction. The term abstract object is said to have been coined by Willard Van Orman Quine.

The study of abstract objects is called abstract object theory (AOT). According to AOT, some objects (the ordinary concrete ones around us, like tables and chairs) exemplify properties, while others (abstract objects like numbers, and what others would call "non-existent objects", like the round square, and the mountain made entirely of gold) merely encode them (this is also known as the dual copula strategy).

In philosophy

The type–token distinction identifies physical objects that are tokens of a particular type of thing. The "type" of which it is a part is in itself an abstract object. The abstract–concrete distinction is often introduced and initially understood in terms of paradigmatic examples of objects of each kind:

Examples of abstract and concrete objects
Abstract Concrete
Tennis A tennis match
Redness Red light reflected off of an
apple and hitting one's eyes
Five Five cars
Justice A just action
Humanity (the property of being human) Human population (the set of all humans)

Abstract objects have often garnered the interest of philosophers because they raise problems for popular theories. In ontology, abstract objects are considered problematic for physicalism and some forms of naturalism. Historically, the most important ontological dispute about abstract objects has been the problem of universals. In epistemology, abstract objects are considered problematic for empiricism. If abstracta lack causal powers or spatial location, how do we know about them? It is hard to say how they can affect our sensory experiences, and yet we seem to agree on a wide range of claims about them.

Some, such as Ernst Mally, Edward Zalta and arguably, Plato in his Theory of Forms, have held that abstract objects constitute the defining subject matter of metaphysics or philosophical inquiry more broadly. To the extent that philosophy is independent of empirical research, and to the extent that empirical questions do not inform questions about abstracta, philosophy would seem especially suited to answering these latter questions.

In modern philosophy, the distinction between abstract and concrete was explored by Immanuel Kant and G. W. F. Hegel.

Gottlob Frege said that abstract objects, such as numbers, were members of a third realm, different from the external world or from internal consciousness.

Abstract objects and causality

Another popular proposal for drawing the abstract–concrete distinction contends that an object is abstract if it lacks any causal powers. A causal power has the ability to affect something causally. Thus, the empty set is abstract because it cannot act on other objects. One problem for this view is that it is not clear exactly what it is to have a causal power. For a more detailed exploration of the abstract–concrete distinction, see the relevant Stanford Encyclopedia of Philosophy article.

Quasi-abstract entities

Recently, there has been some philosophical interest in the development of a third category of objects known as the quasi-abstract. Quasi-abstract objects have drawn particular attention in the area of social ontology and documentality. Some argue that the over-adherence to the platonist duality of the concrete and the abstract has led to a large category of social objects having been overlooked or rejected as nonexistent because they exhibit characteristics that the traditional duality between concrete and abstract regards as incompatible. Specially, the ability to have temporal location, but not spatial location, and have causal agency (if only by acting through representatives). These characteristics are exhibited by a number of social objects, including states of the international legal system.

Concrete and abstract thought in psychology

Jean Piaget uses the terms "concrete" and "formal" to describe two different types of learning. Concrete thinking involves facts and descriptions about everyday, tangible objects, while abstract (formal operational) thinking involves a mental process.

Concrete idea Abstract idea
Dense things sink. It will sink if its density is
greater than the density of the fluid.
You breathe in oxygen and breathe
out carbon dioxide.
Gas exchange takes place between
the air in the alveoli and the blood.
Plants get water through their roots. Water diffuses through the cell
membrane of the root hair cells.

Abstraction

From Wikipedia, the free encyclopedia

Abstraction in its main sense is a conceptual process where general rules and concepts are derived from the usage and classification of specific examples, literal ("real" or "concrete") signifiers, first principles, or other methods.

"An abstraction" is the outcome of this process—a concept that acts as a common noun for all subordinate concepts, and connects any related concepts as a group, field, or category.

Conceptual abstractions may be formed by filtering the information content of a concept or an observable phenomenon, selecting only the aspects which are relevant for a particular subjectively valued purpose. For example, abstracting a leather soccer ball to the more general idea of a ball selects only the information on general ball attributes and behavior, excluding, but not eliminating, the other phenomenal and cognitive characteristics of that particular ball. In a type–token distinction, a type (e.g., a 'ball') is more abstract than its tokens (e.g., 'that leather soccer ball').

Abstraction in its secondary use is a material process, discussed in the themes below.

Origins

Thinking in abstractions is considered by anthropologists, archaeologists, and sociologists to be one of the key traits in modern human behaviour, which is believed to have developed between 50,000 and 100,000 years ago. Its development is likely to have been closely connected with the development of human language, which (whether spoken or written) appears to both involve and facilitate abstract thinking.

History

Abstraction involves induction of ideas or the synthesis of particular facts into one general theory about something. It is the opposite of specification, which is the analysis or breaking-down of a general idea or abstraction into concrete facts. Abstraction can be illustrated with Francis Bacon's Novum Organum (1620), a book of modern scientific philosophy written in the late Jacobean era of England to encourage modern thinkers to collect specific facts before making any generalizations.

Bacon used and promoted induction as an abstraction tool, and it countered the ancient deductive-thinking approach that had dominated the intellectual world since the times of Greek philosophers like Thales, Anaximander, and Aristotle. Thales (c. 624–546 BCE) believed that everything in the universe comes from one main substance, water. He deduced or specified from a general idea, "everything is water", to the specific forms of water such as ice, snow, fog, and rivers.

Modern scientists can also use the opposite approach of abstraction, or going from particular facts collected into one general idea, such as the motion of the planets (Newton (1642–1727)). When determining that the sun is the center of our solar system (Copernicus (1473–1543)), scientists had to utilize thousands of measurements to finally conclude that Mars moves in an elliptical orbit about the sun (Kepler (1571–1630)), or to assemble multiple specific facts into the law of falling bodies (Galileo (1564–1642)).

Themes

Compression

An abstraction can be seen as a compression process, mapping multiple different pieces of constituent data to a single piece of abstract data; based on similarities in the constituent data, for example, many different physical cats map to the abstraction "CAT". This conceptual scheme emphasizes the inherent equality of both constituent and abstract data, thus avoiding problems arising from the distinction between "abstract" and "concrete". In this sense the process of abstraction entails the identification of similarities between objects, and the process of associating these objects with an abstraction (which is itself an object).

For example, picture 1 below illustrates the concrete relationship "Cat sits on Mat".

Chains of abstractions can be construed, moving from neural impulses arising from sensory perception to basic abstractions such as color or shape, to experiential abstractions such as a specific cat, to semantic abstractions such as the "idea" of a CAT, to classes of objects such as "mammals" and even categories such as "object" as opposed to "action".

For example, graph 1 below expresses the abstraction "agent sits on location". This conceptual scheme entails no specific hierarchical taxonomy (such as the one mentioned involving cats and mammals), only a progressive exclusion of detail.

Instantiation

Non-existent things in any particular place and time are often seen as abstract. By contrast, instances, or members, of such an abstract thing might exist in many different places and times.

Those abstract things are then said to be multiply instantiated, in the sense of picture 1, picture 2, etc., shown below. It is not sufficient, however, to define abstract ideas as those that can be instantiated and to define abstraction as the movement in the opposite direction to instantiation. Doing so would make the concepts "cat" and "telephone" abstract ideas since despite their varying appearances, a particular cat or a particular telephone is an instance of the concept "cat" or the concept "telephone". Although the concepts "cat" and "telephone" are abstractions, they are not abstract in the sense of the objects in graph 1 below. We might look at other graphs, in a progression from cat to mammal to animal, and see that animal is more abstract than mammal; but on the other hand mammal is a harder idea to express, certainly in relation to marsupial or monotreme.

Perhaps confusingly, some philosophies refer to tropes (instances of properties) as abstract particulars—e.g., the particular redness of a particular apple is an abstract particular. This is similar to qualia and sumbebekos.

Material process

Still retaining the primary meaning of 'abstrere' or 'to draw away from', the abstraction of money, for example, works by drawing away from the particular value of things allowing completely incommensurate objects to be compared (see the section on 'Physicality' below). Karl Marx's writing on the commodity abstraction recognizes a parallel process.

The state (polity) as both concept and material practice exemplifies the two sides of this process of abstraction. Conceptually, 'the current concept of the state is an abstraction from the much more concrete early-modern use as the standing or status of the prince, his visible estates'. At the same time, materially, the 'practice of statehood is now constitutively and materially more abstract than at the time when princes ruled as the embodiment of extended power'.

Ontological status

The way that physical objects, like rocks and trees, have being differs from the way that properties of abstract concepts or relations have being, for example the way the concrete, particular, individuals pictured in picture 1 exist differs from the way the concepts illustrated in graph 1 exist. That difference accounts for the ontological usefulness of the word "abstract". The word applies to properties and relations to mark the fact that, if they exist, they do not exist in space or time, but that instances of them can exist, potentially in many different places and times.

Physicality

A physical object (a possible referent of a concept or word) is considered concrete (not abstract) if it is a particular individual that occupies a particular place and time. However, in the secondary sense of the term 'abstraction', this physical object can carry materially abstracting processes. For example, record-keeping aids throughout the Fertile Crescent included calculi (clay spheres, cones, etc.) which represented counts of items, probably livestock or grains, sealed in containers. According to Schmandt-Besserat & (1981), these clay containers contained tokens, the total of which were the count of objects being transferred. The containers thus served as something of a bill of lading or an accounts book. In order to avoid breaking open the containers for the count, marks were placed on the outside of the containers. These physical marks, in other words, acted as material abstractions of a materially abstract process of accounting, using conceptual abstractions (numbers) to communicate its meaning.

Abstract things are sometimes defined as those things that do not exist in reality or exist only as sensory experiences, like the color red. That definition, however, suffers from the difficulty of deciding which things are real (i.e. which things exist in reality). For example, it is difficult to agree to whether concepts like God, the number three, and goodness are real, abstract, or both.

An approach to resolving such difficulty is to use predicates as a general term for whether things are variously real, abstract, concrete, or of a particular property (e.g., good). Questions about the properties of things are then propositions about predicates, which propositions remain to be evaluated by the investigator. In the graph 1 below, the graphical relationships like the arrows joining boxes and ellipses might denote predicates.

Referencing and referring

Abstractions sometimes have ambiguous referents; for example, "happiness" (when used as an abstraction) can refer to as many things as there are people and events or states of being which make them happy. Likewise, "architecture" refers not only to the design of safe, functional buildings, but also to elements of creation and innovation which aim at elegant solutions to construction problems, to the use of space, and to the attempt to evoke an emotional response in the builders, owners, viewers and users of the building.

Simplification and ordering

Abstraction uses a strategy of simplification, wherein formerly concrete details are left ambiguous, vague, or undefined; thus effective communication about things in the abstract requires an intuitive or common experience between the communicator and the communication recipient. This is true for all verbal/abstract communication.

Conceptual graph for A Cat sitting on the Mat (graph 1)
 
Cat on Mat (picture 1)

For example, many different things can be red. Likewise, many things sit on surfaces (as in picture 1, to the right). The property of redness and the relation sitting-on are therefore abstractions of those objects. Specifically, the conceptual diagram graph 1 identifies only three boxes, two ellipses, and four arrows (and their five labels), whereas the picture 1 shows much more pictorial detail, with the scores of implied relationships as implicit in the picture rather than with the nine explicit details in the graph.

Graph 1 details some explicit relationships between the objects of the diagram. For example, the arrow between the agent and CAT:Elsie depicts an example of an is-a relationship, as does the arrow between the location and the MAT. The arrows between the gerund/present participle SITTING and the nouns agent and location express the diagram's basic relationship; "agent is SITTING on location"; Elsie is an instance of CAT.

Although the description sitting-on (graph 1) is more abstract than the graphic image of a cat sitting on a mat (picture 1), the delineation of abstract things from concrete things is somewhat ambiguous; this ambiguity or vagueness is characteristic of abstraction. Thus something as simple as a newspaper might be specified to six levels, as in Douglas Hofstadter's illustration of that ambiguity, with a progression from abstract to concrete in Gödel, Escher, Bach (1979):

(1) a publication
(2) a newspaper
(3) The San Francisco Chronicle
(4) the May 18 edition of The San Francisco Chronicle
(5) my copy of the May 18 edition of The San Francisco Chronicle
(6) my copy of the May 18 edition of The San Francisco Chronicle as it was when I first picked it up (as contrasted with my copy as it was a few days later: in my fireplace, burning)

An abstraction can thus encapsulate each of these levels of detail with no loss of generality. But perhaps a detective or philosopher/scientist/engineer might seek to learn about something, at progressively deeper levels of detail, to solve a crime or a puzzle.

Thought processes

In philosophical terminology, abstraction is the thought process wherein ideas are distanced from objects.

As used in different disciplines

In art

Typically, abstraction is used in the arts as a synonym for abstract art in general. Strictly speaking, it refers to art unconcerned with the literal depiction of things from the visible world—it can, however, refer to an object or image which has been distilled from the real world, or indeed, another work of art. Artwork that reshapes the natural world for expressive purposes is called abstract; that which derives from, but does not imitate a recognizable subject is called nonobjective abstraction. In the 20th century the trend toward abstraction coincided with advances in science, technology, and changes in urban life, eventually reflecting an interest in psychoanalytic theory. Later still, abstraction was manifest in more purely formal terms, such as color, freedom from objective context, and a reduction of form to basic geometric designs.

In computer science

Computer scientists use abstraction to make models that can be used and re-used without having to re-write all the program code for each new application on every different type of computer. They communicate their solutions with the computer by writing source code in some particular computer language which can be translated into machine code for different types of computers to execute. Abstraction allows program designers to separate a framework (categorical concepts related to computing problems) from specific instances which implement details. This means that the program code can be written so that code doesn't have to depend on the specific details of supporting applications, operating system software, or hardware, but on a categorical concept of the solution. A solution to the problem can then be integrated into the system framework with minimal additional work. This allows programmers to take advantage of another programmer's work, while requiring only an abstract understanding of the implementation of another's work, apart from the problem that it solves.

In general semantics

Abstractions and levels of abstraction play an important role in the theory of general semantics originated by Alfred Korzybski. Anatol Rapoport wrote: "Abstracting is a mechanism by which an infinite variety of experiences can be mapped on short noises (words)."

In history

Francis Fukuyama defines history as "a deliberate attempt of abstraction in which we separate out important from unimportant events".

In linguistics

Researchers in linguistics frequently apply abstraction so as to allow analysis of the phenomena of language at the desired level of detail. A commonly used abstraction, the phoneme, abstracts speech sounds in such a way as to neglect details that cannot serve to differentiate meaning. Other analogous kinds of abstractions (sometimes called "emic units") considered by linguists include morphemes, graphemes, and lexemes.

Abstraction also arises in the relation between syntax, semantics, and pragmatics. Pragmatics involves considerations that make reference to the user of the language; semantics considers expressions and what they denote (the designata) abstracted from the language user; and syntax considers only the expressions themselves, abstracted from the designata.

In mathematics

Abstraction in mathematics is the process of extracting the underlying structures, patterns or properties of a mathematical concept or object, removing any dependence on real world objects with which it might originally have been connected, and generalizing it so that it has wider applications or matching among other abstract descriptions of equivalent phenomena.

The advantages of abstraction in mathematics are:

  • It reveals deep connections between different areas of mathematics.
  • Known results in one area can suggest conjectures in another related area.
  • Techniques and methods from one area can be applied to prove results in other related area.
  • Patterns from one mathematical object can be generalized to other similar objects in the same class.

The main disadvantage of abstraction is that highly abstract concepts are more difficult to learn, and might require a degree of mathematical maturity and experience before they can be assimilated.

In music

In music, the term abstraction can be used to describe improvisatory approaches to interpretation, and may sometimes indicate abandonment of tonality. Atonal music has no key signature, and is characterized by the exploration of internal numeric relationships.

In neurology

A recent meta-analysis suggests that the verbal system has greater engagement for abstract concepts when the perceptual system is more engaged for processing of concrete concepts. This is because abstract concepts elicit greater brain activity in the inferior frontal gyrus and middle temporal gyrus compared to concrete concepts which elicit greater activity in the posterior cingulate, precuneus, fusiform gyrus, and parahippocampal gyrus. Other research into the human brain suggests that the left and right hemispheres differ in their handling of abstraction. For example, one meta-analysis reviewing human brain lesions has shown a left hemisphere bias during tool usage.

In philosophy

Abstraction in philosophy is the process (or, to some, the alleged process) in concept formation of recognizing some set of common features in individuals, and on that basis forming a concept of that feature. The notion of abstraction is important to understanding some philosophical controversies surrounding empiricism and the problem of universals. It has also recently become popular in formal logic under predicate abstraction. Another philosophical tool for discussion of abstraction is thought space.

John Locke defined abstraction in An Essay Concerning Human Understanding:

'So words are used to stand as outward marks of our internal ideas, which are taken from particular things; but if every particular idea that we take in had its own special name, there would be no end to names. To prevent this, the mind makes particular ideas received from particular things become general; which it does by considering them as they are in the mind—mental appearances—separate from all other existences, and from the circumstances of real existence, such as time, place, and so on. This procedure is called abstraction. In it, an idea taken from a particular thing becomes a general representative of all of the same kind, and its name becomes a general name that is applicable to any existing thing that fits that abstract idea.' 2.11.9

In psychology

Carl Jung's definition of abstraction broadened its scope beyond the thinking process to include exactly four mutually exclusive, different complementary psychological functions: sensation, intuition, feeling, and thinking. Together they form a structural totality of the differentiating abstraction process. Abstraction operates in one of these functions when it excludes the simultaneous influence of the other functions and other irrelevancies, such as emotion. Abstraction requires selective use of this structural split of abilities in the psyche. The opposite of abstraction is concretism. Abstraction is one of Jung's 57 definitions in Chapter XI of Psychological Types.

There is an abstract thinking, just as there is abstract feeling, sensation and intuition. Abstract thinking singles out the rational, logical qualities ... Abstract feeling does the same with ... its feeling-values. ... I put abstract feelings on the same level as abstract thoughts. ... Abstract sensation would be aesthetic as opposed to sensuous sensation and abstract intuition would be symbolic as opposed to fantastic intuition. (Jung, [1921] (1971): par. 678).

In social theory

In social theory, abstraction is used as both an ideational and material process. Alfred Sohn-Rethel, asked "Can there be abstraction other than by thought?" He used the example of commodity abstraction to show that abstraction occurs in practice as people create systems of abstract exchange that extend beyond the immediate physicality of the object and yet have real and immediate consequences. This work was extended through the 'Constitutive Abstraction' approach of writers associated with the Journal Arena. Two books that have taken this theme of the abstraction of social relations as an organizing process in human history are Nation Formation: Towards a Theory of Abstract Community (1996) and the second volume published in 2006, Globalism, Nationalism, Tribalism: Bringing Theory Back In. These books argue that the nation is an abstract community bringing together strangers who will never meet as such; thus constituting materially real and substantial, but abstracted and mediated relations. The books suggest that contemporary processes of globalization and mediatization have contributed to materially abstracting relations between people, with major consequences for how we live our lives.

It can be easily argued that abstraction is an elementary methodological tool in several disciplines of social science. These disciplines have definite and different man concepts that highlight those aspects of man and his behaviour by idealization that are relevant for the given human science. For example, homo sociologicus is the man as sociology abstracts and idealizes it, depicting man as a social being. Moreover, we could talk about homo cyber sapiens (the man who can extend his biologically determined intelligence thanks to new technologies), or homo creativus (who is simply creative).

Abstraction (combined with Weberian idealization) plays a crucial role in economics. Breaking away from directly experienced reality was a common trend in 19th century sciences (especially physics), and this was the effort which was fundamentally determined the way economics tried and still tries to approach the economic aspects of social life. It is abstraction we meet in the case of both Newton's physics and the neoclassical theory, since the goal was to grasp the unchangeable and timeless essence of phenomena. For example, Newton created the concept of the material point by following the abstraction method so that he abstracted from the dimension and shape of any perceptible object, preserving only inertial and translational motion. Material point is the ultimate and common feature of all bodies. Neoclassical economists created the indefinitely abstract notion of homo economicus by following the same procedure. Economists abstract from all individual and personal qualities in order to get to those characteristics that embody the essence of economic activity. Eventually, it is the substance of the economic man that they try to grasp. Any characteristic beyond it only disturbs the functioning of this essential core.

Big History

From Wikipedia, the free encyclopedia
 
A diagram of the Big Bang expansion according to NASA
 
Artist's depiction of the WMAP satellite gathering data to help scientists understand the Big Bang

Big History is an academic discipline which examines history from the Big Bang to the present. Big History resists specialization, and searches for universal patterns or trends. It examines long time frames using a multidisciplinary approach based on combining numerous disciplines from science and the humanities, and explores human existence in the context of this bigger picture. It integrates studies of the cosmos, Earth, life, and humanity using empirical evidence to explore cause-and-effect relations, and is taught at universities and primary and secondary schools often using web-based interactive presentations.

Historian David Christian has been credited with coining the term "Big History" while teaching one of the first such courses at Macquarie University. An all-encompassing study of humanity's relationship to cosmology and natural history has been pursued by scholars since the Renaissance, and the new field, Big History, continues such work.

Comparison with conventional history

Conventional history Big History
5000 BCE to present Big Bang to present
7,000–10,000 years 13.8 billion years
Compartmentalized fields of study Interdisciplinary approach
Focus on human civilization Focus on how humankind fits within the universe
Taught mostly with books Taught on interactive platforms at: Coursera, Youtube's Crash Course, Big History Project, Macquarie University, ChronoZoom
Microhistory Macrohistory
Focus on trends, processes Focus on analogy, metaphor
Based on a variety of documents, including written records and material artifacts Based on current knowledge about phenomena such as fossils, ecological changes, genetic analysis, telescope data, in addition to conventional historical data

Big History examines the past using numerous time scales, from the Big Bang to modernity, unlike conventional history courses which typically begin with the introduction of farming and civilization, or with the beginning of written records. It explores common themes and patterns. Courses generally do not focus on humans until one-third to halfway through, and, unlike conventional history courses, there is not much focus on kingdoms or civilizations or wars or national borders. If conventional history focuses on human civilization with humankind at the center, Big History focuses on the universe and shows how humankind fits within this framework and places human history in the wider context of the universe's history.

Conventional history often begins with the development of agriculture in civilizations such as Ancient Egypt.
 
Control of fire by early humans predating both agriculture and civilization

Unlike conventional history, Big History tends to go rapidly through detailed historical eras such as the Renaissance or Ancient Egypt. It draws on the latest findings from biology, astronomy, geology, climatology, prehistory, archaeology, anthropology, evolutionary biology, chemistry, psychology, hydrology, geography, paleontology, ancient history, physics, economics, cosmology, natural history, and population and environmental studies as well as standard history. One teacher explained:

We're taking the best evidence from physics and the best evidence from chemistry and biology, and we're weaving it together into a story ... They're not going to learn how to balance [chemical] equations, but they're going to learn how the chemical elements came out of the death of stars, and that's really interesting.

Big History arose from a desire to go beyond the specialized and self-contained fields that emerged in the 20th century. It tries to grasp history as a whole, looking for common themes across multiple time scales in history. Conventional history typically begins with the invention of writing, and is limited to past events relating directly to the human race. Big Historians point out that this limits study to the past 5,000 years and neglects the much longer time when humans existed on Earth. Henry Kannberg sees Big History as being a product of the Information Age, a stage in history itself following speech, writing, and printing. Big History covers the formation of the universe, stars, and galaxies, and includes the beginning of life as well as the period of several hundred thousand years when humans were hunter-gatherers. It sees the transition to civilization as a gradual one, with many causes and effects, rather than an abrupt transformation from uncivilized static cavemen to dynamic civilized farmers. An account in The Boston Globe describes what it polemically asserts to be the conventional "history" view:

Early humans were slump-shouldered, slope-browed, hairy brutes. They hunkered over campfires and ate scorched meat. Sometimes they carried spears. Once in a while they scratched pictures of antelopes on the walls of their caves. That's what I learned during elementary school, anyway. History didn't start with the first humans—they were cavemen! The Stone Age wasn't history; the Stone Age was a preamble to history, a dystopian era of stasis before the happy onset of civilization, and the arrival of nifty developments like chariot wheels, gunpowder, and Google. History started with agriculture, nation-states, and written documents. History began in Mesopotamia's Fertile Crescent, somewhere around 4000 BC. It began when we finally overcame our savage legacy, and culture surpassed biology.

Big History, in contrast to conventional history, has more of an interdisciplinary basis. Advocates sometimes view conventional history as "microhistory" or "shallow history", and note that three-quarters of historians specialize in understanding the last 250 years while ignoring the "long march of human existence." However, one historian disputed that the discipline of history has overlooked the big view, and described the "grand narrative" of Big History as a "cliché that gets thrown around a lot." One account suggested that conventional history had the "sense of grinding the nuts into an ever finer powder." It emphasizes long-term trends and processes rather than history-making individuals or events. Historian Dipesh Chakrabarty of the University of Chicago suggested that Big History was less politicized than contemporary history because it enables people to "take a step back." It uses more kinds of evidence than the standard historical written records, such as fossils, tools, household items, pictures, structures, ecological changes and genetic variations.

Criticism of Big History

Critics of Big History, including sociologist Frank Furedi, have deemed the discipline an "anti-humanist turn of history." The Big History narrative has also been challenged for failing to engage with the methodology of the conventional history discipline. According to historian and educator Sam Wineburg of Stanford University, Big History eschews the interpretation of texts in favor of a purely scientific approach, thus becoming "less history and more of a kind of evolutionary biology or quantum physics." Others have pointed out that such criticisms of Big History removing the human element or not following a historical methodology seem to derive from observers who have not sufficiently looked into what Big History actually does, with most courses having one-third or half devoted to humanity, with the concept of increasing complexity giving humanity an important place, and with methods in the natural sciences being innately historical since they also attempt to gather evidence in order to craft a narrative.

Themes

Radiocarbon dating helps scientists understand the age of rocks as well as the Earth and the Solar System.

Big History seeks to retell the "human story" in light of scientific advances by such methods as radiocarbon dating, genetic analysis, thermodynamic measurements of "free energy rate density", along with a host of methods employed in archaeology, anthropology, and world history. David Christian of Macquarie University has argued that the recent past is only understandable in terms of the "whole 14-billion-year span of time itself." David Baker of Macquarie University has pointed out that not only do the physical principles of energy flows and complexity connect human history to the very start of the Universe, but the broadest view of human history many also supply the discipline of history with a "unifying theme" in the form of the concept of collective learning. Big History also explores the mix of individual action and social and environmental forces, according to one view. Big History seeks to discover repeating patterns during the 13.8 billion years since the Big Bang and explore the core transdisciplinary theme of increasing complexity as described by Eric Chaisson of Harvard University.

Time scales and questions

Big History makes comparisons based on different time scales and notes similarities and differences between the human, geological, and cosmological scales. David Christian believes such "radical shifts in perspective" will yield "new insights into familiar historical problems, from the nature/nurture debate to environmental history to the fundamental nature of change itself." It shows how human existence has been changed by both human-made and natural factors: for example, according to natural processes which happened more than four billion years ago, iron emerged from the remains of an exploding star and, as a result, humans could use this hard metal to forge weapons for hunting and war. The discipline addresses such questions as "How did we get here?," "How do we decide what to believe?," "How did Earth form?," and "What is life?" According to Fred Spier it offers a "grand tour of all the major scientific paradigms" and helps students to become scientifically literate quickly. One interesting perspective that arises from Big History is that despite the vast temporal and spatial scales of the history of the Universe, it is actually very small pockets of the cosmos where most of the "history" is happening, due to the nature of complexity.

Cosmic evolution

Cosmic evolution, the scientific study of universal change, is closely related to Big History (as are the allied subjects of the epic of evolution and astrobiology); some researchers regard cosmic evolution as broader than Big History since the latter mainly (and rightfully) examines the specific historical trek from Big Bang → Milky Way → Sun → Earth → humanity. Cosmic evolution, while fully addressing all complex systems (and not merely those that led to humans) has been taught and researched for decades, mostly by astronomers and astrophysicists. This Big-Bang-to-humankind scenario well preceded the subject that some historians began calling Big History in the 1990s. Cosmic evolution is an intellectual framework that offers a grand synthesis of the many varied changes in the assembly and composition of radiation, matter, and life throughout the history of the universe. While engaging the time-honored queries of who we are and whence we came, this interdisciplinary subject attempts to unify the sciences within the entirety of natural history—a single, inclusive scientific narrative of the origin and evolution of all material things over ~14 billion years, from the origin of the universe to the present day on Earth.

The roots of the idea of cosmic evolution extend back millennia. Ancient Greek philosophers of the fifth century BCE, most notably Heraclitus, are celebrated for their reasoned claims that all things change. Early modern speculation about cosmic evolution began more than a century ago, including the broad insights of Robert Chambers, Herbert Spencer, and Lawrence Henderson. Only in the mid-20th century was the cosmic-evolutionary scenario articulated as a research paradigm to include empirical studies of galaxies, stars, planets, and life—in short, an expansive agenda that combines physical, biological, and cultural evolution. Harlow Shapley widely articulated the idea of cosmic evolution (often calling it "cosmography") in public venues at mid-century, and NASA embraced it in the late 20th century as part of its more limited astrobiology program. Carl Sagan, Eric Chaisson, Hubert Reeves, Erich Jantsch, and Preston Cloud, among others, extensively championed cosmic evolution at roughly the same time around 1980. This extremely broad subject now continues to be richly formulated as both a technical research program and a scientific worldview for the 21st century.

One popular collection of scholarly materials on cosmic evolution is based on teaching and research that has been underway at Harvard University since the mid-1970s.

Complexity, energy, thresholds

Cosmic evolution is a quantitative subject, whereas big history typically is not; this is because cosmic evolution is practiced mostly by natural scientists, while big history by social scholars. These two subjects, closely allied and overlapping, benefit from each other; cosmic evolutionists tend to treat universal history linearly, thus humankind enters their story only at the most very recent times, whereas big historians tend to stress humanity and its many cultural achievements, granting human beings a larger part of their story. One can compare and contrast these different emphases by watching two short movies portraying the Big-Bang-to-humankind narrative, one animating time linearly, and the other capturing time (actually look-back time) logarithmically; in the former, humans enter this 14-minute movie in the last second, while in the latter we appear much earlier—yet both are correct.

These different treatments of time over ~14 billion years, each with different emphases on historical content, are further clarified by noting that some cosmic evolutionists divide the whole narrative into three phases and seven epochs:

Phases: physical evolution → biological evolution → cultural evolution
Epochs: particulate → galactic → stellar → planetary → chemical → biological → cultural

This contrasts with the approach used by some big historians who divide the narrative into many more thresholds, as noted in the discussion at the end of this section below. Yet another telling of the Big-Bang-to-humankind story is one that emphasizes the earlier universe, particularly the growth of particles, galaxies, and large-scale cosmic structure, such as in physical cosmology.

Notable among quantitative efforts to describe cosmic evolution are Eric Chaisson's research efforts to describe the concept of energy flow through open, thermodynamic systems, including galaxies, stars, planets, life, and society. The observed increase of energy rate density (energy/time/mass) among a whole host of complex systems is one useful way to explain the rise of complexity in an expanding universe that still obeys the cherished second law of thermodynamics and thus continues to accumulate net entropy. As such, ordered material systems—from buzzing bees and redwood trees to shining stars and thinking beings—are viewed as temporary, local islands of order in a vast, global sea of disorder. A recent review article, which is especially directed toward big historians, summarizes much of this empirical effort over the past decade.

One striking finding of such complexity studies is the apparently ranked order among all known material systems in the universe. Although the absolute energy in astronomical systems greatly exceeds that of humans, and although the mass densities of stars, planets, bodies, and brains are all comparable, the energy rate density for humans and modern human society are approximately a million times greater than for stars and galaxies. For example, the Sun emits a vast luminosity, 4x1033 erg/s (equivalent to nearly a billion billion billion watt light bulb), but it also has a huge mass, 2x1033 g; thus each second an amount of energy equaling only 2 ergs passes through each gram of this star. In contrast to any star, more energy flows through each gram of a plant's leaf during photosynthesis, and much more (nearly a million times) rushes through each gram of a human brain while thinking (~20W/1350g).

Cosmic evolution is more than a subjective, qualitative assertion of "one damn thing after another". This inclusive scientific worldview constitutes an objective, quantitative approach toward deciphering much of what comprises organized, material Nature. Its uniform, consistent philosophy of approach toward all complex systems demonstrates that the basic differences, both within and among many varied systems, are of degree, not of kind. And, in particular, it suggests that optimal ranges of energy rate density grant opportunities for the evolution of complexity; those systems able to adjust, adapt, or otherwise take advantage of such energy flows survive and prosper, while other systems adversely affected by too much or too little energy are non-randomly eliminated.

Fred Spier is foremost among those big historians who have found the concept of energy flows useful, suggesting that Big History is the rise and demise of complexity on all scales, from sub-microscopic particles to vast galaxy clusters, and not least many biological and cultural systems in between.

David Christian, in an 18-minute TED talk, described some of the basics of the Big History course. Christian describes each stage in the progression towards greater complexity as a "threshold moment" when things become more complex, but they also become more fragile and mobile. Some of Christian's threshold stages are:

In a supernova, a star which has exhausted most of its energy bursts in an incredible explosion, creating conditions for heavier elements such as iron and gold to form.
  1. The universe appears, incredibly hot, exponentially expanding within a second.
  2. Stars are born.
  3. Stars die, creating temperatures hot enough to make complex chemicals, as well as rocks, asteroids, planets, moons, and our solar system.
  4. Earth is created.
  5. Life appears on Earth, with molecules growing from the Goldilocks conditions, with neither too much nor too little energy.
  6. Humans appear, language, collective learning.

Christian elaborated that more complex systems are more fragile, and that while collective learning is a powerful force to advance humanity in general, it is not clear that humans are in charge of it, and it is possible in his view for humans to destroy the biosphere with the powerful weapons that have been invented.

In the 2008 lecture series through The Teaching Company's Great Courses entitled Big History: The Big Bang, Life on Earth, and the Rise of Humanity, Christian explains Big History in terms of eight thresholds of increasing complexity:

  1. The Big Bang and the creation of the Universe about roughly 14 billion years ago
  2. The creation of the first complex objects, stars, about 12 billion years ago
  3. The creation of chemical elements inside dying stars required for chemically-complex objects, including plants and animals
  4. The formation of planets, such as our Earth, which are more chemically complex than the Sun
  5. The origin and evolution of life from roughly about 4.2 billion years ago, including the evolution of our hominine ancestors
  6. The development of our species, Homo sapiens, about 250,000 years ago, covering the Paleolithic era of human history
  7. The appearance of agriculture about 11,000 years ago in the Neolithic era, allowing for larger, more complex societies
  8. The "modern revolution", or the vast social, economic, and cultural transformations that brought the world into the modern era
  9. What will happen in the future and predicting what will be the next threshold in our history

Goldilocks conditions

The Earth is ideally located in a Goldilocks condition—being neither too close nor too distant from the Sun.

A theme in Big History is what has been termed Goldilocks conditions or the Goldilocks principle, which describes how "circumstances must be right for any type of complexity to form or continue to exist," as emphasized by Spier in his recent book. For humans, bodily temperatures can neither be too hot nor too cold; for life to form on a planet, it can neither have too much nor too little energy from sunlight. Stars require sufficient quantities of hydrogen, sufficiently packed together under tremendous gravity, to cause nuclear fusion.

Christian suggests that the universe creates complexity when these Goldilocks conditions are met, that is, when things are not too hot or cold, not too fast or slow. For example, life began not in solids (molecules are stuck together, preventing the right kinds of associations) or gases (molecules move too fast to enable favorable associations) but in liquids such as water that permitted the right kinds of interactions at the right speeds.

Somewhat in contrast, Chaisson has maintained for well more than a decade that "organizational complexity is mostly governed by the optimum use of energy—not too little as to starve a system, yet not too much as to destroy it". Neither maximum energy principles nor minimum entropy states are likely relevant to appreciate the emergence of complexity in Nature writ large.

Other themes

Big Historians use information based on scientific techniques such as gene mapping to learn more about the origins of humanity.

Advances in particular sciences such as archaeology, gene mapping, and evolutionary ecology have enabled historians to gain new insights into the early origins of humans, despite the lack of written sources. One account suggested that proponents of Big History were trying to "upend" the conventional practice in historiography of relying on written records.

Big History proponents suggest that humans have been affecting climate change throughout history, by such methods as slash-and-burn agriculture, although past modifications have been on a lesser scale than in recent years during the Industrial Revolution.

A book by Daniel Lord Smail in 2008 suggested that history was a continuing process of humans learning to self-modify our mental states by using stimulants such as coffee and tobacco, as well as other means such as religious rites or romance novels. His view is that culture and biology are highly intertwined, such that cultural practices may cause human brains to be wired differently from those in different societies.

Another theme that has been actively discussed recently by the Big History community is the issue of the Big History Singularity

Presentation by web-based interactive video

ChronoZoom is a free open source project that helps readers visualize time at all scales from the Big Bang 13.8 billion years ago to the present.

Big History is more likely than conventional history to be taught with interactive "video-heavy" websites without textbooks, according to one account. The discipline has benefited from having new ways of presenting themes and concepts in new formats, often supplemented by Internet and computer technology. For example, the ChronoZoom project is a way to explore the 14 billion year history of the universe in an interactive website format. It was described in one account:

ChronoZoom splays out the entirety of cosmic history in a web browser, where users can click into different epochs to learn about the events that have culminated to bring us to where we are today—in my case, sitting in an office chair writing about space. Eager to learn about the Stelliferous epoch? Click away, my fellow explorer. Curious about the formation of the earth? Jump into the "Earth and Solar System" section to see historian David Christian talk about the birth of our homeworld.

— TechCrunch, 2012

In 2012, the History channel showed the film History of the World in Two Hours. It showed how dinosaurs effectively dominated mammals for 160 million years until an asteroid impact wiped them out. One report suggested the History channel had won a sponsorship from StanChart to develop a Big History program entitled Mankind. In 2013 the History channel's new H2 network debuted the 10-part series Big History, narrated by Bryan Cranston and featuring David Christian and an assortment of historians, scientists and related experts. Each episode centered on a major Big History topic such as salt, mountains, cold, flight, water, meteors and megastructures.

History of the field

Early efforts

Astronomer Carl Sagan

While the emerging field of Big History in its present state is generally seen as having emerged in the past two decades beginning around 1990, there have been numerous precedents going back almost 150 years. In the mid-19th century, Alexander von Humboldt's book Cosmos, and Robert Chambers' 1844 book Vestiges of the Natural History of Creation were seen as early precursors to the field. In a sense, Darwin's theory of evolution was, in itself, an attempt to explain a biological phenomenon by examining longer term cause-and-effect processes. In the first half of the 20th century, secular biologist Julian Huxley originated the term "evolutionary humanism", while around the same time the French Jesuit paleontologist Pierre Teilhard de Chardin examined links between cosmic evolution and a tendency towards complexification (including human consciousness), while envisaging compatibility between cosmology, evolution, and theology. In the mid and later 20th century, The Ascent of Man by Jacob Bronowski examined history from a multidisciplinary perspective. Later, Eric Chaisson explored the subject of cosmic evolution quantitatively in terms of energy rate density, and the astronomer Carl Sagan wrote Cosmos.  Thomas Berry, a cultural historian, and the academic Brian Swimme explored meaning behind myths and encouraged academics to explore themes beyond organized religion.

The famous 1968 Earthrise photo, taken by astronaut William Anders, may have stimulated, among other things, an interest in interdisciplinary studies.

The field continued to evolve from interdisciplinary studies during the mid-20th century, stimulated in part by the Cold War and the Space Race. Some early efforts were courses in Cosmic Evolution at Harvard University in the United States, and Universal History in the Soviet Union. One account suggested that the notable Earthrise photo, taken by William Anders during a lunar orbit by the Apollo 8, which showed Earth as a small blue and white ball behind a stark and desolate lunar landscape, not only stimulated the environmental movement but also caused an upsurge of interdisciplinary interest. The French historian Fernand Braudel examined daily life with investigations of "large-scale historical forces like geology and climate". Physiologist Jared Diamond in his book Guns, Germs, and Steel examined the interplay between geography and human evolution; for example, he argued that the horizontal shape of the Eurasian continent enabled human civilizations to advance more quickly than the vertical north-south shape of the American continent, because it enabled greater competition and information-sharing among peoples of the relatively same climate.

In the 1970s, scholars in the United States including geologist Preston Cloud of the University of Minnesota, astronomer G. Siegfried Kutter at Evergreen State College in Washington state, and Harvard University astrophysicists George B. Field and Eric Chaisson started synthesizing knowledge to form a "science-based history of everything", although each of these scholars emphasized somewhat their own particular specializations in their courses and books. In 1980, the Austrian philosopher Erich Jantsch wrote The Self-Organizing Universe which viewed history in terms of what he called "process structures". There was an experimental course taught by John Mears at Southern Methodist University in Dallas, Texas, and more formal courses at the university level began to appear.

In 1991 Clive Ponting wrote A Green History of the World: The Environment and the Collapse of Great Civilizations. His analysis did not begin with the Big Bang, but his chapter "Foundations of History" explored the influences of large-scale geological and astronomical forces over a broad time period.

Sometimes the terms "Deep History" and "Big History" are interchangeable, but sometimes "Deep History" simply refers to history going back several hundred thousand years or more without the other senses of being a movement within history itself.

David Christian

One exponent is David Christian of Macquarie University in Sydney, Australia. He read widely in diverse fields in science, and believed that much was missing from the general study of history. His first university-level course was offered in 1989. He developed a college course beginning with the Big Bang to the present in which he collaborated with numerous colleagues from diverse fields in science and the humanities and the social sciences. This course eventually became a Teaching Company course entitled Big History: The Big Bang, Life on Earth, and the Rise of Humanity, with 24 hours of lectures, which appeared in 2008.

Since the 1990s, other universities began to offer similar courses. In 1994 at the University of Amsterdam and the Eindhoven University of Technology, college courses were offered. In 1996, Fred Spier wrote The Structure of Big History. Spier looked at structured processes which he termed "regimes":

I defined a regime in its most general sense as 'a more or less regular but ultimately unstable pattern that has a certain temporal permanence', a definition which can be applied to human cultures, human and non-human physiology, non-human nature, as well as to organic and inorganic phenomena at all levels of complexity. By defining 'regime' in this way, human cultural regimes thus became a subcategory of regimes in general, and the approach allowed me to look systematically at interactions among different regimes which together produce big history.

— Fred Spier, 2008

Christian's course caught the attention of philanthropist Bill Gates, who discussed with him how to turn Big History into a high school-level course. Gates said about David Christian:

He really blew me away. Here's a guy who's read across the sciences, humanities, and social sciences and brought it together in a single framework. It made me wish that I could have taken big history when I was young, because it would have given me a way to think about all of the school work and reading that followed. In particular, it really put the sciences in an interesting historical context and explained how they apply to a lot of contemporary concerns.

— Bill Gates, in 2012

Educational courses

By 2002, a dozen college courses on Big History had sprung up around the world. Cynthia Stokes Brown initiated Big History at the Dominican University of California, and she wrote Big History: From the Big Bang to the Present. In 2010, Dominican University of California launched the world's first Big History program to be required of all first-year students, as part of the school's general education track. This program, directed by Mojgan Behmand, includes a one-semester survey of Big History, and an interdisciplinary second-semester course exploring the Big History metanarrative through the lens of a particular discipline or subject. A course description reads:

Welcome to First Year Experience Big History at Dominican University of California. Our program invites you on an immense journey through time, to witness the first moments of our universe, the birth of stars and planets, the formation of life on Earth, the dawn of human consciousness, and the ever-unfolding story of humans as Earth's dominant species. Explore the inevitable question of what it means to be human and our momentous role in shaping possible futures for our planet.

— course description 2012

The Dominican faculty's approach is to synthesize the disparate threads of Big History thought, in order to teach the content, develop critical thinking and writing skills, and prepare students to wrestle with the philosophical implications of the Big History metanarrative. In 2015, University of California Press published Teaching Big History, a comprehensive pedagogical guide for teaching Big History, edited by Richard B. Simon, Mojgan Behmand, and Thomas Burke, and written by the Dominican faculty.

Big History is taught at the University of Southern Maine.

Barry Rodrigue, at the University of Southern Maine, established the first general education course and the first online version, which has drawn students from around the world. The University of Queensland in Australia offers an undergraduate course entitled Global History, required for all history majors, which "surveys how powerful forces and factors at work on large time-scales have shaped human history". By 2011, 50 professors around the world have offered courses. In 2012, one report suggested that Big History was being practiced as a "coherent form of research and teaching" by hundreds of academics from different disciplines.

Philanthropist Bill Gates is a major advocate of encouraging instruction in Big History.

There are efforts to bring Big History to younger students. In 2008, Christian and his colleagues began developing a course for secondary school students. In 2011, a pilot high school course was taught to 3,000 kids in 50 high schools worldwide. In 2012, there were 87 schools, with 50 in the United States, teaching Big History, with the pilot program set to double in 2013 for students in the ninth and tenth grades, and even in one middle school. The subject is a STEM course at one high school.

There are initiatives to make Big History a required standard course for university students throughout the world. An education project founded by philanthropist Bill Gates from his personal funds was launched in Australia and the United States, to offer a free online version of the course to high school students.

International Big History Association

Founding members of the International Big History Association gathered at Coldigioco, Italy in 2010

The International Big History Association (IBHA) was founded at the Coldigioco Geological Observatory in Coldigioco, Marche, Italy, on 20 August 2010. Its headquarters is located at Grand Valley State University in Allendale, Michigan, United States. Its inaugural gathering in 2012 was described as "big news" in a report in The Huffington Post.

People involved

Some notable academics involved with the concept include:

Delayed-choice quantum eraser

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Delayed-choice_quantum_eraser A delayed-cho...