Search This Blog

Sunday, May 13, 2018

Testing the God Hypothesis

This article was originally published in Fair Observer.

In my 2007 book God: The Failed Hypothesis; How ScienceShows That God Does Not Exist, I applied the scientific process of hypothesis testing to the question of God. The common objection I heard was that the existence of God is not a scientific hypothesis. Let me explain why I say it is.

The scientific method is not limited to what professional scientists do but can be applied to any question that relates to observations. The brain does not have the capacity to save the time, direction, and energy of each photon that hits the eyes. Instead it operates on a simplified picture of objects, be they rocks, trees, or people, assigning them general properties that do not encompass every detail.

That is, we make models. Science merely rationalizes the procedure, communicating by precise speech and writing among individuals who then attempt to reach an agreement on what they all have seen and how best to represent their collective observations. What are called scientific theories are just models.

The God Model

Religion carries out a similar process, although one in which agreement is generally asserted by authority rather than by a consensus of objective, unbiased observations. From humanity’s earliest days, gods have been imagined who possessed attributes that people could understand and to which they could relate.

Gods and spirits took the form of the objects of experience: the sun, Earth, moon, animals, and humans. The gods of the ancient Egyptians had the form of animals. The gods of the ancient Greeks had the form of imperfect but immortal humans. The God of Judaism, Christianity, and Islam took the form of a powerful, autocratic, male king enthroned high above his subjects.

Each of these god models developed from the culture of the day. If the process continued to today, everyone would worship the shopping mall. In fact, many of the megachurches in America today are located in shopping malls.

By dealing in terms of models of gods that are based on human conceptions, we avoid the objection that the “true” God may lay beyond our limited cognitive capabilities. When we demonstrate that a particular god is rejected by the evidence, we are not proving that all gods, conceivable or inconceivable, do not exist. We are simply showing beyond a reasonable doubt that a god with explicit hypothesized attributes described by the model does not exist. Belief aside, at the very minimum the fact that a specific god model may be inconsistent with the evidence is cause enough to disregard that model in the practices of everyday life.

The exact relationship between the elements of scientific models and whatever true reality lies out there is not of major concern to most scientists, or should not be anyway. When scientists have a model that describes their measurements, is consistent with other established models, makes successful predictions, and can be put to practical use, what else do they need?

The model works fine in not only describing observations but in enabling practical applications. It makes absolutely no difference whether or not an electron is “real” when we apply the model of electrons flowing in an electronic circuit to design some high-tech device. Whatever the intrinsic reality, the model describes what we observe, and those observations are real enough.

Similarly, it does not matter from a practical standpoint whether the “real” God resembles any of the gods whose empirical consequences we have examined and modeled. People do not worship abstractions. They worship a God with qualities they can comprehend. The most common example of a god model is a personal God who answers prayers. This god model has not has not been confirmed in numerous controlled experiments on the efficacy of prayer. It follows that a religious person is wasting her time praying for some favor of such a God.

If praying worked, the effects would be objectively observed. They are not. Let me then summarize the god models that are inconsistent with scientific observations.

Inconsistent Gods
  • A personal God who has given humans immortal souls fails to agree with the empirical facts that human thoughts, memories, and personalities are governed by physical processes in the brain, which dissolves upon death. No nonphysical or extra-physical powers of “mind” can be found and no evidence exists for an afterlife.
  • A personal God whose interactions with humans include miraculous interventions such as those reported in scriptures is contradicted by the lack of independent evidence for the alleged miraculous events.
  • A cosmic God who fine-tuned the laws and constants of physics for life, in particular human life, fails to agree with the fact that the universe is not congenial to human life, being tremendously wasteful of time, space, and matter from the human perspective. It also fails to agree with the fact that the universe is mostly composed of particles in random motion, with complex structures such as galaxies forming less than four percent of the total mass of the universe.
  • A personal God who communicates directly with humans by means of revelation fails to agree with the fact that no scientifically verifiable new information has ever been transmitted while many wrong and harmful doctrines have been asserted by this means. No claimed revelation contains information that could not have been already in the head of the person making the claim. Furthermore, physical evidence now conclusively demonstrates that some of the most important biblical narratives, such as the Exodus, never took place.
  • A personal God who is the source of morality and human values does not exist since the evidence shows that humans define morals and values for themselves. This is not “relative morality.” Believers and nonbelievers alike agree on a common set of morals and values. Even the most devout decide for themselves what is good and what is bad and even judge much of what is approved in scriptures as immoral, such as genocide, slavery, and the oppression of women. Nonbelievers behave no less morally than believers.
  • A personal God who is omniscient, omnibenevolent, and omnipotent does not exist because it is logically inconsistent with the existence of evil, in particular, gratuitous suffering (standard problem of evil).
What If?

The existence of the God worshiped by most Jews, Christians, and Muslims not only lacks supporting empirical evidence but is even contradicted by such evidence. However, it need not have turned out that way. Things might have been different, and this is important to understand as it justifies the use of science to address the God question and refutes the frequently heard statement that science can say nothing about God. If scientific observations had confirmed at least one model god, those believers who make that statement would quickly change their tune. Even the most skeptical atheists would have to come around and admit that there might be some chance that God exists. This has not happened.

Consider the following hypothetical events that, had they occurred, would have favored the God hypothesis. Readers are invited to think of their own similar “might have been” scenarios. While not necessarily proving the existence of God, they would at least lend some credence to traditional beliefs that currently does not exist.

Hypothetical Observations
  • Evidence was found that falsified evolution. Fossils might have been discovered that were inexplicably out of sequence. Life forms might not have all been based on the same genetic scheme. Transitional species might not have been observed. As actually thought at the time of Darwin, the age of the sun could have proved too short for evolution. The discovery of nuclear energy changed that, showing that, fueled by nuclear fusion, the sun will last ten billion years–ample time for life to evolve.
  • Human memories and thoughts might have provided evidence that cannot be plausibly accounted for by known physical processes. Science might have confirmed exceptional powers of the mind that it could not be plausibly explained physically.
  • Science might have uncovered convincing evidence for an afterlife. For example, a person who had been declared dead by every means known to science might return to life with detailed stories of an afterlife that were later verified. For example, she might meet Jimmy Hoffa who tells her where to find his body.
  • Similarly, any claim of a revelation obtained during a mystical trance could contain scientifically verifiable information that the subject could not possibly have known.
  • Physical and historical evidence might have been found for the miraculous events and the important narratives of the scriptures. For example, Roman records might have been found for an earthquake in Judea at the time of a certain crucifixion ordered by Pontius Pilate. Noah’s Ark might have been discovered. The Shroud of Turin might have contained genetic material with no Y-chromosomes. Since the image is that of a man with a beard, this would confirm he was born of a virgin. Or, the genetic material might contain a novel form of coding molecule not found in any other living organism. This would have proven an alien (if not divine) origin of the enshrouded being.
  • The universe might have been found to be so congenial to human life that it must have been created with human life in mind. Humans might have been able to move from planet to planet, just as easily as they now move from continent to continent, and be able to survive on every planet – even in space – without life support.
  • Natural events might follow some moral law, rather than morally neutral mathematical laws. For example, lightning might strike only the wicked; people who behave badly might fall sick more often; nuns would always survive plane crashes.
  • Believers might have had a higher moral sense than nonbelievers and other measurably superior qualities. For example, the jails might be filled with atheists while all believers live happy, prosperous, contented lives surrounded by loving families and pets.
  • Miracles are observed. For example, prayers are answered; an arm or a leg is regenerated through faith healing.
But none of this has happened. Indeed, the opposite is true in some cases, such as an abnormally low number of atheists in jail. Every claim of a supernatural event has proved false. The hypothesis of God is not confirmed by the evidence. Indeed, that hypothesis is strongly contradicted by the observations of our senses and the instruments of science.

Epistemology

From Wikipedia, the free encyclopedia
Epistemology (/ɪˌpɪstɪˈmɒləi/ (About this sound listen); from Greek ἐπιστήμη, epistēmē, meaning 'knowledge', and λόγος, logos, meaning 'logical discourse') is the branch of philosophy concerned with the theory of knowledge.[1]

Epistemology studies the nature of knowledge, justification, and the rationality of belief. Much of the debate in epistemology centers on four areas: (1) the philosophical analysis of the nature of knowledge and how it relates to such concepts as truth, belief, and justification,[2][3] (2) various problems of skepticism, (3) the sources and scope of knowledge and justified belief, and (4) the criteria for knowledge and justification. Epistemology addresses such questions as "What makes justified beliefs justified?",[4] "What does it mean to say that we know something?"[5] and fundamentally "How do we know that we know?"[6]

The term "epistemology" was first used by Scottish philosopher James Frederick Ferrier in 1854.[a] However, according to Brett Warren, King James VI of Scotland had previously personified this philosophical concept as the character Epistemon in 1591.[8]

Epistemon

In a philosophical dialogue, King James VI of Scotland penned the character Epistemon as the personification of a philosophical concept to debate on arguments of whether the ancient religious perceptions of witchcraft should be punished in a politically fueled Christian society. The arguments King James poses, through the character Epistemon, are based on ideas of theological reasoning regarding society's belief, as his opponent Philomathes takes a philosophical stance on society's legal aspects but seeks to obtain greater knowledge from Epistemon, whose name is Greek for scientist. This philosophical approach signified a Philomath seeking to obtain greater knowledge through epistemology with the use of theology. The dialogue was used by King James to educate society on various concepts including the history and etymology of the subjects debated.[8]

Etymology

The word epistemology is derived from the ancient Greek epistēmē meaning "knowledge" and the suffix -logy, meaning "logical discourse" (derived from the Greek word logos meaning "discourse"). J.F. Ferrier coined epistemology on the model of 'ontology', to designate that branch of philosophy which aims to discover the meaning of knowledge, and called it the 'true beginning' of philosophy. The word is equivalent to the concept Wissenschaftslehre, which was used by German philosophers Johann Fichte and Bernard Bolzano for different projects before it was taken up again by Husserl. French philosophers then gave the term épistémologie a narrower meaning as 'theory of knowledge [théorie de la connaissance].' E.g., Émile Meyerson opened his Identity and Reality, written in 1908, with the remark that the word 'is becoming current' as equivalent to 'the philosophy of the sciences.'[9]

Knowledge

In mathematics, it is known that 2 + 2 = 4, but there is also knowing how to add two numbers, and knowing a person (e.g., oneself), place (e.g., one's hometown), thing (e.g., cars), or activity (e.g., addition). Some philosophers think there is an important distinction between "knowing that" (know a concept), "knowing how" (understand an operation), and "acquaintance-knowledge" (know by relation), with epistemology being primarily concerned with the first of these.[10]

While these distinctions are not explicit in English, they are defined explicitly in other languages (N.B. some languages related to English have been said to retain these verbs, e.g. Scots: "wit" and "ken"). In French, Portuguese, Spanish, German and Dutch to know (a person) is translated using connaître, conhecer, conocer and kennen respectively, whereas to know (how to do something) is translated using savoir, saber, wissen and weten. Modern Greek has the verbs γνωρίζω (gnorízo) and ξέρω (kséro). Italian has the verbs conoscere and sapere and the nouns for knowledge are conoscenza and sapienza. German has the verbs wissen and kennen. Wissen implies knowing a fact, kennen implies knowing in the sense of being acquainted with and having a working knowledge of; there is also a noun derived from kennen, namely Erkennen, which has been said to imply knowledge in the form of recognition or acknowledgment. The verb itself implies a process: you have to go from one state to another, from a state of "not-erkennen" to a state of true erkennen. This verb seems to be the most appropriate in terms of describing the "episteme" in one of the modern European languages, hence the German name "Erkenntnistheorie". The theoretical interpretation and significance of these linguistic issues remains controversial.

In his paper On Denoting and his later book Problems of Philosophy Bertrand Russell stressed the distinction between "knowledge by description" and "knowledge by acquaintance". Gilbert Ryle is also credited with stressing the distinction between knowing how and knowing that in The Concept of Mind. In Personal Knowledge, Michael Polanyi argues for the epistemological relevance of knowledge how and knowledge that; using the example of the act of balance involved in riding a bicycle, he suggests that the theoretical knowledge of the physics involved in maintaining a state of balance cannot substitute for the practical knowledge of how to ride, and that it is important to understand how both are established and grounded. This position is essentially Ryle's, who argued that a failure to acknowledge the distinction between knowledge that and knowledge how leads to infinite regress.

In recent times, epistemologists including Sosa, Greco, Kvanvig, Zagzebski and Duncan Pritchard have argued that epistemology should evaluate people's "properties" (i.e., intellectual virtues) and not just the properties of propositions or of propositional mental attitudes.[citation needed]

Belief

In common speech, a "statement of belief" is typically an expression of faith or trust in a person, power or other entity—while it includes such traditional views, epistemology is also concerned with what we believe. This includes 'the' truth, and everything else we accept as 'true' for ourselves from a cognitive point of view.

Truth

Whether someone's belief is true is not a prerequisite for (its) belief. On the other hand, if something is actually known, then it categorically cannot be false. For example, if a person believes that a bridge is safe enough to support them, and attempts to cross it, but the bridge then collapses under their weight, it could be said that they believed that the bridge was safe but that their belief was mistaken. It would not be accurate to say that they knew that the bridge was safe, because plainly it was not. By contrast, if the bridge actually supported their weight, then the person might say that they had believed the bridge was safe, whereas now, after proving it to themself (by crossing it), they know it was safe.

Epistemologists argue over whether belief is the proper truth-bearer. Some would rather describe knowledge as a system of justified true propositions, and others as a system of justified true sentences. Plato, in his Gorgias, argues that belief is the most commonly invoked truth-bearer.[11]

Justification

In the Theaetetus, Socrates considers a number of theories as to what knowledge is, the last being that knowledge is true belief "with an account" (meaning explained or defined in some way). According to the theory that knowledge is justified true belief, in order to know that a given proposition is true, one must not only believe the relevant true proposition, but one must also have a good reason for doing so. One implication of this would be that no one would gain knowledge just by believing something that happened to be true. For example, an ill person with no medical training, but with a generally optimistic attitude, might believe that he will recover from his illness quickly. Nevertheless, even if this belief turned out to be true, the patient would not have known that he would get well since his belief lacked justification.

The definition of knowledge as justified true belief was widely accepted until the 1960s. At this time, a paper written by the American philosopher Edmund Gettier provoked major widespread discussion.

Gettier problem


Euler diagram representing a definition of knowledge.

Edmund Gettier is best known for a short paper entitled 'Is Justified True Belief Knowledge?' published in 1963, which called into question the theory of knowledge that had been dominant among philosophers for thousands of years.[12] This in turn called into question the actual value of philosophy if such an obvious and easy counterexample to a major theory could exist without anyone noticing it for thousands of years. In a few pages, Gettier argued that there are situations in which one's belief may be justified and true, yet fail to count as knowledge. That is, Gettier contended that while justified belief in a true proposition is necessary for that proposition to be known, it is not sufficient. As in the diagram, a true proposition can be believed by an individual (purple region) but still not fall within the "knowledge" category (yellow region).

According to Gettier, there are certain circumstances in which one does not have knowledge, even when all of the above conditions are met. Gettier proposed two thought experiments, which have come to be known as "Gettier cases", as counterexamples to the classical account of knowledge. One of the cases involves two men, Smith and Jones, who are awaiting the results of their applications for the same job. Each man has ten coins in his pocket. Smith has excellent reasons to believe that Jones will get the job and, furthermore, knows that Jones has ten coins in his pocket (he recently counted them). From this Smith infers, "the man who will get the job has ten coins in his pocket." However, Smith is unaware that he also has ten coins in his own pocket. Furthermore, Smith, not Jones, is going to get the job. While Smith has strong evidence to believe that Jones will get the job, he is wrong. Smith has a justified true belief that the man who will get the job has ten coins in his pocket; however, according to Gettier, Smith does not know that the man who will get the job has ten coins in his pocket, because Smith's belief is "...true by virtue of the number of coins in Jones's pocket, while Smith does not know how many coins are in Smith's pocket, and bases his belief...on a count of the coins in Jones's pocket, whom he falsely believes to be the man who will get the job." (see[12] p. 122.) These cases fail to be knowledge because the subject's belief is justified, but only happens to be true by virtue of luck. In other words, he made the correct choice (believing that the man who will get the job has ten coins in his pocket) for the wrong reasons. This example is similar to those often given when discussing belief and truth, wherein a person's belief of what will happen can coincidentally be correct without his or her having the actual knowledge to base it on.

Responses to Gettier

The responses to Gettier have been varied. Usually, they have involved substantial attempts to provide a definition of knowledge different from the classical one, either by recasting knowledge as justified true belief with some additional fourth condition, or proposing a completely new set of conditions, disregarding the classical ones entirely.
Infallibilism, indefeasibility
In one response to Gettier, the American philosopher Richard Kirkham has argued that the only definition of knowledge that could ever be immune to all counterexamples is the infallibilist one.[13] To qualify as an item of knowledge, goes the theory, a belief must not only be true and justified, the justification of the belief must necessitate its truth. In other words, the justification for the belief must be infallible.

Yet another possible candidate for the fourth condition of knowledge is indefeasibility. Defeasibility theory maintains that there should be no overriding or defeating truths for the reasons that justify one's belief. For example, suppose that person S believes he saw Tom Grabit steal a book from the library and uses this to justify the claim that Tom Grabit stole a book from the library. A possible defeater or overriding proposition for such a claim could be a true proposition like, "Tom Grabit's identical twin Sam is currently in the same town as Tom." When no defeaters of one's justification exist, a subject would be epistemologically justified.

The Indian philosopher B. K. Matilal has drawn on the Navya-Nyāya fallibilism tradition to respond to the Gettier problem. Nyaya theory distinguishes between know p and know that one knows p—these are different events, with different causal conditions. The second level is a sort of implicit inference that usually follows immediately the episode of knowing p (knowledge simpliciter). The Gettier case is examined by referring to a view of Gangesha Upadhyaya (late 12th century), who takes any true belief to be knowledge; thus a true belief acquired through a wrong route may just be regarded as knowledge simpliciter on this view. The question of justification arises only at the second level, when one considers the knowledgehood of the acquired belief. Initially, there is lack of uncertainty, so it becomes a true belief. But at the very next moment, when the hearer is about to embark upon the venture of knowing whether he knows p, doubts may arise. "If, in some Gettier-like cases, I am wrong in my inference about the knowledgehood of the given occurrent belief (for the evidence may be pseudo-evidence), then I am mistaken about the truth of my belief – and this is in accordance with Nyaya fallibilism: not all knowledge-claims can be sustained."[14]
Reliabilism
Reliabilism has been a significant line of response to the Gettier problem among philosophers, originating with work by Alvin Goldman in the 1960s. According to reliabilism, a belief is justified (or otherwise supported in such a way as to count towards knowledge) only if it is produced by processes that typically yield a sufficiently high ratio of true to false beliefs. In other words, this theory states that a true belief counts as knowledge only if it is produced by a reliable belief-forming process. Examples of reliable processes include: standard perceptual processes, remembering, good reasoning, and introspection.[15]
Reliabilism has been challenged by Gettier cases. Another argument that challenges reliabilism, like the Gettier cases (although it was not presented in the same short article as the Gettier cases), is the case of Henry and the barn façades. In the thought experiment, a man, Henry, is driving along and sees a number of buildings that resemble barns. Based on his perception of one of these, he concludes that he has just seen barns. While he has seen one, and the perception he based his belief that the one he saw was of a real barn, all the other barn-like buildings he saw were façades. Theoretically, Henry does not know that he has seen a barn, despite both his belief that he has seen one being true and his belief being formed on the basis of a reliable process (i.e. his vision), since he only acquired his true belief by accident.[16]
Other responses
Robert Nozick has offered the following definition of knowledge: S knows that P if and only if:
  • P;
  • S believes that P;
  • if P were false, S would not believe that P;
  • if P were true, S would believe that P.[17]
Nozick argues that the third of these conditions serves to address cases of the sort described by Gettier. Nozick further claims this condition addresses a case of the sort described by D. M. Armstrong:[18] A father believes his daughter innocent of committing a particular crime, both because of faith in his baby girl and (now) because he has seen presented in the courtroom a conclusive demonstration of his daughter's innocence. His belief via the method of the courtroom satisfies the four subjunctive conditions, but his faith-based belief does not. If his daughter were guilty, he would still believe her innocent, on the basis of faith in his daughter; this would violate the third condition.

The British philosopher Simon Blackburn has criticized this formulation by suggesting that we do not want to accept as knowledge beliefs, which, while they "track the truth" (as Nozick's account requires), are not held for appropriate reasons. He says that "we do not want to award the title of knowing something to someone who is only meeting the conditions through a defect, flaw, or failure, compared with someone else who is not meeting the conditions."[19] In addition to this, externalist accounts of knowledge, such as Nozick's, are often forced to reject closure in cases where it is intuitively valid.

Timothy Williamson has advanced a theory of knowledge according to which knowledge is not justified true belief plus some extra condition(s), but primary. In his book Knowledge and its Limits, Williamson argues that the concept of knowledge cannot be broken down into a set of other concepts through analysis—instead, it is sui generis. Thus, though knowledge requires justification, truth, and belief, the word "knowledge" can't be, according to Williamson's theory, accurately regarded as simply shorthand for "justified true belief".

Alvin Goldman writes in his Causal Theory of Knowing that in order for knowledge to truly exist there must be a causal chain between the proposition and the belief of that proposition.

Externalism and internalism

A central debate about the nature of justification is a debate between epistemological externalists on the one hand, and epistemological internalists on the other.

Externalists hold that factors deemed "external", meaning outside of the psychological states of those who gain knowledge, can be conditions of justification. For example, an externalist response to the Gettier problem is to say that, in order for a justified true belief to count as knowledge, there must be a link or dependency between the belief and the state of the external world. Usually this is understood to be a causal link. Such causation, to the extent that it is "outside" the mind, would count as an external, knowledge-yielding condition. Internalists, on the other hand, assert that all knowledge-yielding conditions are within the psychological states of those who gain knowledge.

Though unfamiliar with the internalist/externalist debate himself, many point to René Descartes as an early example of the internalist path to justification. He wrote that, because the only method by which we perceive the external world is through our senses, and that, because the senses are not infallible, we should not consider our concept of knowledge to be infallible. The only way to find anything that could be described as "indubitably true", he advocates, would be to see things "clearly and distinctly".[20] He argued that if there is an omnipotent, good being who made the world, then it's reasonable to believe that people are made with the ability to know. However, this does not mean that man's ability to know is perfect. God gave man the ability to know, but not omniscience. Descartes said that man must use his capacities for knowledge correctly and carefully through methodological doubt.[21] The dictum "Cogito ergo sum" (I think, therefore I am) is also commonly associated with Descartes' theory, because in his own methodological doubt, doubting everything he previously knew in order to start from a blank slate, the first thing that he could not logically bring himself to doubt was his own existence: "I do not exist" would be a contradiction in terms; the act of saying that one does not exist assumes that someone must be making the statement in the first place. Though Descartes could doubt his senses, his body and the world around him, he could not deny his own existence, because he was able to doubt and must exist in order to do so. Even if some "evil genius" were to be deceiving him, he would have to exist in order to be deceived. This one sure point provided him with what he would call his Archimedean point, in order to further develop his foundation for knowledge. Simply put, Descartes' epistemological justification depended upon his indubitable belief in his own existence and his clear and distinct knowledge of God.[22]

Value problem

We generally assume that knowledge is more valuable than mere true belief. If so, what is the explanation? A formulation of the value problem in epistemology first occurs in Plato's Meno. Socrates points out to Meno that a man who knew the way to Larissa could lead others there correctly. But so, too, could a man who had true beliefs about how to get there, even if he had not gone there or had any knowledge of Larissa. Socrates says that it seems that both knowledge and true opinion can guide action. Meno then wonders why knowledge is valued more than true belief, and why knowledge and true belief are different. Socrates responds that knowledge is more valuable than mere true belief because it is tethered, or justified. Justification, or working out the reason for a true belief, locks down true belief.[23]

The problem is to identify what (if anything) makes knowledge more valuable than mere true belief, or that makes knowledge more valuable than a more minimal conjunction of its components, such as justification, safety, sensitivity, statistical likelihood, and anti-Gettier conditions, on a particular analysis of knowledge that conceives of knowledge as divided into components (to which knowledge-first epistemological theories, which posit knowledge as fundamental, are notable exceptions).[24] The value problem reemerged in the philosophical literature on epistemology in the twenty-first century following the rise of virtue epistemology in the 1980s, partly because of the obvious link to the concept of value in ethics.[25]

The value problem has been presented as an argument against epistemic reliabilism by philosophers including Linda Zagzebski, Wayne Riggs and Richard Swinburne. Zagzebski analogizes the value of knowledge to the value of espresso produced by an espresso maker: "The liquid in this cup is not improved by the fact that it comes from a reliable espresso maker. If the espresso tastes good, it makes no difference if it comes from an unreliable machine."[26] For Zagzebski, the value of knowledge deflates to the value of mere true belief. She assumes that reliability in itself has no value or disvalue, but Goldman and Olsson disagree. They point out that Zagzebski's conclusion rests on the assumption of veritism: all that matters is the acquisition of true belief.[27] To the contrary, they argue that a reliable process for acquiring a true belief adds value to the mere true belief by making it more likely that future beliefs of a similar kind will be true. By analogy, having a reliable espresso maker that produced a good cup of espresso would be more valuable than having an unreliable one that luckily produced a good cup because the reliable one would more likely produce good future cups compared to the unreliable one.

The value problem is important to assessing the adequacy of theories of knowledge that conceive of knowledge as consisting of true belief and other components. According to Kvanvig, an adequate account of knowledge should resist counterexamples and allow an explanation of the value of knowledge over mere true belief. Should a theory of knowledge fail to do so, it would prove inadequate.[28]

One of the more influential responses to the problem is that knowledge is not particularly valuable and is not what ought to be the main focus of epistemology. Instead, epistemologists ought to focus on other mental states, such as understanding.[29] Advocates of virtue epistemology have argued that the value of knowledge comes from an internal relationship between the knower and the mental state of believing.[24]

Acquiring knowledge

A priori and a posteriori knowledge

The nature of this distinction has been disputed by various philosophers; however, the terms may be roughly defined as follows:
  • A priori knowledge is knowledge that is known independently of experience (that is, it is non-empirical, or arrived at beforehand, usually by reason). It will henceforth be acquired through anything that is independent from experience.
  • A posteriori knowledge is knowledge that is known by experience (that is, it is empirical, or arrived at afterward).
A priori knowledge is a way of gaining knowledge without the need of experience. In Bruce Russell's article "A Priori Justification and Knowledge"[30] he says that it is "knowledge based on a priori justification," (1) which relies on intuition and the nature of these intuitions. A priori knowledge is often contrasted with posteriori knowledge, which is knowledge gained by experience. A way to look at the difference between the two is through an example. Bruce Russell gives two propositions in which the reader decides which one he believes more. Option A: All crows are birds. Option B: All crows are black. If you believe option A, then you are a priori justified in believing it because you don't have to see a crow to know it's a bird. If you believe in option B, then you are posteriori justified to believe it because you have seen many crows therefore knowing they are black. He goes on to say that it doesn't matter if the statement is true or not, only that if you believe in one or the other that matters.

The idea of a priori knowledge is that it is based on intuition or rational insights. Laurence BonJour says in his article "The Structure of Empirical Knowledge",[31] that a "rational insight is an immediate, non-inferential grasp, apprehension or 'seeing' that some proposition is necessarily true." (3) Going back to the crow example, by Laurence BonJour's definition the reason you would believe in option A is because you have an immediate knowledge that a crow is a bird, without ever experiencing one.

Evolutionary psychology takes a novel approach to the problem. It says that there is an innate predisposition for certain types of learning. "Only small parts of the brain resemble a tabula rasa; this is true even for human beings. The remainder is more like an exposed negative waiting to be dipped into a developer fluid"[32]

Analytic–synthetic distinction

Immanuel Kant, in his Critique of Pure Reason, drew a distinction between "analytic" and "synthetic" propositions. He contended that some propositions are such that we can know them to be true just by understanding their meaning. For example, consider, "My father's brother is my uncle." We can know it to be true solely by virtue of our understanding what its terms mean. Philosophers call such propositions "analytic". Synthetic propositions, on the other hand, have distinct subjects and predicates. An example would be, "My father's brother has black hair." Kant stated that all mathematical and scientific statements are analytic a priori propositions because they are necessarily true but our knowledge about the attributes of the mathematical or physical subjects we can only get by logical inference.

The American philosopher Willard Van Orman Quine, in his Two Dogmas of Empiricism, famously challenged the distinction, arguing that the two have a blurry boundary. Some contemporary philosophers have offered more sustainable accounts of the distinction.[33]

Branches or schools of thought

Historical

The historical study of philosophical epistemology is the historical study of efforts to gain philosophical understanding or knowledge of the nature and scope of human knowledge.[34] Since efforts to get that kind of understanding have a history, the questions philosophical epistemology asks today about human knowledge are not necessarily the same as they once were.[34] But that does not mean that philosophical epistemology is itself a historical subject, or that it pursues only or even primarily historical understanding.[34]

Empiricism

In philosophy, empiricism is generally a theory of knowledge focusing on the role of experience, especially experience based on perceptual observations by the senses. Certain forms treat all knowledge as empirical,[citation needed] while some regard disciplines such as mathematics and logic as exceptions.[citation needed]

There are many variants of empiricism, positivism, realism and common sense being among the most commonly expounded. But central to all empiricist epistemologies is the notion of the epistemologically privileged status of sense data.

Idealism

Many idealists believe that knowledge is primarily (at least in some areas) acquired by a priori processes or is innate—for example, in the form of concepts not derived from experience. The relevant theoretical processes often go by the name "intuition".[35] The relevant theoretical concepts may purportedly be part of the structure of the human mind (as in Kant's theory of transcendental idealism), or they may be said to exist independently of the mind (as in Plato's theory of Forms).

Rationalism

By contrast with empiricism and idealism, which centres around the epistemologically privileged status of sense data (empirical) and the primacy of Reason (theoretical) respectively, modern rationalism adds a third 'system of thinking', (as Gaston Bachelard has termed these areas) and holds that all three are of equal importance: The empirical, the theoretical and the abstract. For Bachelard, rationalism makes equal reference to all three systems of thinking.

Constructivism

Constructivism is a view in philosophy according to which all "knowledge is a compilation of human-made constructions",[36] "not the neutral discovery of an objective truth".[37] Whereas objectivism is concerned with the "object of our knowledge", constructivism emphasises "how we construct knowledge".[38] Constructivism proposes new definitions for knowledge and truth that form a new paradigm, based on inter-subjectivity instead of the classical objectivity, and on viability instead of truth. Piagetian constructivism, however, believes in objectivity—constructs can be validated through experimentation. The constructivist point of view is pragmatic;[39] as Vico said: "The norm of the truth is to have made it."

Pragmatism

Pragmatism is an empiricist epistemology formulated by Charles Sanders Peirce, William James, and John Dewey, which understands truth as that which is practically applicable in the world. Peirce formulates the maxim: 'Consider what effects, that might conceivably have practical bearings, we conceive the object of our conception to have. Then, our conception of these effects is the whole of our conception of the object.'[40] This suggests that we are to analyse ideas and objects in the world for their practical value. This is in contrast to any correspondence theory of truth which holds that what is true is what corresponds to an external reality. William James suggests that through a pragmatist epistemology 'Theories thus become instruments, not answers to enigmas in which we can rest.' [41] A more contemporary understanding of pragmatism was developed by the philosopher Richard Rorty who proposed that values were historically contingent and dependent upon their utility within a given historical period. [42]

Regress problem

The regress problem is the problem of providing a complete logical foundation for human knowledge. The traditional way of supporting a rational argument is to appeal to other rational arguments, typically using chains of reason and rules of logic. A classic example that goes back to Aristotle is deducing that Socrates is mortal. We have a logical rule that says All humans are mortal and an assertion that Socrates is human and we deduce that Socrates is mortal. In this example how do we know that Socrates is human? Presumably we apply other rules such as: All born from human females are human. Which then leaves open the question how do we know that all born from humans are human? This is the regress problem: how can we eventually terminate a logical argument with some statement(s) that do not require further justification but can still be considered rational and justified?
As John Pollock stated:
... to justify a belief one must appeal to a further justified belief. This means that one of two things can be the case. Either there are some beliefs that we can be justified for holding, without being able to justify them on the basis of any other belief, or else for each justified belief there is an infinite regress of (potential) justification [the nebula theory]. On this theory there is no rock bottom of justification. Justification just meanders in and out through our network of beliefs, stopping nowhere.[43]
The apparent impossibility of completing an infinite chain of reasoning is thought by some to support skepticism. It is also the impetus for Descartes' famous dictum: I think, therefore I am. Descartes was looking for some logical statement that could be true without appeal to other statements.

Response to the regress problem

Many epistemologists studying justification have attempted to argue for various types of chains of reasoning that can escape the regress problem.
  • Foundationalism
Foundationalists respond to the regress problem by asserting that certain "foundations" or "basic beliefs" support other beliefs but do not themselves require justification from other beliefs. These beliefs might be justified because they are self-evident, infallible, or derive from reliable cognitive mechanisms. Perception, memory, and a priori intuition are often considered to be possible examples of basic beliefs.

The chief criticism of foundationalism is that if a belief is not supported by other beliefs, accepting it may be arbitrary or unjustified.[44]
  • Coherentism
Another response to the regress problem is coherentism, which is the rejection of the assumption that the regress proceeds according to a pattern of linear justification. To avoid the charge of circularity, coherentists hold that an individual belief is justified circularly by the way it fits together (coheres) with the rest of the belief system of which it is a part. This theory has the advantage of avoiding the infinite regress without claiming special, possibly arbitrary status for some particular class of beliefs. Yet, since a system can be coherent while also being wrong, coherentists face the difficulty of ensuring that the whole system corresponds to reality. Additionally, most logicians agree that any argument that is circular is trivially valid. That is, to be illuminating, arguments must be linear with conclusions that follow from stated premises.

However, Warburton writes in 'Thinking from A to Z', "Circular arguments are not invalid; in other words, from a logical point of view there is nothing intrinsically wrong with them. However, they are, when viciously circular, spectacularly uninformative. (Warburton 1996)."
  • Foundherentism
A position known as "foundherentism", advanced by Susan Haack, is meant to be a unification of foundationalism and coherentism. One component of this theory is what is called the "analogy of the crossword puzzle." Whereas, for example, infinitists regard the regress of reasons as "shaped" like a single line, Susan Haack has argued that it is more like a crossword puzzle, with multiple lines mutually supporting each other.[45]
  • Infinitism
An alternative resolution to the regress problem is known as "infinitism". Infinitists take the infinite series to be merely potential, in the sense that an individual may have indefinitely many reasons available to them, without having consciously thought through all of these reasons when the need arises. This position is motivated in part by the desire to avoid what is seen as the arbitrariness and circularity of its chief competitors, foundationalism and coherentism.

Indian pramana

Indian philosophical schools such as the Hindu Nyaya, and Carvaka, and later, the Jain and Buddhist philosophical schools, developed an epistemological tradition which is termed "pramana" independently of the Western philosophical tradition. Pramana can be translated as "instrument of knowledge" and refers to various means or sources of knowledge which were held to be reliable by Indian philosophers. Each school of Indian philosophy had their own theories about which pramanas were valid means to knowledge and which was unreliable (and why).[46] A Vedic text, Taittirīya Āraṇyaka (c. 9th–6th centuries BCE), lists "four means of attaining correct knowledge": smṛti ("tradition" or "scripture"), pratyakṣa ("perception"), aitihya ("communication by one who is expert", or "tradition), and anumāna ("reasoning" or "inference").[47][48]
In the Indian traditions, the most widely discussed pramanas are: Pratyakṣa (perception), Anumāṇa (inference), Upamāṇa (comparison and analogy), Arthāpatti (postulation, derivation from circumstances), Anupalabdi (non-perception, negative/cognitive proof) and Śabda (word, testimony of past or present reliable experts). While the Nyaya school (beginning with the Nyāya Sūtras of Gotama, between 6th-century BCE and 2nd-century CE[49][50]) were a proponent of realism and supported four pramanas (perception, inference, comparison/analogy and testimony), the Buddhist epistemologists (Dignaga and Dharmakirti) generally accepted only perception and inference.

The theory of knowledge of the Buddha in the early Buddhist texts has been interpreted as a form of pragmatism as well as a form of correspondence theory.[51] Likewise, the Buddhist philosopher Dharmakirti has been interpreted both as holding a form of pragmatism or correspondence theory for his view that what is true is what has effective power (arthakriya).[52][53] The Buddhist Madhyamika school's theory of emptiness (shunyata) meanwhile has been interpreted as a form of philosophical skepticism.[54]

The main Jain contribution to epistemology has been their theory of "many sided-ness" or "multi-perspectivism" (Anekantavada) which says that since the world is multifaceted, any single viewpoint is limited (naya — a partial standpoint).[55] This has been interpreted as a kind of pluralism or perspectivism.[56][57] According to Jain epistemology, none of the pramanas gives absolute or perfect knowledge since they are each limited points of view.

The Carvaka school of materialists only accepted the pramana of perception and hence were one of the first empiricists.[58] There was also another school of philosophical skepticism, the Ajñana.

Skepticism

Skepticism is a position that questions the validity of some or all of human knowledge. Skepticism does not refer to any one specific school of philosophy, rather it is a thread that runs through many philosophical discussions of epistemology. The first well known Greek skeptic was Socrates who claimed that his only knowledge was that he knew nothing with certainty. In Indian philosophy, Sanjaya Belatthiputta was a famous skeptic and the Buddhist Madhyamika school has been seen as taking up a form of skepticism. Descartes' most famous inquiry into mind and body also began as an exercise in skepticism. Descartes began by questioning the validity of all knowledge and looking for some fact that was irrefutable. In so doing, he came to his famous dictum: I think, therefore I am.
Foundationalism and the other responses to the regress problem are essentially defenses against skepticism. Similarly, the pragmatism of William James can be viewed as a coherentist defense against skepticism. James discarded conventional philosophical views of truth and defined truth to be based on how well a concept works in a specific context rather than objective rational criteria. The philosophy of Logical Positivism and the work of philosophers such as Kuhn and Popper can be viewed as skepticism applied to what can truly be considered scientific knowledge.[59]

Big History

From Wikipedia, the free encyclopedia

A diagram of the Big Bang expansion according to NASA

Artist's depiction of the WMAP satellite gathering data to help scientists understand the Big Bang

Big History is an emerging[clarification needed][year needed] academic discipline which examines history from the Big Bang to the present. It examines long time frames using a multidisciplinary approach based on combining numerous disciplines from science and the humanities,[1][2][3][4][5] and explores human existence in the context of this bigger picture.[6] It integrates studies of the cosmos, Earth, life, and humanity using empirical evidence to explore cause-and-effect relations,[7][8] and is taught at universities[9] and secondary schools[10] often using web-based interactive presentations.[10] According to historian David Christian, who has been credited with coining the term "Big History",[7][9][11] the intellectual movement is made of an "unusual coalition of scholars".[2] Some historians have expressed skepticism towards "scientific history" and argue that the claims of Big History are unoriginal.[12] Others support the scientific merit but point out that cosmology[13] and natural history[14] have been studied since the Renaissance, and that the new term, Big History, continues such work.

Conventional history Big History
5000 BCE to present Big Bang to present
7,000-10,000 years 13.8 billion years
Compartmentalized fields of study Interdisciplinary approach
Focus on human civilization Focus on how humankind fits within the universe
Taught mostly with books Taught with interactive websites such as ChronoZoom
Microhistory Macrohistory
Focus on trends, processes Focus on analogy, metaphor
Based on a variety of documents, including written records and material artifacts Based on current knowledge about phenomena such as fossils, ecological changes, genetic analysis, telescope data














Comparison with conventional history

Big History examines the past using numerous time scales, from the Big Bang to modernity,[4] unlike conventional history courses which typically begin with the introduction of farming and civilization,[15] or with the beginning of written records. It explores common themes and patterns.[10] Courses generally do not focus on humans until more than halfway through,[7] and, unlike conventional history courses, there is not much focus on kingdoms or civilizations or wars or national borders.[7] If conventional history focuses on human civilization with humankind at the center, Big History focuses on the universe and shows how humankind fits within this framework[16] and places human history in the wider context of the universe's history.[17][18]


Conventional history often begins with the development of agriculture in civilizations such as Ancient Egypt.

Control of fire by early humans predating both agriculture and civilization.

Unlike conventional history, Big History tends to go rapidly through detailed historical eras such as the Renaissance or Ancient Egypt.[19] It draws on the latest findings from biology,[4] astronomy,[4] geology,[4] climatology, prehistory, archaeology, anthropology, evolutionary biology, chemistry, psychology, hydrology, geography, paleontology, ancient history, physics, economics,[4] cosmology,[4] natural history, and population and environmental studies as well as standard history.[20] One teacher explained:
We're taking the best evidence from physics and the best evidence from chemistry and biology, and we're weaving it together into a story ... They're not going to learn how to balance [chemical] equations, but they're going to learn how the chemical elements came out of the death of stars, and that's really interesting.
— [10]
Big History arose from a desire to go beyond the specialized and self-contained fields that emerged in the 20th century. It tries to grasp history as a whole, looking for common themes across multiple time scales in history.[21][22] Conventional history typically begins with the invention of writing, and is limited to past events relating directly to the human race. Big Historians point out that this limits study to the past 5,000 years and neglects the much longer time when humans existed on Earth. Henry Kannberg sees Big History as being a product of the Information Age, a stage in history itself following speech, writing, and printing.[23] Big History covers the formation of the universe, stars, and galaxies, and includes the beginning of life as well as the period of several hundred thousand years when humans were hunter-gatherers. It sees the transition to civilization as a gradual one, with many causes and effects, rather than an abrupt transformation from uncivilized static cavemen to dynamic civilized farmers.[24] An account in The Boston Globe describes what it polemically asserts to be the conventional "history" view:
Early humans were slump-shouldered, slope-browed, hairy brutes. They hunkered over campfires and ate scorched meat. Sometimes they carried spears. Once in a while they scratched pictures of antelopes on the walls of their caves. That's what I learned during elementary school, anyway. History didn't start with the first humans - they were cavemen! The Stone Age wasn't history; the Stone Age was a preamble to history, a dystopian era of stasis before the happy onset of civilization, and the arrival of nifty developments like chariot wheels, gunpowder, and Google. History started with agriculture, nation-states, and written documents. History began in Mesopotamia's Fertile Crescent, somewhere around 4000 BC. It began when we finally overcame our savage legacy, and culture surpassed biology.
— Anthony Doerr reviewing On Deep History and the Brain by Daniel Lord Smail, 2007[24]
Big History, in contrast to conventional history, has more of an interdisciplinary basis.[10] Advocates sometimes view conventional history as "microhistory" or "shallow history", and note that three-quarters of historians specialize in understanding the last 250 years while ignoring the "long march of human existence."[2] However, one historian disputed that the discipline of history has overlooked the big view, and described the "grand narrative" of Big History as a "cliché that gets thrown around a lot."[2] One account suggested that conventional history had the "sense of grinding the nuts into an ever finer powder."[20] It emphasizes long-term trends and processes rather than history-making individuals or events.[2] Historian Dipesh Chakrabarty of the University of Chicago suggested that Big History was less politicized than contemporary history because it enables people to "take a step back."[2] It uses more kinds of evidence than the standard historical written records, such as fossils, tools, household items, pictures, structures, ecological changes and genetic variations.[2]

Critics of Big History, including sociologist Frank Furedi, have deemed the discipline an "anti-humanist turn of history."[25] The Big History narrative has also been challenged for failing to engage with the methodology of the conventional history discipline. According to historian and educator Sam Wineburg of Stanford University, Big History eschews the interpretation of texts in favor of a purely scientific approach, thus becoming "less history and more of a kind of evolutionary biology or quantum physics."[26]

Themes


Radiocarbon dating helps scientists understand the age of rocks as well as the Earth and the Solar System.

Professor David Christian argued that the recent past is only understandable in terms of the "whole 14-billion-year span of time itself."[20] Big History seeks to retell the "human story" in light of scientific advances by such methods as radiocarbon dating and genetic analysis. In some instances, it uses mathematical modeling to explore interactions between long-term trends in sociological systems, and it has led to the coining of the term cliodynamics by Peter Turchin of the University of Connecticut to describe how mathematical models might explain events such as the growth of empires, social discontent, and the collapse of nations.[12] It explores the mix of individual action and social and environmental forces, according to one view.[2] While conventional history might see an invention such as sharper spear points as being deliberately created by some smart humans, and then copied by other humans, a Big History perspective might see sharper spear points as accidental, and then natural evolutionary processes enabled their users to be better hunters, even if they did not understand why this was the case.[15] It seeks to discover repeating patterns during the 13.8 billion years since the Big Bang.[1] For example, one pattern is that "chaos catalyzes creativity", such as the asteroid impact wiping out the dinosaurs.[1]

Time scales and questions

Big History makes comparisons based on different time scales, or what David Christian calls "the play of scales", and notes similarities and differences between the human, geological, and cosmological scales. Christian believes such "radical shifts in perspective" will yield "new insights into familiar historical problems, from the nature/nurture debate to environmental history to the fundamental nature of change itself."[20] It shows how human existence has been changed by both human-made and natural factors: for example, according to natural processes which happened more than four billion years ago, iron emerged from the remains of an exploding star and, as a result, humans could use this hard metal to forge weapons for hunting and war.[7] The discipline addresses such questions as "How did we get here?," "How do we decide what to believe?," "How did Earth form?," and "What is life?"[4] It offers a "grand tour of all the major scientific paradigms."[17] According to one view, it helps students to become scientifically literate quickly.[17]

Cosmic evolution

Cosmic evolution, the scientific study of universal change, is closely related to Big History (as are the allied subjects of the epic of evolution and astrobiology); some researchers regard cosmic evolution as broader than Big History since the latter mainly (and rightfully) examines the specific historical trek from Big Bang → Milky Way → Sun → Earth → humanity. Cosmic evolution, while fully addressing all complex systems (and not merely those that led to humans), which is also sometimes called cosmological history or universal history, has been taught and researched for decades, mostly by astronomers and astrophysicists. This Big-Bang-to-humankind scenario well preceded the subject that some historians began calling Big History in the 1990s. Cosmic evolution is an intellectual framework that offers a grand synthesis of the many varied changes in the assembly and composition of radiation, matter, and life throughout the history of the universe. While engaging the time-honored queries of who we are and whence we came, this interdisciplinary subject attempts to unify the sciences within the entirety of natural history—a single, inclusive scientific narrative of the origin and evolution of all material things over ~14 billion years, from the origin of the universe to the present day on Earth.

The roots of the idea of cosmic evolution extend back millennia. Ancient Greek philosophers of the fifth century BCE, most notably Heraclitus, are celebrated for their reasoned claims that all things change. Early modern speculation about cosmic evolution began more than a century ago, including the broad insights of Robert Chambers, Herbert Spencer, and Lawrence Henderson. Only in the mid-20th century was the cosmic-evolutionary scenario articulated as a research paradigm to include empirical studies of galaxies, stars, planets, and life—in short, an expansive agenda that combines physical, biological, and cultural evolution. Harlow Shapley widely articulated the idea of cosmic evolution (often calling it "cosmography") in public venues at mid-century,[27] and NASA embraced it in the late 20th century as part of its more limited astrobiology program. Carl Sagan,[28] Eric Chaisson,[29] Hubert Reeves,[30] Erich Jantsch,[31] and Preston Cloud,[32] among others, extensively championed cosmic evolution at roughly the same time around 1980. This extremely broad subject now continues to be richly formulated as both a technical research program and a scientific worldview for the 21st century.[33][34][35]

Cosmic evolution can elicit controversy for several reasons: evolution of any kind inherently attracts detractors, especially among religious fundamentalists; cosmic evolution addresses universal and human origins, which often elevate emotions; it challenges age-old ideas about life's sense of place in the cosmos; it embraces change, which many people dislike or distrust; it welcomes a broad interpretation of the concept of evolution, replacing the idea of evolution exclusive to life, which some biologists prefer; it proposes a sweeping, interdisciplinary worldview based on rationality and empiricism, which, despite its experimental tests, some find intellectually arrogant.[citation needed]

One popular collection of scholarly materials on cosmic evolution is based on teaching and research that has been underway at Harvard University since the mid-1970s[36]

Complexity, energy, thresholds

Cosmic evolution is a quantitative subject, whereas big history typically is not; this is because cosmic evolution is practiced mostly by natural scientists, while big history by social scholars. These two subjects, closely allied and overlapping, benefit from each other; cosmic evolutionists tend to treat universal history linearly, thus humankind enters their story only at the most very recent times, whereas big historians tend to stress humanity and its many cultural achievements, granting human beings a larger part of their story. One can compare and contrast these different emphases by watching two short movies portraying the Big-Bang-to-humankind narrative, one animating time linearly, and the other capturing time (actually look-back time) logarithmically; in the former, humans enter this 14-minute movie in the last second, while in the latter we appear much earlier—yet both are correct.[37][38]

These different treatments of time over ~14 billion years, each with different emphases on historical content, are further clarified by noting that some cosmic evolutionists divide the whole narrative into three phases and seven epochs:
Phases: physical evolution → biological evolution → cultural evolution
Epochs: particulate → galactic → stellar → planetary → chemical → biological → cultural
This contrasts with the approach used by some big historians who divide the narrative into many more thresholds, as noted in the discussion at the end of this section below. Yet another telling of the Big-Bang-to-humankind story is one that emphasizes the earlier universe, particularly the growth of particles, galaxies, and large-scale cosmic structure, such as in physical cosmology.

Notable among quantitative efforts to describe cosmic evolution are Eric Chaisson's research efforts to describe the concept of energy flow through open, thermodynamic systems, including galaxies, stars, planets, life, and society.[39][40][41] The observed increase of energy rate density (energy/time/mass) among a whole host of complex systems is one useful way to explain the rise of complexity in an expanding universe that still obeys the cherished second law of thermodynamics and thus continues to accumulate net entropy. As such, ordered material systems—from buzzing bees and redwood trees to shining stars and thinking beings—are viewed as temporary, local islands of order in a vast, global sea of disorder. A recent review article, which is especially directed toward big historians, summarizes much of this empirical effort over the past decade.[42]

One striking finding of such complexity studies is the apparently ranked order among all known material systems in the universe. Although the absolute energy in astronomical systems greatly exceeds that of humans, and although the mass densities of stars, planets, bodies, and brains are all comparable, the energy rate density for humans and modern human society are approximately a million times greater than for stars and galaxies. For example, the Sun emits a vast luminosity, 4x1033 erg/s (equivalent to nearly a billion billion billion watt light bulb), but it also has a huge mass, 2x1033 g; thus each second an amount of energy equaling only 2 ergs passes through each gram of this star. In contrast to any star, more energy flows through each gram of a plant's leaf during photosynthesis, and much more (nearly a million times) rushes through each gram of a human brain while thinking (~20W/1350g).[43]

Cosmic evolution is more than a subjective, qualitative assertion of "one damn thing after another". This inclusive scientific worldview constitutes an objective, quantitative approach toward deciphering much of what comprises organized, material Nature. Its uniform, consistent philosophy of approach toward all complex systems demonstrates that the basic differences, both within and among many varied systems, are of degree, not of kind. And, in particular, it suggests that optimal ranges of energy rate density grant opportunities for the evolution of complexity; those systems able to adjust, adapt, or otherwise take advantage of such energy flows survive and prosper, while other systems adversely affected by too much or too little energy are non-randomly eliminated.[44]

Fred Spier is foremost among those big historians who have found the concept of energy flows useful, suggesting that Big History is the rise and demise of complexity on all scales, from sub-microscopic particles to vast galaxy clusters, and not least many biological and cultural systems in between.[45]

David Christian, in an 18-minute TED talk, described some of the basics of the Big History course.[46] Christian describes each stage in the progression towards greater complexity as a "threshold moment" when things become more complex, but they also become more fragile and mobile.[46] Some of Christian's threshold stages are:


In a supernova, a star which has exhausted most of its energy bursts in an incredible explosion, creating conditions for heavier elements such as iron and gold to form.
  1. The universe appears, incredibly hot, busting, expanding, within a second.[46]
  2. Stars are born.[46]
  3. Stars die, creating temperatures hot enough to make complex chemicals, as well as rocks, asteroids, planets, moons, and our solar system.[46]
  4. Earth is created.[46]
  5. Life appears on Earth, with molecules growing from the Goldilocks conditions, with neither too much nor too little energy.[46]
  6. Humans appear, language, collective learning.[46]
Christian elaborated that more complex systems are more fragile, and that while collective learning is a powerful force to advance humanity in general, it is not clear that humans are in charge of it, and it is possible in his view for humans to destroy the biosphere with the powerful weapons that have been invented.[46]

In the 2008 lecture series through The Teaching Company's Great Courses entitled Big History: The Big Bang, Life on Earth, and the Rise of Humanity, Christian explains Big History in terms of eight thresholds of increasing complexity:[47]
  1. The Big Bang and the creation of the Universe about 13 billion years ago[47]
  2. The creation of the first complex objects, stars, about 12 billion years ago[47]
  3. The creation of chemical elements inside dying stars required for chemically-complex objects, including plants and animals[47]
  4. The formation of planets, such as our Earth, which are more chemically complex than the Sun[47]
  5. The creation and evolution of life from about 3.8 billion years ago, including the evolution of our hominine ancestors[47]
  6. The development of our species, Homo sapiens, about 250,000 years ago, covering the Paleolithic era of human history[47]
  7. The appearance of agriculture about 11,000 years ago in the Neolithic era, allowing for larger, more complex societies[47]
  8. The "modern revolution", or the vast social, economic, and cultural transformations that brought the world into the modern era[47]
  9. What will happen in the future and predicting what will be the next threshold in our history[48][49]

Goldilocks conditions


The Earth is ideally located in a Goldilocks condition—being neither too close nor too distant from the Sun.

A theme in Big History is what has been termed Goldilocks conditions or the Goldilocks principle, which describes how "circumstances must be right for any type of complexity to form or continue to exist," as emphasized by Spier in his recent book.[17] For humans, bodily temperatures can neither be too hot nor too cold; for life to form on a planet, it can neither have too much nor too little energy from sunlight. Stars require sufficient quantities of hydrogen, sufficiently packed together under tremendous gravity, to cause nuclear fusion.[17]

Christian suggests that the universe creates complexity when these Goldilocks conditions are met, that is, when things are not too hot or cold, not too fast or slow. For example, life began not in solids (molecules are stuck together, preventing the right kinds of associations) or gases (molecules move too fast to enable favorable associations) but in liquids such as water which permitted the right kinds of interactions at the right speeds.[46]

Somewhat in contrast, Chaisson has maintained for well more than a decade that "organizational complexity is mostly governed by the optimum use of energy—not too little as to starve a system, yet not too much as to destroy it" (italics in the original published paper[50]). Neither maximum energy principles nor minimum entropy states are likely relevant, and appeals to "Goldilocks principles" (or other such fairy tales) are unnecessary to appreciate the emergence of complexity in Nature writ large.

Other themes


Big Historians use information based on scientific techniques such as gene mapping to learn more about the origins of humanity.

Advances in particular sciences such as archaeology, gene mapping, and evolutionary ecology have enabled historians to gain new insights into the early origins of humans, despite the lack of written sources.[2] One account suggested that proponents of Big History were trying to "upend" the conventional practice in historiography of relying on written records.[2]

Big History proponents suggest that humans have been affecting climate change throughout history, by such methods as slash-and-burn agriculture, although past modifications have been on a lesser scale than in recent years during the Industrial Revolution.[2]

A book by Daniel Lord Smail in 2008 suggested that history was a continuing process of humans learning to self-modify our mental states by using stimulants such as coffee and tobacco, as well as other means such as religious rites or romance novels.[15] His view is that culture and biology are highly intertwined, such that cultural practices may cause human brains to be wired differently from those in different societies.[15]

Presentation by web-based interactive video


ChronoZoom is a free open source project that helps readers visualize time at all scales from the Big Bang 13.8 billion years ago to the present.

Big History is more likely than conventional history to be taught with interactive "video-heavy" websites without textbooks, according to one account.[10] The discipline has benefited from having new ways of presenting themes and concepts in new formats, often supplemented by Internet and computer technology.[1] For example, the ChronoZoom project is a way to explore the 14 billion year history of the universe in an interactive website format.[9][51] It was described in one account:
ChronoZoom splays out the entirety of cosmic history in a web browser, where users can click into different epochs to learn about the events that have culminated to bring us to where we are today — in my case, sitting in an office chair writing about space. Eager to learn about the Stelliferous epoch? Click away, my fellow explorer. Curious about the formation of the earth? Jump into the "Earth and Solar System" section to see historian David Christian talk about the birth of our homeworld.
— TechCrunch, 2012[51]
In 2012, the History channel showed the film History of the World in Two Hours.[1][9] It showed how dinosaurs effectively dominated mammals for 160 million years until an asteroid impact wiped them out.[1] One report suggested the History channel had won a sponsorship from StanChart to develop a Big History program entitled Mankind.[52] In 2013 the History channel's new H2 network debuted the 10-part series Big History, narrated by Bryan Cranston and featuring David Christian and an assortment of historians, scientists and related experts.[53] Each episode centered on a major Big History topic such as salt, mountains, cold, flight, water, meteors and megastructures.

History of the field

Early efforts



Astronomer Carl Sagan

While the emerging field of Big History in its present state is generally seen as having emerged in the past two decades beginning around 1990, there have been numerous precedents going back almost 150 years. In the mid-19th century, Alexander von Humboldt's book Cosmos, and Robert Chambers' 1844 book Vestiges of the Natural History of Creation[17] were seen as early precursors to the field.[17] In a sense, Darwin's theory of evolution was, in itself, an attempt to explain a biological phenomenon by examining longer term cause-and-effect processes. In the first half of the 20th century, secular biologist Julian Huxley originated the term "evolutionary humanism",[1] while around the same time the French Jesuit paleontologist Pierre Teilhard de Chardin examined links between cosmic evolution and a tendency towards complexification (including human consciousness), while envisaging compatibility between cosmology, evolution, and theology. In the mid and later 20th century, The Ascent of Man by Jacob Bronowski examined history from a multidisciplinary perspective. Later, Eric Chaisson explored the subject of cosmic evolution quantitatively in terms of energy rate density, and the astronomer Carl Sagan wrote Cosmos.[1]  Thomas Berry, a cultural historian, and the academic Brian Swimme explored meaning behind myths and encouraged academics to explore themes beyond organized religion.[1]


The famous Earthrise photo may have stimulated, among other things, an interest in interdisciplinary studies.

The field continued to evolve from interdisciplinary studies during the mid-20th century, stimulated in part by the Cold War and the Space Race. Some early efforts were courses in Cosmic Evolution at Harvard University in the United States, and Universal History in the Soviet Union. One account suggested that the notable Earthrise photo, taken during a lunar orbit by the spacecraft Apollo 8, which showed Earth as a small blue and white ball behind a stark and desolate lunar landscape, not only stimulated the environmental movement but also caused an upsurge of interdisciplinary interest.[17] The French historian Fernand Braudel examined daily life with investigations of "large-scale historical forces like geology and climate".[20] Physiologist Jared Diamond in his book Guns, Germs, and Steel examined the interplay between geography and human evolution;[20] for example, he argued that the horizontal shape of the Eurasian continent enabled human civilizations to advance more quickly than the vertical north-south shape of the American continent, because it enabled greater competition and information-sharing among peoples of the relatively same climate.

In the 1970s, scholars in the United States including geologist Preston Cloud of the University of Minnesota, astronomer G. Siegfried Kutter at Evergreen State College in Washington state, and Harvard University astrophysicists George B. Field and Eric Chaisson started synthesizing knowledge to form a "science-based history of everything", although each of these scholars emphasized somewhat their own particular specializations in their courses and books.[17] In 1980, the Austrian philosopher Erich Jantsch wrote The Self-Organizing Universe which viewed history in terms of what he called "process structures".[17] There was an experimental course taught by John Mears at Southern Methodist University in Dallas, Texas, and more formal courses at the university level began to appear.

In 1991 Clive Ponting wrote A Green History of the World: The Environment and the Collapse of Great Civilizations. His analysis did not begin with the Big Bang, but his chapter "Foundations of History" explored the influences of large-scale geological and astronomical forces over a broad time period.

Sometimes the terms "Deep History" and "Big History" are interchangeable, but sometimes "Deep History" simply refers to history going back several hundred thousand years or more without the other senses of being a movement within history itself.[54][55]

David Christian

One exponent is David Christian of Macquarie University in Sydney, Australia.[56][57] He read widely in diverse fields in science, and believed that much was missing from the general study of history. His first university-level course was offered in 1989.[17] He developed a college course beginning with the Big Bang to the present[10] in which he collaborated with numerous colleagues from diverse fields in science and the humanities and the social sciences. This course eventually became a Teaching Company course entitled Big History: The Big Bang, Life on Earth, and the Rise of Humanity, with 24 hours of lectures,[1] which appeared in 2008.[9]

Since the 1990s, other universities began to offer similar courses. In 1994 at the University of Amsterdam and the Eindhoven University of Technology, college courses were offered.[17] In 1996, Fred Spier wrote The Structure of Big History.[17] Spier looked at structured processes which he termed "regimes":
I defined a regime in its most general sense as 'a more or less regular but ultimately unstable pattern that has a certain temporal permanence', a definition which can be applied to human cultures, human and non-human physiology, non-human nature, as well as to organic and inorganic phenomena at all levels of complexity. By defining 'regime' in this way, human cultural regimes thus became a subcategory of regimes in general, and the approach allowed me to look systematically at interactions among different regimes which together produce big history.
— Fred Spier, 2008[17]
Christian's course caught the attention of philanthropist Bill Gates, who discussed with him how to turn Big History into a high school-level course. Gates said about David Christian:
He really blew me away. Here's a guy who's read across the sciences, humanities, and social sciences and brought it together in a single framework. It made me wish that I could have taken big history when I was young, because it would have given me a way to think about all of the school work and reading that followed. In particular, it really put the sciences in an interesting historical context and explained how they apply to a lot of contemporary concerns.
— Bill Gates, in 2012[4]

Educational courses

By 2002, a dozen college courses on Big History had sprung up around the world.[20] Cynthia Stokes Brown initiated Big History at the Dominican University of California, and she wrote Big History: From the Big Bang to the Present.[58] In 2010, Dominican University of California launched the world's first Big History program to be required of all first-year students, as part of the school's general education track. This program, directed by Mojgan Behmand, includes a one-semester survey of Big History, and an interdisciplinary second-semester course exploring the Big History metanarrative through the lens of a particular discipline or subject.[9][59] A course description reads:
Welcome to First Year Experience Big History at Dominican University of California. Our program invites you on an immense journey through time, to witness the first moments of our universe, the birth of stars and planets, the formation of life on Earth, the dawn of human consciousness, and the ever-unfolding story of humans as Earth's dominant species. Explore the inevitable question of what it means to be human and our momentous role in shaping possible futures for our planet.
— course description 2012[60]
The Dominican faculty's approach is to synthesize the disparate threads of Big History thought, in order to teach the content, develop critical thinking and writing skills, and prepare students to wrestle with the philosophical implications of the Big History metanarrative. In 2015, University of California Press published Teaching Big History, a comprehensive pedagogical guide for teaching Big History, edited by Richard B. Simon, Mojgan Behmand, and Thomas Burke, and written by the Dominican faculty.[61]


Big History is taught at the University of Southern Maine.

Barry Rodrigue, at the University of Southern Maine, established the first general education course and the first online version, which has drawn students from around the world.[16] The University of Queensland in Australia offers an undergraduate course entitled Global History, required for all history majors, which "surveys how powerful forces and factors at work on large time-scales have shaped human history". By 2011, 50 professors around the world have offered courses. In 2012, one report suggested that Big History was being practiced as a "coherent form of research and teaching" by hundreds of academics from different disciplines.[9]


Philanthropist Bill Gates is a major advocate of encouraging instruction in Big History.

There are efforts to bring Big History to younger students.[3] In 2008, Christian and his colleagues began developing a course for secondary school students.[17] In 2011, a pilot high school course was taught to 3,000 kids in 50 high schools worldwide.[10] In 2012, there were 87 schools, with 50 in the United States, teaching Big History, with the pilot program set to double in 2013 for students in the ninth and tenth grades,[4] and even in one middle school.[62] The subject is a STEM course[63] at one high school.[64]

There are initiatives to make Big History a required standard course for university students throughout the world. An education project founded by philanthropist Bill Gates from his personal funds was launched in Australia and the United States, to offer a free online version of the course to high school students.[7]

International Big History Association


Founding members of the International Big History Association gathered at Coldigioco, Italy in 2010

The International Big History Association (IBHA) was founded at the Coldigioco Geological Observatory in Coldigioco, Marche, Italy, on 20 August 2010.[65] Its headquarters is located at Grand Valley State University in Allendale, Michigan, United States. Its inaugural gathering in 2012 was described as "big news" in a report in The Huffington Post.[1]

Notable people involved

Academics involved with the concept include:[9]

Computer-aided software engineering

From Wikipedia, the free encyclopedia ...