Search This Blog

Tuesday, September 15, 2020

Big History

From Wikipedia, the free encyclopedia
 
A diagram of the Big Bang expansion according to NASA
 
Artist's depiction of the WMAP satellite gathering data to help scientists understand the Big Bang

Big History is an academic discipline which examines history from the Big Bang to the present. Big History resists specialization, and searches for universal patterns or trends. It examines long time frames using a multidisciplinary approach based on combining numerous disciplines from science and the humanities, and explores human existence in the context of this bigger picture. It integrates studies of the cosmos, Earth, life, and humanity using empirical evidence to explore cause-and-effect relations, and is taught at universities and primary and secondary schools often using web-based interactive presentations.

Historian David Christian has been credited with coining the term "Big History" while teaching one of the first such courses at Macquarie University. An all-encompassing study of humanity's relationship to cosmology and natural history has been pursued by scholars since the Renaissance, and the new field, Big History, continues such work.

Comparison with conventional history

Conventional history Big History
5000 BCE to present Big Bang to present
7,000–10,000 years 13.8 billion years
Compartmentalized fields of study Interdisciplinary approach
Focus on human civilization Focus on how humankind fits within the universe
Taught mostly with books Taught on interactive platforms at: Coursera, Youtube's Crash Course, Big History Project, Macquarie University, ChronoZoom
Microhistory Macrohistory
Focus on trends, processes Focus on analogy, metaphor
Based on a variety of documents, including written records and material artifacts Based on current knowledge about phenomena such as fossils, ecological changes, genetic analysis, telescope data, in addition to conventional historical data

Big History examines the past using numerous time scales, from the Big Bang to modernity, unlike conventional history courses which typically begin with the introduction of farming and civilization, or with the beginning of written records. It explores common themes and patterns. Courses generally do not focus on humans until one-third to halfway through, and, unlike conventional history courses, there is not much focus on kingdoms or civilizations or wars or national borders. If conventional history focuses on human civilization with humankind at the center, Big History focuses on the universe and shows how humankind fits within this framework and places human history in the wider context of the universe's history.

Conventional history often begins with the development of agriculture in civilizations such as Ancient Egypt.
 
Control of fire by early humans predating both agriculture and civilization

Unlike conventional history, Big History tends to go rapidly through detailed historical eras such as the Renaissance or Ancient Egypt. It draws on the latest findings from biology, astronomy, geology, climatology, prehistory, archaeology, anthropology, evolutionary biology, chemistry, psychology, hydrology, geography, paleontology, ancient history, physics, economics, cosmology, natural history, and population and environmental studies as well as standard history. One teacher explained:

We're taking the best evidence from physics and the best evidence from chemistry and biology, and we're weaving it together into a story ... They're not going to learn how to balance [chemical] equations, but they're going to learn how the chemical elements came out of the death of stars, and that's really interesting.

Big History arose from a desire to go beyond the specialized and self-contained fields that emerged in the 20th century. It tries to grasp history as a whole, looking for common themes across multiple time scales in history. Conventional history typically begins with the invention of writing, and is limited to past events relating directly to the human race. Big Historians point out that this limits study to the past 5,000 years and neglects the much longer time when humans existed on Earth. Henry Kannberg sees Big History as being a product of the Information Age, a stage in history itself following speech, writing, and printing. Big History covers the formation of the universe, stars, and galaxies, and includes the beginning of life as well as the period of several hundred thousand years when humans were hunter-gatherers. It sees the transition to civilization as a gradual one, with many causes and effects, rather than an abrupt transformation from uncivilized static cavemen to dynamic civilized farmers. An account in The Boston Globe describes what it polemically asserts to be the conventional "history" view:

Early humans were slump-shouldered, slope-browed, hairy brutes. They hunkered over campfires and ate scorched meat. Sometimes they carried spears. Once in a while they scratched pictures of antelopes on the walls of their caves. That's what I learned during elementary school, anyway. History didn't start with the first humans—they were cavemen! The Stone Age wasn't history; the Stone Age was a preamble to history, a dystopian era of stasis before the happy onset of civilization, and the arrival of nifty developments like chariot wheels, gunpowder, and Google. History started with agriculture, nation-states, and written documents. History began in Mesopotamia's Fertile Crescent, somewhere around 4000 BC. It began when we finally overcame our savage legacy, and culture surpassed biology.

Big History, in contrast to conventional history, has more of an interdisciplinary basis. Advocates sometimes view conventional history as "microhistory" or "shallow history", and note that three-quarters of historians specialize in understanding the last 250 years while ignoring the "long march of human existence." However, one historian disputed that the discipline of history has overlooked the big view, and described the "grand narrative" of Big History as a "cliché that gets thrown around a lot." One account suggested that conventional history had the "sense of grinding the nuts into an ever finer powder." It emphasizes long-term trends and processes rather than history-making individuals or events. Historian Dipesh Chakrabarty of the University of Chicago suggested that Big History was less politicized than contemporary history because it enables people to "take a step back." It uses more kinds of evidence than the standard historical written records, such as fossils, tools, household items, pictures, structures, ecological changes and genetic variations.

Criticism of Big History

Critics of Big History, including sociologist Frank Furedi, have deemed the discipline an "anti-humanist turn of history." The Big History narrative has also been challenged for failing to engage with the methodology of the conventional history discipline. According to historian and educator Sam Wineburg of Stanford University, Big History eschews the interpretation of texts in favor of a purely scientific approach, thus becoming "less history and more of a kind of evolutionary biology or quantum physics." Others have pointed out that such criticisms of Big History removing the human element or not following a historical methodology seem to derive from observers who have not sufficiently looked into what Big History actually does, with most courses having one-third or half devoted to humanity, with the concept of increasing complexity giving humanity an important place, and with methods in the natural sciences being innately historical since they also attempt to gather evidence in order to craft a narrative.

Themes

Radiocarbon dating helps scientists understand the age of rocks as well as the Earth and the Solar System.

Big History seeks to retell the "human story" in light of scientific advances by such methods as radiocarbon dating, genetic analysis, thermodynamic measurements of "free energy rate density", along with a host of methods employed in archaeology, anthropology, and world history. David Christian of Macquarie University has argued that the recent past is only understandable in terms of the "whole 14-billion-year span of time itself." David Baker of Macquarie University has pointed out that not only do the physical principles of energy flows and complexity connect human history to the very start of the Universe, but the broadest view of human history many also supply the discipline of history with a "unifying theme" in the form of the concept of collective learning. Big History also explores the mix of individual action and social and environmental forces, according to one view. Big History seeks to discover repeating patterns during the 13.8 billion years since the Big Bang and explore the core transdisciplinary theme of increasing complexity as described by Eric Chaisson of Harvard University.

Time scales and questions

Big History makes comparisons based on different time scales and notes similarities and differences between the human, geological, and cosmological scales. David Christian believes such "radical shifts in perspective" will yield "new insights into familiar historical problems, from the nature/nurture debate to environmental history to the fundamental nature of change itself." It shows how human existence has been changed by both human-made and natural factors: for example, according to natural processes which happened more than four billion years ago, iron emerged from the remains of an exploding star and, as a result, humans could use this hard metal to forge weapons for hunting and war. The discipline addresses such questions as "How did we get here?," "How do we decide what to believe?," "How did Earth form?," and "What is life?" According to Fred Spier it offers a "grand tour of all the major scientific paradigms" and helps students to become scientifically literate quickly. One interesting perspective that arises from Big History is that despite the vast temporal and spatial scales of the history of the Universe, it is actually very small pockets of the cosmos where most of the "history" is happening, due to the nature of complexity.

Cosmic evolution

Cosmic evolution, the scientific study of universal change, is closely related to Big History (as are the allied subjects of the epic of evolution and astrobiology); some researchers regard cosmic evolution as broader than Big History since the latter mainly (and rightfully) examines the specific historical trek from Big Bang → Milky Way → Sun → Earth → humanity. Cosmic evolution, while fully addressing all complex systems (and not merely those that led to humans) has been taught and researched for decades, mostly by astronomers and astrophysicists. This Big-Bang-to-humankind scenario well preceded the subject that some historians began calling Big History in the 1990s. Cosmic evolution is an intellectual framework that offers a grand synthesis of the many varied changes in the assembly and composition of radiation, matter, and life throughout the history of the universe. While engaging the time-honored queries of who we are and whence we came, this interdisciplinary subject attempts to unify the sciences within the entirety of natural history—a single, inclusive scientific narrative of the origin and evolution of all material things over ~14 billion years, from the origin of the universe to the present day on Earth.

The roots of the idea of cosmic evolution extend back millennia. Ancient Greek philosophers of the fifth century BCE, most notably Heraclitus, are celebrated for their reasoned claims that all things change. Early modern speculation about cosmic evolution began more than a century ago, including the broad insights of Robert Chambers, Herbert Spencer, and Lawrence Henderson. Only in the mid-20th century was the cosmic-evolutionary scenario articulated as a research paradigm to include empirical studies of galaxies, stars, planets, and life—in short, an expansive agenda that combines physical, biological, and cultural evolution. Harlow Shapley widely articulated the idea of cosmic evolution (often calling it "cosmography") in public venues at mid-century, and NASA embraced it in the late 20th century as part of its more limited astrobiology program. Carl Sagan, Eric Chaisson, Hubert Reeves, Erich Jantsch, and Preston Cloud, among others, extensively championed cosmic evolution at roughly the same time around 1980. This extremely broad subject now continues to be richly formulated as both a technical research program and a scientific worldview for the 21st century.

One popular collection of scholarly materials on cosmic evolution is based on teaching and research that has been underway at Harvard University since the mid-1970s.

Complexity, energy, thresholds

Cosmic evolution is a quantitative subject, whereas big history typically is not; this is because cosmic evolution is practiced mostly by natural scientists, while big history by social scholars. These two subjects, closely allied and overlapping, benefit from each other; cosmic evolutionists tend to treat universal history linearly, thus humankind enters their story only at the most very recent times, whereas big historians tend to stress humanity and its many cultural achievements, granting human beings a larger part of their story. One can compare and contrast these different emphases by watching two short movies portraying the Big-Bang-to-humankind narrative, one animating time linearly, and the other capturing time (actually look-back time) logarithmically; in the former, humans enter this 14-minute movie in the last second, while in the latter we appear much earlier—yet both are correct.

These different treatments of time over ~14 billion years, each with different emphases on historical content, are further clarified by noting that some cosmic evolutionists divide the whole narrative into three phases and seven epochs:

Phases: physical evolution → biological evolution → cultural evolution
Epochs: particulate → galactic → stellar → planetary → chemical → biological → cultural

This contrasts with the approach used by some big historians who divide the narrative into many more thresholds, as noted in the discussion at the end of this section below. Yet another telling of the Big-Bang-to-humankind story is one that emphasizes the earlier universe, particularly the growth of particles, galaxies, and large-scale cosmic structure, such as in physical cosmology.

Notable among quantitative efforts to describe cosmic evolution are Eric Chaisson's research efforts to describe the concept of energy flow through open, thermodynamic systems, including galaxies, stars, planets, life, and society. The observed increase of energy rate density (energy/time/mass) among a whole host of complex systems is one useful way to explain the rise of complexity in an expanding universe that still obeys the cherished second law of thermodynamics and thus continues to accumulate net entropy. As such, ordered material systems—from buzzing bees and redwood trees to shining stars and thinking beings—are viewed as temporary, local islands of order in a vast, global sea of disorder. A recent review article, which is especially directed toward big historians, summarizes much of this empirical effort over the past decade.

One striking finding of such complexity studies is the apparently ranked order among all known material systems in the universe. Although the absolute energy in astronomical systems greatly exceeds that of humans, and although the mass densities of stars, planets, bodies, and brains are all comparable, the energy rate density for humans and modern human society are approximately a million times greater than for stars and galaxies. For example, the Sun emits a vast luminosity, 4x1033 erg/s (equivalent to nearly a billion billion billion watt light bulb), but it also has a huge mass, 2x1033 g; thus each second an amount of energy equaling only 2 ergs passes through each gram of this star. In contrast to any star, more energy flows through each gram of a plant's leaf during photosynthesis, and much more (nearly a million times) rushes through each gram of a human brain while thinking (~20W/1350g).

Cosmic evolution is more than a subjective, qualitative assertion of "one damn thing after another". This inclusive scientific worldview constitutes an objective, quantitative approach toward deciphering much of what comprises organized, material Nature. Its uniform, consistent philosophy of approach toward all complex systems demonstrates that the basic differences, both within and among many varied systems, are of degree, not of kind. And, in particular, it suggests that optimal ranges of energy rate density grant opportunities for the evolution of complexity; those systems able to adjust, adapt, or otherwise take advantage of such energy flows survive and prosper, while other systems adversely affected by too much or too little energy are non-randomly eliminated.

Fred Spier is foremost among those big historians who have found the concept of energy flows useful, suggesting that Big History is the rise and demise of complexity on all scales, from sub-microscopic particles to vast galaxy clusters, and not least many biological and cultural systems in between.

David Christian, in an 18-minute TED talk, described some of the basics of the Big History course. Christian describes each stage in the progression towards greater complexity as a "threshold moment" when things become more complex, but they also become more fragile and mobile. Some of Christian's threshold stages are:

In a supernova, a star which has exhausted most of its energy bursts in an incredible explosion, creating conditions for heavier elements such as iron and gold to form.
  1. The universe appears, incredibly hot, exponentially expanding within a second.
  2. Stars are born.
  3. Stars die, creating temperatures hot enough to make complex chemicals, as well as rocks, asteroids, planets, moons, and our solar system.
  4. Earth is created.
  5. Life appears on Earth, with molecules growing from the Goldilocks conditions, with neither too much nor too little energy.
  6. Humans appear, language, collective learning.

Christian elaborated that more complex systems are more fragile, and that while collective learning is a powerful force to advance humanity in general, it is not clear that humans are in charge of it, and it is possible in his view for humans to destroy the biosphere with the powerful weapons that have been invented.

In the 2008 lecture series through The Teaching Company's Great Courses entitled Big History: The Big Bang, Life on Earth, and the Rise of Humanity, Christian explains Big History in terms of eight thresholds of increasing complexity:

  1. The Big Bang and the creation of the Universe about roughly 14 billion years ago
  2. The creation of the first complex objects, stars, about 12 billion years ago
  3. The creation of chemical elements inside dying stars required for chemically-complex objects, including plants and animals
  4. The formation of planets, such as our Earth, which are more chemically complex than the Sun
  5. The origin and evolution of life from roughly about 4.2 billion years ago, including the evolution of our hominine ancestors
  6. The development of our species, Homo sapiens, about 250,000 years ago, covering the Paleolithic era of human history
  7. The appearance of agriculture about 11,000 years ago in the Neolithic era, allowing for larger, more complex societies
  8. The "modern revolution", or the vast social, economic, and cultural transformations that brought the world into the modern era
  9. What will happen in the future and predicting what will be the next threshold in our history

Goldilocks conditions

The Earth is ideally located in a Goldilocks condition—being neither too close nor too distant from the Sun.

A theme in Big History is what has been termed Goldilocks conditions or the Goldilocks principle, which describes how "circumstances must be right for any type of complexity to form or continue to exist," as emphasized by Spier in his recent book. For humans, bodily temperatures can neither be too hot nor too cold; for life to form on a planet, it can neither have too much nor too little energy from sunlight. Stars require sufficient quantities of hydrogen, sufficiently packed together under tremendous gravity, to cause nuclear fusion.

Christian suggests that the universe creates complexity when these Goldilocks conditions are met, that is, when things are not too hot or cold, not too fast or slow. For example, life began not in solids (molecules are stuck together, preventing the right kinds of associations) or gases (molecules move too fast to enable favorable associations) but in liquids such as water that permitted the right kinds of interactions at the right speeds.

Somewhat in contrast, Chaisson has maintained for well more than a decade that "organizational complexity is mostly governed by the optimum use of energy—not too little as to starve a system, yet not too much as to destroy it". Neither maximum energy principles nor minimum entropy states are likely relevant to appreciate the emergence of complexity in Nature writ large.

Other themes

Big Historians use information based on scientific techniques such as gene mapping to learn more about the origins of humanity.

Advances in particular sciences such as archaeology, gene mapping, and evolutionary ecology have enabled historians to gain new insights into the early origins of humans, despite the lack of written sources. One account suggested that proponents of Big History were trying to "upend" the conventional practice in historiography of relying on written records.

Big History proponents suggest that humans have been affecting climate change throughout history, by such methods as slash-and-burn agriculture, although past modifications have been on a lesser scale than in recent years during the Industrial Revolution.

A book by Daniel Lord Smail in 2008 suggested that history was a continuing process of humans learning to self-modify our mental states by using stimulants such as coffee and tobacco, as well as other means such as religious rites or romance novels. His view is that culture and biology are highly intertwined, such that cultural practices may cause human brains to be wired differently from those in different societies.

Another theme that has been actively discussed recently by the Big History community is the issue of the Big History Singularity

Presentation by web-based interactive video

ChronoZoom is a free open source project that helps readers visualize time at all scales from the Big Bang 13.8 billion years ago to the present.

Big History is more likely than conventional history to be taught with interactive "video-heavy" websites without textbooks, according to one account. The discipline has benefited from having new ways of presenting themes and concepts in new formats, often supplemented by Internet and computer technology. For example, the ChronoZoom project is a way to explore the 14 billion year history of the universe in an interactive website format. It was described in one account:

ChronoZoom splays out the entirety of cosmic history in a web browser, where users can click into different epochs to learn about the events that have culminated to bring us to where we are today—in my case, sitting in an office chair writing about space. Eager to learn about the Stelliferous epoch? Click away, my fellow explorer. Curious about the formation of the earth? Jump into the "Earth and Solar System" section to see historian David Christian talk about the birth of our homeworld.

— TechCrunch, 2012

In 2012, the History channel showed the film History of the World in Two Hours. It showed how dinosaurs effectively dominated mammals for 160 million years until an asteroid impact wiped them out. One report suggested the History channel had won a sponsorship from StanChart to develop a Big History program entitled Mankind. In 2013 the History channel's new H2 network debuted the 10-part series Big History, narrated by Bryan Cranston and featuring David Christian and an assortment of historians, scientists and related experts. Each episode centered on a major Big History topic such as salt, mountains, cold, flight, water, meteors and megastructures.

History of the field

Early efforts

Astronomer Carl Sagan

While the emerging field of Big History in its present state is generally seen as having emerged in the past two decades beginning around 1990, there have been numerous precedents going back almost 150 years. In the mid-19th century, Alexander von Humboldt's book Cosmos, and Robert Chambers' 1844 book Vestiges of the Natural History of Creation were seen as early precursors to the field. In a sense, Darwin's theory of evolution was, in itself, an attempt to explain a biological phenomenon by examining longer term cause-and-effect processes. In the first half of the 20th century, secular biologist Julian Huxley originated the term "evolutionary humanism", while around the same time the French Jesuit paleontologist Pierre Teilhard de Chardin examined links between cosmic evolution and a tendency towards complexification (including human consciousness), while envisaging compatibility between cosmology, evolution, and theology. In the mid and later 20th century, The Ascent of Man by Jacob Bronowski examined history from a multidisciplinary perspective. Later, Eric Chaisson explored the subject of cosmic evolution quantitatively in terms of energy rate density, and the astronomer Carl Sagan wrote Cosmos.  Thomas Berry, a cultural historian, and the academic Brian Swimme explored meaning behind myths and encouraged academics to explore themes beyond organized religion.

The famous 1968 Earthrise photo, taken by astronaut William Anders, may have stimulated, among other things, an interest in interdisciplinary studies.

The field continued to evolve from interdisciplinary studies during the mid-20th century, stimulated in part by the Cold War and the Space Race. Some early efforts were courses in Cosmic Evolution at Harvard University in the United States, and Universal History in the Soviet Union. One account suggested that the notable Earthrise photo, taken by William Anders during a lunar orbit by the Apollo 8, which showed Earth as a small blue and white ball behind a stark and desolate lunar landscape, not only stimulated the environmental movement but also caused an upsurge of interdisciplinary interest. The French historian Fernand Braudel examined daily life with investigations of "large-scale historical forces like geology and climate". Physiologist Jared Diamond in his book Guns, Germs, and Steel examined the interplay between geography and human evolution; for example, he argued that the horizontal shape of the Eurasian continent enabled human civilizations to advance more quickly than the vertical north-south shape of the American continent, because it enabled greater competition and information-sharing among peoples of the relatively same climate.

In the 1970s, scholars in the United States including geologist Preston Cloud of the University of Minnesota, astronomer G. Siegfried Kutter at Evergreen State College in Washington state, and Harvard University astrophysicists George B. Field and Eric Chaisson started synthesizing knowledge to form a "science-based history of everything", although each of these scholars emphasized somewhat their own particular specializations in their courses and books. In 1980, the Austrian philosopher Erich Jantsch wrote The Self-Organizing Universe which viewed history in terms of what he called "process structures". There was an experimental course taught by John Mears at Southern Methodist University in Dallas, Texas, and more formal courses at the university level began to appear.

In 1991 Clive Ponting wrote A Green History of the World: The Environment and the Collapse of Great Civilizations. His analysis did not begin with the Big Bang, but his chapter "Foundations of History" explored the influences of large-scale geological and astronomical forces over a broad time period.

Sometimes the terms "Deep History" and "Big History" are interchangeable, but sometimes "Deep History" simply refers to history going back several hundred thousand years or more without the other senses of being a movement within history itself.

David Christian

One exponent is David Christian of Macquarie University in Sydney, Australia. He read widely in diverse fields in science, and believed that much was missing from the general study of history. His first university-level course was offered in 1989. He developed a college course beginning with the Big Bang to the present in which he collaborated with numerous colleagues from diverse fields in science and the humanities and the social sciences. This course eventually became a Teaching Company course entitled Big History: The Big Bang, Life on Earth, and the Rise of Humanity, with 24 hours of lectures, which appeared in 2008.

Since the 1990s, other universities began to offer similar courses. In 1994 at the University of Amsterdam and the Eindhoven University of Technology, college courses were offered. In 1996, Fred Spier wrote The Structure of Big History. Spier looked at structured processes which he termed "regimes":

I defined a regime in its most general sense as 'a more or less regular but ultimately unstable pattern that has a certain temporal permanence', a definition which can be applied to human cultures, human and non-human physiology, non-human nature, as well as to organic and inorganic phenomena at all levels of complexity. By defining 'regime' in this way, human cultural regimes thus became a subcategory of regimes in general, and the approach allowed me to look systematically at interactions among different regimes which together produce big history.

— Fred Spier, 2008

Christian's course caught the attention of philanthropist Bill Gates, who discussed with him how to turn Big History into a high school-level course. Gates said about David Christian:

He really blew me away. Here's a guy who's read across the sciences, humanities, and social sciences and brought it together in a single framework. It made me wish that I could have taken big history when I was young, because it would have given me a way to think about all of the school work and reading that followed. In particular, it really put the sciences in an interesting historical context and explained how they apply to a lot of contemporary concerns.

— Bill Gates, in 2012

Educational courses

By 2002, a dozen college courses on Big History had sprung up around the world. Cynthia Stokes Brown initiated Big History at the Dominican University of California, and she wrote Big History: From the Big Bang to the Present. In 2010, Dominican University of California launched the world's first Big History program to be required of all first-year students, as part of the school's general education track. This program, directed by Mojgan Behmand, includes a one-semester survey of Big History, and an interdisciplinary second-semester course exploring the Big History metanarrative through the lens of a particular discipline or subject. A course description reads:

Welcome to First Year Experience Big History at Dominican University of California. Our program invites you on an immense journey through time, to witness the first moments of our universe, the birth of stars and planets, the formation of life on Earth, the dawn of human consciousness, and the ever-unfolding story of humans as Earth's dominant species. Explore the inevitable question of what it means to be human and our momentous role in shaping possible futures for our planet.

— course description 2012

The Dominican faculty's approach is to synthesize the disparate threads of Big History thought, in order to teach the content, develop critical thinking and writing skills, and prepare students to wrestle with the philosophical implications of the Big History metanarrative. In 2015, University of California Press published Teaching Big History, a comprehensive pedagogical guide for teaching Big History, edited by Richard B. Simon, Mojgan Behmand, and Thomas Burke, and written by the Dominican faculty.

Big History is taught at the University of Southern Maine.

Barry Rodrigue, at the University of Southern Maine, established the first general education course and the first online version, which has drawn students from around the world. The University of Queensland in Australia offers an undergraduate course entitled Global History, required for all history majors, which "surveys how powerful forces and factors at work on large time-scales have shaped human history". By 2011, 50 professors around the world have offered courses. In 2012, one report suggested that Big History was being practiced as a "coherent form of research and teaching" by hundreds of academics from different disciplines.

Philanthropist Bill Gates is a major advocate of encouraging instruction in Big History.

There are efforts to bring Big History to younger students. In 2008, Christian and his colleagues began developing a course for secondary school students. In 2011, a pilot high school course was taught to 3,000 kids in 50 high schools worldwide. In 2012, there were 87 schools, with 50 in the United States, teaching Big History, with the pilot program set to double in 2013 for students in the ninth and tenth grades, and even in one middle school. The subject is a STEM course at one high school.

There are initiatives to make Big History a required standard course for university students throughout the world. An education project founded by philanthropist Bill Gates from his personal funds was launched in Australia and the United States, to offer a free online version of the course to high school students.

International Big History Association

Founding members of the International Big History Association gathered at Coldigioco, Italy in 2010

The International Big History Association (IBHA) was founded at the Coldigioco Geological Observatory in Coldigioco, Marche, Italy, on 20 August 2010. Its headquarters is located at Grand Valley State University in Allendale, Michigan, United States. Its inaugural gathering in 2012 was described as "big news" in a report in The Huffington Post.

People involved

Some notable academics involved with the concept include:

Web-based simulation

From Wikipedia, the free encyclopedia

Web-based simulation (WBS) is the invocation of computer simulation services over the World Wide Web, specifically through a web browser. Increasingly, the web is being looked upon as an environment for providing modeling and simulation applications, and as such, is an emerging area of investigation within the simulation community.

Application

Web-based simulation is used in several contexts:

  • In e-learning, various principles can quickly be illustrated to students by means of interactive computer animations, for example during lecture demonstrations and computer exercises.
  • In distance learning, web-based simulation may provide an alternative to installing expensive simulation software on the student computer, or an alternative to expensive laboratory equipment.
  • In software engineering, web-based emulation allows application development and testing on one platform for other target platforms, for example for various mobile operating systems or mobile web browsers, without the need of target hardware or locally installed emulation software.
  • In online computer games, 3D environments can be simulated, and old home computers and video game consoles can be emulated, allowing the user to play old computer games in the web browser.
  • In medical education, nurse education and allied health education (like sonographer training), web-based simulations can be used for learning and practicing clinical healthcare procedures. Web-based procedural simulations emphasize the cognitive elements such as the steps of the procedure, the decisions, the tools/devices to be used, and the correct anatomical location.

Client-side vs server-side approaches

Web-based simulation can take place either on the server side or on the client side. In server-side simulation, the numerical calculations and visualization (generation of plots and other computer graphics) is carried out on the web server, while the interactive graphical user interface (GUI) often partly is provided by the client-side, for example using server-side scripting such as PHP or CGI scripts, interactive services based on Ajax or a conventional application software remotely accessed through a VNC Java applet.

In client-side simulation, the simulation program is downloaded from the server side but completely executed on the client side, for example using Java applets, Flash animations, JavaScript, or some mathematical software viewer plug-in. Server-side simulation is not scalable for many simultaneous users, but places fewer demands on the user computer performance and web-browser plug-ins than client-side simulation.

The term on-line simulation sometimes refers to server-side web-based simulation, sometimes to symbiotic simulation, i.e. a simulation that interacts in real-time with a physical system.

The upcoming cloud-computing technologies can be used for new server-side simulation approaches. For instance, there are multi-agent-simulation applications which are deployed on cloud-computing instances and act independently. This allows simulations to be highly scalable.

Existing tools

Environmental isotopes

From Wikipedia, the free encyclopedia

The environmental isotopes are a subset of the isotopes, both stable and radioactive, which are the object of isotope geochemistry. They are primarily used as tracers to see how things move around within the ocean-atmosphere system, within terrestrial biomes, within the Earth's surface, and between these broad domains.

Isotope Geochemistry

Chemical elements are defined by their number of protons, but the mass of the atom is determined by the number of protons and neutrons in the nucleus. Isotopes are atoms that are of a specific element, but have different numbers of neutrons and thus different masses. In a specific object, you can have a ratio between two isotopes of an element. This ratio varies slightly in the world, so in order to study isotopic ratio changes across the world, changes in isotope ratios are defined as deviations from a standard, multiplied by 1000. This unit is a "per mil". As a convention, the ratio is of the heavier isotope to the lower isotope.

These variations in isotopes can occur through many types of fractionation. They are generally classified as mass independent fractionation and mass dependent fractionation. An example of a mass independent process is the fractionation of oxygen atoms in ozone. This is due to the kinetic isotope effect (KIE) and is caused by different isotope molecules reacting at different speeds. An example of a mass dependent process is the fractionation of water as it transitions from the liquid to gas phase. Water molecules with heavier isotopes (18O and 2H) tend to stay in the liquid phase as water molecules with lighter isotopes (16O and 1H) preferentially move to the gas phase.

Of the different isotopes that exist, one common classification is distinguishing radioactive isotopes from stable isotopes. Radioactive isotopes are isotopes that will decay into a different isotope. For example, 3H (tritium) is a radioactive isotope of hydrogen. It decays into 3He with a half-life of ~12.3 years. By comparison, stable isotopes are more stable, decaying much more slowly and having much longer half-lives. Examples of stable isotopes are 86Sr and 87Sr. These isotopes of strontium have half-lives on the order of billions of years or are unmeasured because of how stable they are. On timescales that geologists and environmental scientists investigate, these isotopes are stable. Both of these types of isotopes are useful to scientists. Radioactive isotopes are generally more useful on shorter timescales, such as investigating modern circulation of the ocean using 14C, while stable isotopes are generally more useful on longer timescales, such as investigating differences in river flow with strontium stable isotopes.

These isotopes are used as tracers to study various phenomena of interest. These tracers have a certain distribution spatially, and so scientists need to deconvolve the different processes that affect these tracer distributions. One way tracer distributions are set is by conservative mixing. In conservative mixing, the amount of the tracer is conserved. An example of this is mixing two water masses with different salinities. The salt from the saltier water mass moves to the less salty water mass, keeping the total amount of salinity constant. This way of mixing tracers is very important, giving a baseline of what value of a tracer one should expect. The value of a tracer as a point is expected to be an average value of the sources that flow into that region. Deviations from this are indicative of other processes. These can be called nonconservative mixing, where there are other processes that do not conserve the amount of tracer. An example of this is 𝛿14C. This mixes between water masses, but it also decays over time, reducing the amount of 14C in the region.

Useful Elements

The most used environmental isotopes are:

Ocean Circulation

One topic that environmental isotopes are used to study is the circulation of the ocean. Treating the ocean as a box is only useful in some studies; in depth consideration of the oceans in general circulation models (GCM's) requires knowing how the ocean circulates. This leads to an understanding of how the oceans (along with the atmosphere) transfer heat from the tropics to the poles. This also helps deconvolve circulation effects from other phenomena that affect certain tracers such as radioactive and biological processes.

A summary of the path of the thermohaline circulation. Blue paths represent deep-water currents, while red paths represent surface currents.

Using rudimentary observation techniques, the circulation of the surface ocean can be determined. In the Atlantic basin, surface waters flow from the south towards the north in general, while also creating gyres in the northern and southern Atlantic. In the Pacific Ocean, the gyres still form, but there is comparatively very little large scale meridional (North-South) movement. For deep waters, there are two areas where density causes waters to sink into the deep ocean. These are in the North Atlantic and the Antarctic. The deep water masses formed are North Atlantic Deep Water (NADW) and Antarctic Bottom Water (AABW). Deep waters are mixtures of these two waters, and understanding how waters are composed of these two water masses can tell us about how water masses move around in the deep ocean.

This can be investigated with environmental isotopes, including 14C. 14C is predominantly produced in the upper atmosphere and from nuclear testing, with no major sources or sinks in the ocean. This 14C from the atmosphere becomes oxidized into 14CO2, allowing it to enter the surface ocean through gas transfer. This is transferred into the deep ocean through NADW and AABW. In NADW, the 𝛿14C is approximately -60‰, and in AABW, the 𝛿14C is approximately -160‰. Thus, using conservative mixing of radiocarbon, the expected amount of radiocarbon in various locations can be determined using the percent compositions of NADW and AABW at that location. This can be determined using other tracers, such as phosphate star or salinity. Deviations from this expected value are indicative of other processes that affect the delta ratio of radiocarbon, namely radioactive decay. This deviation can be converted to a time, giving the age of the water at that location. Doing this over the world's ocean can yield a circulation pattern of the ocean and the rate at which water flow through the deep ocean. Using this circulation in conjunction with the surface circulation allows scientists to understand the energy balance of the world. Warmer surface waters flow northward while colder deep waters flow southward, leading to net heat transfer towards the pole.

Paleoclimate

Isotopes are also used to study paleoclimate. This is the study of how climate was in the past, from hundreds of years ago to hundreds of thousands of years ago. The only records of these times that we have are buried in rocks, sediments, biological shells, stalagmites and stalactites, etc. The isotope ratios in these samples were affected by the temperature, salinity, circulation of the ocean, precipitation, etc. of the climate at the time, causing a measurable change from the standards for isotope measurements. This is how climate information is encoded in these geological formations. Some of the many isotopes useful for environmental science are discussed below.

Delta O18

One useful isotope for reconstructing past climates is oxygen-18. It is another stable isotope of oxygen along with oxygen-16, and its incorporation into water and carbon dioxide/carbonate molecules is strongly temperature dependent. Higher temperature implies more incorporation of oxygen-18, and vice versa. Thus, the ratio of 18O/16O can tell something about temperature. For water, the isotope ratio standard is Vienna Standard Mean Ocean Water, and for carbonates, the standard is Pee Dee Belemnite. Using ice cores and sediment cores that record information about the water and shells from past times, this ratio can tell scientists about the temperature of those times.

 
Climate record as reconstructed by Lisiecki and Raymo (2005) showing oscillations in the Earth's temperature over time. These oscillations have a 41 kyr cycle until about 1.2 million years ago, switching to a 100 kyr cycle that we see now.

This ratio is used with ice cores to determine the temperature at the spot in the ice core. Depth in an ice core is proportional to time, and it is "wiggle-matched" with other records to determine the true time of the ice at that depth. This can be done by comparing δ18O in calcium carbonate shells in sediment cores to these records to match large scale changes in the temperature of the Earth. Once the ice cores are matched to sediment cores, highly accurate dating methods such as U-series dating can be used to accurately determine the time of these events. There are some processes that mix water from different times into the same depth in the ice core, such as firn production and sloped landscape floes.

Lisiecki and Raymo (2005) used measurements of δ18O in benthic foraminifera from 57 globally distributed deep sea sediment cores, taken as a proxy for the total global mass of glacial ice sheets, to reconstruct the climate for the past five million years. This record shows oscillations of 2-10 degrees Celsius over this time. Between 5 million and 1.2 million years ago, these oscillations had a period of 41,000 years (41 kyr), but about 1.2 million years ago the period switch to 100 kyr. These changes in global temperature match with changes in orbital parameters of the Earth's orbit around the Sun. These are called Milankovitch cycles, and these are related to eccentricity, obliquity (axial tilt), and precession of Earth around its axis. These correspond to cycles with periods of 100 kyr, 40 kyr, and 20 kyr.

δ18O can also be used to investigate smaller scale climate phenomena. Koutavas et al. (2006) used δ18O of G. ruber foraminifera to study the El Niño–Southern Oscillation (ENSO) and it's variability through the mid-Holocene.[6] By isolating individual foram shells, Koutavas et al. were able to obtain a spread of δ18O values at a specific depth. Because these forams live for approximately a month and that the individual forams were from many different months, clumped together in a small depth range in the coral, the variability of δ18O was able to be determined. In the eastern Pacific, where these cores were taken, the primary driver of this variability is ENSO, making this a record of ENSO variability over the core's time span. Koutavas et al. found that ENSO was much less variable in the mid Holocene (~6,000 years ago) than it is currently.

Strontium isotopes

Another set of environmental isotopes used in paleoclimate is strontium isotopes. Strontium-86 and strontium-87 are both stable isotopes of strontium, but strontium-87 is radiogenic, coming from the decay of rubidium-87. The ratio of these two isotopes depends on the concentration of rubidium-87 initially and the age of the sample, assuming that the background concentration of strontium-87 is known. This is useful because 87Rb is predominantly found in continental rocks. Particles from these rocks come into the ocean through weathering by rivers, meaning that this strontium isotope ratio is related to the weathering ion flux coming from rivers into the ocean. The background concentration in the ocean for 87Sr/86Sr is 0.709 ± 0.0012. Because the strontium ratio is recorded in sedimentary records, the oscillations of this ratio over time can be studied. These oscillations are related to the riverine input into the oceans or into the local basin. Richter and Turekian have done work on this, finding that over glacial-interglacial timescales (105 years), the 87Sr/86Sr ratio varies by 3*10−5.

Decay series of Actinides, including Uranium, Protactinium, Thorium, and Lead

Uranium and related isotopes

Uranium has many radioactive isotopes that continue emitting particles down a decay chain.

Uranium-235 is in one such chain, and decays into protactinium-231 and then into other products. Uranium-238 is in a separate chain, decaying into a series of elements, including thorium-230. Both of these series end up forming lead, either lead-207 from uranium-235 or lead-206 from uranium-238. All of these decays are alpha or beta decays, meaning that they all follow first order rate equations of the form , where λ is the half-life of the isotope in question. This makes it simple to determine the age of a sample based on the various ratios of radioactive isotopes that exist.

One way uranium isotopes are used is to date rocks from millions to billions of years ago. This is through uranium-lead dating. This technique uses zircon samples and measures the lead content in them. Zircon incorporates uranium and thorium atoms into its crystal structure, but strongly rejects lead. Thus, the only sources of lead in a zircon crystal are through decay of uranium and thorium. Both the uranium-235 and uranium-238 series decay into an isotope of lead. The "half-life" of converting 235U to 207Pb is 710 million years, and the "half-life" of converting 238U to 206Pb is 4.47 billion years. Because of high resolution mass-spectroscopy, both chains can be used to date rocks, giving complementary information about the rocks. The large difference in half-lives makes the technique robust over long time scales, from on the order of millions of years to on the order of billions of years.

Another way uranium isotopes are used in environmental science is the ratio of 231Pa/230Th. These radiogenic isotopes have different uranium parents, but have very different reactivities in the ocean. The uranium profile in the ocean is constant because uranium has a very large residence time compared to the residence time of the ocean. The decay of uranium is thus also isotropic, but the daughter isotopes react differently. Thorium is readily scavenged by particles, leading to rapid removal from the ocean into sediments. By contrast, 231Pa is not as particle-reactive, feeling the circulation of the ocean in small amounts before settling into the sediment. Thus, knowing the decay rates of both isotopes and the fractions of each uranium isotopes, the expected ratio of 231Pa/230Th can be determined, with any deviation from this value being due to circulation. Circulation leads to a higher 231Pa/230Th ratio downstream and a lower ratio upstream, with the magnitude of the deviation being related to flow rate. This technique has been used to quantify the Atlantic Meridional Overturning Circulation (AMOC) during the Last Glacial Maximum (LGM) and during abrupt climate change events in Earth's past, such as Heinrich events and Dansgaard-Oeschger events.

Neodymium

Neodymium isotopes are also used to determine circulation in the ocean. All of the isotopes of neodymium are stable on the timescales of glacial-interglacial cycles, but 143Nd is a daughter of 147Sm, a radioactive isotope in the ocean. Samarium-147 has higher concentrations in mantle rocks vs crust rocks, so areas that receive river inputs from mantle-derived rocks have higher concentrations of 147Sm and 143Nd. However, these differences are so small, the standard notation of a delta value are no blunt for it; a more precise epsilon value is used to describe variations in this ratio of neodymium isotopes. It 

is defined as

The only major sources of this in the ocean are in the North Atlantic and in the deep Pacific Ocean. Because one of the end-members is set in the interior of the ocean, this technique has the potential to tell us complementary information about paleoclimate compared to all other ocean tracers that are only set in the surface ocean.

Isotope geochemistry

From Wikipedia, the free encyclopedia

Isotope geochemistry is an aspect of geology based upon the study of natural variations in the relative abundances of isotopes of various elements. Variations in isotopic abundance are measured by isotope ratio mass spectrometry, and can reveal information about the ages and origins of rock, air or water bodies, or processes of mixing between them.

Stable isotope geochemistry is largely concerned with isotopic variations arising from mass-dependent isotope fractionation, whereas radiogenic isotope geochemistry is concerned with the products of natural radioactivity.

Stable isotope geochemistry

For most stable isotopes, the magnitude of fractionation from kinetic and equilibrium fractionation is very small; for this reason, enrichments are typically reported in "per mil" (‰, parts per thousand).

 These enrichments (δ) represent the ratio of heavy isotope to light isotope in the sample over the ratio of a standard. That is,

Carbon

Carbon has two stable isotopes, 12C and 13C, and one radioactive isotope, 14C.

The stable carbon isotope ratio, δ13C, is measured against Vienna Pee Dee Belemnite (VPDB). The stable carbon isotopes are fractionated primarily by photosynthesis (Faure, 2004). The 13C/12C ratio is also an indicator of paleoclimate: a change in the ratio in the remains of plants indicates a change in the amount of photosynthetic activity, and thus in how favorable the environment was for the plants. During photosynthesis, organisms using the C3 pathway show different enrichments compared to those using the C4 pathway, allowing scientists not only to distinguish organic matter from abiotic carbon, but also what type of photosynthetic pathway the organic matter was using. Occasional spikes in the global 13C/12C ratio have also been useful as stratigraphic markers for chemostratigraphy, especially during the Paleozoic.

The 14C ratio has been used to track ocean circulation, among other things.

Nitrogen

Nitrogen has two stable isotopes, 14N and 15N. The ratio between these is measured relative to nitrogen in ambient air. Nitrogen ratios are frequently linked to agricultural activities. Nitrogen isotope data has also been used to measure the amount of exchange of air between the stratosphere and troposphere using data from the greenhouse gas N2O.

Oxygen

Oxygen has three stable isotopes, 16O, 17O, and 18O. Oxygen ratios are measured relative to Vienna Standard Mean Ocean Water (VSMOW) or Vienna Pee Dee Belemnite (VPDB). Variations in oxygen isotope ratios are used to track both water movement, paleoclimate, and atmospheric gases such as ozone and carbon dioxide. Typically, the VPDB oxygen reference is used for paleoclimate, while VSMOW is used for most other applications. Oxygen isotopes appear in anomalous ratios in atmospheric ozone, resulting from mass-independent fractionation. Isotope ratios in fossilized foraminifera have been used to deduce the temperature of ancient seas.

Sulfur

Sulfur has four stable isotopes, with the following abundances: 32S (0.9502), 33S (0.0075), 34S (0.0421) and 36S (0.0002). These abundances are compared to those found in Cañon Diablo troilite. Variations in sulfur isotope ratios are used to study the origin of sulfur in an orebody and the temperature of formation of sulfur–bearing minerals.

Radiogenic isotope geochemistry

Radiogenic isotopes provide powerful tracers for studying the ages and origins of Earth systems. They are particularly useful to understand mixing processes between different components, because (heavy) radiogenic isotope ratios are not usually fractionated by chemical processes.

Radiogenic isotope tracers are most powerful when used together with other tracers: The more tracers used, the more control on mixing processes. An example of this application is to the evolution of the Earth's crust and Earth's mantle through geological time.

Lead–lead isotope geochemistry

Lead has four stable isotopes: 204Pb, 206Pb, 207Pb, and 208Pb.

Lead is created in the Earth via decay of actinide elements, primarily uranium and thorium.

Lead isotope geochemistry is useful for providing isotopic dates on a variety of materials. Because the lead isotopes are created by decay of different transuranic elements, the ratios of the four lead isotopes to one another can be very useful in tracking the source of melts in igneous rocks, the source of sediments and even the origin of people via isotopic fingerprinting of their teeth, skin and bones.

It has been used to date ice cores from the Arctic shelf, and provides information on the source of atmospheric lead pollution.

Lead–lead isotopes has been successfully used in forensic science to fingerprint bullets, because each batch of ammunition has its own peculiar 204Pb/206Pb vs 207Pb/208Pb ratio.

Samarium–neodymium

Samariumneodymium is an isotope system which can be utilised to provide a date as well as isotopic fingerprints of geological materials, and various other materials including archaeological finds (pots, ceramics).

147Sm decays to produce 143Nd with a half life of 1.06x1011 years.

Dating is achieved usually by trying to produce an isochron of several minerals within a rock specimen. The initial 143Nd/144Nd ratio is determined.

This initial ratio is modelled relative to CHUR - the Chondritic Uniform Reservoir - which is an approximation of the chondritic material which formed the solar system. CHUR was determined by analysing chondrite and achondrite meteorites.

The difference in the ratio of the sample relative to CHUR can give information on a model age of extraction from the mantle (for which an assumed evolution has been calculated relative to CHUR) and to whether this was extracted from a granitic source (depleted in radiogenic Nd), the mantle, or an enriched source.

Rhenium–osmium

Rhenium and osmium are siderophile elements which are present at very low abundances in the crust. Rhenium undergoes radioactive decay to produce osmium. The ratio of non-radiogenic osmium to radiogenic osmium throughout time varies.

Rhenium prefers to enter sulfides more readily than osmium. Hence, during melting of the mantle, rhenium is stripped out, and prevents the osmium–osmium ratio from changing appreciably. This locks in an initial osmium ratio of the sample at the time of the melting event. Osmium–osmium initial ratios are used to determine the source characteristic and age of mantle melting events.

Noble gas isotopes

Natural isotopic variations amongst the noble gases result from both radiogenic and nucleogenic production processes. Because of their unique properties, it is useful to distinguish them from the conventional radiogenic isotope systems described above.

Helium-3

Helium-3 was trapped in the planet when it formed. Some 3He is being added by meteoric dust, primarily collecting on the bottom of oceans (although due to subduction, all oceanic tectonic plates are younger than continental plates). However, 3He will be degassed from oceanic sediment during subduction, so cosmogenic 3He is not affecting the concentration or noble gas ratios of the mantle.

Helium-3 is created by cosmic ray bombardment, and by lithium spallation reactions which generally occur in the crust. Lithium spallation is the process by which a high-energy neutron bombards a lithium atom, creating a 3He and a 4He ion. This requires significant lithium to adversely affect the 3He/4He ratio.

All degassed helium is lost to space eventually, due to the average speed of helium exceeding the escape velocity for the Earth. Thus, it is assumed the helium content and ratios of Earth's atmosphere have remained essentially stable.

It has been observed that 3He is present in volcano emissions and oceanic ridge samples. How 3He is stored in the planet is under investigation, but it is associated with the mantle and is used as a marker of material of deep origin.

Due to similarities in helium and carbon in magma chemistry, outgassing of helium requires the loss of volatile components (water, carbon dioxide) from the mantle, which happens at depths of less than 60 km. However, 3He is transported to the surface primarily trapped in the crystal lattice of minerals within fluid inclusions.

Helium-4 is created by radiogenic production (by decay of uranium/thorium-series elements). The continental crust has become enriched with those elements relative to the mantle and thus more He4 is produced in the crust than in the mantle.

The ratio (R) of 3He to 4He is often used to represent 3He content. R usually is given as a multiple of the present atmospheric ratio (Ra).

Common values for R/Ra:

  • Old continental crust: less than 1
  • mid-ocean ridge basalt (MORB): 7 to 9
  • Spreading ridge rocks: 9.1 plus or minus 3.6
  • Hotspot rocks: 5 to 42
  • Ocean and terrestrial water: 1
  • Sedimentary formation water: less than 1
  • Thermal spring water: 3 to 11

3He/4He isotope chemistry is being used to date groundwaters, estimate groundwater flow rates, track water pollution, and provide insights into hydrothermal processes, igneous geology and ore genesis.

Isotopes in actinide decay chains

Isotopes in the decay chains of actinides are unique amongst radiogenic isotopes because they are both radiogenic and radioactive. Because their abundances are normally quoted as activity ratios rather than atomic ratios, they are best considered separately from the other radiogenic isotope systems.

Protactinium/Thorium – 231Pa / 230Th

Uranium is well mixed in the ocean, and its decay produces 231Pa and 230Th at a constant activity ratio (0.093). The decay products are rapidly removed by adsorption on settling particles, but not at equal rates. 231Pa has a residence equivalent to the residence time of deep water in the Atlantic basin (around 1000 yrs) but 230Th is removed more rapidly (centuries). Thermohaline circulation effectively exports 231Pa from the Atlantic into the Southern Ocean, while most of the 230Th remains in Atlantic sediments. As a result, there is a relationship between 231Pa/230Th in Atlantic sediments and the rate of overturning: faster overturning produces lower sediment 231Pa/230Th ratio, while slower overturning increases this ratio. The combination of δ13C and 231Pa/230Th can therefore provide a more complete insight into past circulation changes.

Anthropogenic isotopes

Tritium/helium-3

Tritium was released to the atmosphere during atmospheric testing of nuclear bombs. Radioactive decay of tritium produces the noble gas helium-3. Comparing the ratio of tritium to helium-3 (3H/3He) allows estimation of the age of recent ground waters.


Platinum group

From Wikipedia, the free encyclopedia ...