Search This Blog

Sunday, March 15, 2026

Emergence

From Wikipedia, the free encyclopedia
The formation of complex symmetrical and fractal patterns in snowflakes exemplifies emergence in a physical system.
A termite "cathedral" mound produced by a termite colony offers a classic example of emergence in nature.

In philosophy, systems theory, science, and art, emergence occurs when a complex entity has properties or behaviors that its parts do not have on their own, and emerge only when they interact in a wider whole.

Emergence plays a central role in theories of integrative levels and of complex systems. For instance, the phenomenon of life as studied in biology is an emergent property of chemistry and physics.

In philosophy, theories that emphasize emergent properties have been called emergentism.

In philosophy

Philosophers often understand emergence as a claim about the etiology of a system's properties. An emergent property of a system, in this context, is one that is not a property of any component of that system, but is still a feature of the system as a whole. Nicolai Hartmann (1882–1950), one of the first modern philosophers to write on emergence, termed this a categorial novum (new category).

Definitions

This concept of emergence dates from at least the time of Aristotle. In Heideggerian thought, the notion of emergence is derived from the Greek word poiein, meaning "to make", and refers to a bringing-forth that encompasses not just a process of crafting (techne) but also the broader sense of something coming into being or revealing itself. Heidegger used emerging blossoms and butterflies as examples to illustrate poiêsis as a threshold event where something moves from one state to another. Many scientists and philosophers have written on the concept, including John Stuart Mill (Composition of Causes, 1843) and Julian Huxley (1887–1975).

The philosopher G. H. Lewes coined the term "emergent" in 1875, distinguishing it from the merely "resultant":

Every resultant is either a sum or a difference of the co-operant forces; their sum, when their directions are the same – their difference, when their directions are contrary. Further, every resultant is clearly traceable in its components, because these are homogeneous and commensurable. It is otherwise with emergents, when, instead of adding measurable motion to measurable motion, or things of one kind to other individuals of their kind, there is a co-operation of things of unlike kinds. The emergent is unlike its components insofar as these are incommensurable, and it cannot be reduced to their sum or their difference.

Strong and weak emergence

Usage of the notion "emergence" may generally be subdivided into two perspectives, that of "weak emergence" and "strong emergence". One paper discussing this division is Weak Emergence, by philosopher Mark Bedau. In terms of physical systems, weak emergence is a type of emergence in which the emergent property is amenable to computer simulation or similar forms of after-the-fact analysis (for example, the formation of a traffic jam, the structure of a flock of starlings in flight or a school of fish, or the formation of galaxies). Crucial in these simulations is that the interacting members retain their independence. If not, a new entity is formed with new, emergent properties: this is called strong emergence, which it is argued cannot be simulated, analysed or reduced.

David Chalmers writes that emergence often causes confusion in philosophy and science due to a failure to demarcate strong and weak emergence, which are "quite different concepts".

Some common points between the two notions are that emergence concerns new properties produced as the system grows, which is to say ones which are not shared with its components or prior states. Also, it is assumed that the properties are supervenient rather than metaphysically primitive.

Weak emergence describes new properties arising in systems as a result of the interactions at a fundamental level. However, Bedau stipulates that the properties can be determined only by observing or simulating the system, and not by any process of a reductionist analysis. As a consequence the emerging properties are scale dependent: they are only observable if the system is large enough to exhibit the phenomenon. Chaotic, unpredictable behaviour can be seen as an emergent phenomenon, while at a microscopic scale the behaviour of the constituent parts can be fully deterministic.

Bedau notes that weak emergence is not a universal metaphysical solvent, as the hypothesis that consciousness is weakly emergent would not resolve the traditional philosophical questions about the physicality of consciousness. However, Bedau concludes that adopting this view would provide a precise notion that emergence is involved in consciousness, and second, the notion of weak emergence is metaphysically benign.

Strong emergence describes the direct causal action of a high-level system on its components; qualities produced this way are irreducible to the system's constituent parts. The whole is other than the sum of its parts. It is argued then that no simulation of the system can exist, for such a simulation would itself constitute a reduction of the system to its constituent parts. Physics lacks well-established examples of strong emergence, unless it is interpreted as the impossibility in practice to explain the whole in terms of the parts. Practical impossibility may be a more useful distinction than one in principle, since it is easier to determine and quantify, and does not imply the use of mysterious forces, but simply reflects the limits of our capability.

Viability of strong emergence

One of the reasons for the importance of distinguishing these two concepts with respect to their difference concerns the relationship of purported emergent properties to science. Some thinkers question the plausibility of strong emergence as contravening our usual understanding of physics. Mark A. Bedau observes:

Although strong emergence is logically possible, it is uncomfortably like magic. How does an irreducible but supervenient downward causal power arise, since by definition it cannot be due to the aggregation of the micro-level potentialities? Such causal powers would be quite unlike anything within our scientific ken. This not only indicates how they will discomfort reasonable forms of materialism. Their mysteriousness will only heighten the traditional worry that emergence entails illegitimately getting something from nothing.

The concern that strong emergence does so entail is that such a consequence must be incompatible with metaphysical principles such as the principle of sufficient reason or the Latin dictum ex nihilo nihil fit, often translated as "nothing comes from nothing".

Strong emergence can be criticized for leading to causal overdetermination. The canonical example concerns emergent mental states (M and M∗) that supervene on physical states (P and P∗) respectively. Let M and M∗ be emergent properties. Let M∗ supervene on base property P∗. What happens when M causes M∗? Jaegwon Kim says:

In our schematic example above, we concluded that M causes M∗ by causing P∗. So M causes P∗. Now, M, as an emergent, must itself have an emergence base property, say P. Now we face a critical question: if an emergent, M, emerges from basal condition P, why cannot P displace M as a cause of any putative effect of M? Why cannot P do all the work in explaining why any alleged effect of M occurred? If causation is understood as nomological (law-based) sufficiency, P, as M's emergence base, is nomologically sufficient for it, and M, as P∗'s cause, is nomologically sufficient for P∗. It follows that P is nomologically sufficient for P∗ and hence qualifies as its cause...If M is somehow retained as a cause, we are faced with the highly implausible consequence that every case of downward causation involves overdetermination (since P remains a cause of P∗ as well). Moreover, this goes against the spirit of emergentism in any case: emergents are supposed to make distinctive and novel causal contributions.

If M is the cause of M∗, then M∗ is overdetermined because M∗ can also be thought of as being determined by P. One escape-route that a strong emergentist could take would be to deny downward causation. However, this would remove the proposed reason that emergent mental states must supervene on physical states, which in turn would call physicalism into question, and thus be unpalatable for some philosophers and physicists.

Carroll and Parola propose a taxonomy that classifies emergent phenomena by how the macro-description relates to the underlying micro-dynamics.

Type‑0 (Featureless) Emergence

A coarse-graining map Φ from a micro state space A to a macro state space B that commutes with time evolution, without requiring any further decomposition into subsystems.
Type‑1 (Local) Emergence

Emergence where the macro theory is defined in terms of localized collections of micro-subsystems. This category is subdivided into:
Type‑1a (Direct) Emergence: When the emergence map Φ is algorithmically simple (i.e. compressible), so that the macro behavior is easily deduced from the micro-states.
Type‑1b (Incompressible) Emergence: When Φ is algorithmically complex (i.e. incompressible), making the macro behavior appear more novel despite being determined by the micro-dynamics.
Type‑2 (Nonlocal) Emergence

Cases in which both the micro and macro theories admit subsystem decompositions, yet the macro entities are defined nonlocally with respect to the micro-structure, meaning that macro behavior depends on widely distributed micro information.
Type‑3 (Augmented) Emergence

A form of strong emergence in which the macro theory introduces additional ontological variables that do not supervene on the micro-states, thereby positing genuinely novel macro-level entities.

Objective or subjective quality

Crutchfield regards the properties of complexity and organization of any system as subjective qualities determined by the observer.

Defining structure and detecting the emergence of complexity in nature are inherently subjective, though essential, scientific activities. Despite the difficulties, these problems can be analysed in terms of how model-building observers infer from measurements the computational capabilities embedded in non-linear processes. An observer's notion of what is ordered, what is random, and what is complex in its environment depends directly on its computational resources: the amount of raw measurement data, of memory, and of time available for estimation and inference. The discovery of structure in an environment depends more critically and subtly, though, on how those resources are organized. The descriptive power of the observer's chosen (or implicit) computational model class, for example, can be an overwhelming determinant in finding regularity in data.

The low entropy of an ordered system can be viewed as an example of subjective emergence: the observer sees an ordered system by ignoring the underlying microstructure (i.e. movement of molecules or elementary particles) and concludes that the system has a low entropy. On the other hand, chaotic, unpredictable behaviour can also be seen as subjective emergent, while at a microscopic scale the movement of the constituent parts can be fully deterministic.

In science

In physics, weak emergence is used to describe a property, law, or phenomenon which occurs at macroscopic scales (in space or time) but not at microscopic scales, despite the fact that a macroscopic system can be viewed as a very large ensemble of microscopic systems.

An emergent behavior of a physical system is a qualitative property that can only occur in the limit that the number of microscopic constituents tends to infinity.

According to Robert Laughlin, for many-particle systems, nothing can be calculated exactly from the microscopic equations, and macroscopic systems are characterised by broken symmetry: the symmetry present in the microscopic equations is not present in the macroscopic system, due to phase transitions. As a result, these macroscopic systems are described in their own terminology, and have properties that do not depend on many microscopic details.

Novelist Arthur Koestler used the metaphor of Janus (a symbol of the unity underlying complements like open/shut, peace/war) to illustrate how the two perspectives (strong vs. weak or holistic vs. reductionistic) should be treated as non-exclusive, and should work together to address the issues of emergence. Theoretical physicist Philip W. Anderson states it this way:

The ability to reduce everything to simple fundamental laws does not imply the ability to start from those laws and reconstruct the universe. The constructionist hypothesis breaks down when confronted with the twin difficulties of scale and complexity. At each level of complexity entirely new properties appear. Psychology is not applied biology, nor is biology applied chemistry. We can now see that the whole becomes not merely more, but very different from the sum of its parts.

Meanwhile, others have worked towards developing analytical evidence of strong emergence. Renormalization methods in theoretical physics enable physicists to study critical phenomena that are not tractable as the combination of their parts. In 2009, Gu et al. presented a class of infinite physical systems that exhibits non-computable macroscopic properties.More precisely, if one could compute certain macroscopic properties of these systems from the microscopic description of these systems, then one would be able to solve computational problems known to be undecidable in computer science. These results concern infinite systems, finite systems being considered computable. However, macroscopic concepts which only apply in the limit of infinite systems, such as phase transitions and the renormalization group, are important for understanding and modeling real, finite physical systems. Gu et al. concluded that

Although macroscopic concepts are essential for understanding our world, much of fundamental physics has been devoted to the search for a 'theory of everything', a set of equations that perfectly describe the behavior of all fundamental particles. The view that this is the goal of science rests in part on the rationale that such a theory would allow us to derive the behavior of all macroscopic concepts, at least in principle. The evidence we have presented suggests that this view may be overly optimistic. A 'theory of everything' is one of many components necessary for complete understanding of the universe, but is not necessarily the only one. The development of macroscopic laws from first principles may involve more than just systematic logic, and could require conjectures suggested by experiments, simulations or insight.

In humanity

Human beings are the basic elements of social systems, which perpetually interact and create, maintain, or untangle mutual social bonds. Social bonds in social systems are perpetually changing in the sense of the ongoing reconfiguration of their structure. An early argument (1904–05) for the emergence of social formations can be found in Max Weber's most famous work, The Protestant Ethic and the Spirit of Capitalism. Recently, the emergence of a new social system is linked with the emergence of order from nonlinear relationships among multiple interacting units, where multiple interacting units are individual thoughts, consciousness, and actions. In the case of the global economic system, under capitalism, growth, accumulation and innovation can be considered emergent processes where not only does technological processes sustain growth, but growth becomes the source of further innovations in a recursive, self-expanding spiral. In this sense, the exponential trend of the growth curve reveals the presence of a long-term positive feedback among growth, accumulation, and innovation; and the emergence of new structures and institutions connected to the multi-scale process of growth. This is reflected in the work of Karl Polanyi, who traces the process by which labor and nature are converted into commodities in the passage from an economic system based on agriculture to one based on industry. This shift, along with the idea of the self-regulating market, set the stage not only for another economy but also for another society. The principle of emergence is also brought forth when thinking about alternatives to the current economic system based on growth facing social and ecological limits. Both degrowth and social ecological economics have argued in favor of a co-evolutionary perspective for theorizing about transformations that overcome the dependence of human wellbeing on economic growth.

Economic trends and patterns which emerge are studied intensively by economists. Within the field of group facilitation and organization development, there have been a number of new group processes that are designed to maximize emergence and self-organization, by offering a minimal set of effective initial conditions. Examples of these processes include SEED-SCALE, appreciative inquiry, Future Search, the world cafe or knowledge cafe, Open Space Technology, and others (Holman, 2010). In international development, concepts of emergence have been used within a theory of social change termed SEED-SCALE to show how standard principles interact to bring forward socio-economic development fitted to cultural values, community economics, and natural environment (local solutions emerging from the larger socio-econo-biosphere). These principles can be implemented utilizing a sequence of standardized tasks that self-assemble in individually specific ways utilizing recursive evaluative criteria.

Looking at emergence in the context of social and systems change, invites us to reframe our thinking on parts and wholes and their interrelation. Unlike machines, living systems at all levels of recursion - be it a sentient body, a tree, a family, an organisation, the education system, the economy, the health system, the political system etc - are continuously creating themselves. They are continually growing and changing along with their surrounding elements, and therefore are more than the sum of their parts. As Peter Senge and co-authors put forward in the book Presence: Exploring profound change in People, Organizations and Society, "as long as our thinking is governed by habit - notably industrial, "machine age" concepts such as control, predictability, standardization, and "faster is better" - we will continue to recreate institutions as they have been, despite their disharmony with the larger world, and the need for all living systems to evolve." While change is predictably constant, it is unpredictable in direction and often occurs at second and nth orders of systemic relationality. Understanding emergence and what creates the conditions for different forms of emergence to occur, either insidious or nourishing vitality, is essential in the search for deep transformations.

The works of Nora Bateson and her colleagues at the International Bateson Institute delve into this. Since 2012, they have been researching questions such as what makes a living system ready to change? Can unforeseen ready-ness for change be nourished? Here being ready is not thought of as being prepared, but rather as nourishing the flexibility we do not yet know will be needed. These inquiries challenge the common view that a theory of change is produced from an identified preferred goal or outcome. As explained in their paper An essay on ready-ing: Tending the prelude to change: "While linear managing or controlling of the direction of change may appear desirable, tending to how the system becomes ready allows for pathways of possibility previously unimagined." This brings a new lens to the field of emergence in social and systems change as it looks to tending the pre-emergent process. Warm Data Labs are the fruit of their praxis, they are spaces for transcontextual mutual learning in which aphanipoetic phenomena unfold. Having hosted hundreds of Warm Data processes with thousands of participants, they have found that these spaces of shared poly-learning across contexts lead to a realm of potential change, a necessarily obscured zone of wild interaction of unseen, unsaid, unknown flexibility. It is such flexibility that nourishes the ready-ing living systems require to respond to complex situations in new ways and change. In other words, this readying process preludes what will emerge. When exploring questions of social change, it is important to ask ourselves, what is submerging in the current social imaginary and perhaps, rather than focus all our resources and energy on driving direct order responses, to nourish flexibility with ourselves, and the systems we are a part of.

Another approach that engages with the concept of emergence for social change is Theory U, where "deep emergence" is the result of self-transcending knowledge after a successful journey along the U through layers of awareness. This practice nourishes transformation at the inner-being level, which enables new ways of being, seeing and relating to emerge. The concept of emergence has also been employed in the field of facilitation. In Emergent Strategy, adrienne maree brown defines emergent strategies as "ways for humans to practice complexity and grow the future through relatively simple interactions".

In linguistics, the concept of emergence has been applied in the domain of stylometry to explain the interrelation between the syntactical structures of the text and the author style (Slautina, Marusenko, 2014). It has also been argued that the structure and regularity of language grammar, or at least language change, is an emergent phenomenon. While each speaker merely tries to reach their own communicative goals, they use language in a particular way. If enough speakers behave in that way, language is changed. In a wider sense, the norms of a language, i.e. the linguistic conventions of its speech society, can be seen as a system emerging from long-time participation in communicative problem-solving in various social circumstances.

In technology

The bulk conductive response of binary (RC) electrical networks with random arrangements, known as the universal dielectric response (UDR), can be seen as emergent properties of such physical systems. Such arrangements can be used as simple physical prototypes for deriving mathematical formulae for the emergent responses of complex systems. Internet traffic can also exhibit some seemingly emergent properties. In the congestion control mechanism, TCP flows can become globally synchronized at bottlenecks, simultaneously increasing and then decreasing throughput in coordination. Congestion, widely regarded as a nuisance, is possibly an emergent property of the spreading of bottlenecks across a network in high traffic flows which can be considered as a phase transition. Some artificially intelligent (AI) computer applications simulate emergent behavior. One example is Boids, which mimics the swarming behavior of birds.

In religion and art

In religion, emergence grounds expressions of religious naturalism and syntheism in which a sense of the sacred is perceived in the workings of entirely naturalistic processes by which more complex forms arise or evolve from simpler forms. Examples are detailed in The Sacred Emergence of Nature by Ursula Goodenough & Terrence Deacon and Beyond Reductionism: Reinventing the Sacred by Stuart Kauffman, both from 2006, as well as Syntheism – Creating God in The Internet Age by Alexander Bard & Jan Söderqvist from 2014 and Emergentism: A Religion of Complexity for the Metamodern World by Brendan Graham Dempsey (2022).

Michael J. Pearce has used emergence to describe the experience of works of art in relation to contemporary neuroscience. Practicing artist Leonel Moura, in turn, attributes to his "artbots" a real, if nonetheless rudimentary, creativity based on emergent principles.

In daily life and nature

Objects consist of components with properties differing from the object itself. We call these properties emergent because they did not exist at the component level. The same applies to artifacts (structures, devices, tools, and even works of art). They are created for a specific purpose and are therefore subjectively emergent: someone who doesn't understand the purpose can't use it.

The artifact is the result of an invention: through a clever combination of components, something new is created with emergent properties and functionalities. This invention is often difficult to predict and therefore usually based on a chance discovery. An invention based on discovery is often improved through a feedback loop, making it more applicable. This is an example of downward causation.

Example 1: A hammer is a combination of a head and a handle, each with different properties. By cleverly connecting them, the hammer becomes an artifact with new, emergent functionalities. Through downward causation, you can improve the head and handle components in such a way that the hammer's functionality increases. Example 2: A mixture of tin and copper produces the alloy bronze, with new, emergent properties (hardness, lower melting temperature). Finding the correct ratio of tin to copper is an example of downward causation. Example 3: Finding the right combination of chemicals to create a superconductor at high temperatures (i.e room temperature) is a great challenge for many scientists, where chance plays a significant role. Conversely, however, the properties of all these invented artifacts can be readily explained reductionistically.

Something similar occurs in nature: random mutations in genes rarely create a creature with new, emergent properties, increasing its chances of survival in a changing ecosystem. This is how evolution works. Here too, through downward causation, new creatures can sometimes manipulate their ecosystem in such a way that their chances of survival are further increased.

In both artifacts and living beings, certain components can be crucial to the emergent end result: the end result supervenes on these components. Examples include: a construction error, a bug in a software program, an error in the genetic code, or the absence of a particular gene.

Both aspects: supervenience and the unpredictability of the emergent result are characteristic of strong emergence (see above). (This definition, however, differs significantly from the definition in philosophical literature).

Planetary health

From Wikipedia, the free encyclopedia

Planetary Health is a multi- and transdisciplinary research paradigm, a science for exceptional action, and a global movement. Planetary health refers to "the health of human civilization and the state of the natural systems on which it depends." In 2015, the Rockefeller Foundation–Lancet Commission on Planetary Health launched the concept which is  currently being developed towards a new health science with over 25 areas of expertise.

Background and milestones

There are a number of ideas, concepts that can be understood as precursors to the concept of planetary health. According to Susan Prescott, the term "planetary health" emerged from the environmental and holistic health movements of the 1970-80s. In 1980, Friends of the Earth expanded the World Health Organization's definition of health, stating, "health is a state of complete physical, mental, social and ecological well-being and not merely the absence of disease - personal health involves planetary health." James Lovelock created the term "Planetary Medicine" in 1986. In 1993 the Norwegian physician Per Fugelli wrote: "The patient Earth is sick. Global environmental disruptions can have serious consequences for human health. It's time for doctors to give a world diagnosis and advise on treatment." In the 1990s, a model curriculum Terra Medicine (Planetary Medicine) was developed at the Catholic University of Eichstätt-Ingolstadt as part of the Altmühltal Agenda 21 project. In 2000, James Lovelock published his book Gaia: The Practical Science of Planetary Medicine.

Milestones

Fourteen years later, a commentary in the March 2014 issue of the medical journal The Lancet called to create a movement for planetary health to transform the field of public health, which has traditionally focused on the health of human populations without necessarily considering the surrounding natural ecosystems. The proposal recognized the emerging threats to natural and human-made systems that support humanity.

In 2015, the Rockefeller Foundation and The Lancet launched the concept with the Rockefeller Foundation–Lancet Commission on Planetary Health. The Planetary Health Alliance was founded in December 2015, by Harvard University, together with the Wildlife Conservation Society and other partner organizations. The Rockefeller Foundation Economic Council on Planetary Health at the Oxford Martin School was established on 1 June 2017 to further define the new discipline of planetary health. The open-access journal "Lancet Planetary Health" published its inaugural issue in April 2017.

The Planetary Health Education Framework, developed in 2021 by the Planetary Health Alliance, aims to guide the education of global citizens, practitioners, and professionals able and willing to address complex Planetary Health challenges. The framework also seeks to inspire all peoples across the globe to create, restore, steward, and conserve healthy ecosystems for a thriving human civilization. The framework considers five foundational domains that form the essence of Planetary Health knowledge, values, and practice: (1) interconnection with nature, (2) the Anthropocene and health, (3) equity and social justice, (4) movement building and systems change, and (5) systems thinking and complexity.

The São Paulo Declaration on Planetary Health is a multi-stakeholder call to action co-created by the global Planetary Health community at the 2021 Planetary Health Annual Meeting in São Paulo, Brazil. The declaration calls on governments, the private sector, civil society, and the general public to commit to the Great Transition to safeguard a healthy and equitable future for humanity and protect all life on Earth.

In 2022, on the occasion of the 50th anniversary of the first UN environmental conference "United Nations Conference on the Human Environment" in Stockholm 1972, the UN published the report: 'UN Conference Stockholm+50: A Healthy Planet for the Prosperity of All - Our Responsibility, Our Opportunity'.

In 2023 the Association of Faculties of Medicine of Canada published the "Academic Health Institutions' Declaration on Planetary Health," which calls on all academic health institutions throughout the world to take immediate action to halt both the negative impact of their activities on the planet's natural systems, and to institute adaptive and regenerative measures, including through advocacy. More than 40 academic health institutions have signed the declaration. These include medical schools, faculties of medicine, schools of nursing, schools of public health, and other health-related academic institutions from various countries including Canada, India, Finland, Dominican Republic, South Africa, Germany, Portugal, Indonesia, and others.

The Royal Netherlands Academy of Arts and Sciences published a comprehensive report in June 2023 on the state of planetary health research and the future research agenda, which has relevance not only for the Netherlands but also internationally (Planetary Health Advisory Report).

In April 2024, the Global Planetary Health Roadmap and Action Plan, a map to guide a path forward for Planetary Health was created by over 100 members of a worldwide community, building on the principles and call to action of the 2021 São Paulo Declaration on Planetary Health. The roadmap encompasses key domains, such as governance, education, business, and communications, providing a strategic framework to nurture this growing movement and safeguard the health and well-being of all life on Earth.

Research paradigms and agenda

Drawing from the definition of health – "a state of complete physical, mental and social wellbeing and not merely the absence of disease or infirmity" – as well as principles articulated in the preamble of the constitution of the World Health Organization, The Lancet Commission report elaborated that planetary health refers to the "achievement of the highest attainable standard of health, wellbeing, and equity worldwide through judicious attention to the human systems – political, economic, and social – that shape the future of humanity and the Earth's natural systems that define the safe environmental limits within which humanity can flourish."

The report laid down the overarching principles guiding the idea of planetary health. One is that human health depends on "flourishing natural systems and the wise stewardship of those natural systems". Human activities, such as energy generation and food production, have led to substantial global effects on the Earth's systems, prompting scientists to refer to the modern times as the Anthropocene.

A group of Earth system and environmental scientists led by Johan Rockström from the Stockholm Resilience Centre proposed the concept of nine planetary boundaries within which humanity can continue to develop and thrive for generations to come. According to a 2024 update, six of the planetary boundaries – climate change, biosphere integrity, biogeochemical flows, land-system change, freshwater use, and novel entities-had already been exceeded. A seventh boundary, ocean acidification is approaching its threshold.

The Rockefeller Foundation–Lancet Commission on Planetary Health report concluded that urgent and transformative actions are needed to protect present and future generations. One important area which required immediate attention was the system of governance and organization of human knowledge, which was deemed inadequate to address the threats to planetary health.

The report made several overarching recommendations. One was to improve governance to aid the integration of social, economic, and environmental policies and for the creation, synthesis, and application of interdisciplinary knowledge. The authors called for solutions based on the redefinition of prosperity to focus on the enhancement of quality of life and delivery of improved health for all, together with respect for the integrity of natural systems.

International research agenda for planetary health

In June 2023, the Royal Netherlands Academy of Sciences presented their planetary health report  Planetary Health, An emerging field to be developed based on a two-year consultative process. Many knowledge gaps were identified in the field of planetary health. A review of the literature and subsequent consultation with experts resulted in a longlist of more than one hundred specific knowledge gaps. Knowledge for the health impacts of global environmental change on human health are incomplete, pathways are poorly understood, the effectiveness of mitigation and adaptation measures are still unclear, how timely policy and behaviour change can be realised. The Academy concluded that: "Filling all Planetary Health knowledge gaps requires an international collaborative effort in research funding". The Academy will cooperate with international partner and 'umbrella academies' (such as EASAC, FEAM and ALLEA) how to take this agenda forward."

In 2025 the United Nations Environment Programme (UNEP) report GEO-7 found that investing in planetary health can deliver trillions in additional global GDP, avoid millions of deaths and reduce poverty and hunger.

Issues

Nutrition and diet are important contributors to and indicators of planetary health. Diets, agriculture, and technology must adjust to sustain population projections upwards of 9 billion while reducing harmful consequences on the environment through food waste and carbon-intensive diets. A focus of planetary health research is nutritional solutions that are sustainable for the human species and the environment, and the generation of scientific research and political will to create and implement desired solutions. In January 2019, an international commission created the planetary health diet.

Planetary health aims to seek out further solutions to global human and environmental sustainability through collaboration and research across all sectors, including the economy, energy, agriculture, water, and health. Biodiversity loss, exposure to pollutants, climate change, and fuel consumption are all issues that threaten human and health, and are, as such, foci of the field. A number of researchers think that it is actually humanity's destruction of biodiversity and the invasion of wild landscapes that creates the conditions for malaria, and new diseases such as COVID-19. Some propose incorporating concern for the impact of digital technology in planetary health and health promotion, including the impact of generative AI on climate, biodiversity, and pollution.

Planetary Health Alliance

The Planetary Health Alliance is an informal global consortium of over 470 universities, non-governmental organizations, government entities, and research institutes with over 20,000 newsletter subscribers.

Several PHA regional hubs function as locally rooted communities that bring PHA members together in geographic clusters to collaboratively advance planetary health research, education, policy, and outreach relevant to specific local contexts.

The alliance's mission is "to promote, mobilize, and lead an inclusive, transdisciplinary field of Planetary Health and its diverse science, stories, solutions, and communities to achieve a comprehensive shift in how human beings interact with each other and Nature, in order to secure a livable future for humanity and the rest of life on Earth." Since November 2023, the secretariat of PHA is based at Johns Hopkins University alongside the Johns Hopkins Institute for Planetary Health.

Regional Hubs

There are eight established Planetary Health Regional Hubs that function as locally rooted communities which bring PHA members together in geographic clusters to collaboratively advance planetary health research, education, policy, and outreach relevant to specific local contexts

While additional hubs are under development, the eight established Planetary Health Regional Hubs are:

  • Caribbean
  • East Africa
  • Europe
  • Japan
  • Latin America
  • Oceania
  • South & Southeast Asia
  • West Africa

In 2022, the inaugural Planetary Health Europe Regional Hub meeting was held in Amsterdam, with 72 institutions represented. The inaugural meeting was organized by the Planetary Health Alliance, the European Environment and Sustainable Development Advisory Councils Network (EEAC Network), and Natura Artis Magistra (ARTIS). The PHA Europe Secretariat has been located in the Netherlands. It is jointly coordinated by Maastricht University and the University Medical Center Utrecht (UMC Utrecht).

Next Generation Network

The Planetary Health Next Generation Network is composed of students and next-generation leaders worldwide who are dedicated to advancing the emerging field of Planetary Health through local community efforts, educational events, and research projects. This open-access network brings together the Planetary Health Campus Ambassadors (PHCAs), Planetary Health student club leaders and members, former and current Travel Scholars to Planetary Health Annual Meetings, and any youths who would like to engage with the Planetary Health community. The Planetary Health Alliance staff team and Impact Fellows work to support these diverse efforts by providing introductory resources, workshop materials, mentorship opportunities, and community-building platforms.

Campus Ambassador program

The Planetary Health Campus Ambassador program formally recognizes next-generation leaders in planetary health on academic campuses and within the international planetary health community at-large. During the program, ambassadors build their planetary health network and gain leadership and organizational skills with the support of their program cohort, staff, fellows, and alliance members. Ambassadors are empowered to take leadership on their campus and beyond, to educate their community, and to facilitate collaborations between existing disciplines and initiatives within the scope of human health and environmental change. They also become part of the program's broader Next Generation Network, composed of individuals from a variety of academic and cultural backgrounds, career stages, and interests. They also have access to leadership opportunities within other initiatives, such as the global Planetary Health Annual Meeting, Planetary Health Regional Hubs, Clinicians for Planetary Health, and various education projects.

Annual meeting

The Planetary Health Annual Meeting, convened by the Planetary Health Alliance, is an international conference series established in 2017, serving as a global forum for advancing the field of Planetary Health. First launched at Harvard University, these meetings have evolved into comprehensive gatherings connecting diverse stakeholders including scientists, policymakers, healthcare professionals, educators, students, and community leaders from over 130 countries. The meetings rotate globally, having been hosted in the United States (Harvard University 2017, 2022; Stanford University 2019), Scotland (University of Edinburgh 2018), Brazil (University of São Paulo 2021, virtual), and Malaysia (Sunway University 2024), reflecting a commitment to geographic and cultural diversity in addressing planetary health challenges. A meeting is planned for October 2025 in the Netherlands (Erasmus University).

The meetings consistently focus on planetary health themes, including climate change, biodiversity loss, food systems transformation, health equity, and education. Each meeting has produced significant outcomes that have shaped the field: from establishing foundational frameworks in the early meetings to the São Paulo Declaration on Planetary Health (2021) and the Kuala Lumpur Call to Action (2024), accompanied by the launch of the global Planetary Health Roadmap and Action Plan. Through plenary sessions, research presentations, workshops, and community engagement activities, these meetings have been instrumental in building capacity, fostering collaboration, and driving actionable solutions for planetary health challenges.

Comparison with other fields

Planetary health is considered a response to existing fields and paradigms such as public health environmental health, ecohealth, One Health and international health.

While there may be competing definitions of global health, it is loosely defined as the health of populations in a global context, a response to the cross-border movement of health drivers as well as risks, and an improvement over the older concept of international health with its new emphasis on achieving equity in health among all people. Some scholars hold that advocacy of planetary health amounts to an over-expansion and totalization of health.

The editor in chief of The Lancet, Richard Horton, wrote in a 2014 special issue of The Economist on planetary health, that global health was no longer able to truly meet the demands which societies face, as it was still too narrow to explain and illuminate some pressing challenges."Global health does not fully take into account the natural foundation on which humans live – the planet itself. Nor does it factor in the force and fragility of human civilizations."

In 2015, Judith Rodin, president of the Rockefeller Foundation, declared planetary health as a new discipline in global health.

In 2023, the US Bureau of Labor Statistics updated the definition of environmental engineering as using, "engineering disciplines in developing solutions to problems of planetary health."

In September 2024, the Consortium of Universities for Global Health (CUGH) put forth a set of planetary health learning objectives, noting "the knowledge of planetary health science, interventions, and communication that is essential for future global health professionals." CUGH included planetary health in the updated edition of their Global Health Competencies Toolkit.

In 2026, Daniel Oerther proposed that the profession of engineering modify the paramountcy clause to, "hold paramount the health, safety, and welfare of the public and the planet,” in recognition of the interconnectedness of all life.

Multiverse

From Wikipedia, the free encyclopedia

The multiverse is the hypothetical set of all universes. Together, these universes are presumed to comprise everything that exists: the entirety of space, time, matter, energy, information, and the physical laws and constants that describe them. The different universes within the multiverse are called "parallel universes", "flat universes", "other universes", "alternate universes", "multiple universes", "plane universes", "parent and child universes", "many universes", or "many worlds". One common assumption is that the multiverse is a "patchwork quilt of separate universes all bound by the same laws of physics."

The concept of multiple universes, or a multiverse, has been discussed throughout history. It has evolved and has been debated in various fields, including cosmology, physics, and philosophy. Some physicists have argued that the multiverse is a philosophical notion rather than a scientific hypothesis, as it cannot be empirically falsified. In recent years, there have been proponents and skeptics of multiverse theories within the physics community. Although some scientists have analyzed data in search of evidence for other universes, no statistically significant evidence has been found. Critics argue that the multiverse concept lacks testability and falsifiability, which are essential for scientific inquiry, and that it raises unresolved metaphysical issues.

Max Tegmark and Brian Greene have proposed different classification schemes for multiverses and universes. Tegmark's four-level classification consists of Level I: an extension of our universe, Level II: universes with different physical constants, Level III: many-worlds interpretation of quantum mechanics, and Level IV: ultimate ensemble. Brian Greene's nine types of multiverses include quilted, inflationary, brane, cyclic, landscape, quantum, holographic, simulated, and ultimate. The ideas explore various dimensions of space, physical laws, and mathematical structures to explain the existence and interactions of multiple universes. Some other multiverse concepts include twin-world models, cyclic theories, M-theory, and black-hole cosmology.

The anthropic principle suggests that the existence of a multitude of universes, each with different physical laws, could explain the asserted appearance of fine-tuning of our own universe for conscious life. The weak anthropic principle posits that we exist in one of the few universes that support life. Debates around Occam's razor and the simplicity of the multiverse versus a single universe arise, with proponents like Max Tegmark arguing that the multiverse is simpler and more elegant. The many-worlds interpretation of quantum mechanics and modal realism, the belief that all possible worlds exist and are as real as our world, are also subjects of debate in the context of the anthropic principle.

History of the concept

According to some, the idea of infinite worlds was first suggested by the pre-Socratic Greek philosopher Anaximander in the sixth century BCE. However, there is debate as to whether he believed in multiple worlds, and if he did, whether those worlds were co-existent or successive.

The first figures to whom historians can definitively attribute the concept of innumerable worlds are the Ancient Greek Atomists, beginning with Leucippus and Democritus in the 5th century BCE, followed by Epicurus (341–270 BCE) and the Roman Epicurean Lucretius (1st century BCE). In the third century BCE, the philosopher Chrysippus suggested that the world eternally expired and regenerated, effectively suggesting the existence of multiple universes across time. The concept of multiple universes became more defined in the Middle Ages. In the Renaissance, Giordano Bruno (1548–1600) expressed the concept of infinite worlds.

The American philosopher and psychologist William James used the term "multiverse" in 1895, but in a different context.

The concept first appeared in the modern scientific context in the course of the debate between Boltzmann and Zermelo in 1895.

In Dublin in 1952, Erwin Schrödinger gave a lecture in which he jocularly warned his audience that what he was about to say might "seem lunatic". He said that when his equations seemed to describe several different histories, these were "not alternatives, but all really happen simultaneously". This sort of duality is called "superposition".

Search for evidence

In the 1990s, after recent works of fiction about the concept gained popularity, scientific discussions about the multiverse and journal articles about it gained prominence.

Around 2010, scientists such as Stephen M. Feeney analyzed Wilkinson Microwave Anisotropy Probe (WMAP) data and claimed to find evidence suggesting that this universe collided with other (parallel) universes in the distant past. However, a more thorough analysis of data from the WMAP and from the Planck satellite, which has a resolution three times higher than WMAP, did not reveal any statistically significant evidence of such a bubble universe collision. In addition, there was no evidence of any gravitational pull of other universes on ours.

In 2015, an astrophysicist may have found evidence of alternate or parallel universes by looking back in time to a time immediately after the Big Bang, although it is still a matter of debate among physicists. Dr. Ranga-Ram Chary, after analyzing the cosmic radiation spectrum, found a signal 4,500 times brighter than it should have been, based on the number of protons and electrons scientists believe existed in the very early universe. This signal—an emission line that arose from the formation of atoms during the era of recombination—is more consistent with a universe whose ratio of matter particles to photons is about 65 times greater than our own. There is a 30% chance that this signal is noise, and not really a signal at all; however, it is also possible that it exists because a parallel universe dumped some of its matter particles into our universe. If additional protons and electrons had been added to our universe during recombination, more atoms would have formed, more photons would have been emitted during their formation, and the signature line that arose from all of these emissions would be greatly enhanced. Chary said:

Many other regions beyond our observable universe would exist with each such region governed by a different set of physical parameters than the ones we have measured for our universe.

— Ranga-Ram Chary, USA Today

Chary also noted:

Unusual claims like evidence for alternate universes require a very high burden of proof.

— Ranga-Ram Chary, "Universe Today"

The signature that Chary has isolated may be a consequence of incoming light from distant galaxies, or even from clouds of dust surrounding our own galaxy.

Proponents and skeptics

Modern proponents of one or more of the multiverse hypotheses include Lee SmolinDon PageBrian Greene, Max TegmarkAlan GuthAndrei LindeMichio KakuDavid DeutschLeonard SusskindAlexander VilenkinYasunori NomuraRaj PathriaLaura Mersini-HoughtonNeil deGrasse TysonSean Carroll and Stephen Hawking.

Scientists who are generally skeptical of the concept of a multiverse or popular multiverse hypotheses include Sabine HossenfelderDavid GrossPaul Steinhardt, Anna Ijjas, Abraham LoebDavid SpergelNeil TurokViatcheslav MukhanovMichael S. TurnerRoger PenroseGeorge EllisJoe SilkCarlo RovelliAdam FrankMarcelo GleiserJim Baggott and Paul Davies.

Arguments against multiverse hypotheses

In his 2003 New York Times opinion piece, "A Brief History of the Multiverse", author and cosmologist Paul Davies offered a variety of arguments that multiverse hypotheses are non-scientific:

For a start, how is the existence of the other universes to be tested? To be sure, all cosmologists accept that there are some regions of the universe that lie beyond the reach of our telescopes, but somewhere on the slippery slope between that and the idea that there is an infinite number of universes, credibility reaches a limit. As one slips down that slope, more and more must be accepted on faith, and less and less is open to scientific verification. Extreme multiverse explanations are therefore reminiscent of theological discussions. Indeed, invoking an infinity of unseen universes to explain the unusual features of the one we do see is just as ad hoc as invoking an unseen Creator. The multiverse theory may be dressed up in scientific language, but in essence, it requires the same leap of faith.

— Paul Davies, "A Brief History of the Multiverse", The New York Times

George Ellis, writing in August 2011, provided a criticism of the multiverse, and pointed out that it is not a traditional scientific theory. He accepts that the multiverse is thought to exist far beyond the cosmological horizon. He emphasized that it is theorized to be so far away that it is unlikely any evidence will ever be found. Ellis also explained that some theorists do not believe the lack of empirical testability and falsifiability is a major concern, but he is opposed to that line of thinking:

Many physicists who talk about the multiverse, especially advocates of the string landscape, do not care much about parallel universes per se. For them, objections to the multiverse as a concept are unimportant. Their theories live or die based on internal consistency and, one hopes, eventual laboratory testing.

Ellis says that scientists have proposed the idea of the multiverse as a way of explaining the nature of existence. He points out that it ultimately leaves those questions unresolved because it is a metaphysical issue that cannot be resolved by empirical science. He argues that observational testing is at the core of science and should not be abandoned:

As skeptical as I am, I think the contemplation of the multiverse is an excellent opportunity to reflect on the nature of science and on the ultimate nature of existence: why we are here. ... In looking at this concept, we need an open mind, though not too open. It is a delicate path to tread. Parallel universes may or may not exist; the case is unproved. We are going to have to live with that uncertainty. Nothing is wrong with scientifically based philosophical speculation, which is what multiverse proposals are. But we should name it for what it is.

— George Ellis, "Does the Multiverse Really Exist?", Scientific American

Philosopher Philip Goff argues that the inference of a multiverse to explain the apparent fine-tuning of the universe is an example of Inverse Gambler's Fallacy.

Stoeger, Ellis, and Kircher note that in a true multiverse theory, "the universes are then completely disjoint and nothing that happens in any one of them is causally linked to what happens in any other one. This lack of any causal connection in such multiverses really places them beyond any scientific support".

In May 2020, astrophysicist Ethan Siegel expressed criticism in a Forbes blog post that parallel universes would have to remain a science fiction dream for the time being, based on the scientific evidence available to us.

Scientific American contributor John Horgan also argues against the idea of a multiverse, claiming that they are "bad for science."

Types

Max Tegmark and Brian Greene have devised classification schemes for the various theoretical types of multiverses and universes that they might comprise.

Max Tegmark's four levels

Cosmologist Max Tegmark has provided a taxonomy of universes beyond the familiar observable universe. The four levels of Tegmark's classification are arranged such that subsequent levels can be understood to encompass and expand upon previous levels. They are briefly described below.

Level I: An extension of our universe

A prediction of cosmic inflation is the existence of an infinite ergodic universe, which, being infinite, must contain Hubble volumes realizing all initial conditions.

Accordingly, an infinite universe will contain an infinite number of Hubble volumes, all having the same physical laws and physical constants. In regard to configurations such as the distribution of matter, almost all will differ from our Hubble volume. However, because there are infinitely many, far beyond the cosmological horizon, there will eventually be Hubble volumes with similar, and even identical, configurations. Tegmark estimates that an identical volume to ours should be about 1010115 meters away from us.

Given infinite space, there would be an infinite number of Hubble volumes identical to ours in the universe. This follows directly from the cosmological principle, wherein it is assumed that our Hubble volume is not special or unique.

Level II: Universes with different physical constants

In the eternal inflation theory, which is a variant of the cosmic inflation theory, the multiverse or space as a whole is stretching and will continue doing so forever, but some regions of space stop stretching and form distinct bubbles (like gas pockets in a loaf of rising bread). Such bubbles are embryonic level I multiverses.

Different bubbles may experience different spontaneous symmetry breaking, which results in different properties, such as different physical constants.

Level II also includes John Archibald Wheeler's oscillatory universe theory and Lee Smolin's fecund universes theory.

Level III: Many-worlds interpretation of quantum mechanics

Schrödinger's cat in the many-worlds interpretation, where a branching of the universe occurs through a superposition of two quantum mechanical states

Hugh Everett III's many-worlds interpretation (MWI) is one of several mainstream interpretations of quantum mechanics.

In brief, one aspect of quantum mechanics is that certain observations cannot be predicted absolutely. Instead, there is a range of possible observations, each with a different probability. According to the MWI, each of these possible observations corresponds to a different "world" within the Universal wavefunction, with each world as real as ours. Suppose a six-sided die is thrown and that the result of the throw corresponds to observable quantum mechanics. All six possible ways the die can fall correspond to six different worlds. In the case of the Schrödinger's cat thought experiment, both outcomes would be "real" in at least one "world".

Tegmark argues that a Level III multiverse does not contain more possibilities in the Hubble volume than a Level I or Level II multiverse. In effect, all the different worlds created by "splits" in a Level III multiverse with the same physical constants can be found in some Hubble volume in a Level I multiverse. Tegmark writes that, "The only difference between Level I and Level III is where your doppelgängers reside. In Level I they live elsewhere in good old three-dimensional space. In Level III they live on another quantum branch in infinite-dimensional Hilbert space."

Similarly, all Level II bubble universes with different physical constants can, in effect, be found as "worlds" created by "splits" at the moment of spontaneous symmetry breaking in a Level III multiverse. According to Yasunori NomuraRaphael Bousso, and Leonard Susskind, this is because global spacetime appearing in the (eternally) inflating multiverse is a redundant concept. This implies that the multiverses of Levels I, II, and III are, in fact, the same thing. This hypothesis is referred to as "Multiverse = Quantum Many Worlds". According to Yasunori Nomura, this quantum multiverse is static, and time is a simple illusion.

Another version of the many-worlds idea is H. Dieter Zeh's many-minds interpretation.

Level IV: Ultimate ensemble

The ultimate mathematical universe hypothesis is Tegmark's own hypothesis.

This level considers all universes to be equally real which can be described by different mathematical structures.

Tegmark writes:

Abstract mathematics is so general that any Theory Of Everything (TOE) which is definable in purely formal terms (independent of vague human terminology) is also a mathematical structure. For instance, a TOE involving a set of different types of entities (denoted by words, say) and relations between them (denoted by additional words) is nothing but what mathematicians call a set-theoretical model, and one can generally find a formal system that it is a model of.

He argues that this "implies that any conceivable parallel universe theory can be described at Level IV" and "subsumes all other ensembles, therefore brings closure to the hierarchy of multiverses, and there cannot be, say, a Level V."

Jürgen Schmidhuber, however, says that the set of mathematical structures is not even well-defined and that it admits only universe representations describable by constructive mathematics—that is, computer programs.

Schmidhuber explicitly includes universe representations describable by non-halting programs whose output bits converge after a finite time, although the convergence time itself may not be predictable by a halting program, due to the undecidability of the halting problem. He also explicitly discusses the more restricted ensemble of quickly computable universes.

Brian Greene's nine types

The American theoretical physicist and string theorist Brian Greene discussed nine types of multiverses:

Quilted
The quilted multiverse works only in an infinite universe. With an infinite amount of space, every possible event will occur an infinite number of times. However, the speed of light prevents us from being aware of these other identical areas.
Inflationary
The inflationary multiverse is composed of various pockets in which inflation fields collapse and form new universes.
Brane
The brane multiverse version postulates that our entire universe exists on a membrane (brane) which floats in a higher dimension or "bulk". In this bulk, there are other membranes with their own universes. These universes can interact with one another, and when they collide, the violence and energy produced is more than enough to give rise to a Big Bang. The branes float or drift near each other in the bulk, and every few trillion years, attracted by gravity or some other force we do not understand, collide and bang into each other. This repeated contact gives rise to multiple or "cyclic" Big Bangs. This particular hypothesis falls under the string theory umbrella as it requires extra spatial dimensions.
Cosmos animation of a cyclic universe
Cyclic
The cyclic multiverse has multiple branes that have collided, causing Big Bangs. The universes bounce back and pass through time until they are pulled back together and again collide, destroying the old contents and creating them anew.
Landscape
The landscape multiverse relies on string theory's Calabi–Yau spaces. Quantum fluctuations drop the shapes to a lower energy level, creating a pocket with a set of laws different from that of the surrounding space.
Quantum
The quantum multiverse creates a new universe when a diversion in events occurs, as in the real-worlds variant of the many-worlds interpretation of quantum mechanics.
Holographic
The holographic multiverse is derived from the theory that the surface area of a space can encode the contents of the volume of the region.
Simulated
The simulated multiverse exists on complex computer systems that simulate entire universes. A related hypothesis, as put forward as a possibility by astronomer Avi Loeb, is that universes may be creatable in laboratories of advanced technological civilizations who have a theory of everything. Other related hypotheses include brain in a vat-type scenarios where the perceived universe is either simulated in a low-resource way or not perceived directly by the virtual/simulated inhabitant species.
Ultimate
The ultimate multiverse contains every mathematically possible universe under different laws of physics.

Twin-world models

Concept of a twin universe, with the beginning of time in the middle

There are models of two related universes that e.g. attempt to explain the baryon asymmetry – why there was more matter than antimatter at the beginning – with a mirror anti-universe. One two-universe cosmological model could explain the Hubble constant (H0) tension via interactions between the two worlds. The "mirror world" would contain copies of all existing fundamental particles. Another twin/pair-world or "bi-world" cosmology is shown to theoretically be able to solve the cosmological constant (Λ) problem, closely related to dark energy: two interacting worlds with a large Λ each could result in a small shared effective Λ.

Cyclic theories

In several theories, there is a series of, in some cases infinite, self-sustaining cycles – typically a series of Big Crunches (or Big Bounces). However, the respective universes do not exist at once but are forming or following in a logical order or sequence, with key natural constituents potentially varying between universes (see § Anthropic principle).

M-theory

A multiverse of a somewhat different kind has been envisaged within string theory and its higher-dimensional extension, M-theory.

These theories require the presence of 10 or 11 spacetime dimensions, respectively. The extra six or seven dimensions may either be compactified on a very small scale, or our universe may simply be localized on a dynamical (3+1)-dimensional object, a D3-brane. This opens up the possibility that there are other branes which could support other universes.

Black-hole cosmology

Black-hole cosmology is a cosmological model in which the observable universe is the interior of a black hole existing as one of possibly many universes inside a larger universe. This includes the theory of white holes, which are on the opposite side of space-time.

Anthropic principle

The concept of other universes has been proposed to explain how our own universe appears to be fine-tuned for conscious life as we experience it.

If there were a large (possibly infinite) number of universes, each with possibly different physical laws (or different fundamental physical constants), then some of these universes (even if very few) would have the combination of laws and fundamental parameters that are suitable for the development of matter, astronomical structures, elemental diversity, stars, and planets that can exist long enough for life to emerge and evolve.

The weak anthropic principle could then be applied to conclude that we (as conscious beings) would only exist in one of those few universes that happened to be finely tuned, permitting the existence of life with developed consciousness. Thus, while the probability might be extremely small that any particular universe would have the requisite conditions for life (as we understand life), those conditions do not require intelligent design as an explanation for the conditions in the Universe that promote our existence in it.

An early form of this reasoning is evident in Arthur Schopenhauer's 1844 work "Von der Nichtigkeit und dem Leiden des Lebens", where he argues that our world must be the worst of all possible worlds, because if it were significantly worse in any respect it could not continue to exist.

Occam's razor

Proponents and critics disagree about how to apply Occam's razor. Critics argue that to postulate an almost infinite number of unobservable universes, just to explain our own universe, is contrary to Occam's razor. However, proponents argue that in terms of Kolmogorov complexity the proposed multiverse is simpler than a single idiosyncratic universe.

For example, multiverse proponent Max Tegmark argues:

[A]n entire ensemble is often much simpler than one of its members. This principle can be stated more formally using the notion of algorithmic information content. The algorithmic information content in a number is, roughly speaking, the length of the shortest computer program that will produce that number as output. For example, consider the set of all integers. Which is simpler, the whole set or just one number? Naively, you might think that a single number is simpler, but the entire set can be generated by quite a trivial computer program, whereas a single number can be hugely long. Therefore, the whole set is actually simpler... (Similarly), the higher-level multiverses are simpler. Going from our universe to the Level I multiverse eliminates the need to specify initial conditions, upgrading to Level II eliminates the need to specify physical constants, and the Level IV multiverse eliminates the need to specify anything at all... A common feature of all four multiverse levels is that the simplest and arguably most elegant theory involves parallel universes by default. To deny the existence of those universes, one needs to complicate the theory by adding experimentally unsupported processes and ad hoc postulates: finite space, wave function collapse and ontological asymmetry. Our judgment therefore comes down to which we find more wasteful and inelegant: many worlds or many words. Perhaps we will gradually get used to the weird ways of our cosmos and find its strangeness to be part of its charm.

— Max Tegmark

Possible worlds and real worlds

In any given set of possible universes – e.g. in terms of histories or variables of nature – not all may be ever realized, and some may be realized many times. For example, over infinite time there could, in some potential theories, be infinite universes, but only a small or relatively small real number of universes where humanity could exist and only one where it ever does exist (with a unique history). It has been suggested that a universe that "contains life, in the form it has on Earth, is in a certain sense radically non-ergodic, in that the vast majority of possible organisms will never be realized". On the other hand, some scientists, theories and popular works conceive of a multiverse in which the universes are so similar that humanity exists in many equally real separate universes but with varying histories.

There is a debate about whether the other worlds are real in the many-worlds interpretation (MWI) of quantum mechanics. In Quantum Darwinism one does not need to adopt a MWI in which all of the branches are equally real.

Possible worlds are a way of explaining probability and hypothetical statements. Some philosophers, such as David Lewis, posit that all possible worlds exist and that they are just as real as the world we live in. This position is known as modal realism.

Hypothetical technology

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Hypothetical_technology ...