Science policy thus deals with the entire domain of issues that
involve science. A large and complex web of factors influences the
development of science and engineering that includes government science
policymakers, private firms (including both national and multi-national
firms), social movements, media, non-governmental organizations,
universities, and other research institutions. In addition, science
policy is increasingly international as defined by the global operations
of firms and research institutions as well as by the collaborative
networks of non-governmental organizations and of the nature of
scientific inquiry itself.
Statepolicy has influenced the funding of public works and science for thousands of years, dating at least from the time of the Mohists, who inspired the study of logic during the period of the Hundred Schools of Thought, and the study of defensive fortifications during the Warring States period
in China. General levies of labor and grain were collected to fund
great public works in China, including the accumulation of grain for
distribution in times of famine, for the building of levees to control flooding by the great rivers of
China, for the building of canals and locks to connect rivers of China,
some of which flowed in opposite directions to each other, and for the building of bridges across these rivers. These projects required a civil service, the scholars, some of whom demonstrated great mastery of hydraulics.
In Italy, Galileo noted that individual taxation of minute
amounts could fund large sums to the State, which could then fund his
research on the trajectory of cannonballs, noting that "each individual
soldier was being paid from coin collected by a general tax of pennies
and farthings, while even a million of gold would not suffice to pay the
entire army."
In Great Britain, Lord ChancellorSir Francis Bacon
had a formative effect on science policy with his identification of
"experiments of ... light, more penetrating into nature [than what
others know]", which today we call the crucial experiment. Governmental approval of the Royal Society recognized a scientific community which exists to this day. British prizes for research spurred the development of an accurate, portable chronometer, which directly enabled reliable navigation and sailing on the high seas, and also funded Babbage's computer.
Public policy can directly affect the funding of capital equipment, intellectual infrastructure for industrial research, by providing tax incentives to those organizations who fund research. Vannevar Bush,
director of the office of scientific research and development for the
U.S. government in July 1945, wrote "Science is a proper concern of
government" Vannevar Bush directed the forerunner of the National Science Foundation, and his writings directly inspired researchers to invent the hyperlink and the computer mouse. The DARPA initiative to support computing was the impetus for the Internet Protocol stack. In the same way that scientific consortiums like CERN for high-energy physics
have a commitment to public knowledge, access to this public knowledge
in physics led directly to CERN's sponsorship of development of the World Wide Web and standard Internet access for all.
Philosophies of science policy
Basic versus applied research
The programs that are funded are often divided into four basic categories: basic research, applied research, development, and facilities and equipment. Translational research is a newer concept that seeks to bridge the gap between basic science and practical applications.
Basic science attempts to stimulate breakthroughs. Breakthroughs
often lead to an explosion of new technologies and approaches. Once the
basic result is developed, it is widely published; however conversion
into a practical product is left for the free market. However, many
governments have developed risk-taking research and development
organizations to take basic theoretical research over the edge into
practical engineering. In the U.S., this function is performed by DARPA.
In contrast, technology development is a policy in which
engineering, the application of science, is supported rather than basic
science. The emphasis is usually given to projects that increase important strategic or commercial engineering knowledge.The most extremesuccess story is undoubtedly the Manhattan Project that developed nuclear weapons. Another remarkable success story was the "X-vehicle" studies that gave the US a lasting lead in aerospace technologies.
These exemplify two disparate approaches: The Manhattan Project
was huge, and spent freely on the most risky alternative approaches. The
project members believed that failure would result in their enslavement
or destruction by Nazi Germany.
Each X-project built an aircraft whose only purpose was to develop a
particular technology. The plan was to build a few cheap aircraft of
each type, fly a test series, often to the destruction of an aircraft,
and never design an aircraft for a practical mission. The only mission
was technology development.[11]
A number of high-profile technology developments have failed. The US Space Shuttle
failed to meet its cost or flight schedule goals. Most observers
explain the project as over constrained: the cost goals too aggressive,
the technology and mission too underpowered and undefined.
The Japanese fifth generation computer systems project met every technological goal, but failed to produce commercially important artificial intelligence. Many observers believe that the Japanese tried to force engineering beyond available
science by brute investment. Half the amount spent on basic research
rather might have produced ten times the result.
Utilitarian versus monumental science policy
Utilitarian policies prioritize scientific projects that significantly reduce suffering
for larger numbers of people. This approach would mainly consider the
numbers of people that can be helped by a research policy. Research is
more likely to be supported when it costs less and has greater benefits.
Utilitarian research often pursues incremental improvements rather
than dramatic advancements in knowledge, or break-through solutions,
which are more commercially viable.
In contrast, monumental science is a policy in which science is
supported for the sake of a greater understanding of the universe,
rather than for specific short-term practical goals. This designation
covers both large projects, often with large facilities, and smaller
research that does not have obvious practical applications and are often
overlooked. While these projects may not always have obvious practical
outcomes, they provide education of future scientists, and advancement
of scientific knowledge of lasting worth about the basic building blocks
of science.
Practical outcomes do result from many of these "monumental"
science programs. Sometimes these practical outcomes are foreseeable
and sometimes they are not. A classic example of a monumental science
program focused towards a practical outcome is the Manhattan Project. An example of a monumental science program that produces unexpected practical outcome is the laser.
Coherent light, the principle behind lasing, was first predicted by
Einstein in 1916, but not created until 1954 by Charles H. Townes with
the maser.
The breakthrough with the maser led to the creation of the laser in
1960 by Theodore Maiman. The delay between the theory of coherent light
and the production of the laser was partially due to the assumption
that it would be of no practical use.
Scholastic conservation
This
policy approach prioritizes efficiently teaching all available science
to those who can use it, rather than investing in new science. In
particular, the goal is not to lose any existing knowledge, and
to find new practical ways to apply the available knowledge. The
classic success stories of this method occurred in the 19th century U.S.
land-grant universities, which established a strong tradition of
research in practical agricultural and engineering methods. More
recently, the Green Revolution
prevented mass famine over the last thirty years. The focus,
unsurprisingly, is usually on developing a robust curriculum and
inexpensive practical methods to meet local needs.
By country
Most
developed countries usually have a specific national body overseeing
national science (including technology and innovation) policy. Many
developing countries follow the same fashion. Many governments of developed countries provide considerable funds (primarily to universities) for scientific research (in fields such as physics and geology) as well as social science research (in fields such as economics and history).
Much of this is not intended to provide concrete results that may be
commercialisable, although research in scientific fields may lead to
results that have such potential. Most university research is aimed at
gaining publication in peer reviewedacademic journals.
A funding body is an organisation that provides research funding in the form of research grants or scholarships.
Research councils are funding bodies that are government-funded
agencies engaged in the support of research in different disciplines and
postgraduate funding. Funding from research councils is typically
competitive. As a general rule, more funding is available in science and
engineering disciplines than in the arts and social sciences.
In Brazil, two important research agencies are the National Council for Scientific and Technological Development
(CNPq, Portuguese: Conselho Nacional de Desenvolvimento Científico e
Tecnológico), an organization of the Brazilian federal government under
the Ministry of Science and Technology, and São Paulo Research Foundation
(FAPESP, Portuguese: Fundação de Amparo à Pesquisa do Estado de São
Paulo), a public foundation located in the state of São Paulo, Brazil.
European Union
The science policy of the European Union is carried out through the European Research Area,
a system which integrates the scientific resources of member nations
and acts as a "common market" for research and innovation. The European
Union's executive body, the European Commission, has a Directorate-General for Research, which is responsible for the Union's science policy. In addition, the Joint Research Centre provides independent scientific and technical advice to the European Commission and Member States of the European Union (EU) in support of EU policies. There is also the recently established European Research Council, the first European Union funding body set up to support investigator-driven research.
The European environmental research and innovation policy
addresses global challenges of pivotal importance for the well-being of
European citizens within the context of sustainable development and
environmental protection. Research and innovation in Europe is
financially supported by the programme Horizon 2020, which is also open to participation worldwide.
Research
funding by the Government of India comes from a number of sources. For
basic science and technology research, these include the Council for
Scientific and Industrial Research (CSIR), Department of Science and
Technology (DST), and University Grants Commission (UGC). For medical
research, these include the Indian Council for Medical Research (ICMR),
CSIR, DST and Department of Biotechnology (DBT). For applied research,
these include the CSIR, DBT and Science and Engineering Research Council
(SERC).
Other funding authorities are the Defence Research Development
Organisation (DRDO), the Indian Council of Agricultural Research (ICAR),
the Indian Space Research Organisation (ISRO), the Department of Ocean
Development (DOD), the Indian Council for Social Science Research
(ICSSR), and the Ministry of Environment and Forests (MEF) etc.
The
Government of Pakistan has mandated that a certain percentage of gross
revenue generated by all telecom service providers be allocated to
development and research of information and communication technologies.
The National ICT R&D Fund was established in January 2007.
Russia
Under the Soviet Union, much research was routinely suppressed.
Now science in Russia is supported by state and private funds. From the state: the Russian Humanitarian Scientific Foundation (http://www.rfh.ru), the Russian Foundation for Basic Research (www.rfbr.ru), the Russian Science Foundation (http://rscf.ru)
Sri Lanka
Science and Technology Policy Research Division (STPRD) of the National Science Foundation (NSF), which was established as a statutory body, through an Act of the Parliament of Sri Lanka,
is engaged in providing evidence based policy recommendations for
policy formulation on science, technology and other fields ensuring the
research/innovation eco-system of the country. Accordingly, the Division
undertake science, technology and innovation policy research in the
areas of importance to make recommendations for policy formulation.
Besides NSF, the national experts, researchers, public universities and non-governmental bodies like National Academy of Sciences of Sri Lanka (NASSL), also provides expert advice on policy matters to the Government.
Switzerland
Swiss research funding agencies include the Swiss National Science Foundation (SNSF), the innovation promotion agency CTI (CTI/KTI), Ressortforschung des Bundes , and Eidgenössische Stiftungsaufsicht.
In the United Kingdom, the Haldane principle,
that decisions about what to spend research funds on should be made by
researchers rather than politicians, is still influential in research
policy. There are several university departments with a focus on science
policy, such as the Science Policy Research Unit. There are seven grant-awarding Research Councils:
The United States
has a long history of government support for science and technology.
Science policy in the United States is the responsibility of many
organizations throughout the federal government. Much of the large-scale policy is made through the legislative budget process of enacting the yearly federal budget.
Further decisions are made by the various federal agencies which spend
the funds allocated by Congress, either on in-house research or by
granting funds to outside organizations and researchers.
Research funding agencies in the United States are spread among many different departments, which include:
Determinism is the metaphysical view that all events within the universe (or multiverse) can occur only in one possible way. Deterministic theories throughout the history of philosophy have
developed from diverse and sometimes overlapping motives and
considerations. Like eternalism, determinism focuses on particular events rather than the future as a concept. Determinism is often contrasted with free will, although some philosophers argue that the two are compatible.The antonym of determinism is indeterminism, the view that events are not deterministically caused.
Historically, debates about determinism have involved many
philosophical positions and given rise to multiple varieties or
interpretations of determinism. One topic of debate concerns the scope
of determined systems. Some philosophers have maintained that the entire
universe is a single determinate system, while others identify more
limited determinate systems. Another common debate topic is whether
determinism and free will can coexist; compatibilism and incompatibilism represent the opposing sides of this debate.
Determinism should not be confused with the self-determination
of human actions by reasons, motives, and desires. Determinism is about
interactions which affect cognitive processes in people's lives. It is about the cause and the result of what people have done. Cause
and result are always bound together in cognitive processes. It assumes
that if an observer has sufficient information about an object or human
being, then such an observer might be able to predict every consequent
move of that object or human being. Determinism rarely requires that
perfect prediction be practically possible.
Varieties
Determinism may commonly refer to any of the following viewpoints:
Causal
Causal determinism, sometimes synonymous with historical determinism (a sort of path dependence), is "the idea that every event is necessitated by antecedent events and conditions together with the laws of nature." However, it is a broad enough term to consider that:
...One's
deliberations, choices, and actions will often be necessary links in
the causal chain that brings something about. In other words, even
though our deliberations, choices, and actions are themselves determined
like everything else, it is still the case, according to causal
determinism, that the occurrence or existence of yet other things
depends upon our deliberating, choosing and acting in a certain way.
Causal
determinism proposes that there is an unbroken chain of prior
occurrences stretching back to the origin of the universe. The relation
between events and the origin of the universe may not be specified.
Causal determinists believe that there is nothing in the universe that
has no cause or is self-caused.
Causal determinism has also been considered more generally as the idea
that everything that happens or exists is caused by antecedent
conditions. In the case of nomological determinism, these conditions are considered
events also, implying that the future is determined completely by
preceding events—a combination of prior states of the universe and the
laws of nature. These conditions can also be considered metaphysical in origin (such as in the case of theological determinism).
Many philosophical theories of determinism frame themselves with the idea that reality follows a sort of predetermined path.
Nomological
Nomological determinism is the most common form of causal determinism and is generally synonymous with physical determinism. This is the notion that the past and the present dictate the future
entirely and necessarily by rigid natural laws and that every occurrence
inevitably results from prior events. Nomological determinism is
sometimes illustrated by the thought experiment of Laplace's demon. Laplace
posited that an omniscient observer, knowing with infinite precision
all the positions and velocities of every particle in the universe,
could predict the future entirely. Ernest Nagel viewed determinism in terms of a physical state, declaring a theory to be deterministic if it predicts a state at other times uniquely from values at one given time.
Necessitarianism
Necessitarianism is a metaphysical principle that denies all mere possibility and maintains that there is only one possible way for the world to exist. Leucippus claimed there are no uncaused events and that everything occurs for a reason and by necessity.
Predeterminism
Predeterminism is the idea that all events are determined in advance. The concept is often argued by invoking causal determinism, implying that there is an unbroken chain of prior occurrences
stretching back to the origin of the universe. In the case of
predeterminism, this chain of events has been pre-established, and human
actions cannot interfere with the outcomes of this pre-established
chain.
Predeterminism can be categorized as a specific type of determinism when it is used to mean pre-established causal determinism. It can also be used interchangeably with causal determinism—in the context of its capacity to determine future events. However, predeterminism is often considered as independent of causal determinism.
Biological
The term predeterminism is also frequently used in the context of biology and heredity, in which case it represents a form of biological determinism, sometimes called genetic determinism. Biological determinism is the idea that all human behaviors, beliefs, and desires are fixed by human genetic nature.
Friedrich Nietzsche explained that human beings are "determined" by their bodies and are subject to its passions, impulses, and instincts.
Fatalism
Fatalism is normally distinguished from determinism, as a form of teleological determinism. Fatalism is the idea that everything is fated to happen, resulting in humans having no control over their future. Fate has arbitrary power, and does not necessarily follow any causal or deterministic laws. Types of fatalism include hard theological determinism and the idea of predestination, where there is a God
who determines all that humans will do. This may be accomplished
through either foreknowledge of their actions, achieved through omniscience or by predetermining their actions.
Theological
Theological determinism is a form of determinism that holds that all events that happen are either preordained (i.e., predestined) to happen by a monotheisticdeity, or are destined to occur given its omniscience. Two forms of theological determinism exist, referred to as strong and weak theological determinism.
Strong theological determinism is based on the concept of a creator deity
dictating all events in history: "everything that happens has been
predestined to happen by an omniscient, omnipotent divinity."
Weak theological determinism is based on the concept of divine
foreknowledge—"because God's omniscience is perfect, what God knows
about the future will inevitably happen, which means, consequently, that
the future is already fixed." There exist slight variations on this categorization, however. Some
claim either that theological determinism requires predestination of all
events and outcomes by the divinity—i.e., they do not classify the
weaker version as theological determinism unless libertarian free will is assumed to be denied as a consequence—or that the weaker version does not constitute theological determinism at all.
With respect to free will, "theological determinism is the thesis
that God exists and has infallible knowledge of all true propositions
including propositions about our future actions", more minimal criteria
designed to encapsulate all forms of theological determinism.
Theological determinism can also be seen as a form of causal
determinism, in which the antecedent conditions are the nature and will
of God. Some have asserted that Augustine of Hippo
introduced theological determinism into Christianity in 412 CE, whereas
all prior Christian authors supported free will against Stoic and
Gnostic determinism.[26] However, there are many Biblical passages that seem to support the idea of some kind of theological determinism.
Stephen Hawking
explained that the microscopic world of quantum mechanics is one of
determined probabilities. That is, nature is not governed by laws that
determine the future with certainty but by laws that determine the
probability of various futures.
Many-worlds interpretation
The many-worlds interpretation
of quantum mechanics accepts the linear causal sets of sequential
events with adequate consistency yet also suggests constant forking of
causal chains that can in principle be globally deterministic. Meaning the causal set of events leading to the present are all valid
yet appear as a singular linear time stream within a much broader unseen
conic probability field of other outcomes that "split off" from the
locally observed timeline. Under this model causal sets are still
"consistent" yet not exclusive to singular iterated outcomes.
The interpretation sidesteps the exclusive retrospective causal
chain problem of "could not have done otherwise" by suggesting "the
other outcome does exist" in a set of parallel states of the universe
that (in one version) split off in any interacting event. This
interpretation is sometimes described with the example of agent-based
choices.
Philosophical varieties
Nature/nurture controversy
Although some of the above forms of determinism concern human behaviors and cognition, others frame themselves as an answer to the debate on nature and nurture.
They will suggest that one factor will entirely determine behavior. As
scientific understanding has grown, however, the strongest versions of
these theories have been widely rejected as a single-cause fallacy. In other words, the modern deterministic theories attempt to explain how the interaction of both nature and nurture is entirely predictable. The concept of heritability has been helpful in making this distinction.
Biological determinism, sometimes called genetic determinism, is the idea that each of human behaviors, beliefs, and desires are fixed by human genetic nature.
Behaviorism involves the idea that all behavior can be traced to specific causes—either environmental or reflexive. John B. Watson and B. F. Skinner developed this nurture-focused determinism.
Cultural materialism, contends that the physical world impacts and sets constraints on human behavior.
Other "deterministic" theories actually seek only to highlight the importance of a particular
factor in predicting the future. These theories often use the factor as
a sort of guide or constraint on the future. They need not suppose that
complete knowledge of that one factor would allow the making of perfect
predictions.
Psychological determinism can mean that humans must act according to reason, but it can also be synonymous with some sort of psychological egoism. The latter is the view that humans will always act according to their perceived best interest.
Linguistic determinism proposes that language determines (or at least limits) the things that humans can think and say and thus know. The Sapir–Whorf hypothesis argues that individuals experience the world based on the grammatical structures they habitually use.
Technological determinism is the theory that a society's technology drives the development of its social structure and cultural values.
Structural
Structural
determinism is the philosophical view that actions, events, and
processes are predicated on and determined by structural factors. Given any particular structure or set of estimable components, it is a
concept that emphasizes rational and predictable outcomes. Chilean
biologists Humberto Maturana and Francisco Varela
popularized the notion, writing that a living system's general order is
maintained via a circular process of ongoing self-referral, and thus
its organization and structure defines the changes it undergoes. According to the authors, a system can undergo changes of state
(alteration of structure without loss of identity) or disintegrations
(alteration of structure with loss of identity). Such changes or
disintegrations are not ascertained by the elements of the disturbing
agent, as each disturbance will only trigger responses in the respective
system, which in turn, are determined by each system's own structure.
On an individualistic
level, what this means is that human beings as free and independent
entities are triggered to react by external stimuli or change in
circumstance. However, their own internal state and existing physical
and mental capacities determine their responses to those triggers. On a
much broader societal level, structural determinists believe that larger
issues in the society—especially those pertaining to minorities and
subjugated communities—are predominantly assessed through existing
structural conditions, making change of prevailing conditions difficult,
and sometimes outright impossible. For example, the concept has been
applied to the politics of race in the United States of America and other Western countries such as the United Kingdom and Australia, with structural determinists lamenting structural factors for the prevalence of racism in these countries. Additionally, Marxists have conceptualized the writings of Karl Marx within the context of structural determinism as well. For example, Louis Althusser, a structural Marxist,
argued that the state, in its political, economic, and legal
structures, reproduces the discourse of capitalism, in turn, allowing
for the burgeoning of capitalistic structures.
Proponents of the notion highlight the usefulness of structural
determinism to study complicated issues related to race and gender, as
it highlights often gilded structural conditions that block meaningful
change. Critics call it too rigid, reductionist and inflexible. Additionally,
they also criticize the notion for overemphasizing deterministic forces
such as structure over the role of human agency and the ability of the
people to act. These critics argue that politicians, academics, and
social activists have the capability to bring about significant change
despite stringent structural conditions.
Philosophers have debated both the truth of determinism, and the
truth of free will. This creates the four possible positions in the
figure. Compatibilism refers to the view that free will is, in some sense, compatible with determinism. The three incompatibilist positions deny this possibility. The hard incompatibilists hold that free will is incompatible with both determinism and indeterminism, the libertarians that determinism does not hold, and free will might exist, and the hard determinists that determinism does hold and free will does not exist. The Dutch philosopher Baruch Spinoza
was a determinist thinker, and argued that human freedom can be
achieved through knowledge of the causes that determine desire and
affections. He defined human servitude as the state of bondage of anyone
who is aware of their own desires, but ignorant of the causes that
determined them. However, the free or virtuous person becomes capable,
through reason and knowledge, to be genuinely free, even as they are
being "determined". For the Dutch philosopher, acting out of one's own
internal necessity is genuine freedom
while being driven by exterior determinations is akin to bondage.
Spinoza's thoughts on human servitude and liberty are respectively
detailed in the fourthand fifth volumes of his work Ethics.
The standard argument against free will, according to philosopher J. J. C. Smart, focuses on the implications of determinism for free will. He suggests free will is denied whether determinism is true or not. He
says that if determinism is true, all actions are predicted and no one
is assumed to be free; however, if determinism is false, all actions are
presumed to be random and as such no one seems free because they have
no part in controlling what happens.
With the soul
Some determinists argue that materialism
does not present a complete understanding of the universe, because
while it can describe determinate interactions among material things, it
ignores the minds or souls of conscious beings.
Mecca Chiesa notes that the probabilistic or selectionistic determinism of B. F. Skinner comprised a wholly separate conception of determinism that was not mechanistic
at all. Mechanistic determinism assumes that every event has an
unbroken chain of prior occurrences, but a selectionistic or
probabilistic model does not.
Western tradition
In the West, some elements of determinism have been expressed in Greece from the 6th century BCE by the PresocraticsHeraclitus and Leucippus. The first notions of determinism appears to originate with the Stoics, as part of their theory of universal causal determinism. The resulting philosophical debates, which involved the confluence of
elements of Aristotelian Ethics with Stoic psychology, led in the
1st–3rd centuries CE in the works of Alexander of Aphrodisias to the first recorded Western debate over determinism and freedom, an issue that is known in theology as the paradox of free will. The writings of Epictetus as well as middle Platonist and early Christian thought were instrumental in this development. Jewish philosopher Moses Maimonides said of the deterministic implications of an omniscient god: "Does God know or does He not know that a certain individual will be
good or bad? If thou sayest 'He knows', then it necessarily follows that
[that] man is compelled to act as God knew beforehand he would act,
otherwise God's knowledge would be imperfect."
Newtonian mechanics
Determinism in the West is often associated with Newtonian mechanics/physics,
which depicts the physical matter of the universe as operating
according to a set of fixed laws. The "billiard ball" hypothesis, a
product of Newtonian physics, argues that once the initial conditions of
the universe have been established, the rest of the history of the
universe follows inevitably. If it were actually possible to have
complete knowledge of physical matter and all of the laws governing that
matter at any one time, then it would be theoretically possible to
compute the time and place of every event that will ever occur (Laplace's demon).
In this sense, the basic particles of the universe operate in the same
fashion as the rolling balls on a billiard table, moving and striking
each other in predictable ways to produce predictable results.
Whether or not it is all-encompassing in so doing, Newtonian
mechanics deals only with caused events; for example, if an object
begins in a known position and is hit dead on by an object with some
known velocity, then it will be pushed straight toward another
predictable point. If it goes somewhere else, the Newtonians argue, one
must question one's measurements of the original position of the object,
the exact direction of the striking object, gravitational or other
fields that were inadvertently ignored, etc. Then, they maintain,
repeated experiments and improvements in accuracy will always bring
one's observations closer to the theoretically predicted results. When
dealing with situations on an ordinary human scale, Newtonian physics
has been successful. But it fails as velocities become some substantial
fraction of the speed of light and when interactions at the atomic scale are studied. Before the discovery of quantum
effects and other challenges to Newtonian physics, "uncertainty" was
always a term that applied to the accuracy of human knowledge about
causes and effects, and not to the causes and effects themselves.
Newtonian mechanics, as well as any following physical theories,
are results of observations and experiments, and so they describe "how
it all works" within a tolerance. However, old western scientists
believed if there are any logical connections found between an observed
cause and effect, there must be also some absolute natural laws behind.
Belief in perfect natural laws driving everything, instead of just
describing what we should expect, led to searching for a set of
universal simple laws that rule the world. This movement significantly
encouraged deterministic views in Western philosophy, as well as the related theological views of classical pantheism.
The views on the interaction of karma and free will are numerous, and diverge from each other. For example, in Sikhism,
god's grace, gained through worship, can erase one's karmic debts, a
belief which reconciles the principle of karma with a monotheistic god
one must freely choose to worship. Jainists believe in compatibilism,
in which the cycle of Saṃsara is a completely mechanistic process,
occurring without any divine intervention. The Jains hold an atomic view
of reality, in which particles of karma form the fundamental
microscopic building material of the universe.
Ājīvika
In ancient India, the Ājīvika school of philosophy founded by Makkhali Gosāla (around 500 BCE), otherwise referred to as "Ājīvikism" in Western scholarship, upheld the Niyati ("Fate") doctrine of absolute fatalism or determinism, which negates the existence of free will and karma, and is therefore considered one of the nāstika or "heterodox" schools of Indian philosophy. The oldest descriptions of the Ājīvika fatalists and their founder Gosāla can be found both in the Buddhist and Jaina scriptures of ancient India. The predetermined fate of all sentient beings and the impossibility to achieve liberation (mokṣa) from the eternal cycle of birth, death, and rebirth (saṃsāra) was the major distinctive philosophical and metaphysical doctrine of this heterodox school of Indian philosophy,annoverated among the other Śramaṇa movements that emerged in India during the Second urbanization (600–200 BCE).
Buddhist philosophy
contains several concepts which some scholars describe as deterministic
to various levels. However, the direct analysis of Buddhist metaphysics
through the lens of determinism is difficult, due to the differences
between European and Buddhist traditions of thought.
One concept which is argued to support a hard determinism is the doctrine of dependent origination (pratītyasamutpāda) in the early Buddhist texts, which states that all phenomena (dharma) are necessarily caused by some other phenomenon, which it can be said to be dependent
on, like links in a massive, never-ending chain; the basic principle is
that all things (dharmas, phenomena, principles) arise in dependence
upon other things, which means that they are fundamentally "empty" or devoid of any intrinsic, eternal essence and therefore are impermanent. In traditional Buddhist philosophy, this concept is used to explain the functioning of the eternal cycle of birth, death, and rebirth (saṃsāra); all thoughts and actions exert a karmic force that attaches to the individual's consciousness, which will manifest through reincarnation and results in future lives. In other words, righteous or unrighteous actions in one life will
necessarily cause good or bad responses in another future life or more
lives. The early Buddhist texts and later Tibetan Buddhist scriptures associate dependent arising with the fundamental Buddhist doctrines of emptiness (śūnyatā) and non-self (anattā).
Another Buddhist concept which many scholars perceive to be deterministic is the doctrine of non-self (anattā). In Buddhism, attaining enlightenment involves one realizing that neither in humans nor any other sentient beings
there is a fundamental core of permanent being, identity, or
personality which can be called the "soul", and that all sentient beings
(including humans) are instead made of several, constantly changing factors which bind them to the eternal cycle of birth, death, and rebirth (saṃsāra). Sentient beings are composed of the five aggregates of existence (skandha): matter, sensation, perception, mental formations, and consciousness. In the Saṃyutta Nikāya of the Pāli Canon, the historical Buddha
is recorded as saying that "just as the word 'chariot' exists on the
basis of the aggregation of parts, even so the concept of 'being' exists
when the five aggregates are available." The early Buddhist texts outline different ways in which dependent
origination is a middle way between different sets of "extreme" views
(such as "monist" and "pluralist" ontologies or materialist and dualist views of mind-body relation). In the Kaccānagotta Sutta of the Pāli Canon (SN 12.15, parallel at SA 301), the historical Buddha
stated that "this world mostly relies on the dual notions of existence
and non-existence" and then explains the right view as follows:
But when you truly see the origin
of the world with right understanding, you won't have the notion of
non-existence regarding the world. And when you truly see the cessation
of the world with right understanding, you won't have the notion of
existence regarding the world.
Some Western scholars argue that the concept of non-self necessarily disproves the ideas of free will and moral responsibility. If there is no autonomous self, in this view, and all events are
necessarily and unchangeably caused by others, then no type of autonomy
can be said to exist, moral or otherwise. However, other scholars disagree, claiming that the Buddhist conception of the universe allows for a form of compatibilism. Buddhism perceives reality occurring on two different levels: the ultimate reality, which can only be truly understood by the enlightened ones, and the illusory or false reality of the material world, which is considered to be "real" or "true" by those who are ignorant about the nature of metaphysical reality; i.e., those who still haven't achieved enlightenment. Therefore, Buddhism perceives free will as a notion belonging to the illusory belief in the unchanging self or personhood
that pertains to the false reality of the material world, while
concepts like non-self and dependent origination belong to the ultimate
reality; the transition between the two can be truly understood,
Buddhists claim, by one who has attained enlightenment.
Although it was once thought by scientists that any indeterminism in
quantum mechanics occurred at too small a scale to influence biological
or neurological systems, there is indication that nervous systems are influenced by quantum indeterminism due to chaos theory. It is unclear what implications this has for the problem of free will given various possible reactions to the problem in the first place. Many biologists do not grant determinism: Christof Koch, for instance, argues against it, and in favour of libertarian free will, by making arguments based on generative processes (emergence). Other proponents of emergentist or generative philosophy, cognitive sciences, and evolutionary psychology, argue that a certain form of determinism (not necessarily causal) is true. They suggest instead that an illusion of free will is experienced due
to the generation of infinite behaviour from the interaction of
finite-deterministic set of rules and parameters.
Thus the unpredictability of the emerging behaviour from deterministic
processes leads to a perception of free will, even though free will as
an ontological entity does not exist.
An animation of Conway's Game of Life, where the interaction of just four simple rules creates patterns that seem somehow "alive"
As an illustration, the strategy board-games chess and Go have rigorous rules in which no information (such as cards' face-values) is hidden from either player and no random
events (such as dice-rolling) happen within the game. Yet, chess and
especially Go with its extremely simple deterministic rules, can still
have an extremely large number of unpredictable moves. When chess is
simplified to 7 or fewer pieces, however, endgame tables are available
that dictate which moves to play to achieve a perfect game. This implies
that, given a less complex environment (with the original 32 pieces
reduced to 7 or fewer pieces), a perfectly predictable game of chess is
possible. In this scenario, the winning player can announce that a
checkmate will happen within a given number of moves, assuming a perfect
defense by the losing player, or fewer moves if the defending player
chooses sub-optimal moves as the game progresses into its inevitable,
predicted conclusion. By this analogy, it is suggested, the experience
of free will emerges from the interaction of finite rules and
deterministic parameters that generate nearly infinite and practically
unpredictable behavioural responses. In theory, if all these events
could be accounted for, and there were a known way to evaluate these
events, the seemingly unpredictable behaviour would become predictable. Another hands-on example of generative processes is John Horton Conway's playable Game of Life. Nassim Taleb is wary of such models, and coined the term "ludic fallacy."
Compatibility with the existence of science
Certain philosophers of science
argue that, while causal determinism (in which everything including the
brain/mind is subject to the laws of causality) is compatible with
minds capable of science, fatalism and predestination is not. These
philosophers make the distinction that causal determinism means that
each step is determined by the step before and therefore allows sensory
input from observational data to determine what conclusions the brain
reaches, while fatalism in which the steps between do not connect an
initial cause to the results would make it impossible for observational
data to correct false hypotheses. This is often combined with the
argument that if the brain had fixed views and the arguments were mere
after-constructs with no causal effect on the conclusions, science would
have been impossible and the use of arguments would have been a
meaningless waste of energy with no persuasive effect on brains with
fixed views.
Mathematical models
Many mathematical models of physical systems are deterministic. This is true of most models involving differential equations
(notably, those measuring rate of change over time). Mathematical
models that are not deterministic because they involve randomness are
called stochastic. Because of sensitive dependence on initial conditions,
some deterministic models may appear to behave non-deterministically;
in such cases, a deterministic interpretation of the model may not be
useful due to numerical instability and a finite amount of precision
in measurement. Such considerations can motivate the consideration of a
stochastic model even though the underlying system is governed by
deterministic equations.
Since the beginning of the 20th century, quantum mechanics—the
physics of the extremely small—has revealed previously concealed aspects
of events. Before that, Newtonian physics—the physics of everyday life—dominated. Taken in isolation (rather than as an approximation
to quantum mechanics), Newtonian physics depicts a universe in which
objects move in perfectly determined ways. At the scale where humans
exist and interact with the universe, Newtonian mechanics remain useful,
and make relatively accurate predictions (e.g. calculating the
trajectory of a bullet). But whereas in theory, absolute knowledge
of the forces accelerating a bullet would produce an absolutely
accurate prediction of its path, modern quantum mechanics casts
reasonable doubt on this main thesis of determinism.
This doubt takes radically different forms. The observed results of quantum mechanics are random but various interpretations of quantum mechanics
make different assumptions about determinism which cannot be
distinguished experimentally. The standard interpretation widely used by
physicists is not deterministic, but the other interpretations have
been devised which are deterministic.
Standard quantum mechanics
These
are five of the infinitely many paths available for a particle to move
from point A at time t to point B at time t’(>t).
Quantum mechanics is the product of a careful application of the scientific method, logic and empiricism.
Through a large number of careful experiments physicists developed a
rather unintuitive mental model: A particle's path cannot be specified
in from its quantum description. "Path" is a classical, practical
attribute in everyday life, but one that quantum particles do not
possess. Quantum mechanics attributes probability to all possible paths
and asserts the only one outcome will be observed.
The randomness in quantum mechanics derives from the quantum
aspect of the model. Different experimental results are obtained for
each individual quanta. Only the probability can predicted.As Stephen Hawking explains, the result is not traditional determinism, but rather determined probabilities. As far as the thesis of determinism is concerned, these probabilities, at least, are quite determined.
Although
it is not possible to predict the arrival position or time for any
particle, probabilities of arrival predict the final pattern of events.
On the topic of predictable probabilities, the double-slit experiments are a popular example. Photons
are fired one-by-one through a double-slit apparatus at a distant
screen. They do not arrive at any single point, nor even the two points
lined up with the slits (the way it might be expected of bullets fired
by a fixed gun at a distant target). Instead, the photons arrive in
varying concentrations and times across the screen, and only the final
distribution of photons can be predicted. In that sense the behavior of
light in this apparatus is predictable, but there is no way to predict
where or when in the resulting interference pattern any single photon will make its contribution.
Some (including Albert Einstein) have argued that the inability to predict any more than probabilities is simply due to ignorance. The idea is that, beyond the conditions and laws can be observed or deduced, there are also hidden factors or "hidden variables" that determine absolutely
in which order photons reach the detector screen. They argue that the
course of the universe is absolutely determined, but that humans are
screened from knowledge of the determinative factors. So, they say, it
only appears that things proceed in a probabilistically way.
John S. Bell analyzed Einstein's work in his famous Bell's theorem,
which demonstrates that quantum mechanics can makes statistical
predictions that would be violated if local hidden variables really
existed. Many experiments have verified the quantum predictions.
Other interpretations
Bell's theorem only applies to local
hidden variables. Quantum mechanics can be formulated with non-local
hidden variables to achieve a deterministic theory that is in agreement
with experiment. An example is the Bohm interpretation
of quantum mechanics. Bohm's Interpretation, though, violates special
relativity and it is highly controversial whether or not it can be
reconciled without giving up on determinism.
The Many worlds interpretation focuses on the deterministic nature of the Schrodinger's equation.
For any closed system, including the entire universe, the wavefunction
solutions to this equation evolve deterministically. The apparent
randomness of observations corresponds to branching of the wavefunction,
with one world for each possible outcome.
Another foundational assumption to quantum mechanics is that of free will, which has been argued to be foundational to the scientific method as a whole. Bell acknowledged that abandoning this assumption would both allow for the maintenance of determinism as well as locality. This perspective is known as superdeterminism, and is defended by some physicists such as Sabine Hossenfelder and Tim Palmer.
More advanced variations on these arguments include quantum contextuality, by Bell, Simon B. Kochen and Ernst Specker,
which argues that hidden variable theories cannot be "sensible",
meaning that the values of the hidden variables inherently depend on the
devices used to measure them.
This debate is relevant because there are possibly specific
situations in which the arrival of an electron at a screen at a certain
point and time would trigger one event, whereas its arrival at another
point would trigger an entirely different event (e.g. see Schrödinger's cat—a thought experiment used as part of a deeper debate).
In his 1939 address "The Relation between Mathematics and Physics", Paul Dirac
pointed out that purely deterministic classical mechanics cannot
explain the cosmological origins of the universe; today the early
universe is modeled quantum mechanically.
Nevertheless, the question of determinism in modern physics remains debated. On one hand, Albert Einstein's theory of relativity,
which represents an advancement over Newtonian mechanics, is based on a
deterministic framework. On the other hand, Einstein himself resisted
the indeterministic view of quantum mechanics, as evidenced by his
famous debates with Niels Bohr, which continued until his death.
Moreover, chaos theory
highlights that even within a deterministic framework, the ability to
precisely predict the evolution of a system is often limited. A
deterministic system may appear random: two apparently identical
starting points can result in vastly different results. Such dynamical systems are sensitive to initial conditions.
Even if the universe followed a strict deterministic order, the human
capacity to predict every event and comprehend all underlying causes
would still be constrained this kind of sensitivity.
Adequate determinism (see Varieties, above) is the reason that Stephen Hawking called libertarian free will "just an illusion".