Search This Blog

Wednesday, July 6, 2022

Inductivism

From Wikipedia, the free encyclopedia

Inductivism is the traditional and still commonplace philosophy of scientific method to develop scientific theories. Inductivism aims to neutrally observe a domain, infer laws from examined cases—hence, inductive reasoning—and thus objectively discover the sole naturally true theory of the observed.

Inductivism's basis is, in sum, "the idea that theories can be derived from, or established on the basis of, facts". Evolving in phases, inductivism's conceptual reign spanned four centuries since Francis Bacon's 1620 proposal of such against Western Europe's prevailing model, scholasticism, which reasoned deductively from preconceived beliefs.[

In the 19th and 20th centuries, inductivism succumbed to hypotheticodeductivism—sometimes worded deductivism—as scientific method's realistic idealization. Yet scientific theories as such are now widely attributed to occasions of inference to the best explanation, IBE, which, like scientists' actual methods, are diverse and not formally prescribable.

Philosophers' debates

Inductivist endorsement

Francis Bacon, articulating inductivism in England, is often falsely stereotyped as a naive inductivist. Crudely explained, the "Baconian model" advises to observe nature, propose a modest law that generalizes an observed pattern, confirm it by many observations, venture a modestly broader law, and confirm that, too, by many more observations, while discarding disconfirmed laws. Growing ever broader, the laws never quite exceed observations. Scientists, freed from preconceptions, thus gradually uncover nature's causal and material structure. Newton's theory of universal gravitation—modeling motion as an effect of a force—resembled inductivism's paramount triumph.

Near 1740, David Hume, in Scotland, identified multiple obstacles to inferring causality from experience. Hume noted the formal illogicality of enumerative induction—unrestricted generalization from particular instances to all instances, and stating a universal law—since humans observe sequences of sensory events, not cause and effect. Perceiving neither logical nor natural necessity or impossibility among events, humans tacitly postulate uniformity of nature, unproved. Later philosophers would select, highlight, and nickname Humean principles—Hume's fork, the problem of induction, and Hume's law—although Hume respected and accepted the empirical sciences as inevitably inductive, after all.

Immanuel Kant, in Germany, alarmed by Hume's seemingly radical empiricism, identified its apparent opposite, rationalism, in Descartes, and sought a middleground. Kant intuited that necessity exists, indeed, bridging the world in itself to human experience, and that it is the mind, having innate constants that determine space, time, and substance, and thus ensure the empirically correct physical theory's universal truth. Thus shielding Newtonian physics by discarding scientific realism, Kant's view limited science to tracing appearances, mere phenomena, never unveiling external reality, the noumena. Kant's transcendental idealism launched German idealism, a group of speculative metaphysics.

While philosophers widely continued awkward confidence in empirical sciences as inductive, John Stuart Mill, in England, proposed five methods to discern causality, how genuine inductivism purportedly exceeds enumerative induction. In the 1830s, opposing metaphysics, Auguste Comte, in France, explicated positivism, which, unlike Bacon's model, emphasizes predictions, confirming them, and laying scientific laws, irrefutable by theology or metaphysics. Mill, viewing experience as affirming uniformity of nature and thus justifying enumerative induction, endorsed positivism—the first modern philosophy of science—which, also a political philosophy, upheld scientific knowledge as the only genuine knowledge.

Inductivist repudiation

Nearing 1840, William Whewell, in England, deemed the inductive sciences not so simple, and argued for recognition of "superinduction", an explanatory scope or principle invented by the mind to unite facts, but not present in the facts. John Stuart Mill rejected Whewell's hypotheticodeductivism as science's method. Whewell believed it to sometimes, upon the evidence, potentially including unlikely signs, including consilience, render scientific theories that are probably true metaphysically. By 1880, C S Peirce, in America, clarified the basis of deductive inference and, although acknowledging induction, proposed a third type of inference. Peirce called it "abduction", now termed inference to the best explanation, IBE.

The logical positivists arose in the 1920s, rebuked metaphysical philosophies, accepted hypotheticodeductivist theory origin, and sought to objectively vet scientific theories—or any statement beyond emotive—as provably false or true as to merely empirical facts and logical relations, a campaign termed verificationism. In its milder variant, Rudolf Carnap tried, but always failed, to find an inductive logic whereby a universal law's truth via observational evidence could be quantified by "degree of confirmation". Karl Popper, asserting a strong hypotheticodeductivism since the 1930s, attacked inductivism and its positivist variants, then in 1963 called enumerative induction "a myth", a deductive inference from a tacit explanatory theory. In 1965, Gilbert Harman explained enumerative induction as a masked IBE.

Thomas Kuhn's 1962 book, a cultural landmark, explains that periods of normal science as but paradigms of science are each overturned by revolutionary science, whose radical paradigm becomes the normal science new. Kuhn's thesis dissolved logical positivism's grip on Western academia, and inductivism fell. Besides Popper and Kuhn, other postpositivist philosophers of science—including Paul Feyerabend, Imre Lakatos, and Larry Laudan—have all but unanimously rejected inductivism. Those who assert scientific realism—which interprets scientific theory as reliably and literally, if approximate, true regarding nature's unobservable aspects—generally attribute new theories to IBE. And yet IBE, which, so far, cannot be trained, lacks particular rules of inference. By the 21st century's turn, inductivism's heir was Bayesianism.

Scientific methods

From the 17th to the 20th centuries, inductivism was widely conceived as scientific method's ideal. Even at the 21st century's turn, popular presentations of scientific discovery and progress naively, erroneously suggested it. The 20th was the first century producing more scientists than philosopherscientists. Earlier scientists, "natural philosophers," pondered and debated their philosophies of method. Einstein remarked, "Science without epistemology is—in so far as it is thinkable at all—primitive and muddled".

Particularly after the 1960s, scientists became unfamiliar with the historical and philosophical underpinnings of their won research programs, and often unfamiliar with logic. Scientists thus often struggle to evaluate and communicate their own work against question or attack or to optimize methods and progress. In any case, during the 20th century, philosophers of science accepted that scientific method's truer idealization is hypotheticodeductivism, which, especially in its strongest form, Karl Popper's falsificationism, is also termed deductivism.

Inductivism

Inductivism infers from observations of similar effects to similar causes, and generalizes unrestrictedly—that is, by enumerative induction—to a universal law.

Extending inductivism, Comtean positivism explicitly aims to oppose metaphysics, shuns imaginative theorizing, emphasizes observation, then making predictions, confirming them, and stating laws.

Logical positivism would accept hypotheticodeductivsm in theory development, but sought an inductive logic to objectively quantity a theory's confirmation by empirical evidence and, additionally, objectively compare rival theories.

Confirmation

Whereas a theory's proof—were such possible—may be termed verification. A theory's support is termed confirmation. But to reason from confirmation to verification—If A, then B; in fact B, and so A—is the deductive fallacy called "affirming the consequent." Inferring the relation A to B implies the relation B to A supposes, for instance, "If the lamp is broken, then the room will be dark, and so the room's being dark means the lamp is broken." Even if B holds, A could be due to X or Y or Z, or to XYZ combined. Or the sequence A and then B could be consequence of U—utterly undetected—whereby B always trails A by constant conjunction instead of by causation. Maybe, in fact, U can cease, disconnecting A from B.

Disconfirmation

A natural deductive reasoning form is logically valid without postulates and true by simply the principle of nonselfcontradiction. "Denying the consequent" is a natural deduction—If A, then B; not B, so not A—whereby one can logically disconfirm the hypothesis A. Thus, there also is eliminative induction, using this

Determination

At least logically, any phenomenon can host multiple, conflicting explanations—the problem of underdetermination—why inference from data to theory lacks any formal logic, any deductive rules of inference. A counterargument is the difficulty of finding even one empirically adequate theory. Still, however difficult to attain one, one after another has been replaced by a radically different theory, the problem of unconceived alternatives. In the meantime, many confirming instances of a theory's predictions can occur even if many of the theory's other predictions are false.

Scientific method cannot ensure that scientists will imagine, much less will or even can perform, inquiries or experiments inviting disconfirmations. Further, any data collection projects a horizon of expectation—how even objective facts, direct observations, are laden with theory—whereby incompatible facts may go unnoticed. And the experimenter's regress permits disconfirmation to be rejected by inferring that unnoticed entities or aspects unexpectedly altered the test conditions. A hypothesis can be tested only conjoined to countless auxiliary hypotheses, mostly neglected until disconfirmation.

Deductivism

In hypotheticodeductivism, the HD model, one introduces some explanation or principle from any source, such as imagination or even a dream, infers logical consequences of it—that is, deductive inferences—and compares those with observations, perhaps experimental. In simple or Whewellian hypotheticodeductivism, one might accept a theory as metaphysically true or probably true if its predictions display certain traits that appear doubtful of a false theory.

In Popperian hypotheticodeductivism, sometimes called falsificationism, although one aims for a true theory, one's main tests of the theory are efforts to empirically refute it. Falsification's main value on confirmations is when testing risky predictions that seem likeliest to fail. If the theory's bizarre prediction is empirically confirmed, then the theory is strongly corroborated, but, never upheld as metaphysically true, it is granted simply verisimilitude, the appearance of truth and thus a likeness to truth.

Inductivist reign

Francis Bacon introduced inductivism—and Isaac Newton soon emulated it—in England of the 17th century. In the 18th century, David Hume, in Scotland, raised scandal by philosophical skepticism at inductivism's rationality, whereas Immanuel Kant, in a German state, deflected Hume's fork, as it were, to shield Newtonian physics as well as philosophical metaphysics, but in the feat implied that science could at best reflect and predict observations, structured by the mind. Kant's metaphysics led Hegel's metaphysics, which Karl Marx transposed from spiritual to material and others gave it a nationalist reading.

Auguste Comte, in France of the early 19th century, opposing metaphysics, introducing positivism as, in essence, refined inductivism and a political philosophy. The contemporary urgency of the positivists and of the neopositivists—the logical positivists, emerging in Germany and Vienna in World War I's aftermath, and attenuating into the logical empiricists in America and England after World War II—reflected the sociopolitical climate of their own eras. The philosophers perceived dire threats to society via metaphysical theories, which associated with religious, sociopolitical, and thereby social and military conflicts.

Bacon

In 1620 in England, Francis Bacon's treatise Novum Organum alleged that scholasticism's Aristotelian method of deductive inference via syllogistic logic upon traditional categories was impeding society's progress. Admonishing allegedly classic induction for inferring straight from "sense and particulars up to the most general propositions" and then applying the axioms onto new particulars without empirically verifying them, Bacon stated the "true and perfect Induction". In Bacon's inductivist method, a scientist, until the late 19th century a natural philosopher, ventures an axiom of modest scope, makes many observations, accepts the axiom if it is confirmed and never disconfirmed, then ventures another axiom only modestly broader, collects many more observations, and accepts that axiom, too, only if it is confirmed, never disconfirmed.

In Novus Organum, Bacon uses the term hypothesis rarely, and usually uses it in pejorative senses, as prevalent in Bacon's time. Yet ultimately, as applied, Bacon's term axiom is more similar now to the term hypothesis than to the term law. By now, a law are nearer to an axiom, a rule of inference. By the 20th century's close, historians and philosophers of science generally agreed that Bacon's actual counsel was far more balanced than it had long been stereotyped, while some assessment even ventured that Bacon had described falsificationism, presumably as far from inductivism as one can get. In any case, Bacon was not a strict inductivist and included aspects of hypotheticodeductivism, but those aspects of Bacon's model were neglected by others, and the "Baconian model" was regarded as true inductivism—which it mostly was.

In Bacon's estimation, during this repeating process of modest axiomatization confirmed by extensive and minute observations, axioms expand in scope and deepen in penetrance tightly in accord with all the observations. This, Bacon proposed, would open a clear and true view of nature as it exists independently of human preconceptions. Ultimately, the general axioms concerning observables would render matter's unobservable structure and nature's causal mechanisms discernible by humans. But, as Bacon provides no clear way to frame axioms, let alone develop principles or theoretical constructs universally true, researchers might observe and collect data endlessly. For this vast venture, Bacon's advised precise record keeping and collaboration among researchers—a vision resembling today's research institutes—while the true understanding of nature would permit technological innovation, heralding a New Atlantis.

Newton

Modern science arose against Aristotelian physics. Geocentric were both Aristotelian physics and Ptolemaic astronomy, which latter was a basis of astrology, a basis of medicine. Nicolaus Copernicus proposed heliocentrism, perhaps to better fit astronomy to Aristotelian physics' fifth element—the universal essence, or quintessence, the aether—whose intrinsic motion, explaining celestial observations, was perpetual, perfect circles. Yet Johannes Kepler modified Copernican orbits to ellipses soon after Galileo Galilei's telescopic observations disputed the Moon's composition by aether, and Galilei's experiments with earthly bodies attacked Aristotelian physics. Galilean principles were subsumed by René Descartes, whose Cartesian physics structured his Cartesian cosmology, modeling heliocentrism and employing mechanical philosophy. Mechanical philosophy's first principle, stated by Descartes, was No action at a distance. Yet it was British chemist Robert Boyle who imparted, here, the term mechanical philosophy. Boyle sought for chemistry, by way of corpuscularism—a Cartesian hypothesis that matter is particulate but not necessarily atomic—a mechanical basis and thereby a divorce from alchemy.

In 1666, Isaac Newton fled London from the plague.[30] Isolated, he applied rigorous experimentation and mathematics, including development of calculus, and reduced both terrestrial motion and celestial motion—that is, both physics and astronomy—to one theory stating Newton's laws of motion, several corollary principles, and law of universal gravitation, set in a framework of postulated absolute space and absolute time. Newton's unification of celestial and terrestrial phenomena overthrew vestiges of Aristotelian physics, and disconnected physics from chemistry, which each then followed its own course. Newton became the exemplar of the modern scientist, and the Newtonian research program became the modern model of knowledge. Although absolute space, revealed by no experience, and a force acting at a distance discomforted Newton, he and physicists for some 200 years more would seldom suspect the fictional character of the Newtonian foundation, as they believed not that physical concepts and laws are "free inventions of the human mind", as Einstein in 1933 called them, but could be inferred logically from experience. Supposedly, Newton maintained that toward his gravitational theory, he had "framed" no hypotheses.

Hume

At 1740, Hume sorted truths into two, divergent categories—"relations of ideas" versus "matters of fact and real existence"—as later termed Hume's fork. "Relations of ideas", such as the abstract truths of logic and mathematics, known true without experience of particular instances, offer a priori knowledge. Yet the quests of empirical science concern "matters of fact and real existence", known true only through experience, thus a posteriori knowledge. As no number of examined instances logically entails the conformity of unexamined instances, a universal law's unrestricted generalization bears no formally logical basis, but one justifies it by adding the principle uniformity of nature—itself unverified—thus a major induction to justify a minor induction. This apparent obstacle to empirical science was later termed the problem of induction.

For Hume, humans experience sequences of events, not cause and effect, by pieces of sensory data whereby similar experiences might exhibit merely constant conjunctionfirst an event like A, and always an event like B—but there is no revelation of causality to reveal either necessity or impossibility. Although Hume apparently enjoyed the scandal that trailed his explanations, Hume did not view them as fatal, and interpreted enumerative induction to be among the mind's unavoidable customs, required in order for one to live. Rather, Hume sought to counter Copernican displacement of humankind from the Universe's center, and to redirect intellectual attention to human nature as the central point of knowledge.

Hume proceeded with inductivism not only toward enumerative induction but toward unobservable aspects of nature, too. Not demolishing Newton's theory, Hume placed his own philosophy on par with it, then. Though skeptical at common metaphysics or theology, Hume accepted "genuine Theism and Religion" and found a rational person must believe in God to explain the structure of nature and order of the universe. Still, Hume had urged, "When we run over libraries, persuaded of these principles, what havoc must we make? If we take into our hand any volume—of divinity or school metaphysics, for instance—let us ask, Does it contain any abstract reasoning concerning quantity or number? No. Does it contain any experimental reasoning concerning matter of fact and existence? No. Commit it then to the flames, for it can contain nothing but sophistry and illusion".

Kant

Awakened from "dogmatic slumber" by Hume's work, Immanuel Kant sought to explain how metaphysics is possible. Kant's 1781 book introduced the distinction rationalism, whereby some knowledge results not by empiricism, but instead by "pure reason". Concluding it impossible to know reality in itself, however, Kant discarded the philosopher's task of unveiling appearance to view the noumena, and limited science to organizing the phenomena. Reasoning that the mind contains categories organizing sense data into the experiences substance, space, and time, Kant thereby inferred uniformity of nature, after all, in the form of a priori knowledge.

Kant sorted statements, rather, into two types, analytic versus synthetic. The analytic, true by their terms' arrangement and meanings, are tautologies, mere logical truths—thus true by necessity—whereas the synthetic apply meanings toward factual states, which are contingent. Yet some synthetic statements, presumably contingent, are necessarily true, because of the mind, Kant argued. Kant's synthetic a priori, then, buttressed both physics—at the time, Newtonian—and metaphysics, too, but discarded scientific realism. This realism regards scientific theories as literally true descriptions of the external world. Kant's transcendental idealism triggered German idealism, including G F W Hegel's absolute idealism.

Positivism

Comte

In the French Revolution's aftermath, fearing Western society's ruin again, Auguste Comte was fed up with metaphysics. As suggested in 1620 by Francis Bacon, developed by Saint-Simon, and promulgated in the 1830s by his former student Comte, positivism was the first modern philosophy of science. Human knowledge had evolved from religion to metaphysics to science, explained Comte, which had flowed from mathematics to astronomy to physics to chemistry to biology to sociology—in that order—describing increasingly intricate domains, all of society's knowledge having become scientific, whereas questions of theology and of metaphysics remained unanswerable, Comte argued. Comte considered, enumerative induction to be reliable, upon the basis of experience available, and asserted that science's proper use is improving human society, not attaining metaphysical truth.

According to Comte, scientific method constrains itself to observations, but frames predictions, confirms these, rather, and states laws—positive statements—irrefutable by theology and by metaphysics, and then lays the laws as foundation for subsequent knowledge. Later, concluding science insufficient for society, however, Comte launched Religion of Humanity, whose churches, honoring eminent scientists, led worship of humankind. Comte coined the term altruism, and emphasized science's application for humankind's social welfare, which would be revealed by Comte's spearheaded science, sociology. Comte's influence is prominent in Herbert Spencer of England and in Émile Durkheim of France, both establishing modern empirical, functionalist sociology. Influential in the latter 19th century, positivism was often linked to evolutionary theory, yet was eclipsed in the 20th century by neopositivism: logical positivism or logical empiricism.

Mill

J S Mill thought, unlike Comte, that scientific laws were susceptible to recall or revision. And Mill abstained from Comte's Religion of Humanity. Still, regarding experience to justify enumerative induction by having shown, indeed, the uniformity of nature, Mill commended Comte's positivism. Mill noted that within the empirical sciences, the natural sciences had well surpassed the alleged Baconian model, too simplistic, whereas the human sciences, such ethics and political philosophy, lagged even Baconian scrutiny of immediate experience and enumerative induction. Similarly, economists of the 19th century tended to pose explanations a priori, and reject disconfirmation by posing circuitous routes of reasoning to maintain their a priori laws. In 1843, Mill's A System of Logic introduced Mill's methods: the five principles whereby causal laws can be discerned to enhance the empirical sciences as, indeed, the inductive sciences. For Mill, all explanations have the same logical structure, while society can be explained by natural laws.

Social

In the 17th century, England, with Isaac Newton and industrialization, led in science. In the 18th century, France led, particularly in chemistry, as by Antoine Lavoisier. During the 19th century, French chemists were influential, like Antoine Béchamp and Louis Pasteur, who inaugurated biomedicine, yet Germany gained the lead in science, by combining physics, physiology, pathology, medical bacteriology, and applied chemistry. In the 20th, America led. These shifts influenced each country's contemporary, envisioned roles for science.

Before Germany's lead in science, France's was upended by the first French Revolution, whose Reign of Terror beheaded Lavoisier, reputedly for selling diluted beer, and led to Napoleon's wars. Amid such crisis and tumult, Auguste Comte inferred that society's natural condition is order, not change. As in Saint-Simon's industrial utopianism, Comte's vision, as later upheld by modernity, positioned science as the only objective true knowledge and thus also as industrial society's secular spiritualism, whereby science would offer political and ethical guide.

Positivism reached Britain well after Britain's own lead in science had ended. British positivism, as witnessed in Victorian ethics of utilitarianism—for instance, J S Mill's utilitarianism and later in Herbert Spencer's social evolutionism—associated science with moral improvement, but rejected science for political leadership. For Mill, all explanations held the same logical structure—thus, society could be explained by natural laws—yet Mill criticized "scientific politics". From its outset, then, sociology was pulled between moral reform versus administrative policy.

Herbert Spencer helped popularize the word sociology in England, and compiled vast data aiming to infer general theory through empirical analysis. Spencer's 1850 book Social Statics shows Comtean as well as Victorian concern for social order. Yet whereas Comte's social science was a social physics, as it were, Spencer took biology—later by way of Darwinism, so called, which arrived in 1859—as the model of science, a model for social science to emulate. Spencer's functionalist, evolutionary account identified social structures as functions that adapt, such that analysis of them would explain social change.

In France, Comte's sociology influence shows with Émile Durkheim, whose Rules for the Sociological Method, 1895, likewise posed natural science as sociology's model. For Durkheim, social phenomena are functions without psychologism—that is, operating without consciousness of individuals—while sociology is antinaturalist, in that social facts differ from natural facts. Still, per Durkheim, social representations are real entities observable, without prior theory, by assessing raw data. Durkheim's sociology was thus realist and inductive, whereby theory would trail observations while scientific method proceeds from social facts to hypotheses to causal laws discovered inductively.

Logical

World War erupted in 1914 and closed in 1919 with a treaty upon reparations that British economist John Maynard Keynes immediately, vehemently predicted would crumble German society by hyperinflation, a prediction fulfilled by 1923. Via the solar eclipse of May, 29, 1919, Einstein's gravitational theory, confirmed in its astonishing prediction, apparently overthrew Newton's gravitional theory. This revolution in science was bitterly resisted by many scientists, yet was completed nearing 1930. Not yet dismissed as pseudoscience, race science flourished, overtaking medicine and public health, even in America, with excesses of negative eugenics. In the 1920s, some philosophers and scientists were appalled by the flaring nationalism, racism, and bigotry, yet perhaps no less by the countermovements toward metaphysics, intuitionism, and mysticism.

Also optimistic, some of the appalled German and Austrian intellectuals were inspired by breakthroughs in philosophy, mathematics, logic, and physics, and sought to lend humankind a transparent, universal language competent to vet statements for either logical truth or empirical truth, no more confusion and irrationality. In their envisioned, radical reform of Western philosophy to transform it into scientific philosophy, they studied exemplary cases of empirical science in their quest to turn philosophy into a special science, like biology and economics. The Vienna Circle, including Otto Neurath, was led by Moritz Schlick, and had converted to the ambitious program by its member Rudolf Carnap, whom the Berlin Circle's leader Hans Reichenbach had introduced to Schlick. Carl Hempel, who had studied under Reichenbach, and would be a Vienna Circle alumnus, would later lead the movement from America, which, along with England, received emigration of many logical positivists during Hitler's regime.

The Berlin Circle and the Vienna Circle became called—or, soon, were often stereotyped as—the logical positivists or, in a milder connotation, the logical empiricists or, in any case, the neopositivists. Rejecting Kant's synthetic a priori, they asserted Hume's fork. Staking it at the analytic/synthetic gap, they sought to dissolve confusions by freeing language from "pseudostatements". And appropriating Ludwig Wittgenstein's verifiability criterion, many asserted that only statements logically or empirically verifiable are cognitively meaningful, whereas the rest are merely emotively meaningful. Further, they presumed a semantic gulf between observational terms versus theoretical terms. Altogether, then, many withheld credence from science's claims about nature's unobservable aspects. Thus rejecting scientific realism, many embraced instrumentalism, whereby scientific theory is simply useful to predict human observations, while sometimes regarding talk of unobservables as either metaphorical or meaningless.

Pursuing both Bertrand Russell's program of logical atomism, which aimed to deconstruct language into supposedly elementary parts, and Russell's endeavor of logicism, which would reduce swaths of mathematics to symbolic logic, the neopositivists envisioned both everyday language and mathematics—thus physics, too—sharing a logical syntax in symbolic logic. To gain cognitive meaningfulness, theoretical terms would be translated, via correspondence rules, into observational terms—thus revealing any theory's actually empirical claims—and then empirical operations would verify them within the observational structure, related to the theoretical structure through the logical syntax. Thus, a logical calculus could be operated to objectively verify the theory's falsity or truth. With this program termed verificationism, logical positivists battled the Marburg school's neoKantianism, Husserlian phenomenology, and, as their very epitome of philosophical transgression, Heidegger's "existential hermeneutics", which Carnap accused of the most flagrant "pseudostatements".

Opposition

In friendly spirit, the Vienna Circle's Otto Neurath nicknamed Karl Popper, a fellow philosopher in Vienna, "the Official Opposition". Popper asserted that any effort to verify a scientific theory, or even to inductively confirm a scientific law, is fundamentally misguided. Popper asserted that although exemplary science is not dogmatic, science inevitably relies on "prejudices". Popper accepted Hume's criticism—the problem of induction—as revealing verification to be impossible.

Popper accepted hypotheticodeductivism, sometimes termed it deductivism, but restricted it to denying the consequent, and thereby, refuting verificationism, reframed it as falsificationism. As to law or theory, Popper held confirmation of probable truth to be untenable, as any number confirmations is finite: empirical evidence approaching 0% probability of truth amid a universal law's predictive run to infinity. Popper even held that a scientific theory is better if its truth appears most improbable. Logical positivism, Popper asserted, "is defeated by its typically inductivist prejudice".

Problems

Having highlighted Hume's problem of induction, John Maynard Keynes posed logical probability to answer it—but then figured not quite. Bertrand Russell held Keynes's book A Treatise on Probability as induction's best examination, and if read with Jean Nicod's Le Probleme logique de l'induction as well as R B Braithwaite's review of that in the October 1925 issue of Mind, to provide "most of what is known about induction", although the "subject is technical and difficult, involving a good deal of mathematics".

Rather than validate enumerative induction—the futile task of showing it a deductive inference—some sought simply to vindicate it. Herbert Feigl as well as Hans Reichenbach, apparently independently, thus sought to show enumerative induction simply useful, either a "good" or the "best" method for the goal at hand, making predictions. Feigl posed it as a rule, thus neither a priori nor a posteriori but a fortiori. Reichenbach's treatment, similar to Pascal's wager, posed it as entailing greater predictive success versus the alternative of not using it.

In 1936, Rudolf Carnap switched the goal of scientific statements' verification, clearly impossible, to the goal of simply their confirmation. Meanwhile, similarly, ardent logical positivist A J Ayer identified two types of verification—strong versus weak—the strong being impossible, but the weak being attained when the statement's truth is probable. In such mission, Carnap sought to apply probability theory to formalize inductive logic by discovering an algorithm that would reveal "degree of confirmation". Employing abundant logical and mathematical tools, yet never attaining the goal, Carnap's formulations of inductive logic always held a universal law's degree of confirmation at zero.

Kurt Gödel's incompleteness theorem of 1931 made the logical positivists' logicism, or reduction of mathematics to logic, doubtful. But then Alfred Tarski's undefinability theorem of 1934 made it hopeless. Some, including logical empiricist Carl Hempel, argued for its possibility, anyway. After all, nonEuclidean geometry had shown that even geometry's truth via axioms occurs among postulates, by definition unproved. Meanwhile, as to mere formalism, rather, which coverts everyday talk into logical forms, but does not reduce it to logic, neopositivists, though accepting hypotheticodeductivist theory development, upheld symbolic logic as the language to justify, by verification or confirmation, its results. But then Hempel's paradox of confirmation highlighted that formalizing confirmatory evidence of the hypothesized, universal law All ravens are black—implying All nonblack things are not ravens—formalizes defining a white shoe, in turn, as a case confirming All ravens are black.

Early criticism

During the 1830s and 1840s, the French Auguste Comte and the British J S Mill were the leading philosophers of science. Debating in the 1840s, J S Mill claimed that science proceeds by inductivism, whereas William Whewell, also British, claimed that it proceeds by hypotheticodeductivism.

Whewell

William Whewell found the "inductive sciences" not so simple, but, amid the climate of esteem for inductivism, described "superinduction". Whewell proposed recognition of "the peculiar import of the term Induction", as "there is some Conception superinduced upon the facts", that is, "the Invention of a new Conception in every inductive inference". Rarely spotted by Whewell's predecessors, such mental inventions rapidly evade notice. Whewell explains,

"Although we bind together facts by superinducing upon them a new Conception, this Conception, once introduced and applied, is looked upon as inseparably connected with the facts, and necessarily implied in them. Having once had the phenomena bound together in their minds in virtue of the Conception, men can no longer easily restore them back to detached and incoherent condition in which they were before they were thus combined".

Once one observes the facts, "there is introduced some general conception, which is given, not by the phenomena, but by the mind". Whewell this called this "colligation", uniting the facts with a "hypothesis"—an explanation—that is an "invention" and a "conjecture". In fact, one can colligate the facts via multiple, conflicting hypotheses. So the next step is testing the hypothesis. Whewell seeks, ultimately, four signs: coverage, abundance, consilience, and coherence.

First, the idea must explain all phenomena that prompted it. Second, it must predict more phenomena, too. Third, in consilience, it must be discovered to encompass phenomena of a different type. Fourth, the idea must nest in a theoretical system that, not framed all at once, developed over time and yet became simpler meanwhile. On these criteria, the colligating idea is naturally true, or probably so. Although devoting several chapters to "methods of induction" and mentioned "logic of induction", Whewell stressed that the colligating "superinduction" lacks rules and cannot be trained. Whewell also held that Bacon, not a strict inductivist, "held the balance, with no partial or feeble hand, between phenomena and ideas".

Peirce

As Kant had noted in 1787, the theory of deductive inference had not progressed since antiquity. In the 1870s, C S Peirce and Gottlob Frege, unbeknownst to one another, revolutionized deductive logic through vast efforts identifying it with mathematical proof. An American who originated pragmatism—or, since 1905, pragmaticism, distinguished from more recent appropriations of his original term—Peirce recognized induction, too, but continuously insisted on a third type of inference that Pierce variously termed abduction, or retroduction, or hypothesis, or presumption. Later philosophers gave Peirce's abduction, and so on, the synonym inference to the best explanation, or IBE. Many philosophers of science later espousing scientific realism have maintained that IBE is how scientists develop approximately true scientific theories about nature.

Inductivist fall

After defeat of National Socialism via World War II in 1945, logical positivists lost their revolutionary zeal and led Western academia's philosophy departments to develop the niche philosophy of science, researching such riddles of scientific method, theories, knowledge, and so on. The movement shifted, thus, into a milder variant bettered termed logical empiricism or, but still a neopositivism, led principally by Rudolf Carnap, Hans Reichenbach, and Carl Hempel.

Amid increasingly apparent contradictions in neopositivism's central tenets—the verifiability principle, the analytic/synthetic division, and the observation/theory gap—Hempel in 1965 abandoned the program a far wider conception of "degrees of significance". This signaled neopositivism's official demise. Neopositivism became mostly maligned, while credit for its fall generally has gone to W V O Quine and to Thomas S Kuhn, although its "murder" had been prematurely confessed to by Karl R Popper in the 1930s.

Fuzziness

Willard Van Orman Quine's 1951 paper "Two dogmas of empiricism"—explaining semantic holism, whereby any term's meaning draws from the speaker's beliefs about the whole world—cast Hume's fork, which posed the analytic/synthetic division as unbridgeable, as itself untenable. Among verificationism's greatest internal critics, Carl Hempel had recently concluded that the verifiability criterion, too, is untenable, as it would cast not only religious assertions and metaphysical statements, but even scientific laws of universal type as cognitively meaningless.

In 1958, Norwood Hanson's book Patterns of Discovery subverted the putative gap between observational terms and theoretical terms, a putative gap whereby direct observation would permit neutral comparison of rival theories. Hanson explains that even direct observations, the scientific facts, are laden with theory, which guides the collection, sorting, prioritization, and interpretation of direct observations, and even shapes the researcher's ability to apprehend a phenomenon. Meanwhile, even as to general knowledge, Quine's thesis eroded foundationalism, which retreated to modesty.

Revolutions

The Structure of Scientific Revolutions, by Thomas Kuhn, 1962, was first published in the International Encyclopedia of Unified Science—a project begun by logical positivists—and somehow, at last, unified the empirical sciences by withdrawing the physics model, and scrutinizing them via history and sociology. Lacking such heavy use of mathematics and logic's formal language—an approach introduced in the Vienna Circle's Rudolf Carnap in the 1920s—Kuhn's book, powerful and persuasive, used in natural language open to laypersons.

Structure explains science as puzzlesolving toward a vision projected by the "ruling class" of a scientific specialty's community, whose "unwritten rulebook" dictates acceptable problems and solutions, altogether normal science. The scientists reinterpret ambiguous data, discard anomalous data, and try to stuff nature into the box of their shared paradigm—a theoretical matrix or fundamental view of nature—until compatible data become scarce, anomalies accumulate, and scientific "crisis" ensues. Newly training, some young scientists defect to revolutionary science, which, simultaneously explaining both the normal data and the anomalous data, resolves the crisis by setting a new "exemplar" that contradicts normal science.

Kuhn explains that rival paradigms, having incompatible languages, are incommensurable. Trying to resolve conflict, scientists talk past each other, as even direct observations—for example, that the Sun is "rising"—get fundamentally conflicting interpretations. Some working scientists convert by a perspectival shift that—to their astonishment—snaps the new paradigm, suddenly obvious, into view. Others, never attaining such gestalt switch, remain holdouts, committed for life to the old paradigm. One by one, holdouts die. Thus, the new exemplar—the new, unwritten rulebook—settles in the new normal science. The old theoretical matrix becomes so shrouded by the meanings of terms in the new theoretical matrix that even philosophers of science misread the old science.

And thus, Kuhn explains, a revolution in science is fulfilled. Kuhn's thesis critically destabilized confidence in foundationalism, which was generally, although erroneously, presumed to be one of logical empiricism's key tenets. As logical empiricism was extremely influential in the social sciences, Kuhn's ideas were rapidly adopted by scholars in disciplines well outside of the natural sciences, where Kuhn's analysis occurs. Kuhn's thesis in turn was attacked, however, even by some of logical empiricism's opponents. In Structure's 1970 postscript, Kuhn asserted, mildly, that science at least lacks an algorithm. On that point, even Kuhn's critics agreed. Reinforcing Quine's assault on logical empiricism, Kuhn ushered American and English academia into postpositivism or postempiricism.

Critical rationalism

Karl Popper's 1959 book The Logic of Scientific Discovery, originally published in German in 1934, reached readers of English at a time when logical empiricism, with its ancestrally verificationist program, was so dominant that a book reviewer mistook it for a new version of verificationism. Instead, Popper's philosophy, later called critical rationalism, fundamentally refuted verificationism. Popper's demarcation principle of falsifiability grants a theory the status of scientific—simply, being empirically testable—not the status of meaningful, a status that Popper did not aim to arbiter. Popper found no scientific theory either verifiable or, as in Carnap's "liberalization of empiricism", confirmable, and found unscientific, metaphysical, ethical, and aesthetic statements often rich in meaning while also underpinning or fueling science as the origin of scientific theories. The only confirmations particularly relevant are those of risky predictions, such as ones conventionally predicted to fail.

Postpositivism

At 1967, historian of philosophy John Passmore concluded, "Logical positivism is dead, or as dead as a philosophical movement ever becomes". Logical positivism, or logical empiricism, or verificationism, or, as the overarching term for this sum movement, neopositivism soon became philosophy of science's bogeyman.

Kuhn's influential thesis was soon attacked for portraying science as irrational—cultural relativism similar to religious experience. Postpositivism's poster became Popper's view of human knowledge as hypothetical, continually growing, always tentative, open to criticism and revision. But then even Popper became unpopular, allegedly unrealistic.

Problem of induction

In 1945, Bertrand Russell had proposed enumerative induction as an "independent logical principle", one "incapable of being inferred either from experience or from other logical principles, and that without this principle, science is impossible". And yet in 1963, Karl Popper declared, "Induction, i.e. inference based on many observations, is a myth. It is neither a psychological fact, nor a fact of ordinary life, nor one of scientific procedure". Popper's 1972 book Objective Knowledge opens, "I think I have solved a major philosophical problem: the problem of induction".

Popper's schema of theory evolution is a superficially stepwise but otherwise cyclical process: Problem1 → Tentative Solution → Critical Test → Error Elimination → Problem2. The tentative solution is improvised, an imaginative leap unguided by inductive rules, and the resulting universal law is deductive, an entailed consequence of all, included explanatory considerations. Popper calls enumerative induction, then, "a kind of optical illusion" that shrouds steps of conjecture and refutation during a problem shift. Still, debate continued over the problem of induction, or whether it even poses a problem to science.

Some have argued that although inductive inference is often obscured by language—as in news reporting that experiments have proved a substance is to be safe—and that enumerative induction ought to be tempered by proper clarification, inductive inference is used liberally in science, that science requires it, and that Popper is obviously wrong. There are, more actually, strong arguments on both sides. Enumerative induction obviously occurs as a summary conclusion, but its literal operation is unclear, as it may, as Popper explains, reflect deductive inference from an underlying, unstated explanation of the observations.

In a 1965 paper now classic, Gilbert Harman explains enumerative induction as a masked effect of what C S Pierce had termed abduction, that is, inference to the best explanation, or IBE. Philosophers of science who espouse scientific realism have usually maintained that IBE is how scientists develop, about the putative mind-independent world, scientific theories approximately true. Thus, calling Popper obviously wrong—since scientists use induction in effort to "prove" their theories true—reflects conflicting semantics. By now, enumerative induction has been shown to exist, but is found rarely, as in programs of machine learning in artificial intelligence. Likewise, machines can be programmed to operate on probabilistic inference of near certainty. Yet sheer enumerative induction is overwhelmingly absent from science conducted by humans. Although much talked of, IBE proceeds by humans' imaginations and creativity without rules of inference, which IBE's discussants provide nothing resembling.

Logical bogeymen

Popperian falsificationism, too, became widely criticized and soon unpopular among philosophers of science. Still, Popper has been the only philosopher of science often praised by scientists. On the other hand, likened to economists of the 19th century who took circuitous, protracted measures to deflect falsification of their own preconceived principles, the verificationists—that is, the logical positivists—became identified as pillars of scientism, allegedly asserting strict inductivism, as well as foundationalism, to ground all empirical sciences to a foundation of direct sensory experience. Rehashing neopositivism's alleged failures became a popular tactic of subsequent philosophers before launching argument for their own views, often built atop misrepresentations and outright falsehoods about neopositivism. Not seeking to overhaul and regulate empirical sciences or their practices, the neopositivists had sought to analyze and understand them, and thereupon overhaul philosophy to scientifically organize human knowledge.

Logical empiricists indeed conceived the unity of science to network all special sciences and to reduce the special sciences' laws—by stating boundary conditions, supplying bridge laws, and heeding the deductivenomological model—to, at least in principle, the fundamental science, that is, fundamental physics. And Rudolf Carnap sought to formalize inductive logic to confirm universal laws through probability as "degree of confirmation". Yet the Vienna Circle had pioneered nonfoundationalism, a legacy especially of its member Otto Neurath, whose coherentism—the main alternative to foundationalism—likened science to a boat that scientists must rebuild at sea without ever touching shore. And neopositivists did not seek rules of inductive logic to regulate scientific discovery or theorizing, but to verify or confirm laws and theories once scientists pose them. Practicing what Popper had preached—conjectures and refutations—neopositivism simply ran its course. So its chief rival, Popper, initially a contentious misfit, emerged from interwar Vienna vindicated.

Scientific anarchy

In the early 1950s, studying philosophy of quantum mechanics under Popper at the London School of Economics, Paul Feyerabend found falsificationism to be not a breakthrough but rather obvious, and thus the controversy over it to suggest instead endemic poverty in the academic discipline philosophy of science. And yet, there witnessing Popper's attacks on inductivism—"the idea that theories can be derived from, or established on the basis of, facts"—Feyerabend was impressed by a Popper talk at the British Society for the Philosophy of Science. Popper showed that higher-level laws, far from reducible to, often conflict with laws supposedly more fundamental.

Popper's prime example, already made by the French classical physicist and philosopher of science Pierre Duhem decades earlier, was Kepler's laws of planetary motion, long famed to be, and yet not actually, reducible to Newton's law of universal gravitation. For Feyerabend, the sham of inductivism was pivotal. Feyerabend investigated, eventually concluding that even in the natural sciences, the unifying method is Anything goes—often rhetoric, circular reasoning, propaganda, deception, and subterfuge—methodological lawlessness, scientific anarchy. At persistent claims that faith in induction is a necessary precondition of reason, Feyerabend's 1987 book sardonically bids Farewell to Reason.

Research programmes

Imre Lakatos deemed Popper's falsificationism neither practiced by scientists nor even realistically practical, but held Kuhn's paradigms of science to be more monopolistic than actual. Lakatos found multiple, vying research programmes to coexist, taking turns at leading in scientific progress.

A research programme stakes a hard core of principles, such as the Cartesian rule No action at a distance, that resists falsification, deflected by a protective belt of malleable theories that advance the hard core via theoretical progress, spreading the hard core into new empirical territories.

Corroborating the new theoretical claims is empirical progress, making the research programme progressive—or else it degenerates. But even an eclipsed research programme may linger, Lakatos finds, and can resume progress by later revisions to its protective belt.

In any case, Lakatos concluded inductivism to be rather farcical and never in the history of science actually practiced. Lakatos alleged that Newton had fallaciously posed his own research programme as inductivist to publicly legitimize itself.

Research traditions

Lakatos's putative methodology of scientific research programmes was criticized by sociologists of science and by some philosophers of science, too, as being too idealized and omitting scientific communities' interplay with the wider society's social configurations and dynamics. Philosopher of science Larry Laudan argued that the stable elements are not research programmes, but rather are research traditions.

Inductivist heir

By the 21st century's turn, Bayesianism had become the heir of inductivism.

Cognitive science

From Wikipedia, the free encyclopedia

Figure illustrating the fields that contributed to the birth of cognitive science, including linguistics, neuroscience, artificial intelligence, philosophy, anthropology, and psychology

Cognitive science is the interdisciplinary, scientific study of the mind and its processes with input from linguistics, psychology, neuroscience, philosophy, computer science/artificial intelligence, and anthropology. It examines the nature, the tasks, and the functions of cognition (in a broad sense). Cognitive scientists study intelligence and behavior, with a focus on how nervous systems represent, process, and transform information. Mental faculties of concern to cognitive scientists include language, perception, memory, attention, reasoning, and emotion; to understand these faculties, cognitive scientists borrow from fields such as linguistics, psychology, artificial intelligence, philosophy, neuroscience, and anthropology. The typical analysis of cognitive science spans many levels of organization, from learning and decision to logic and planning; from neural circuitry to modular brain organization. One of the fundamental concepts of cognitive science is that "thinking can best be understood in terms of representational structures in the mind and computational procedures that operate on those structures."

The goal of cognitive science is to understand and formulate the principles of intelligence with the hope that this will lead to a better comprehension of the mind and of learning. The cognitive sciences began as an intellectual movement in the 1950s often referred to as the cognitive revolution.

History

The cognitive sciences began as an intellectual movement in the 1950s, called the cognitive revolution. Cognitive science has a prehistory traceable back to ancient Greek philosophical texts (see Plato's Meno and Aristotle's De Anima); Modernist philosophers such as Descartes, David Hume, Immanuel Kant, Benedict de Spinoza, Nicolas Malebranche, Pierre Cabanis, Leibniz and John Locke, rejected scholasticism while mostly having never read Aristotle, and they were working with an entirely different set of tools and core concepts than those of the cognitive scientist.

The modern culture of cognitive science can be traced back to the early cyberneticists in the 1930s and 1940s, such as Warren McCulloch and Walter Pitts, who sought to understand the organizing principles of the mind. McCulloch and Pitts developed the first variants of what are now known as artificial neural networks, models of computation inspired by the structure of biological neural networks.

Another precursor was the early development of the theory of computation and the digital computer in the 1940s and 1950s. Kurt Gödel, Alonzo Church, Alan Turing, and John von Neumann were instrumental in these developments. The modern computer, or Von Neumann machine, would play a central role in cognitive science, both as a metaphor for the mind, and as a tool for investigation.

The first instance of cognitive science experiments being done at an academic institution took place at MIT Sloan School of Management, established by J.C.R. Licklider working within the psychology department and conducting experiments using computer memory as models for human cognition.

In 1959, Noam Chomsky published a scathing review of B. F. Skinner's book Verbal Behavior. At the time, Skinner's behaviorist paradigm dominated the field of psychology within the United States. Most psychologists focused on functional relations between stimulus and response, without positing internal representations. Chomsky argued that in order to explain language, we needed a theory like generative grammar, which not only attributed internal representations but characterized their underlying order.

The term cognitive science was coined by Christopher Longuet-Higgins in his 1973 commentary on the Lighthill report, which concerned the then-current state of artificial intelligence research. In the same decade, the journal Cognitive Science and the Cognitive Science Society were founded. The founding meeting of the Cognitive Science Society was held at the University of California, San Diego in 1979, which resulted in cognitive science becoming an internationally visible enterprise. In 1972, Hampshire College started the first undergraduate education program in Cognitive Science, led by Neil Stillings. In 1982, with assistance from Professor Stillings, Vassar College became the first institution in the world to grant an undergraduate degree in Cognitive Science. In 1986, the first Cognitive Science Department in the world was founded at the University of California, San Diego.

In the 1970s and early 1980s, as access to computers increased, artificial intelligence research expanded. Researchers such as Marvin Minsky would write computer programs in languages such as LISP to attempt to formally characterize the steps that human beings went through, for instance, in making decisions and solving problems, in the hope of better understanding human thought, and also in the hope of creating artificial minds. This approach is known as "symbolic AI".

Eventually the limits of the symbolic AI research program became apparent. For instance, it seemed to be unrealistic to comprehensively list human knowledge in a form usable by a symbolic computer program. The late 80s and 90s saw the rise of neural networks and connectionism as a research paradigm. Under this point of view, often attributed to James McClelland and David Rumelhart, the mind could be characterized as a set of complex associations, represented as a layered network. Critics argue that there are some phenomena which are better captured by symbolic models, and that connectionist models are often so complex as to have little explanatory power. Recently symbolic and connectionist models have been combined, making it possible to take advantage of both forms of explanation. While both connectionism and symbolic approaches have proven useful for testing various hypotheses and exploring approaches to understanding aspects of cognition and lower level brain functions, neither are biologically realistic and therefore, both suffer from a lack of neuroscientific plausibility. Connectionism has proven useful for exploring computationally how cognition emerges in development and occurs in the human brain, and has provided alternatives to strictly domain-specific / domain general approaches. For example, scientists such as Jeff Elman, Liz Bates, and Annette Karmiloff-Smith have posited that networks in the brain emerge from the dynamic interaction between them and environmental input.

Principles

Levels of analysis

A central tenet of cognitive science is that a complete understanding of the mind/brain cannot be attained by studying only a single level. An example would be the problem of remembering a phone number and recalling it later. One approach to understanding this process would be to study behavior through direct observation, or naturalistic observation. A person could be presented with a phone number and be asked to recall it after some delay of time; then the accuracy of the response could be measured. Another approach to measure cognitive ability would be to study the firings of individual neurons while a person is trying to remember the phone number. Neither of these experiments on its own would fully explain how the process of remembering a phone number works. Even if the technology to map out every neuron in the brain in real-time were available and it were known when each neuron fired it would still be impossible to know how a particular firing of neurons translates into the observed behavior. Thus an understanding of how these two levels relate to each other is imperative. Francisco Varela, in The Embodied Mind: Cognitive Science and Human Experience argues that "the new sciences of the mind need to enlarge their horizon to encompass both lived human experience and the possibilities for transformation inherent in human experience". On the classic cognitivist view, this can be provided by a functional level account of the process. Studying a particular phenomenon from multiple levels creates a better understanding of the processes that occur in the brain to give rise to a particular behavior. Marr gave a famous description of three levels of analysis:

  1. The computational theory, specifying the goals of the computation;
  2. Representation and algorithms, giving a representation of the inputs and outputs and the algorithms which transform one into the other; and
  3. The hardware implementation, or how algorithm and representation may be physically realized.

Interdisciplinary nature

Cognitive science is an interdisciplinary field with contributors from various fields, including psychology, neuroscience, linguistics, philosophy of mind, computer science, anthropology and biology. Cognitive scientists work collectively in hope of understanding the mind and its interactions with the surrounding world much like other sciences do. The field regards itself as compatible with the physical sciences and uses the scientific method as well as simulation or modeling, often comparing the output of models with aspects of human cognition. Similarly to the field of psychology, there is some doubt whether there is a unified cognitive science, which have led some researchers to prefer 'cognitive sciences' in plural.

Many, but not all, who consider themselves cognitive scientists hold a functionalist view of the mind—the view that mental states and processes should be explained by their function – what they do. According to the multiple realizability account of functionalism, even non-human systems such as robots and computers can be ascribed as having cognition.

Cognitive science: the term

The term "cognitive" in "cognitive science" is used for "any kind of mental operation or structure that can be studied in precise terms" (Lakoff and Johnson, 1999). This conceptualization is very broad, and should not be confused with how "cognitive" is used in some traditions of analytic philosophy, where "cognitive" has to do only with formal rules and truth conditional semantics.

The earliest entries for the word "cognitive" in the OED take it to mean roughly "pertaining to the action or process of knowing". The first entry, from 1586, shows the word was at one time used in the context of discussions of Platonic theories of knowledge. Most in cognitive science, however, presumably do not believe their field is the study of anything as certain as the knowledge sought by Plato.

Scope

Cognitive science is a large field, and covers a wide array of topics on cognition. However, it should be recognized that cognitive science has not always been equally concerned with every topic that might bear relevance to the nature and operation of minds. Classical cognitivists have largely de-emphasized or avoided social and cultural factors, embodiment, emotion, consciousness, animal cognition, and comparative and evolutionary psychologies. However, with the decline of behaviorism, internal states such as affects and emotions, as well as awareness and covert attention became approachable again. For example, situated and embodied cognition theories take into account the current state of the environment as well as the role of the body in cognition. With the newfound emphasis on information processing, observable behavior was no longer the hallmark of psychological theory, but the modeling or recording of mental states.

Below are some of the main topics that cognitive science is concerned with. This is not an exhaustive list. See List of cognitive science topics for a list of various aspects of the field.

Artificial intelligence

Artificial intelligence (AI) involves the study of cognitive phenomena in machines. One of the practical goals of AI is to implement aspects of human intelligence in computers. Computers are also widely used as a tool with which to study cognitive phenomena. Computational modeling uses simulations to study how human intelligence may be structured. (See § Computational modeling.)

There is some debate in the field as to whether the mind is best viewed as a huge array of small but individually feeble elements (i.e. neurons), or as a collection of higher-level structures such as symbols, schemes, plans, and rules. The former view uses connectionism to study the mind, whereas the latter emphasizes symbolic artificial intelligence. One way to view the issue is whether it is possible to accurately simulate a human brain on a computer without accurately simulating the neurons that make up the human brain.

Attention

Attention is the selection of important information. The human mind is bombarded with millions of stimuli and it must have a way of deciding which of this information to process. Attention is sometimes seen as a spotlight, meaning one can only shine the light on a particular set of information. Experiments that support this metaphor include the dichotic listening task (Cherry, 1957) and studies of inattentional blindness (Mack and Rock, 1998). In the dichotic listening task, subjects are bombarded with two different messages, one in each ear, and told to focus on only one of the messages. At the end of the experiment, when asked about the content of the unattended message, subjects cannot report it.

Bodily processes related to cognition

Embodied cognition approaches to cognitive science emphasize the role of body and environment in cognition. This includes both neural and extra-neural bodily processes, and factors that range from affective and emotional processes, to posture, motor control, proprioception, and kinaesthesis, to autonomic processes that involve heartbeat and respiration, to the role of the enteric gut microbiome. It also includes accounts of how the body engages with or is coupled to social and physical environments. 4E (embodied, embedded, extended and enactive) cognition includes a broad range of views about brain-body-environment interaction, from causal embeddedness to stronger claims about how the mind extends to include tools and instruments, as well as the role of social interactions, action-oriented processes, and affordances. 4E theories range from those closer to classic cognitivism (so-called "weak" embodied cognition) to stronger extended and enactive versions that are sometimes referred to as radical embodied cognitive science.

Knowledge and processing of language

A well known example of a phrase structure tree. This is one way of representing human language that shows how different components are organized hierarchically.
 

The ability to learn and understand language is an extremely complex process. Language is acquired within the first few years of life, and all humans under normal circumstances are able to acquire language proficiently. A major driving force in the theoretical linguistic field is discovering the nature that language must have in the abstract in order to be learned in such a fashion. Some of the driving research questions in studying how the brain itself processes language include: (1) To what extent is linguistic knowledge innate or learned?, (2) Why is it more difficult for adults to acquire a second-language than it is for infants to acquire their first-language?, and (3) How are humans able to understand novel sentences?

The study of language processing ranges from the investigation of the sound patterns of speech to the meaning of words and whole sentences. Linguistics often divides language processing into orthography, phonetics, phonology, morphology, syntax, semantics, and pragmatics. Many aspects of language can be studied from each of these components and from their interaction.

The study of language processing in cognitive science is closely tied to the field of linguistics. Linguistics was traditionally studied as a part of the humanities, including studies of history, art and literature. In the last fifty years or so, more and more researchers have studied knowledge and use of language as a cognitive phenomenon, the main problems being how knowledge of language can be acquired and used, and what precisely it consists of. Linguists have found that, while humans form sentences in ways apparently governed by very complex systems, they are remarkably unaware of the rules that govern their own speech. Thus linguists must resort to indirect methods to determine what those rules might be, if indeed rules as such exist. In any event, if speech is indeed governed by rules, they appear to be opaque to any conscious consideration.

Learning and development

Learning and development are the processes by which we acquire knowledge and information over time. Infants are born with little or no knowledge (depending on how knowledge is defined), yet they rapidly acquire the ability to use language, walk, and recognize people and objects. Research in learning and development aims to explain the mechanisms by which these processes might take place.

A major question in the study of cognitive development is the extent to which certain abilities are innate or learned. This is often framed in terms of the nature and nurture debate. The nativist view emphasizes that certain features are innate to an organism and are determined by its genetic endowment. The empiricist view, on the other hand, emphasizes that certain abilities are learned from the environment. Although clearly both genetic and environmental input is needed for a child to develop normally, considerable debate remains about how genetic information might guide cognitive development. In the area of language acquisition, for example, some (such as Steven Pinker) have argued that specific information containing universal grammatical rules must be contained in the genes, whereas others (such as Jeffrey Elman and colleagues in Rethinking Innateness) have argued that Pinker's claims are biologically unrealistic. They argue that genes determine the architecture of a learning system, but that specific "facts" about how grammar works can only be learned as a result of experience.

Memory

Memory allows us to store information for later retrieval. Memory is often thought of as consisting of both a long-term and short-term store. Long-term memory allows us to store information over prolonged periods (days, weeks, years). We do not yet know the practical limit of long-term memory capacity. Short-term memory allows us to store information over short time scales (seconds or minutes).

Memory is also often grouped into declarative and procedural forms. Declarative memory—grouped into subsets of semantic and episodic forms of memory—refers to our memory for facts and specific knowledge, specific meanings, and specific experiences (e.g. "Are apples food?", or "What did I eat for breakfast four days ago?"). Procedural memory allows us to remember actions and motor sequences (e.g. how to ride a bicycle) and is often dubbed implicit knowledge or memory .

Cognitive scientists study memory just as psychologists do, but tend to focus more on how memory bears on cognitive processes, and the interrelationship between cognition and memory. One example of this could be, what mental processes does a person go through to retrieve a long-lost memory? Or, what differentiates between the cognitive process of recognition (seeing hints of something before remembering it, or memory in context) and recall (retrieving a memory, as in "fill-in-the-blank")?

Perception and action

The Necker cube, an example of an optical illusion
 
An optical illusion. The square A is exactly the same shade of gray as square B. See checker shadow illusion.
 

Perception is the ability to take in information via the senses, and process it in some way. Vision and hearing are two dominant senses that allow us to perceive the environment. Some questions in the study of visual perception, for example, include: (1) How are we able to recognize objects?, (2) Why do we perceive a continuous visual environment, even though we only see small bits of it at any one time? One tool for studying visual perception is by looking at how people process optical illusions. The image on the right of a Necker cube is an example of a bistable percept, that is, the cube can be interpreted as being oriented in two different directions.

The study of haptic (tactile), olfactory, and gustatory stimuli also fall into the domain of perception.

Action is taken to refer to the output of a system. In humans, this is accomplished through motor responses. Spatial planning and movement, speech production, and complex motor movements are all aspects of action.

Consciousness

Consciousness is the awareness of external objects and experiences within oneself. This helps the mind with having the ability to experience or feel a sense of self.

Research methods

Many different methodologies are used to study cognitive science. As the field is highly interdisciplinary, research often cuts across multiple areas of study, drawing on research methods from psychology, neuroscience, computer science and systems theory.

Behavioral experiments

In order to have a description of what constitutes intelligent behavior, one must study behavior itself. This type of research is closely tied to that in cognitive psychology and psychophysics. By measuring behavioral responses to different stimuli, one can understand something about how those stimuli are processed. Lewandowski & Strohmetz (2009) reviewed a collection of innovative uses of behavioral measurement in psychology including behavioral traces, behavioral observations, and behavioral choice. Behavioral traces are pieces of evidence that indicate behavior occurred, but the actor is not present (e.g., litter in a parking lot or readings on an electric meter). Behavioral observations involve the direct witnessing of the actor engaging in the behavior (e.g., watching how close a person sits next to another person). Behavioral choices are when a person selects between two or more options (e.g., voting behavior, choice of a punishment for another participant).

  • Reaction time. The time between the presentation of a stimulus and an appropriate response can indicate differences between two cognitive processes, and can indicate some things about their nature. For example, if in a search task the reaction times vary proportionally with the number of elements, then it is evident that this cognitive process of searching involves serial instead of parallel processing.
  • Psychophysical responses. Psychophysical experiments are an old psychological technique, which has been adopted by cognitive psychology. They typically involve making judgments of some physical property, e.g. the loudness of a sound. Correlation of subjective scales between individuals can show cognitive or sensory biases as compared to actual physical measurements. Some examples include:
    • sameness judgments for colors, tones, textures, etc.
    • threshold differences for colors, tones, textures, etc.
  • Eye tracking. This methodology is used to study a variety of cognitive processes, most notably visual perception and language processing. The fixation point of the eyes is linked to an individual's focus of attention. Thus, by monitoring eye movements, we can study what information is being processed at a given time. Eye tracking allows us to study cognitive processes on extremely short time scales. Eye movements reflect online decision making during a task, and they provide us with some insight into the ways in which those decisions may be processed.

Brain imaging

Image of the human head with the brain. The arrow indicates the position of the hypothalamus.

Brain imaging involves analyzing activity within the brain while performing various tasks. This allows us to link behavior and brain function to help understand how information is processed. Different types of imaging techniques vary in their temporal (time-based) and spatial (location-based) resolution. Brain imaging is often used in cognitive neuroscience.

  • Single-photon emission computed tomography and positron emission tomography. SPECT and PET use radioactive isotopes, which are injected into the subject's bloodstream and taken up by the brain. By observing which areas of the brain take up the radioactive isotope, we can see which areas of the brain are more active than other areas. PET has similar spatial resolution to fMRI, but it has extremely poor temporal resolution.
  • Electroencephalography. EEG measures the electrical fields generated by large populations of neurons in the cortex by placing a series of electrodes on the scalp of the subject. This technique has an extremely high temporal resolution, but a relatively poor spatial resolution.
  • Functional magnetic resonance imaging. fMRI measures the relative amount of oxygenated blood flowing to different parts of the brain. More oxygenated blood in a particular region is assumed to correlate with an increase in neural activity in that part of the brain. This allows us to localize particular functions within different brain regions. fMRI has moderate spatial and temporal resolution.
  • Optical imaging. This technique uses infrared transmitters and receivers to measure the amount of light reflectance by blood near different areas of the brain. Since oxygenated and deoxygenated blood reflects light by different amounts, we can study which areas are more active (i.e., those that have more oxygenated blood). Optical imaging has moderate temporal resolution, but poor spatial resolution. It also has the advantage that it is extremely safe and can be used to study infants' brains.
  • Magnetoencephalography. MEG measures magnetic fields resulting from cortical activity. It is similar to EEG, except that it has improved spatial resolution since the magnetic fields it measures are not as blurred or attenuated by the scalp, meninges and so forth as the electrical activity measured in EEG is. MEG uses SQUID sensors to detect tiny magnetic fields.

Computational modeling

An artificial neural network with two layers.

Computational models require a mathematically and logically formal representation of a problem. Computer models are used in the simulation and experimental verification of different specific and general properties of intelligence. Computational modeling can help us understand the functional organization of a particular cognitive phenomenon. Approaches to cognitive modeling can be categorized as: (1) symbolic, on abstract mental functions of an intelligent mind by means of symbols; (2) subsymbolic, on the neural and associative properties of the human brain; and (3) across the symbolic–subsymbolic border, including hybrid.

  • Symbolic modeling evolved from the computer science paradigms using the technologies of knowledge-based systems, as well as a philosophical perspective (e.g. "Good Old-Fashioned Artificial Intelligence" (GOFAI)). They were developed by the first cognitive researchers and later used in information engineering for expert systems. Since the early 1990s it was generalized in systemics for the investigation of functional human-like intelligence models, such as personoids, and, in parallel, developed as the SOAR environment. Recently, especially in the context of cognitive decision-making, symbolic cognitive modeling has been extended to the socio-cognitive approach, including social and organizational cognition, interrelated with a sub-symbolic non-conscious layer.
  • Subsymbolic modeling includes connectionist/neural network models. Connectionism relies on the idea that the mind/brain is composed of simple nodes and its problem-solving capacity derives from the connections between them. Neural nets are textbook implementations of this approach. Some critics of this approach feel that while these models approach biological reality as a representation of how the system works, these models lack explanatory powers because, even in systems endowed with simple connection rules, the emerging high complexity makes them less interpretable at the connection-level than they apparently are at the macroscopic level.
  • Other approaches gaining in popularity include (1) dynamical systems theory, (2) mapping symbolic models onto connectionist models (Neural-symbolic integration or hybrid intelligent systems), and (3) and Bayesian models, which are often drawn from machine learning.

All the above approaches tend either to be generalized to the form of integrated computational models of a synthetic/abstract intelligence (i.e. cognitive architecture) in order to be applied to the explanation and improvement of individual and social/organizational decision-making and reasoning or to focus on single simulative programs (or microtheories/"middle-range" theories) modelling specific cognitive faculties (e.g. vision, language, categorization etc.).

Neurobiological methods

Research methods borrowed directly from neuroscience and neuropsychology can also help us to understand aspects of intelligence. These methods allow us to understand how intelligent behavior is implemented in a physical system.

Key findings

Cognitive science has given rise to models of human cognitive bias and risk perception, and has been influential in the development of behavioral finance, part of economics. It has also given rise to a new theory of the philosophy of mathematics (related to denotational mathematics), and many theories of artificial intelligence, persuasion and coercion. It has made its presence known in the philosophy of language and epistemology as well as constituting a substantial wing of modern linguistics. Fields of cognitive science have been influential in understanding the brain's particular functional systems (and functional deficits) ranging from speech production to auditory processing and visual perception. It has made progress in understanding how damage to particular areas of the brain affect cognition, and it has helped to uncover the root causes and results of specific dysfunction, such as dyslexia, anopia, and hemispatial neglect.

Notable researchers

Name Year of birth Year of contribution Contribution(s)
David Chalmers 1966 1995 Dualism, hard problem of consciousness
Daniel Dennett 1942 1987 Offered a computational systems perspective (Multiple drafts model)
John Searle 1932 1980 Chinese room
Douglas Hofstadter 1945 1979 Gödel, Escher, Bach
Jerry Fodor 1935 1968, 1975 Functionalism
Alan Baddeley 1934 1974 Baddeley's model of working memory
Marvin Minsky 1927 1970s, early 1980s Wrote computer programs in languages such as LISP to attempt to formally characterize the steps that human beings go through, such as making decisions and solving problems
Christopher Longuet-Higgins 1923 1973 Coined the term cognitive science
Noam Chomsky 1928 1959 Published a review of B.F. Skinner's book Verbal Behavior which began cognitivism against then-dominant behaviorism
George Miller 1920 1956 Wrote about the capacities of human thinking through mental representations
Herbert Simon 1916 1956 Co-created Logic Theory Machine and General Problem Solver with Allen Newell, EPAM (Elementary Perceiver and Memorizer) theory, organizational decision-making
John McCarthy 1927 1955 Coined the term artificial intelligence and organized the famous Dartmouth conference in Summer 1956, which started AI as a field
McCulloch and Pitts
1930s–1940s Developed early artificial neural networks
J. C. R. Licklider 1915
Established MIT Sloan School of Management
Lila R. Gleitman 1929 1970s-2010s Wide-ranging contributions to understanding the cognition of language acquisition, including syntactic bootstrapping theory
Eleanor Rosch 1938 1976 Development of the Prototype Theory of categorisation
Philip N. Johnson-Laird 1936 1980 Introduced the idea of mental models in cognitive science
Dedre Gentner 1944 1983 Development of the Structure-mapping Theory of analogical reasoning
Allen Newell 1927 1990 Development of the field of Cognitive architecture in cognitive modelling and artificial intelligence
Annette Karmiloff-Smith 1938 1992 Integrating neuroscience and computational modelling into theories of cognitive development
David Marr (neuroscientist) 1945 1990 Proponent of the Three-Level Hypothesis of levels of analysis of computational systems
Peter Gärdenfors 1949 2000 Creator of the conceptual space framework used in cognitive modelling and artificial intelligence.
Linda B. Smith 1951 1993 Together with Esther Thelen, created a dynamical systems approach to understanding cognitive development

Some of the more recognized names in cognitive science are usually either the most controversial or the most cited. Within philosophy, some familiar names include Daniel Dennett, who writes from a computational systems perspective, John Searle, known for his controversial Chinese room argument, and Jerry Fodor, who advocates functionalism.

Others include David Chalmers, who advocates Dualism and is also known for articulating the hard problem of consciousness, and Douglas Hofstadter, famous for writing Gödel, Escher, Bach, which questions the nature of words and thought.

In the realm of linguistics, Noam Chomsky and George Lakoff have been influential (both have also become notable as political commentators). In artificial intelligence, Marvin Minsky, Herbert A. Simon, and Allen Newell are prominent.

Popular names in the discipline of psychology include George A. Miller, James McClelland, Philip Johnson-Laird, Lawrence Barsalou, Vittorio Guidano, Howard Gardner and Steven Pinker. Anthropologists Dan Sperber, Edwin Hutchins, Bradd Shore, James Wertsch and Scott Atran, have been involved in collaborative projects with cognitive and social psychologists, political scientists and evolutionary biologists in attempts to develop general theories of culture formation, religion, and political association.

Computational theories (with models and simulations) have also been developed, by David Rumelhart, James McClelland and Philip Johnson-Laird.

Epistemics

Epistemics is a term coined in 1969 by the University of Edinburgh with the foundation of its School of Epistemics. Epistemics is to be distinguished from epistemology in that epistemology is the philosophical theory of knowledge, whereas epistemics signifies the scientific study of knowledge.

Christopher Longuet-Higgins has defined it as "the construction of formal models of the processes (perceptual, intellectual, and linguistic) by which knowledge and understanding are achieved and communicated." In his 1978 essay "Epistemics: The Regulative Theory of Cognition", Alvin I. Goldman claims to have coined the term "epistemics" to describe a reorientation of epistemology. Goldman maintains that his epistemics is continuous with traditional epistemology and the new term is only to avoid opposition. Epistemics, in Goldman's version, differs only slightly from traditional epistemology in its alliance with the psychology of cognition; epistemics stresses the detailed study of mental processes and information-processing mechanisms that lead to knowledge or beliefs.

In the mid-1980s, the School of Epistemics was renamed as The Centre for Cognitive Science (CCS). In 1998, CCS was incorporated into the University of Edinburgh's School of Informatics.

Microbial biodegradation

From Wikipedia, the free encyclopedia

Microbial biodegradation is the use of bioremediation and biotransformation methods to harness the naturally occurring ability of microbial xenobiotic metabolism to degrade, transform or accumulate environmental pollutants, including hydrocarbons (e.g. oil), polychlorinated biphenyls (PCBs), polyaromatic hydrocarbons (PAHs), heterocyclic compounds (such as pyridine or quinoline), pharmaceutical substances, radionuclides and metals.

Interest in the microbial biodegradation of pollutants has intensified in recent years, and recent major methodological breakthroughs have enabled detailed genomic, metagenomic, proteomic, bioinformatic and other high-throughput analyses of environmentally relevant microorganisms, providing new insights into biodegradative pathways and the ability of organisms to adapt to changing environmental conditions.

Biological processes play a major role in the removal of contaminants and take advantage of the catabolic versatility of microorganisms to degrade or convert such compounds. In environmental microbiology, genome-based global studies are increasing the understanding of metabolic and regulatory networks, as well as providing new information on the evolution of degradation pathways and molecular adaptation strategies to changing environmental conditions.

Aerobic biodegradation of pollutants

The increasing amount of bacterial genomic data provides new opportunities for understanding the genetic and molecular bases of the degradation of organic pollutants. Aromatic compounds are among the most persistent of these pollutants and lessons can be learned from the recent genomic studies of Burkholderia xenovorans LB400 and Rhodococcus sp. strain RHA1, two of the largest bacterial genomes completely sequenced to date. These studies have helped expand our understanding of bacterial catabolism, non-catabolic physiological adaptation to organic compounds, and the evolution of large bacterial genomes. First, the metabolic pathways from phylogenetically diverse isolates are very similar with respect to overall organization. Thus, as originally noted in pseudomonads, a large number of "peripheral aromatic" pathways funnel a range of natural and xenobiotic compounds into a restricted number of "central aromatic" pathways. Nevertheless, these pathways are genetically organized in genus-specific fashions, as exemplified by the b-ketoadipate and Paa pathways. Comparative genomic studies further reveal that some pathways are more widespread than initially thought. Thus, the Box and Paa pathways illustrate the prevalence of non-oxygenolytic ring-cleavage strategies in aerobic aromatic degradation processes. Functional genomic studies have been useful in establishing that even organisms harboring high numbers of homologous enzymes seem to contain few examples of true redundancy. For example, the multiplicity of ring-cleaving dioxygenases in certain rhodococcal isolates may be attributed to the cryptic aromatic catabolism of different terpenoids and steroids. Finally, analyses have indicated that recent genetic flux appears to have played a more significant role in the evolution of some large genomes, such as LB400's, than others. However, the emerging trend is that the large gene repertoires of potent pollutant degraders such as LB400 and RHA1 have evolved principally through more ancient processes. That this is true in such phylogenetically diverse species is remarkable and further suggests the ancient origin of this catabolic capacity.

Anaerobic biodegradation of pollutants

Anaerobic microbial mineralization of recalcitrant organic pollutants is of great environmental significance and involves intriguing novel biochemical reactions. In particular, hydrocarbons and halogenated compounds have long been doubted to be degradable in the absence of oxygen, but the isolation of hitherto unknown anaerobic hydrocarbon-degrading and reductively dehalogenating bacteria during the last decades provided ultimate proof for these processes in nature. While such research involved mostly chlorinated compounds initially, recent studies have revealed reductive dehalogenation of bromine and iodine moieties in aromatic pesticides. Other reactions, such as biologically induced abiotic reduction by soil minerals, has been shown to deactivate relatively persistent aniline-based herbicides far more rapidly than observed in aerobic environments. Many novel biochemical reactions were discovered enabling the respective metabolic pathways, but progress in the molecular understanding of these bacteria was rather slow, since genetic systems are not readily applicable for most of them. However, with the increasing application of genomics in the field of environmental microbiology, a new and promising perspective is now at hand to obtain molecular insights into these new metabolic properties. Several complete genome sequences were determined during the last few years from bacteria capable of anaerobic organic pollutant degradation. The ~4.7 Mb genome of the facultative denitrifying Aromatoleum aromaticum strain EbN1 was the first to be determined for an anaerobic hydrocarbon degrader (using toluene or ethylbenzene as substrates). The genome sequence revealed about two dozen gene clusters (including several paralogs) coding for a complex catabolic network for anaerobic and aerobic degradation of aromatic compounds. The genome sequence forms the basis for current detailed studies on regulation of pathways and enzyme structures. Further genomes of anaerobic hydrocarbon degrading bacteria were recently completed for the iron-reducing species Geobacter metallireducens (accession nr. NC_007517) and the perchlorate-reducing Dechloromonas aromatica (accession nr. NC_007298), but these are not yet evaluated in formal publications. Complete genomes were also determined for bacteria capable of anaerobic degradation of halogenated hydrocarbons by halorespiration: the ~1.4 Mb genomes of Dehalococcoides ethenogenes strain 195 and Dehalococcoides sp. strain CBDB1 and the ~5.7 Mb genome of Desulfitobacterium hafniense strain Y51. Characteristic for all these bacteria is the presence of multiple paralogous genes for reductive dehalogenases, implicating a wider dehalogenating spectrum of the organisms than previously known. Moreover, genome sequences provided unprecedented insights into the evolution of reductive dehalogenation and differing strategies for niche adaptation.

Recently, it has become apparent that some organisms, including Desulfitobacterium chlororespirans, originally evaluated for halorespiration on chlorophenols, can also use certain brominated compounds, such as the herbicide bromoxynil and its major metabolite as electron acceptors for growth. Iodinated compounds may be dehalogenated as well, though the process may not satisfy the need for an electron acceptor.

Bioavailability, chemotaxis, and transport of pollutants

Bioavailability, or the amount of a substance that is physiochemically accessible to microorganisms is a key factor in the efficient biodegradation of pollutants. O'Loughlin et al. (2000) showed that, with the exception of kaolinite clay, most soil clays and cation exchange resins attenuated biodegradation of 2-picoline by Arthrobacter sp. strain R1, as a result of adsorption of the substrate to the clays. Chemotaxis, or the directed movement of motile organisms towards or away from chemicals in the environment is an important physiological response that may contribute to effective catabolism of molecules in the environment. In addition, mechanisms for the intracellular accumulation of aromatic molecules via various transport mechanisms are also important.

Oil biodegradation

General overview of microbial biodegradation of petroleum oil by microbial communities. Some microorganisms, such as A. borkumensis, are able to use hydrocarbons as their source for carbon in metabolism. They are able to oxidize the environmentally harmful hydrocarbons while producing harmless products, following the general equation CnHn + O2 → H2O + CO2. In the figure, carbon is represented as yellow circles, oxygen as pink circles, and hydrogen as blue circles. This type of special metabolism allows these microbes to thrive in areas affected by oil spills and are important in the elimination of environmental pollutants.

Petroleum oil contains aromatic compounds that are toxic to most life forms. Episodic and chronic pollution of the environment by oil causes major disruption to the local ecological environment. Marine environments in particular are especially vulnerable, as oil spills near coastal regions and in the open sea are difficult to contain and make mitigation efforts more complicated. In addition to pollution through human activities, approximately 250 million litres of petroleum enter the marine environment every year from natural seepages. Despite its toxicity, a considerable fraction of petroleum oil entering marine systems is eliminated by the hydrocarbon-degrading activities of microbial communities, in particular by a recently discovered group of specialists, the hydrocarbonoclastic bacteria (HCB). Alcanivorax borkumensis was the first HCB to have its genome sequenced. In addition to hydrocarbons, crude oil often contains various heterocyclic compounds, such as pyridine, which appear to be degraded by similar mechanisms to hydrocarbons.

Cholesterol biodegradation

Many synthetic steroidic compounds like some sexual hormones frequently appear in municipal and industrial wastewaters, acting as environmental pollutants with strong metabolic activities negatively affecting the ecosystems. Since these compounds are common carbon sources for many different microorganisms their aerobic and anaerobic mineralization has been extensively studied. The interest of these studies lies on the biotechnological applications of sterol transforming enzymes for the industrial synthesis of sexual hormones and corticoids. Very recently, the catabolism of cholesterol has acquired a high relevance because it is involved in the infectivity of the pathogen Mycobacterium tuberculosis (Mtb). Mtb causes tuberculosis disease, and it has been demonstrated that novel enzyme architectures have evolved to bind and modify steroid compounds like cholesterol in this organism and other steroid-utilizing bacteria as well. These new enzymes might be of interest for their potential in the chemical modification of steroid substrates.

Analysis of waste biotreatment

Sustainable development requires the promotion of environmental management and a constant search for new technologies to treat vast quantities of wastes generated by increasing anthropogenic activities. Biotreatment, the processing of wastes using living organisms, is an environmentally friendly, relatively simple and cost-effective alternative to physico-chemical clean-up options. Confined environments, such as bioreactors, have been engineered to overcome the physical, chemical and biological limiting factors of biotreatment processes in highly controlled systems. The great versatility in the design of confined environments allows the treatment of a wide range of wastes under optimized conditions. To perform a correct assessment, it is necessary to consider various microorganisms having a variety of genomes and expressed transcripts and proteins. A great number of analyses are often required. Using traditional genomic techniques, such assessments are limited and time-consuming. However, several high-throughput techniques originally developed for medical studies can be applied to assess biotreatment in confined environments.

Metabolic engineering and biocatalytic applications

The study of the fate of persistent organic chemicals in the environment has revealed a large reservoir of enzymatic reactions with a large potential in preparative organic synthesis, which has already been exploited for a number of oxygenases on pilot and even on industrial scale. Novel catalysts can be obtained from metagenomic libraries and DNA sequence based approaches. Our increasing capabilities in adapting the catalysts to specific reactions and process requirements by rational and random mutagenesis broadens the scope for application in the fine chemical industry, but also in the field of biodegradation. In many cases, these catalysts need to be exploited in whole cell bioconversions or in fermentations, calling for system-wide approaches to understanding strain physiology and metabolism and rational approaches to the engineering of whole cells as they are increasingly put forward in the area of systems biotechnology and synthetic biology.

Fungal biodegradation

In the ecosystem, different substrates are attacked at different rates by consortia of organisms from different kingdoms. Aspergillus and other moulds play an important role in these consortia because they are adept at recycling starches, hemicelluloses, celluloses, pectins and other sugar polymers. Some aspergilli are capable of degrading more refractory compounds such as fats, oils, chitin, and keratin. Maximum decomposition occurs when there is sufficient nitrogen, phosphorus and other essential inorganic nutrients. Fungi also provide food for many soil organisms.

For Aspergillus the process of degradation is the means of obtaining nutrients. When these moulds degrade human-made substrates, the process usually is called biodeterioration. Both paper and textiles (cotton, jute, and linen) are particularly vulnerable to Aspergillus degradation. Our artistic heritage is also subject to Aspergillus assault. To give but one example, after Florence in Italy flooded in 1969, 74% of the isolates from a damaged Ghirlandaio fresco in the Ognissanti church were Aspergillus versicolor.

Sea level rise

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Futures_studies The global average sea ...