Search This Blog

Sunday, May 14, 2023

Technology

From Wikipedia, the free encyclopedia
A steam turbine with the case opened, an example of energy technology

Technology is the application of knowledge for achieving practical goals in a reproducible way. The word technology can also mean the products resulting from such efforts, including both tangible tools such as utensils or machines, and intangible ones such as software. Technology plays a critical role in science, engineering, and everyday life.

Technological advancements have led to significant changes in society. The earliest known technology is the stone tool, used during prehistoric times, followed by the control of fire, which contributed to the growth of the human brain and the development of language during the Ice Age. The invention of the wheel in the Bronze Age allowed greater travel and the creation of more complex machines. More recent technological inventions, including the printing press, telephone, and the Internet, have lowered barriers to communication and ushered in the knowledge economy.

While technology contributes to economic development and improves human prosperity, it can also have negative impacts like pollution and resource depletion, and can cause social harms like technological unemployment resulting from automation. As a result, there are ongoing philosophical and political debates about the role and use of technology, the ethics of technology, and ways to mitigate its downsides.

Etymology

Technology is a term dating back to the early 17th century that meant 'systematic treatment' (from Greek Τεχνολογία, from the Greek: τέχνη, romanizedtékhnē, lit.'craft, art' and -λογία, 'study, knowledge'). It is predated in use by the Ancient Greek word tékhnē, used to mean 'knowledge of how to make things', which encompassed activities like architecture.

Starting in the 19th century, continental Europeans started using the terms Technik (German) or technique (French) to refer to a 'way of doing', which included all technical arts, such as dancing, navigation, or printing, whether or not they required tools or instruments. At the time, Technologie (German and French) referred either to the academic discipline studying the "methods of arts and crafts", or to the political discipline "intended to legislate on the functions of the arts and crafts."Since the distinction between Technik and Technologie is absent in English, both were translated as technology. The term was previously uncommon in English and mostly referred to the academic discipline, as in the Massachusetts Institute of Technology.

In the 20th century, as a result of scientific progress and the Second Industrial Revolution, technology stopped being considered a distinct academic discipline and took on its current-day meaning: the systemic use of knowledge to practical ends.

History

Prehistoric

refer to caption
A person holding a hand axe

Tools were initially developed by hominids through observation and trial and error. Around 2 Mya (million years ago), they learned to make the first stone tools by hammering flakes off a pebble, forming a sharp hand axe. This practice was refined 75 kya (thousand years ago) into pressure flaking, enabling much finer work.

The discovery of fire was described by Charles Darwin as "possibly the greatest ever made by man". Archaeological, dietary, and social evidence point to "continuous [human] fire-use" at least 1.5 Mya. Fire, fueled with wood and charcoal, allowed early humans to cook their food to increase its digestibility, improving its nutrient value and broadening the number of foods that could be eaten. The cooking hypothesis proposes that the ability to cook promoted an increase in hominid brain size, though some researchers find the evidence inconclusive. Archaeological evidence of hearths was dated to 790 kya; researchers believe this is likely to have intensified human socialization and may have contributed to the emergence of language.

Other technological advances made during the Paleolithic era include clothing and shelter. No consensus exists on the approximate time of adoption of either technology, but archaeologists have found archaeological evidence of clothing 90-120 kya and shelter 450 kya. As the Paleolithic era progressed, dwellings became more sophisticated and more elaborate; as early as 380 kya, humans were constructing temporary wood huts. Clothing, adapted from the fur and hides of hunted animals, helped humanity expand into colder regions; humans began to migrate out of Africa around 200 kya, initially moving to Eurasia.

Neolithic

Photo of Neolithic tools on display
An array of Neolithic artifacts, including bracelets, axe heads, chisels, and polishing tools

The Neolithic Revolution (or First Agricultural Revolution) brought about an acceleration of technological innovation, and a consequent increase in social complexity. The invention of the polished stone axe was a major advance that allowed large-scale forest clearance and farming. This use of polished stone axes increased greatly in the Neolithic but was originally used in the preceding Mesolithic in some areas such as Ireland. Agriculture fed larger populations, and the transition to sedentism allowed for the simultaneous raising of more children, as infants no longer needed to be carried around by nomads. Additionally, children could contribute labor to the raising of crops more readily than they could participate in hunter-gatherer activities.

With this increase in population and availability of labor came an increase in labor specialization. What triggered the progression from early Neolithic villages to the first cities, such as Uruk, and the first civilizations, such as Sumer, is not specifically known; however, the emergence of increasingly hierarchical social structures and specialized labor, of trade and war amongst adjacent cultures, and the need for collective action to overcome environmental challenges such as irrigation, are all thought to have played a role.

Continuing improvements led to the furnace and bellows and provided, for the first time, the ability to smelt and forge gold, copper, silver, and lead  – native metals found in relatively pure form in nature. The advantages of copper tools over stone, bone and wooden tools were quickly apparent to early humans, and native copper was probably used from near the beginning of Neolithic times (about 10 ka). Native copper does not naturally occur in large amounts, but copper ores are quite common and some of them produce metal easily when burned in wood or charcoal fires. Eventually, the working of metals led to the discovery of alloys such as bronze and brass (about 4,000 BCE). The first use of iron alloys such as steel dates to around 1,800 BCE.

Ancient

Photo of an early wooden wheel
The wheel was invented c. 4,000 BCE.

After harnessing fire, humans discovered other forms of energy. The earliest known use of wind power is the sailing ship; the earliest record of a ship under sail is that of a Nile boat dating to around 7,000 BCE. From prehistoric times, Egyptians likely used the power of the annual flooding of the Nile to irrigate their lands, gradually learning to regulate much of it through purposely built irrigation channels and "catch" basins. The ancient Sumerians in Mesopotamia used a complex system of canals and levees to divert water from the Tigris and Euphrates rivers for irrigation.

Archaeologists estimate that the wheel was invented independently and concurrently in Mesopotamia (in present-day Iraq), the Northern Caucasus (Maykop culture), and Central Europe. Time estimates range from 5,500 to 3,000 BCE with most experts putting it closer to 4,000 BCE. The oldest artifacts with drawings depicting wheeled carts date from about 3,500 BCE. More recently, the oldest-known wooden wheel in the world was found in the Ljubljana Marsh of Slovenia.

The invention of the wheel revolutionized trade and war. It did not take long to discover that wheeled wagons could be used to carry heavy loads. The ancient Sumerians used a potter's wheel and may have invented it. A stone pottery wheel found in the city-state of Ur dates to around 3,429 BCE, and even older fragments of wheel-thrown pottery have been found in the same area. Fast (rotary) potters' wheels enabled early mass production of pottery, but it was the use of the wheel as a transformer of energy (through water wheels, windmills, and even treadmills) that revolutionized the application of nonhuman power sources. The first two-wheeled carts were derived from travois and were first used in Mesopotamia and Iran in around 3,000 BCE.

The oldest known constructed roadways are the stone-paved streets of the city-state of Ur, dating to c. 4,000 BCE, and timber roads leading through the swamps of Glastonbury, England, dating to around the same period. The first long-distance road, which came into use around 3,500 BCE, spanned 2,400 km from the Persian Gulf to the Mediterranean Sea, but was not paved and was only partially maintained. In around 2,000 BCE, the Minoans on the Greek island of Crete built a 50 km road leading from the palace of Gortyn on the south side of the island, through the mountains, to the palace of Knossos on the north side of the island. Unlike the earlier road, the Minoan road was completely paved.

refer to caption
Photograph of the Pont du Gard in France, one of the most famous ancient Roman aqueducts

Ancient Minoan private homes had running water. A bathtub virtually identical to modern ones was unearthed at the Palace of Knossos. Several Minoan private homes also had toilets, which could be flushed by pouring water down the drain. The ancient Romans had many public flush toilets, which emptied into an extensive sewage system. The primary sewer in Rome was the Cloaca Maxima; construction began on it in the sixth century BCE and it is still in use today.

The ancient Romans also had a complex system of aqueducts, which were used to transport water across long distances. The first Roman aqueduct was built in 312 BCE. The eleventh and final ancient Roman aqueduct was built in 226 CE. Put together, the Roman aqueducts extended over 450 km, but less than 70 km of this was above ground and supported by arches.

Pre-modern

Innovations continued through the Middle Ages with the introduction of silk production (in Asia and later Europe), the horse collar, and horseshoes. Simple machines (such as the lever, the screw, and the pulley) were combined into more complicated tools, such as the wheelbarrow, windmills, and clocks. A system of universities developed and spread scientific ideas and practices, including Oxford and Cambridge.

The Renaissance era produced many innovations, including the introduction of the movable type printing press to Europe, which facilitated the communication of knowledge. Technology became increasingly influenced by science, beginning a cycle of mutual advancement.

Modern

Photo of a Ford Model T on a road
The automobile revolutionized personal transportation.

Starting in the United Kingdom in the 18th century, the discovery of steam power set off the Industrial Revolution, which saw wide-ranging technological discoveries, particularly in the areas of agriculture, manufacturing, mining, metallurgy, and transport, and the widespread application of the factory system. This was followed a century later by the Second Industrial Revolution which led to rapid scientific discovery, standardization, and mass production. New technologies were developed, including sewage systems, electricity, light bulbs, electric motors, railroads, automobiles, and airplanes. These technological advances led to significant developments in medicine, chemistry, physics, and engineering. They were accompanied by consequential social change, with the introduction of skyscrapers accompanied by rapid urbanization. Communication improved with the invention of the telegraph, the telephone, the radio, and television.

The 20th century brought a host of innovations. In physics, the discovery of nuclear fission in the Atomic Age led to both nuclear weapons and nuclear power. Computers were invented and later shifted from analog to digital in the Digital Revolution. Information technology, particularly optical fiber and optical amplifiers led to the birth of the Internet, which ushered in the Information Age. The Space Age began with the launch of Sputnik 1 in 1957, and later the launch of crewed missions to the moon in the 1960s. Organized efforts to search for extraterrestrial intelligence have used radio telescopes to detect signs of technology use, or technosignatures, given off by alien civilizations. In medicine, new technologies were developed for diagnosis (CT, PET, and MRI scanning), treatment (like the dialysis machine, defibrillator, pacemaker, and a wide array of new pharmaceutical drugs), and research (like interferon cloning and DNA microarrays).

Complex manufacturing and construction techniques and organizations are needed to make and maintain more modern technologies, and entire industries have arisen to develop succeeding generations of increasingly more complex tools. Modern technology increasingly relies on training and education – their designers, builders, maintainers, and users often require sophisticated general and specific training. Moreover, these technologies have become so complex that entire fields have developed to support them, including engineering, medicine, and computer science; and other fields have become more complex, such as construction, transportation, and architecture.

Impact

Technological change is the largest cause of long-term economic growth. Throughout human history, energy production was the main constraint on economic development, and new technologies allowed humans to significantly increase the amount of available energy. First came fire, which made edible a wider variety of foods, and made it less physically demanding to digest them. Fire also enabled smelting, and the use of tin, copper, and iron tools, used for hunting or tradesmanship. Then came the agricultural revolution: humans no longer needed to hunt or gather to survive, and began to settle in towns and cities, forming more complex societies, with militaries and more organized forms of religion.

Technologies have contributed to human welfare through increased prosperity, improved comfort and quality of life, and medical progress, but they can also disrupt existing social hierarchies, cause pollution, and harm individuals or groups.

Recent years have brought about a rise in social media's cultural prominence, with potential repercussions on democracy, and economic and social life. Early on, the internet was seen as a "liberation technology" that would democratize knowledge, improve access to education, and promote democracy. Modern research has turned to investigate the internet's downsides, including disinformation, polarization, hate speech, and propaganda.

Since the 1970s, technology's impact on the environment has been criticized, leading to a surge in investment in solar, wind, and other forms of clean energy.

Social

Jobs

Photo of a car assembly line, with numerous robots
A Volkswagen electric car being built with Siemens automation technology

Since the invention of the wheel, technologies have helped increase humans' economic output. Past automation has both substituted and complemented labor; machines replaced humans at some lower-paying jobs (for example in agriculture), but this was compensated by the creation of new, higher-paying jobs. Studies have found that computers did not create significant net technological unemployment.  Due to artificial intelligence being far more capable than computers, and still being in its infancy, it is not known whether it will follow the same trend; the question has been debated at length among economists and policymakers. A 2017 survey found no clear consensus among economists on whether AI would increase long-term unemployment. According to the World Economic Forum's "The Future of Jobs Report 2020", AI is predicted to replace 85 million jobs worldwide, and create 97 million new jobs by 2025. From 1990 to 2007, a study in the U.S by MIT economist Daron Acemoglu showed that an addition of one robot for every 1,000 workers decreased the employment-to-population ratio by 0.2%, or about 3.3 workers, and lowered wages by 0.42%. Concerns about technology replacing human labor however are long-lasting. As US president Lyndon Johnson said in 1964, “Technology is creating both new opportunities and new obligations for us, opportunity for greater productivity and progress; obligation to be sure that no workingman, no family must pay an unjust price for progress.” upon signing the National Commission on Technology, Automation, and Economic Progress bill.

Security

With the growing reliance of technology, there have been security and privacy concerns along with it. Billions of people use different online payment methods, such as WeChat Pay, PayPal, Alipay, and much more to help transfer money. Although security measures are placed, some criminals are able to bypass them. In March 2022, North Korea used Blender.io, a mixer which helped them to hide their cryptocurrency exchanges, to launder over $20.5 million in cryptocurrency, from Axie Infinity, and steal over $600 million worth of cryptocurrency from the game’s owner. Because of this, the U.S. Treasury Department sanctioned Blender.io, which marked the first time it has taken action against a mixer, to try and crack down on North Korean hackers. The privacy of cryptocurrency has been debated. Although many customers like the privacy of cryptocurrency, many also argue that it needs more transparency and stability.

Environmental

Technology has impacted the world with negative and positive environmental impacts, which are usually the reverse of the initial damage, such as; the creation of pollution and the attempt to undo said pollution, deforestation and the reversing of deforestation, and oil spills. All of these have had a significant impact on the environment of the earth. As technology has advanced, so has the negative environmental impact, with the releasing of greenhouse gases, like methane and carbon dioxide, into the atmosphere, causing the greenhouse effect, gradually heating the earth and causing global warming. All of this has become worse with the advancement of technology.

Pollution

Pollution, the presence of contaminants in an environment that causes adverse effects, could have been present as early as the Inca empire. They used a lead sulfide flux in the smelting of ores, along with the use of a wind-drafted clay kiln, which released lead into the atmosphere and the sediment of rivers.

Philosophy

Philosophy of technology is a branch of philosophy that studies the "practice of designing and creating artifacts", and the "nature of the things so created." It emerged as a discipline over the past two centuries, and has grown "considerably" since the 1970s. The humanities philosophy of technology is concerned with the "meaning of technology for, and its impact on, society and culture".

Initially, technology was seen as an extension of the human organism that replicated or amplified bodily and mental faculties. Marx framed it as a tool used by capitalists to oppress the proletariat, but believed that technology would be a fundamentally liberating force once it was "freed from societal deformations". Second-wave philosophers like Ortega later shifted their focus from economics and politics to "daily life and living in a techno-material culture," arguing that technology could oppress "even the members of the bourgeoisie who were its ostensible masters and possessors." Third-stage philosophers like Don Ihde and Albert Borgmann represent a turn toward de-generalization and empiricism, and considered how humans can learn to live with technology.

Early scholarship on technology was split between two arguments: technological determinism, and social construction. Technological determinism is the idea that technologies cause unavoidable social changes. It usually encompasses a related argument, technological autonomy, which asserts that technological progress follows a natural progression and cannot be prevented. Social constructivists argue that technologies follow no natural progression, and are shaped by cultural values, laws, politics, and economic incentives. Modern scholarship has shifted towards an analysis of sociotechnical systems, "assemblages of things, people, practices, and meanings", looking at the value judgments that shape technology.

Cultural critic Neil Postman distinguished tool-using societies from technological societies and from what he called "technopolies," societies that are dominated by an ideology of technological and scientific progress to the detriment of other cultural practices, values, and world views. Herbert Marcuse and John Zerzan suggest that technological society will inevitably deprive us of our freedom and psychological health.

Ethics

The ethics of technology is an interdisciplinary subfield of ethics that analyzes technology's ethical implications and explores ways to mitigate the potential negative impacts of new technologies. There is a broad range of ethical issues revolving around technology, from specific areas of focus affecting professionals working with technology to broader social, ethical, and legal issues concerning the role of technology in society and everyday life.

Prominent debates have surrounded genetically modified organisms, the use of robotic soldiers, algorithmic bias, and the issue of aligning AI behavior with human values.

Technology ethics encompasses several key fields. Bioethics looks at ethical issues surrounding biotechnologies and modern medicine, including cloning, human genetic engineering, and stem cell research. Computer ethics focuses on issues related to computing. Cyberethics explores internet-related issues like intellectual property rights, privacy, and censorship. Nanoethics examines issues surrounding the alteration of matter at the atomic and molecular level in various disciplines including computer science, engineering, and biology. And engineering ethics deals with the professional standards of engineers, including software engineers and their moral responsibilities to the public.

A wide branch of technology ethics is concerned with the ethics of artificial intelligence: it includes robot ethics, which deals with ethical issues involved in the design, construction, use, and treatment of robots, as well as machine ethics, which is concerned with ensuring the ethical behavior of artificially intelligent agents. Within the field of AI ethics, significant yet-unsolved research problems include AI alignment (ensuring that AI behaviors are aligned with their creators' intended goals and interests) and the reduction of algorithmic bias. Some researchers have warned against the hypothetical risk of an AI takeover, and have advocated for the use of AI capability control in addition to AI alignment methods.

Other fields of ethics have had to contend with technology-related issues, including military ethics, media ethics, and educational ethics.

Futures studies

Futures studies is the systematic and interdisciplinary study of social and technological progress. It aims to quantitatively and qualitatively explore the range of plausible futures and to incorporate human values in the development of new technologies. More generally, futures researchers are interested in improving "the freedom and welfare of humankind". It relies on a thorough quantitative and qualitative analysis of past and present technological trends, and attempts to rigorously extrapolate them into the future. Science fiction is often used as a source of ideas. Futures research methodologies include survey research, modeling, statistical analysis, and computer simulations.

Existential risk

Existential risk researchers analyze risks that could lead to human extinction or civilizational collapse, and look for ways to build resilience against them. Relevant research centers include the Cambridge Center for the Study of Existential Risk, and the Stanford Existential Risk Initiative. Future technologies may contribute to the risks of artificial general intelligence, biological warfare, nuclear warfare, nanotechnology, anthropogenic climate change, global warming, or stable global totalitarianism, though technologies may also help us mitigate asteroid impacts and gamma-ray bursts. In 2019 philosopher Nick Bostrom introduced the notion of a vulnerable world, "one in which there is some level of technological development at which civilization almost certainly gets devastated by default", citing the risks of a pandemic caused by bioterrorists, or an arms race triggered by the development of novel armaments and the loss of mutual assured destruction. He invites policymakers to question the assumptions that technological progress is always beneficial, that scientific openness is always preferable, or that they can afford to wait until a dangerous technology has been invented before they prepare mitigations.

Emerging technologies

Photo of a scientist looking at a microscope pointed at a petri dish
Experimental 3D printing of muscle tissue

Emerging technologies are novel technologies whose development or practical applications are still largely unrealized. They include nanotechnology, biotechnology, robotics, 3D printing, blockchains, and artificial intelligence.

In 2005, futurist Ray Kurzweil claimed the next technological revolution would rest upon advances in genetics, nanotechnology, and robotics, with robotics being the most impactful of the three. Genetic engineering will allow far greater control over human biological nature through a process called directed evolution. Some thinkers believe that this may shatter our sense of self, and have urged for renewed public debate exploring the issue more thoroughly; others fear that directed evolution could lead to eugenics or extreme social inequality. Nanotechnology will grant us the ability to manipulate matter "at the molecular and atomic scale", which could allow us to reshape ourselves and our environment in fundamental ways. Nanobots could be used within the human body to destroy cancer cells or form new body parts, blurring the line between biology and technology. Autonomous robots have undergone rapid progress, and are expected to replace humans at many dangerous tasks, including search and rescue, bomb disposal, firefighting, and war.

Estimates on the advent of artificial general intelligence vary, but half of machine learning experts surveyed in 2018 believe that AI will "accomplish every task better and more cheaply" than humans by 2063, and automate all human jobs by 2140. This expected technological unemployment has led to calls for increased emphasis on computer science education and debates about universal basic income. Political science experts predict that this could lead to a rise in extremism, while others see it as an opportunity to usher in a post-scarcity economy.

Movements

Appropriate technology

Some segments of the 1960s hippie counterculture grew to dislike urban living and developed a preference for locally autonomous, sustainable, and decentralized technology, termed appropriate technology. This later influenced hacker culture and technopaganism.

Technological utopianism

Technological utopianism refers to the belief that technological development is a moral good, which can and should bring about a utopia, that is, a society in which laws, governments, and social conditions serve the needs of all its citizens. Examples of techno-utopian goals include post-scarcity economics, life extension, mind uploading, cryonics, and the creation of artificial superintelligence. Major techno-utopian movements include transhumanism and singularitarianism.

The transhumanism movement is founded upon the "continued evolution of human life beyond its current human form" through science and technology, informed by "life-promoting principles and values." The movement gained wider popularity in the early 21st century.

Singularitarians believe that machine superintelligence will "accelerate technological progress" by orders of magnitude and "create even more intelligent entities ever faster", which may lead to a pace of societal and technological change that is "incomprehensible" to us. This event horizon is known as the technological singularity.

Major figures of techno-utopianism include Ray Kurzweil and Nick Bostrom. Techno-utopianism has attracted both praise and criticism from progressive, religious, and conservative thinkers.

Anti-technology backlash

refer to caption
Luddites smashing a power loom in 1812

Technology's central role in our lives has drawn concerns and backlash. The backlash against technology is not a uniform movement and encompasses many heterogeneous ideologies.

The earliest known revolt against technology was Luddism, a pushback against early automation in textile production. Automation had resulted in a need for fewer workers, a process known as technological unemployment.

Between the 1970s and 1990s, American terrorist Ted Kaczynski carried out a series of bombings across America and published the Unabomber Manifesto denouncing technology's negative impacts on nature and human freedom. The essay resonated with a large part of the American public. It was partly inspired by Jacques Ellul's The Technological Society.

Some subcultures, like the off-the-grid movement, advocate a withdrawal from technology and a return to nature. The ecovillage movement seeks to reestablish harmony between technology and nature.

Relation to science and engineering

Drawing of Lavoisier conducting an experiment in front of onlookers
Antoine Lavoisier experimenting with combustion generated by amplified sunlight
 

Engineering is the process by which technology is developed. It often requires problem-solving under strict constraints. Technological development is "action-oriented", while scientific knowledge is fundamentally explanatory. Polish philosopher Henryk Skolimowski framed it like so: "science concerns itself with what is, technology with what is to be."

The direction of causality between scientific discovery and technological innovation has been debated by scientists, philosophers and policymakers. Because innovation is often undertaken at the edge of scientific knowledge, most technologies are not derived from scientific knowledge, but instead from engineering, tinkering and chance. For example, in the 1940s and 1950s, when knowledge of turbulent combustion or fluid dynamics was still crude, jet engines were invented through "running the device to destruction, analyzing what broke [...] and repeating the process". Scientific explanations often follow technological developments rather than preceding them. Many discoveries also arose from pure chance, like the discovery of penicillin as a result of accidental lab contamination. Since the 1960s, the assumption that government funding of basic research would lead to the discovery of marketable technologies has lost credibility. Probabilist Nassim Taleb argues that national research programs that implement the notions of serendipity and convexity through frequent trial and error are more likely to lead to useful innovations than research that aims to reach specific outcomes.

Despite this, modern technology is increasingly reliant on deep, domain-specific scientific knowledge. In 1975, there was an average of one citation of scientific literature in every three patents granted in the U.S.; by 1989, this increased to an average of one citation per patent. The average was skewed upwards by patents related to the pharmaceutical industry, chemistry, and electronics. A 2021 analysis shows that patents that are based on scientific discoveries are on average 26% more valuable than equivalent non-science-based patents.

Other animal species

Photo of a gorilla walking hip-deep in a pond, holding a stick
This adult gorilla uses a branch as a walking stick to gauge the water's depth.

The use of basic technology is also a feature of non-human animal species. Tool use was once considered a defining characteristic of the genus Homo. This view was supplanted after discovering evidence of tool use among chimpanzees and other primates, dolphins, and crows. For example, researchers have observed wild chimpanzees using basic foraging tools, pestles, levers, using leaves as sponges, and tree bark or vines as probes to fish termites. West African chimpanzees use stone hammers and anvils for cracking nuts, as do capuchin monkeys of Boa Vista, Brazil. Tool use is not the only form of animal technology use; for example, beaver dams, built with wooden sticks or large stones, are a technology with "dramatic" impacts on river habitats and ecosystems.

Popular culture

The relationship of humanity with technology has been explored in science-fiction literature, for example in Brave New World, A Clockwork Orange, Nineteen Eighty-Four, Isaac Asimov's essays, and movies like Minority Report, Total Recall, Gattaca, and Inception. It has spawned the dystopian and futuristic cyberpunk genre, which juxtaposes futuristic technology with societal collapse, dystopia or decay. Notable cyberpunk works include William Gibson's Neuromancer novel, and movies like Blade Runner, and The Matrix.

Relativism

From Wikipedia, the free encyclopedia

Relativism is a family of philosophical views which deny claims to objectivity within a particular domain and assert that valuations in that domain are relative to the perspective of an observer or the context in which they are assessed. There are many different forms of relativism, with a great deal of variation in scope and differing degrees of controversy among them. Moral relativism encompasses the differences in moral judgments among people and cultures. Epistemic relativism holds that there are no absolute principles regarding normative belief, justification, or rationality, and that there are only relative ones. Alethic relativism (also factual relativism) is the doctrine that there are no absolute truths, i.e., that truth is always relative to some particular frame of reference, such as a language or a culture (cultural relativism). Some forms of relativism also bear a resemblance to philosophical skepticism. Descriptive relativism seeks to describe the differences among cultures and people without evaluation, while normative relativism evaluates the word truthfulness of views within a given framework.

Forms of relativism

Anthropological versus philosophical relativism

Anthropological relativism refers to a methodological stance, in which the researcher suspends (or brackets) his or her own cultural prejudice while trying to understand beliefs or behaviors in their contexts. This has become known as methodological relativism, and concerns itself specifically with avoiding ethnocentrism or the application of one's own cultural standards to the assessment of other cultures. This is also the basis of the so-called "emic" and "etic" distinction, in which:

  • An emic or insider account of behavior is a description of a society in terms that are meaningful to the participant or actor's own culture; an emic account is therefore culture-specific, and typically refers to what is considered "common sense" within the culture under observation.
  • An etic or outsider account is a description of a society by an observer, in terms that can be applied to other cultures; that is, an etic account is culturally neutral, and typically refers to the conceptual framework of the social scientist. (This is complicated when it is scientific research itself that is under study, or when there is theoretical or terminological disagreement within the social sciences.)

Philosophical relativism, in contrast, asserts that the truth of a proposition depends on the metaphysical, or theoretical frame, or the instrumental method, or the context in which the proposition is expressed, or on the person, groups, or culture who interpret the proposition.

Methodological relativism and philosophical relativism can exist independently from one another, but most anthropologists base their methodological relativism on that of the philosophical variety.

Descriptive versus normative relativism

The concept of relativism also has importance both for philosophers and for anthropologists in another way. In general, anthropologists engage in descriptive relativism ("how things are" or "how things seem"), whereas philosophers engage in normative relativism ("how things ought to be"), although there is some overlap (for example, descriptive relativism can pertain to concepts, normative relativism to truth).

Descriptive relativism assumes that certain cultural groups have different modes of thought, standards of reasoning, and so forth, and it is the anthropologist's task to describe, but not to evaluate the validity of these principles and practices of a cultural group. It is possible for an anthropologist in his or her fieldwork to be a descriptive relativist about some things that typically concern the philosopher (e.g., ethical principles) but not about others (e.g., logical principles). However, the descriptive relativist's empirical claims about epistemic principles, moral ideals and the like are often countered by anthropological arguments that such things are universal, and much of the recent literature on these matters is explicitly concerned with the extent of, and evidence for, cultural or moral or linguistic or human universals.

The fact that the various species of descriptive relativism are empirical claims may tempt the philosopher to conclude that they are of little philosophical interest, but there are several reasons why this isn't so. First, some philosophers, notably Kant, argue that certain sorts of cognitive differences between human beings (or even all rational beings) are impossible, so such differences could never be found to obtain in fact, an argument that places a priori limits on what empirical inquiry could discover and on what versions of descriptive relativism could be true. Second, claims about actual differences between groups play a central role in some arguments for normative relativism (for example, arguments for normative ethical relativism often begin with claims that different groups in fact have different moral codes or ideals). Finally, the anthropologist's descriptive account of relativism helps to separate the fixed aspects of human nature from those that can vary, and so a descriptive claim that some important aspect of experience or thought does (or does not) vary across groups of human beings tells us something important about human nature and the human condition.

Normative relativism concerns normative or evaluative claims that modes of thought, standards of reasoning, or the like are only right or wrong relative to a framework. ‘Normative’ is meant in a general sense, applying to a wide range of views; in the case of beliefs, for example, normative correctness equals truth. This does not mean, of course, that framework-relative correctness or truth is always clear, the first challenge being to explain what it amounts to in any given case (e.g., with respect to concepts, truth, epistemic norms). Normative relativism (say, in regard to normative ethical relativism) therefore implies that things (say, ethical claims) are not simply true in themselves, but only have truth values relative to broader frameworks (say, moral codes). (Many normative ethical relativist arguments run from premises about ethics to conclusions that assert the relativity of truth values, bypassing general claims about the nature of truth, but it is often more illuminating to consider the type of relativism under question directly.)

Legal relativism

In English common law, two (perhaps three) separate standards of proof are recognized:

Related and contrasting positions

Relationism is the theory that there are only relations between individual entities, and no intrinsic properties. Despite the similarity in name, it is held by some to be a position distinct from relativism—for instance, because "statements about relational properties [...] assert an absolute truth about things in the world". On the other hand, others wish to equate relativism, relationism and even relativity, which is a precise theory of relationships between physical objects: Nevertheless, "This confluence of relativity theory with relativism became a strong contributing factor in the increasing prominence of relativism".

Whereas previous investigations of science only sought sociological or psychological explanations of failed scientific theories or pathological science, the 'strong programme' is more relativistic, assessing scientific truth and falsehood equally in a historic and cultural context.

Criticisms

A common argument against relativism suggests that it inherently refutes itself: the statement "all is relative" classes either as a relative statement or as an absolute one. If it is relative, then this statement does not rule out absolutes. If the statement is absolute, on the other hand, then it provides an example of an absolute statement, proving that not all truths are relative. However, this argument against relativism only applies to relativism that positions truth as relative–i.e. epistemological/truth-value relativism. More specifically, it is only extreme forms of epistemological relativism that can come in for this criticism as there are many epistemological relativists who posit that some aspects of what is regarded as factually "true" are not universal, yet still accept that other universal truths exist (e.g. gas laws or moral laws).

Another argument against relativism posits a Natural Law. Simply put, the physical universe works under basic principles: the "Laws of Nature". Some contend that a natural Moral Law may also exist, for example as argued by, Immanuel Kant in Critique of Practical Reason, Richard Dawkins in The God Delusion (2006) and addressed by C. S. Lewis in Mere Christianity (1952). Dawkins said "I think we face an equal but much more sinister challenge from the left, in the shape of cultural relativism - the view that scientific truth is only one kind of truth and it is not to be especially privileged". Philosopher Hilary Putnam, among others, states that some forms of relativism make it impossible to believe one is in error. If there is no truth beyond an individual's belief that something is true, then an individual cannot hold their own beliefs to be false or mistaken. A related criticism is that relativizing truth to individuals destroys the distinction between truth and belief.

Views

Philosophical

Ancient

Ancient India

Ancient Indian philosophers Mahavira (c. 599 – c. 527 BC) and Nagarjuna (c. 150 – c. 250 BC) made contributions to the development of relativist philosophy.

Sophism

Sophists are considered the founding fathers of relativism in Western philosophy. Elements of relativism emerged among the Sophists in the 5th century BC. Notably, it was Protagoras who coined the phrase, "Man is the measure of all things: of things which are, that they are, and of things which are not, that they are not." The thinking of the Sophists is mainly known through their opponent, Plato. In a paraphrase from Plato's dialogue Theaetetus, Protagoras said: "What is true for you is true for you, and what is true for me is true for me."

Modern

Bernard Crick

Bernard Crick, a British political scientist and advocate of relativism, suggested in In Defence of Politics (1962) that moral conflict between people is inevitable. He thought that only ethics can resolve such conflict, and when that occurs in public it results in politics. Accordingly, Crick saw the process of dispute resolution, harms reduction, mediation or peacemaking as central to all of moral philosophy. He became an important influence on feminists and later on the Greens.

Paul Feyerabend

Philosopher of science Paul Feyerabend is often considered to be a relativist, although he denied being one.

Feyerabend argued that modern science suffers from being methodologically monistic (the belief that only a single methodology can produce scientific progress). Feyerabend summarises his case in Against Method with the phrase "anything goes".

In an aphorism [Feyerabend] often repeated, "potentially every culture is all cultures". This is intended to convey that world views are not hermetically closed, since their leading concepts have an "ambiguity" - better, an open-endedness - which enables people from other cultures to engage with them. [...] It follows that relativism, understood as the doctrine that truth is relative to closed systems, can get no purchase. [...] For Feyerabend, both hermetic relativism and its absolutist rival [realism] serve, in their different ways, to "devalue human existence". The former encourages that unsavoury brand of political correctness which takes the refusal to criticise "other cultures" to the extreme of condoning murderous dictatorship and barbaric practices. The latter, especially in its favoured contemporary form of "scientific realism", with the excessive prestige it affords to the abstractions of "the monster 'science'", is in bed with a politics which likewise disdains variety, richness and everyday individuality - a politics which likewise "hides" its norms behind allegedly neutral facts, "blunts choices and imposes laws".
Thomas Kuhn

Thomas Kuhn's philosophy of science, as expressed in The Structure of Scientific Revolutions, is often interpreted as relativistic. He claimed that, as well as progressing steadily and incrementally ("normal science"), science undergoes periodic revolutions or "paradigm shifts", leaving scientists working in different paradigms with difficulty in even communicating. Thus the truth of a claim, or the existence of a posited entity, is relative to the paradigm employed. However, it isn't necessary for him to embrace relativism because every paradigm presupposes the prior, building upon itself through history and so on. This leads to there being a fundamental, incremental, and referential structure of development which is not relative but again, fundamental.

From these remarks, one thing is however certain: Kuhn is not saying that incommensurable theories cannot be compared - what they can’t be is compared in terms of a system of common measure. He very plainly says that they can be compared, and he reiterates this repeatedly in later work, in a (mostly in vain) effort to avert the crude and sometimes catastrophic misinterpretations he suffered from mainstream philosophers and post-modern relativists alike.

But Kuhn rejected the accusation of being a relativist later in his postscript:

scientific development is ... a unidirectional and irreversible process. Later scientific theories are better than earlier ones for solving puzzles ... That is not a relativist's position, and it displays the sense in which I am a convinced believer in scientific progress.

Some have argued that one can also read Kuhn's work as essentially positivist in its ontology: the revolutions he posits are epistemological, lurching toward a presumably 'better' understanding of an objective reality through the lens presented by the new paradigm. However, a number of passages in Structure do indeed appear to be distinctly relativist, and to directly challenge the notion of an objective reality and the ability of science to progress towards an ever-greater grasp of it, particularly through the process of paradigm change.

In the sciences there need not be progress of another sort. We may, to be more precise, have to relinquish the notion, explicit or implicit, that changes of paradigm carry scientists and those who learn from them closer and closer to the truth.
We are all deeply accustomed to seeing science as the one enterprise that draws constantly nearer to some goal set by nature in advance. But need there be any such goal? Can we not account for both science’s existence and its success in terms of evolution from the community’s state of knowledge at any given time? Does it really help to imagine that there is some one full, objective, true account of nature and that the proper measure of scientific achievement is the extent to which it brings us closer to that ultimate goal?
George Lakoff and Mark Johnson

George Lakoff and Mark Johnson define relativism in Metaphors We Live By as the rejection of both subjectivism and metaphysical objectivism in order to focus on the relationship between them, i.e. the metaphor by which we relate our current experience to our previous experience. In particular, Lakoff and Johnson characterize "objectivism" as a "straw man", and, to a lesser degree, criticize the views of Karl Popper, Kant and Aristotle.

Robert Nozick

In his book Invariances, Robert Nozick expresses a complex set of theories about the absolute and the relative. He thinks the absolute/relative distinction should be recast in terms of an invariant/variant distinction, where there are many things a proposition can be invariant with regard to or vary with. He thinks it is coherent for truth to be relative, and speculates that it might vary with time. He thinks necessity is an unobtainable notion, but can be approximated by robust invariance across a variety of conditions—although we can never identify a proposition that is invariant with regard to everything. Finally, he is not particularly warm to one of the most famous forms of relativism, moral relativism, preferring an evolutionary account.

Joseph Margolis

Joseph Margolis advocates a view he calls "robust relativism" and defends it in his books Historied Thought, Constructed World, Chapter 4 (California, 1995) and The Truth about Relativism (Blackwell, 1991). He opens his account by stating that our logics should depend on what we take to be the nature of the sphere to which we wish to apply our logics. Holding that there can be no distinctions which are not "privileged" between the alethic, the ontic, and the epistemic, he maintains that a many-valued logic just might be the most apt for aesthetics or history since, because in these practices, we are loath to hold to simple binary logic; and he also holds that many-valued logic is relativistic. (This is perhaps an unusual definition of "relativistic". Compare with his comments on "relationism".) To say that "True" and "False" are mutually exclusive and exhaustive judgements on Hamlet, for instance, really does seem absurd. A many-valued logic—with her values "apt", "reasonable", "likely", and so on—seems intuitively more applicable to interpreting Hamlet. Where apparent contradictions arise between such interpretations, we might call the interpretations "incongruent", rather than dubbing either of them "false", because using many-valued logic implies that a measured value is a mixture of two extreme possibilities. Using the subset of many-valued logic, fuzzy logic, it can be said that various interpretations can be represented by membership in more than one possible truth set simultaneously. Fuzzy logic is therefore probably the best mathematical structure for understanding "robust relativism" and has been interpreted by Bart Kosko as philosophically being related to Zen Buddhism.

It was Aristotle who held that relativism implies that we should, sticking with appearances only, end up contradicting ourselves somewhere if we could apply all attributes to all ousiai (beings). Aristotle, however, made non-contradiction dependent upon his essentialism. If his essentialism is false, then so too is his ground for disallowing relativism. (Subsequent philosophers have found other reasons for supporting the principle of non-contradiction.)

Beginning with Protagoras and invoking Charles Sanders Peirce, Margolis shows that the historic struggle to discredit relativism is an attempt to impose an unexamined belief in the world's essentially rigid rule-like nature. Plato and Aristotle merely attacked "relationalism"—the doctrine of true for l or true for k, and the like, where l and k are different speakers or different worlds—or something similar (most philosophers would call this position "relativism"). For Margolis, "true" means true; that is, the alethic use of "true" remains untouched. However, in real world contexts, and context is ubiquitous in the real world, we must apply truth values. Here, in epistemic terms, we might tout court retire "true" as an evaluation and keep "false". The rest of our value-judgements could be graded from "extremely plausible" down to "false". Judgements which on a bivalent logic would be incompatible or contradictory are further seen as "incongruent", although one may well have more weight than the other. In short, relativistic logic is not, or need not be, the bugbear it is often presented to be. It may simply be the best type of logic to apply to certain very uncertain spheres of real experiences in the world (although some sort of logic needs to be applied in order to make that judgement). Those who swear by bivalent logic might simply be the ultimate keepers of the great fear of the flux.

Richard Rorty

Philosopher Richard Rorty has a somewhat paradoxical role in the debate over relativism: he is criticized for his relativistic views by many commentators, but has always denied that relativism applies to much anybody, being nothing more than a Platonic scarecrow. Rorty claims, rather, that he is a pragmatist, and that to construe pragmatism as relativism is to beg the question.

'"Relativism" is the traditional epithet applied to pragmatism by realists'
'"Relativism" is the view that every belief on a certain topic, or perhaps about any topic, is as good as every other. No one holds this view. Except for the occasional cooperative freshman, one cannot find anybody who says that two incompatible opinions on an important topic are equally good. The philosophers who get called 'relativists' are those who say that the grounds for choosing between such opinions are less algorithmic than had been thought.'
'In short, my strategy for escaping the self-referential difficulties into which "the Relativist" keeps getting himself is to move everything over from epistemology and metaphysics into cultural politics, from claims to knowledge and appeals to self-evidence to suggestions about what we should try.'

Rorty takes a deflationary attitude to truth, believing there is nothing of interest to be said about truth in general, including the contention that it is generally subjective. He also argues that the notion of warrant or justification can do most of the work traditionally assigned to the concept of truth, and that justification is relative; justification is justification to an audience, for Rorty.

In Contingency, Irony, and Solidarity he argues that the debate between so-called relativists and so-called objectivists is beside the point because they don't have enough premises in common for either side to prove anything to the other.

Nalin de Silva

In his book Mage Lokaya (My World), 1986, Nalin de Silva criticized the basis of the established western system of knowledge, and its propagation, which he refers as "domination throughout the world".He explained in this book that mind independent reality is impossible and knowledge is not found but constructed. Further he has introduced and developed the concept of "Constructive Relativism" as the basis on which knowledge is constructed relative to the sense organs, culture and the mind completely based on Avidya.

Colin Murray Turbayne

In his final book Metaphors for the Mind: The Creative Mind and Its Origins (1991), Colin Murray Turbayne joins the debate about relativism and realism by providing an analysis of the manner in which Platonic metaphors which were first presented in the procreation model of the Timaeus dialogue have evolved over time to influence the philosophical works of both George Berkeley and Emmanuel Kant. In addition, he illustrates the manner in which these ancient Greek metaphors have subsequently evolved to impact the development of the theories of "substance" and "attribute", which in turn have dominated the development of human thought and language in the 20th century.

In his The Myth of Metaphor (1962) Turbayne argues that it is perfectly possible to transcend the limitations which are inherent in such metaphors, including those incorporated within the framework of classical "objective" mechanistic Newtonian cosmology and scientific materialism in general. In Turbayne's view, one can strive to embrace a more satisfactory epistemology by first acknowledging the limitations imposed by such metaphorical systems. This can readily be accomplished by restoring Plato's metaphorical model to its original state in which both "male" and "female" aspects of the mind work in concert within the context of a harmonious balance during the process of creation.

Postmodernism

The term "relativism" often comes up in debates over postmodernism, poststructuralism and phenomenology. Critics of these perspectives often identify advocates with the label "relativism". For example, the Sapir–Whorf hypothesis is often considered a relativist view because it posits that linguistic categories and structures shape the way people view the world. Stanley Fish has defended postmodernism and relativism.

These perspectives do not strictly count as relativist in the philosophical sense, because they express agnosticism on the nature of reality and make epistemological rather than ontological claims. Nevertheless, the term is useful to differentiate them from realists who believe that the purpose of philosophy, science, or literary critique is to locate externally true meanings. Important philosophers and theorists such as Michel Foucault, Max Stirner, political movements such as post-anarchism or post-Marxism can also be considered as relativist in this sense - though a better term might be social constructivist.

The spread and popularity of this kind of "soft" relativism varies between academic disciplines. It has wide support in anthropology and has a majority following in cultural studies. It also has advocates in political theory and political science, sociology, and continental philosophy (as distinct from Anglo-American analytical philosophy). It has inspired empirical studies of the social construction of meaning such as those associated with labelling theory, which defenders can point to as evidence of the validity of their theories (albeit risking accusations of performative contradiction in the process). Advocates of this kind of relativism often also claim that recent developments in the natural sciences, such as Heisenberg's uncertainty principle, quantum mechanics, chaos theory and complexity theory show that science is now becoming relativistic. However, many scientists who use these methods continue to identify as realist or post-positivist, and some sharply criticize the association.

Religious

Jainism

Mahavira (599-527 BC), the 24th Tirthankara of Jainism, developed a philosophy known as Anekantavada. John Koller describes anekāntavāda as "epistemological respect for view of others" about the nature of existence, whether it is "inherently enduring or constantly changing", but "not relativism; it does not mean conceding that all arguments and all views are equal".

Hinduism

Hindu religion has no theological difficulties in accepting degrees of truth in other religions. A Rig Vedic hymn states that "Truth is One, though the sages tell it variously." (Ékam sat vipra bahudā vadanti)

Buddhism

Madhyamaka Buddhism, which forms the basis for many Mahayana Buddhist schools and which was founded by Nāgārjuna. Nāgārjuna taught the idea of relativity. In the Ratnāvalī, he gives the example that shortness exists only in relation to the idea of length. The determination of a thing or object is only possible in relation to other things or objects, especially by way of contrast. He held that the relationship between the ideas of "short" and "long" is not due to intrinsic nature (svabhāva). This idea is also found in the Pali Nikāyas and Chinese Āgamas, in which the idea of relativity is expressed similarly: "That which is the element of light ... is seen to exist on account of [in relation to] darkness; that which is the element of good is seen to exist on account of bad; that which is the element of space is seen to exist on account of form."

Madhyamaka Buddhism discerns two levels of truth: relative and ultimate. The two truths doctrine states that there are Relative or conventional, common-sense truth, which describes our daily experience of a concrete world, and Ultimate truth, which describes the ultimate reality as sunyata, empty of concrete and inherent characteristics. Conventional truth may be understood, in contrast, as "obscurative truth" or "that which obscures the true nature". It is constituted by the appearances of mistaken awareness. Conventional truth would be the appearance that includes a duality of apprehender and apprehended, and objects perceived within that. Ultimate truth is the phenomenal world free from the duality of apprehender and apprehended.

Sikhism

In Sikhism the Gurus (spiritual teachers) have propagated the message of "many paths" leading to the one God and ultimate salvation for all souls who tread on the path of righteousness. They have supported the view that proponents of all faiths can, by doing good and virtuous deeds and by remembering the Lord, certainly achieve salvation. The students of the Sikh faith are told to accept all leading faiths as possible vehicles for attaining spiritual enlightenment provided the faithful study, ponder and practice the teachings of their prophets and leaders. The holy book of the Sikhs called the Sri Guru Granth Sahib says: "Do not say that the Vedas, the Bible and the Koran are false. Those who do not contemplate them are false." Guru Granth Sahib page 1350; later stating: "The seconds, minutes, and hours, days, weeks and months, and the various seasons originate from the one Sun; O nanak, in just the same way, the many forms originate from the Creator." Guru Granth Sahib page 12,13.

Catholicism

The Catholic Church, especially under John Paul II and Pope Benedict XVI, has identified relativism as one of the most significant problems for faith and morals today.

According to the Church and to some theologians, relativism, as a denial of absolute truth, leads to moral license and a denial of the possibility of sin and of God. Whether moral or epistemological, relativism constitutes a denial of the capacity of the human mind and reason to arrive at truth. Truth, according to Catholic theologians and philosophers (following Aristotle) consists of adequatio rei et intellectus, the correspondence of the mind and reality. Another way of putting it states that the mind has the same form as reality. This means when the form of the computer in front of someone (the type, color, shape, capacity, etc.) is also the form that is in their mind, then what they know is true because their mind corresponds to objective reality.

The denial of an absolute reference, of an axis mundi, denies God, who equates to Absolute Truth, according to these Christian theologians. They link relativism to secularism, an obstruction of religion in human life.

Leo XIII

Pope Leo XIII (1810–1903) was the first known Pope to use the word relativism in the encyclical Humanum genus (1884). Leo XIII condemned Freemasonry and claimed that its philosophical and political system was largely based on relativism.

John Paul II

John Paul II in Veritatis Splendor

As is immediately evident, the crisis of truth is not unconnected with this development. Once the idea of a universal truth about the good, knowable by human reason, is lost, inevitably the notion of conscience also changes. Conscience is no longer considered in its primordial reality as an act of a person's intelligence, the function of which is to apply the universal knowledge of the good in a specific situation and thus to express a judgment about the right conduct to be chosen here and now. Instead, there is a tendency to grant to the individual conscience the prerogative of independently determining the criteria of good and evil and then acting accordingly. Such an outlook is quite congenial to an individualist ethic, wherein each individual is faced with his own truth, different from the truth of others. Taken to its extreme consequences, this individualism leads to a denial of the very idea of human nature.

In Evangelium Vitae (The Gospel of Life), he says:

Freedom negates and destroys itself, and becomes a factor leading to the destruction of others, when it no longer recognizes and respects its essential link with the truth. When freedom, out of a desire to emancipate itself from all forms of tradition and authority, shuts out even the most obvious evidence of an objective and universal truth, which is the foundation of personal and social life, then the person ends up by no longer taking as the sole and indisputable point of reference for his own choices the truth about good and evil, but only his subjective and changeable opinion or, indeed, his selfish interest and whim.
Benedict XVI

In April 2005, in his homily during Mass prior to the conclave which would elect him as Pope, then Cardinal Joseph Ratzinger talked about the world "moving towards a dictatorship of relativism":

How many winds of doctrine we have known in recent decades, how many ideological currents, how many ways of thinking. The small boat of thought of many Christians has often been tossed about by these waves – thrown from one extreme to the other: from Marxism to liberalism, even to libertinism; from collectivism to radical individualism; from atheism to a vague religious mysticism; from agnosticism to syncretism, and so forth. Every day new sects are created and what Saint Paul says about human trickery comes true, with cunning which tries to draw those into error (cf Ephesians 4, 14). Having a clear Faith, based on the Creed of the Church, is often labeled today as a fundamentalism. Whereas, relativism, which is letting oneself be tossed and "swept along by every wind of teaching", looks like the only attitude acceptable to today's standards. We are moving towards a dictatorship of relativism which does not recognize anything as certain and which has as its highest goal one's own ego and one's own desires. However, we have a different goal: the Son of God, true man. He is the measure of true humanism. Being an "Adult" means having a faith which does not follow the waves of today's fashions or the latest novelties. A faith which is deeply rooted in friendship with Christ is adult and mature. It is this friendship which opens us up to all that is good and gives us the knowledge to judge true from false, and deceit from truth.

On June 6, 2005, Pope Benedict XVI told educators:

Today, a particularly insidious obstacle to the task of education is the massive presence in our society and culture of that relativism which, recognizing nothing as definitive, leaves as the ultimate criterion only the self with its desires. And under the semblance of freedom it becomes a prison for each one, for it separates people from one another, locking each person into his or her own 'ego'.

Then during the World Youth Day in August 2005, he also traced to relativism the problems produced by the communist and sexual revolutions, and provided a counter-counter argument.

In the last century we experienced revolutions with a common programme–expecting nothing more from God, they assumed total responsibility for the cause of the world in order to change it. And this, as we saw, meant that a human and partial point of view was always taken as an absolute guiding principle. Absolutizing what is not absolute but relative is called totalitarianism. It does not liberate man, but takes away his dignity and enslaves him. It is not ideologies that save the world, but only a return to the living God, our Creator, the Guarantor of our freedom, the Guarantor of what is really good and true.

Cogito, ergo sum

From Wikipedia, the free encyclopedia

The Latin cogito, ergo sum, usually translated into English as "I think, therefore I am", is the "first principle" of René Descartes's philosophy. He originally published it in French as je pense, donc je suis in his 1637 Discourse on the Method, so as to reach a wider audience than Latin would have allowed. It later appeared in Latin in his Principles of Philosophy, and a similar phrase also featured prominently in his Meditations on First Philosophy. The dictum is also sometimes referred to as the cogito. As Descartes explained in a margin note, "we cannot doubt of our existence while we doubt." In the posthumously published The Search for Truth by Natural Light, he expressed this insight as dubito, ergo sum, vel, quod idem est, cogito, ergo sum ("I doubt, therefore I am — or what is the same — I think, therefore I am"). Antoine Léonard Thomas, in a 1765 essay in honor of Descartes presented it as dubito, ergo cogito, ergo sum ("I doubt, therefore I think, therefore I am").

Descartes's statement became a fundamental element of Western philosophy, as it purported to provide a certain foundation for knowledge in the face of radical doubt. While other knowledge could be a figment of imagination, deception, or mistake, Descartes asserted that the very act of doubting one's own existence served—at minimum—as proof of the reality of one's own mind; there must be a thinking entity—in this case the self—for there to be a thought.

One critique of the dictum, first suggested by Pierre Gassendi, is that it presupposes that there is an "I" which must be doing the thinking. According to this line of criticism, the most that Descartes was entitled to say was that "thinking is occurring", not that "I am thinking".

In Descartes's writings

Descartes first wrote the phrase in French in his 1637 Discourse on the Method. He referred to it in Latin without explicitly stating the familiar form of the phrase in his 1641 Meditations on First Philosophy. The earliest written record of the phrase in Latin is in his 1644 Principles of Philosophy, where, in a margin note (see below), he provides a clear explanation of his intent: "[W]e cannot doubt of our existence while we doubt". Fuller forms of the phrase are attributable to other authors.

Discourse on the Method

The phrase first appeared (in French) in Descartes's 1637 Discourse on the Method in the first paragraph of its fourth part:

Meditations on First Philosophy

In 1641, Descartes published (in Latin) Meditations on first philosophy in which he referred to the proposition, though not explicitly as "cogito, ergo sum" in Meditation II:

Principles of Philosophy

In 1644, Descartes published (in Latin) his Principles of Philosophy where the phrase "ego cogito, ergo sum" appears in Part 1, article 7:

Descartes's margin note for the above paragraph is:

The Search for Truth by Natural Light

Descartes, in a lesser-known posthumously published work dated as written ca. 1647 and titled La Recherche de la Vérité par La Lumiere Naturale (The Search for Truth by Natural Light), provides his only known phrasing of the cogito as cogito, ergo sum and admits that his insight is also expressible as dubito, ergo sum:

Other forms

The proposition is sometimes given as dubito, ergo cogito, ergo sum. This form was penned by the French literary critic, Antoine Léonard Thomas, in an award-winning 1765 essay in praise of Descartes, where it appeared as "Puisque je doute, je pense; puisque je pense, j'existe" ('Since I doubt, I think; since I think, I exist'). With rearrangement and compaction, the passage translates to "I doubt, therefore I think, therefore I am," or in Latin, "dubito, ergo cogito, ergo sum." This aptly captures Descartes's intent as expressed in his posthumously published La Recherche de la Vérité par La Lumiere Naturale as noted above: I doubt, therefore I am — or what is the same — I think, therefore I am.

A further expansion, dubito, ergo cogito, ergo sum—res cogitans ("…—a thinking thing") extends the cogito with Descartes's statement in the subsequent Meditation, "Ego sum res cogitans, id est dubitans, affirmans, negans, pauca intelligens, multa ignorans, volens, nolens, imaginans etiam et sentiens…" ("I am a thinking [conscious] thing, that is, a being who doubts, affirms, denies, knows a few objects, and is ignorant of many,-- who loves, hates, wills, refuses, who imagines likewise, and perceives"). This has been referred to as "the expanded cogito."

Translation

"I am thinking" vs. "I think"

While the Latin translation cōgitō may be translated rather easily as "I think/ponder/visualize", je pense does not indicate whether the verb form corresponds to the English simple present or progressive aspect. Technically speaking, the French lemma pense by itself is actually the result of numerous different conjugations of the verb penser (to think) – it could mean "I think... (something)"/"He thinks... (something)", "I think."/"He thinks.", or even "You (must) think... (something).", thereby necessitating the use of the wider context, or a pronoun, to understand the meaning. In the case of je pense, a pronoun is already included, je or "I", but this still leaves the question of whether "I think..." or "I think." is intended. Therefore, translation needs a larger context to determine aspect.

Following John Lyons (1982), Vladimir Žegarac notes, "The temptation to use the simple present is said to arise from the lack of progressive forms in Latin and French, and from a misinterpretation of the meaning of cogito as habitual or generic" (cf. gnomic aspect). Also following Lyons, Ann Banfield writes, "In order for the statement on which Descartes's argument depends to represent certain knowledge,… its tense must be a true present—in English, a progressive,… not as 'I think' but as 'I am thinking, in conformity with the general translation of the Latin or French present tense in such nongeneric, nonstative contexts." Or in the words of Simon Blackburn, "Descartes's premise is not 'I think' in the sense of 'I ski', which can be true even if you are not at the moment skiing. It is supposed to be parallel to 'I am skiing'."

The similar translation "I am thinking, therefore I exist" of Descartes's correspondence in French ("je pense, donc je suis") appears in The Philosophical Writings of Descartes by Cottingham et al. (1988).

The earliest known translation as "I am thinking, therefore I am" is from 1872 by Charles Porterfield Krauth.

Fumitaka Suzuki writes "Taking consideration of Cartesian theory of continuous creation, which theory was developed especially in the Meditations and in the Principles, we would assure that 'I am thinking, therefore I am/exist' is the most appropriate English translation of 'ego cogito, ergo sum'."

"I exist" vs. "I am"

Alexis Deodato S. Itao notes that cogito, ergo sum is "literally 'I think, therefore I am'." Others differ: 1) "[A] precise English translation will read as 'I am thinking, therefore I exist'.; and 2) "[S]ince Descartes ... emphasized that existence is such an important 'notion,' a better translation is 'I am thinking, therefore I exist.'"

Punctuation

Descartes wrote this phrase as such only once, in the posthumously published lesser-known work noted above,The Search for Truth by Natural Light. It appeared there mid-sentence, uncapitalized, and with a comma. (Commas were not used in Classical Latin but were a regular feature of scholastic Latin, the Latin Descartes "had learned in a Jesuit college at La Flèche.") Most modern reference works show it with a comma, but it is often presented without a comma in academic work and in popular usage. In Descartes's Principia Philosophiae, the proposition appears as ego cogito, ergo sum.

Interpretation

As put succinctly by Krauth (1872), "That cannot doubt which does not think, and that cannot think which does not exist. I doubt, I think, I exist."

The phrase cogito, ergo sum is not used in Descartes's Meditations on First Philosophy but the term "the cogito" is used to refer to an argument from it. In the Meditations, Descartes phrases the conclusion of the argument as "that the proposition, I am, I exist, is necessarily true whenever it is put forward by me or conceived in my mind" (Meditation II). George Henry Lewes says Descartes "has told us that [his objective] was to find a starting point from which to reason—to find an irreversible certainty. And where did he find this? In his own consciousness. Doubt as I may, I cannot doubt of my own existence, because my very doubts reveal to me a something which doubts. You may call this an assumption, if you will; I point out the fact as one above and beyond all logic; which logic can neither prove nor disprove; but which must always remain an irreversible certainty, and as such a fitting basis of philosophy."

At the beginning of the second meditation, having reached what he considers to be the ultimate level of doubt—his argument from the existence of a deceiving god—Descartes examines his beliefs to see if any have survived the doubt. In his belief in his own existence, he finds that it is impossible to doubt that he exists. Even if there were a deceiving god (or an evil demon), one's belief in their own existence would be secure, for there is no way one could be deceived unless one existed in order to be deceived.

But I have convinced myself that there is absolutely nothing in the world, no sky, no earth, no minds, no bodies. Does it now follow that I, too, do not exist? No. If I convinced myself of something [or thought anything at all], then I certainly existed. But there is a deceiver of supreme power and cunning who deliberately and constantly deceives me. In that case, I, too, undoubtedly exist, if he deceives me; and let him deceive me as much as he can, he will never bring it about that I am nothing, so long as I think that I am something. So, after considering everything very thoroughly, I must finally conclude that the proposition, I am, I exist, is necessarily true whenever it is put forward by me or conceived in my mind. (AT VII 25; CSM II 16–17)

There are three important notes to keep in mind here. First, he claims only the certainty of his own existence from the first-person point of view — he has not proved the existence of other minds at this point. This is something that has to be thought through by each of us for ourselves, as we follow the course of the meditations. Second, he does not say that his existence is necessary; he says that if he thinks, then necessarily he exists (see the instantiation principle). Third, this proposition "I am, I exist" is held true not based on a deduction (as mentioned above) or on empirical induction but on the clarity and self-evidence of the proposition. Descartes does not use this first certainty, the cogito, as a foundation upon which to build further knowledge; rather, it is the firm ground upon which he can stand as he works to discover further truths. As he puts it:

Archimedes used to demand just one firm and immovable point in order to shift the entire earth; so I too can hope for great things if I manage to find just one thing, however slight, that is certain and unshakable. (AT VII 24; CSM II 16)

According to many Descartes specialists, including Étienne Gilson, the goal of Descartes in establishing this first truth is to demonstrate the capacity of his criterion — the immediate clarity and distinctiveness of self-evident propositions — to establish true and justified propositions despite having adopted a method of generalized doubt. As a consequence of this demonstration, Descartes considers science and mathematics to be justified to the extent that their proposals are established on a similarly immediate clarity, distinctiveness, and self-evidence that presents itself to the mind. The originality of Descartes's thinking, therefore, is not so much in expressing the cogito—a feat accomplished by other predecessors, as we shall see—but on using the cogito as demonstrating the most fundamental epistemological principle, that science and mathematics are justified by relying on clarity, distinctiveness, and self-evidence. Baruch Spinoza in "Principia philosophiae cartesianae" at its Prolegomenon identified "cogito ergo sum" the "ego sum cogitans" (I am a thinking being) as the thinking substance with his ontological interpretation.

Predecessors

Although the idea expressed in cogito, ergo sum is widely attributed to Descartes, he was not the first to mention it. Plato spoke about the "knowledge of knowledge" (Greek: νόησις νοήσεως, nóesis noéseos) and Aristotle explains the idea in full length:

But if life itself is good and pleasant…and if one who sees is conscious that he sees, one who hears that he hears, one who walks that he walks and similarly for all the other human activities there is a faculty that is conscious of their exercise, so that whenever we perceive, we are conscious that we perceive, and whenever we think, we are conscious that we think, and to be conscious that we are perceiving or thinking is to be conscious that we exist... (Nicomachean Ethics, 1170a 25 ff.)

The Cartesian statement was interpreted to be an Aristotelian syllogism where the premise that all thinkers are also beings is not made explicit.

In the late sixth or early fifth century BC, Parmenides is quoted as saying "For to be aware and to be are the same". (Fragment B3)

In the early fifth century AD, Augustine of Hippo in De Civitate Dei (book XI, 26) affirmed his certain knowledge of his own existence, and added: "So far as these truths are concerned, I do not at all fear the arguments of the Academics when they say, What if you are mistaken? For if I am mistaken, I exist." This formulation (si fallor, sum) is sometimes called the Augustinian cogito. In 1640, Descartes wrote to thank Andreas Colvius (a friend of Descartes's mentor, Isaac Beeckman) for drawing his attention to Augustine:

I am obliged to you for drawing my attention to the passage of St Augustine relevant to my I am thinking, therefore I exist. I went today to the library of this town to read it, and I do indeed find that he does use it to prove the certainty of our existence. He goes on to show that there is a certain likeness of the Trinity in us, in that we exist, we know that we exist, and we love the existence and the knowledge we have. I, on the other hand, use the argument to show that this I which is thinking is an immaterial substance with no bodily element. These are two very different things. In itself it is such a simple and natural thing to infer that one exists from the fact that one is doubting that it could have occurred to any writer. But I am very glad to find myself in agreement with St Augustine, if only to hush the little minds who have tried to find fault with the principle.

Another predecessor was Avicenna's "Floating Man" thought experiment on human self-awareness and self-consciousness.

The 8th century Hindu philosopher Adi Shankara wrote, in a similar fashion, that no one thinks 'I am not', arguing that one's existence cannot be doubted, as there must be someone there to doubt. The central idea of cogito, ergo sum is also the topic of Mandukya Upanishad.

Spanish philosopher Gómez Pereira in his 1554 work De Inmortalitate Animae, wrote "nosco me aliquid noscere, & quidquid noscit, est, ergo ego sum" ('I know that I know something, anyone who knows is, therefore I am').

Critique

Use of "I"

In Descartes, The Project of Pure Enquiry, Bernard Williams provides a history and full evaluation of this issue. The first to raise the "I" problem was Pierre Gassendi, who in his Disquisitio Metaphysica, as noted by Saul Fisher "points out that recognition that one has a set of thoughts does not imply that one is a particular thinker or another. …[T]he only claim that is indubitable here is the agent-independent claim that there is cognitive activity present."

The objection, as presented by Georg Lichtenberg, is that rather than supposing an entity that is thinking, Descartes should have said: "thinking is occurring." That is, whatever the force of the cogito, Descartes draws too much from it; the existence of a thinking thing, the reference of the "I," is more than the cogito can justify. Friedrich Nietzsche criticized the phrase in that it presupposes that there is an "I", that there is such an activity as "thinking", and that "I" know what "thinking" is. He suggested a more appropriate phrase would be "it thinks" wherein the "it" could be an impersonal subject as in the sentence "It is raining."

Kierkegaard

The Danish philosopher Søren Kierkegaard calls the phrase a tautology in his Concluding Unscientific Postscript. He argues that the cogito already presupposes the existence of "I", and therefore concluding with existence is logically trivial. Kierkegaard's argument can be made clearer if one extracts the premise "I think" into the premises "'x' thinks" and "I am that 'x'", where "x" is used as a placeholder in order to disambiguate the "I" from the thinking thing.

Here, the cogito has already assumed the "I"'s existence as that which thinks. For Kierkegaard, Descartes is merely "developing the content of a concept", namely that the "I", which already exists, thinks. As Kierkegaard argues, the proper logical flow of argument is that existence is already assumed or presupposed in order for thinking to occur, not that existence is concluded from that thinking.

Williams

Bernard Williams claims that what we are dealing with when we talk of thought, or when we say "I am thinking," is something conceivable from a third-person perspective—namely objective "thought-events" in the former case, and an objective thinker in the latter. He argues, first, that it is impossible to make sense of "there is thinking" without relativizing it to something. However, this something cannot be Cartesian egos, because it is impossible to differentiate objectively between things just on the basis of the pure content of consciousness. The obvious problem is that, through introspection, or our experience of consciousness, we have no way of moving to conclude the existence of any third-personal fact, to conceive of which would require something above and beyond just the purely subjective contents of the mind.

Heidegger

As a critic of Cartesian subjectivity, Heidegger sought to ground human subjectivity in death as that certainty which individualizes and authenticates our being. As he wrote in 1925 in History of the Concept of Time:

This certainty, that "I myself am in that I will die," is the basic certainty of Dasein itself. It is a genuine statement of Dasein, while cogito sum is only the semblance of such a statement. If such pointed formulations mean anything at all, then the appropriate statement pertaining to Dasein in its being would have to be sum moribundus [I am in dying], moribundus not as someone gravely ill or wounded, but insofar as I am, I am moribundus. The MORIBUNDUS first gives the SUM its sense.

John Macmurray

The Scottish philosopher John Macmurray rejects the cogito outright in order to place action at the center of a philosophical system he entitles the Form of the Personal. "We must reject this, both as standpoint and as method. If this be philosophy, then philosophy is a bubble floating in an atmosphere of unreality." The reliance on thought creates an irreconcilable dualism between thought and action in which the unity of experience is lost, thus dissolving the integrity of our selves, and destroying any connection with reality. In order to formulate a more adequate cogito, Macmurray proposes the substitution of "I do" for "I think," ultimately leading to a belief in God as an agent to whom all persons stand in relation.

Quantum cryptography

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Quantum_crypto...