Search This Blog

Friday, July 26, 2024

Information Age

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Information_Age
 
Third Industrial Revolution
1947–Present
A laptop connects to the Internet to display information from Wikipedia; long-distance communication between computer systems is a hallmark of the Information Age

The Information Age (also known as the Third Industrial Revolution, Computer Age, Digital Age, Silicon Age, New Media Age, Internet Age, or the Digital Revolution) is a historical period that began in the mid-20th century. It is characterized by a rapid shift from traditional industries, as established during the Industrial Revolution, to an economy centered on information technology. The onset of the Information Age has been linked to the development of the transistor in 1947 and the optical amplifier in 1957. These technological advances have had a significant impact on the way information is processed and transmitted.

According to the United Nations Public Administration Network, the Information Age was formed by capitalizing on computer microminiaturization advances, which led to modernized information systems and internet communications as the driving force of social evolution.

Many debate if or when the Third Industrial Revolution ended and the Fourth Industrial Revolution began, ranging from 2000 to 2020.

History

The digital revolution converted technology from analog format to digital format. By doing this, it became possible to make copies that were identical to the original. In digital communications, for example, repeating hardware was able to amplify the digital signal and pass it on with no loss of information in the signal. Of equal importance to the revolution was the ability to easily move the digital information between media, and to access or distribute it remotely.

One turning point of the revolution was the change from analog to digitally recorded music. During the 1980s the digital format of optical compact discs gradually replaced analog formats, such as vinyl records and cassette tapes, as the popular medium of choice.

Previous inventions

Humans have manufactured tools for counting and calculating since ancient times, such as the abacus, astrolabe, equatorium, and mechanical timekeeping devices. More complicated devices started appearing in the 1600s, including the slide rule and mechanical calculators. By the early 1800s, the First Industrial Revolution had produced mass-market calculators like the arithmometer and the enabling technology of the punch card. Charles Babbage proposed a mechanical general-purpose computer called the Analytical Engine, but it was never successfully built, and was largely forgotten by the 20th century and unknown to most of the inventors of modern computers.

The Second Industrial Revolution in the last quarter of the 19th century developed useful electrical circuits and the telegraph. In the 1880s, Herman Hollerith developed electromechanical tabulating and calculating devices using punch cards and unit record equipment, which became widespread in business and government.

Meanwhile, various analog computer systems used electrical, mechanical, or hydraulic systems to model problems and calculate answers. These included an 1872 tide-predicting machine, differential analysers, perpetual calendar machines, the Deltar for water management in the Netherlands, network analyzers for electrical systems, and various machines for aiming military guns and bombs. The construction of problem-specific analog computers continued in the late 1940s and beyond, with FERMIAC for neutron transport, Project Cyclone for various military applications, and the Phillips Machine for economic modeling.

Building on the complexity of the Z1 and Z2, German inventor Konrad Zuse use electromechanical systems to complete in 1941 the Z3, the world's first working programmable, fully automatic digital computer. Also during World War II, Allied engineers constructed electromechanical bombes to break German Enigma machine encoding. The base-10 electromechanical Harvard Mark I was completed in 1944, and was to some degree improved with inspiration from Charles Babbage's designs.

1947–1969: Origins

A Pennsylvania state historical marker in Philadelphia cites the creation of ENIAC, the "first all-purpose digital computer", in 1946 as the beginning of the Information Age.

In 1947, the first working transistor, the germanium-based point-contact transistor, was invented by John Bardeen and Walter Houser Brattain while working under William Shockley at Bell Labs. This led the way to more advanced digital computers. From the late 1940s, universities, military, and businesses developed computer systems to digitally replicate and automate previously manually performed mathematical calculations, with the LEO being the first commercially available general-purpose computer.

Digital communication became economical for widespread adoption after the invention of the personal computer in the 1970s. Claude Shannon, a Bell Labs mathematician, is credited for having laid out the foundations of digitalization in his pioneering 1948 article, A Mathematical Theory of Communication.

Other important technological developments included the invention of the monolithic integrated circuit chip by Robert Noyce at Fairchild Semiconductor in 1959 (made possible by the planar process developed by Jean Hoerni), the first successful metal–oxide–semiconductor field-effect transistor (MOSFET, or MOS transistor) by Mohamed Atalla and Dawon Kahng at Bell Labs in 1959, and the development of the complementary MOS (CMOS) process by Frank Wanlass and Chih-Tang Sah at Fairchild in 1963.

In 1962 AT&T deployed the T-carrier for long-haul pulse-code modulation (PCM) digital voice transmission. The T1 format carried 24 pulse-code modulated, time-division multiplexed speech signals each encoded in 64 kbit/s streams, leaving 8 kbit/s of framing information which facilitated the synchronization and demultiplexing at the receiver. Over the subsequent decades the digitisation of voice became the norm for all but the last mile (where analogue continued to be the norm right into the late 1990s).

Following the development of MOS integrated circuit chips in the early 1960s, MOS chips reached higher transistor density and lower manufacturing costs than bipolar integrated circuits by 1964. MOS chips further increased in complexity at a rate predicted by Moore's law, leading to large-scale integration (LSI) with hundreds of transistors on a single MOS chip by the late 1960s. The application of MOS LSI chips to computing was the basis for the first microprocessors, as engineers began recognizing that a complete computer processor could be contained on a single MOS LSI chip. In 1968, Fairchild engineer Federico Faggin improved MOS technology with his development of the silicon-gate MOS chip, which he later used to develop the Intel 4004, the first single-chip microprocessor. It was released by Intel in 1971, and laid the foundations for the microcomputer revolution that began in the 1970s.

MOS technology also led to the development of semiconductor image sensors suitable for digital cameras. The first such image sensor was the charge-coupled device, developed by Willard S. Boyle and George E. Smith at Bell Labs in 1969, based on MOS capacitor technology.

1969–1989: Invention of the internet, rise of home computers

A visualization of the various routes through a portion of the Internet (created via The Opte Project)

The public was first introduced to the concepts that led to the Internet when a message was sent over the ARPANET in 1969. Packet switched networks such as ARPANET, Mark I, CYCLADES, Merit Network, Tymnet, and Telenet, were developed in the late 1960s and early 1970s using a variety of protocols. The ARPANET in particular led to the development of protocols for internetworking, in which multiple separate networks could be joined into a network of networks.

The Whole Earth movement of the 1960s advocated the use of new technology.

In the 1970s, the home computer was introduced, time-sharing computers, the video game console, the first coin-op video games, and the golden age of arcade video games began with Space Invaders. As digital technology proliferated, and the switch from analog to digital record keeping became the new standard in business, a relatively new job description was popularized, the data entry clerk. Culled from the ranks of secretaries and typists from earlier decades, the data entry clerk's job was to convert analog data (customer records, invoices, etc.) into digital data.

In developed nations, computers achieved semi-ubiquity during the 1980s as they made their way into schools, homes, business, and industry. Automated teller machines, industrial robots, CGI in film and television, electronic music, bulletin board systems, and video games all fueled what became the zeitgeist of the 1980s. Millions of people purchased home computers, making household names of early personal computer manufacturers such as Apple, Commodore, and Tandy. To this day the Commodore 64 is often cited as the best selling computer of all time, having sold 17 million units (by some accounts) between 1982 and 1994.

In 1984, the U.S. Census Bureau began collecting data on computer and Internet use in the United States; their first survey showed that 8.2% of all U.S. households owned a personal computer in 1984, and that households with children under the age of 18 were nearly twice as likely to own one at 15.3% (middle and upper middle class households were the most likely to own one, at 22.9%). By 1989, 15% of all U.S. households owned a computer, and nearly 30% of households with children under the age of 18 owned one. By the late 1980s, many businesses were dependent on computers and digital technology.

Motorola created the first mobile phone, Motorola DynaTac, in 1983. However, this device used analog communication - digital cell phones were not sold commercially until 1991 when the 2G network started to be opened in Finland to accommodate the unexpected demand for cell phones that was becoming apparent in the late 1980s.

Compute! magazine predicted that CD-ROM would be the centerpiece of the revolution, with multiple household devices reading the discs.

The first true digital camera was created in 1988, and the first were marketed in December 1989 in Japan and in 1990 in the United States. By the mid-2000s, digital cameras had eclipsed traditional film in popularity.

Digital ink was also invented in the late 1980s. Disney's CAPS system (created 1988) was used for a scene in 1989's The Little Mermaid and for all their animation films between 1990's The Rescuers Down Under and 2004's Home on the Range.

1989–2005: Invention of the World Wide Web, mainstreaming of the Internet, Web 1.0

Tim Berners-Lee invented the World Wide Web in 1989.

The first public digital HDTV broadcast was of the 1990 World Cup that June; it was played in 10 theaters in Spain and Italy. However, HDTV did not become a standard until the mid-2000s outside Japan.

The World Wide Web became publicly accessible in 1991, which had been available only to government and universities. In 1993 Marc Andreessen and Eric Bina introduced Mosaic, the first web browser capable of displaying inline images and the basis for later browsers such as Netscape Navigator and Internet Explorer. Stanford Federal Credit Union was the first financial institution to offer online internet banking services to all of its members in October 1994. In 1996 OP Financial Group, also a cooperative bank, became the second online bank in the world and the first in Europe. The Internet expanded quickly, and by 1996, it was part of mass culture and many businesses listed websites in their ads. By 1999, almost every country had a connection, and nearly half of Americans and people in several other countries used the Internet on a regular basis. However throughout the 1990s, "getting online" entailed complicated configuration, and dial-up was the only connection type affordable by individual users; the present day mass Internet culture was not possible.

In 1989, about 15% of all households in the United States owned a personal computer. For households with children, nearly 30% owned a computer in 1989, and in 2000, 65% owned one.

Cell phones became as ubiquitous as computers by the early 2000s, with movie theaters beginning to show ads telling people to silence their phones. They also became much more advanced than phones of the 1990s, most of which only took calls or at most allowed for the playing of simple games.

Text messaging became widely used in the late 1990s worldwide, except for in the United States of America where text messaging didn't become commonplace till the early 2000s.

The digital revolution became truly global in this time as well - after revolutionizing society in the developed world in the 1990s, the digital revolution spread to the masses in the developing world in the 2000s.

By 2000, a majority of U.S. households had at least one personal computer and internet access the following year. In 2002, a majority of U.S. survey respondents reported having a mobile phone.

2005–2020: Web 2.0, social media, smartphones, digital TV

In late 2005 the population of the Internet reached 1 billion, and 3 billion people worldwide used cell phones by the end of the decade. HDTV became the standard television broadcasting format in many countries by the end of the decade. In September and December 2006 respectively, Luxembourg and the Netherlands became the first countries to completely transition from analog to digital television. In September 2007, a majority of U.S. survey respondents reported having broadband internet at home. According to estimates from the Nielsen Media Research, approximately 45.7 million U.S. households in 2006 (or approximately 40 percent of approximately 114.4 million) owned a dedicated home video game console, and by 2015, 51 percent of U.S. households owned a dedicated home video game console according to an Entertainment Software Association annual industry report. By 2012, over 2 billion people used the Internet, twice the number using it in 2007. Cloud computing had entered the mainstream by the early 2010s. In January 2013, a majority of U.S. survey respondents reported owning a smartphone. By 2016, half of the world's population was connected and as of 2020, that number has risen to 67%.

Rise in digital technology use of computers

In the late 1980s, less than 1% of the world's technologically stored information was in digital format, while it was 94% in 2007, with more than 99% by 2014.

It is estimated that the world's capacity to store information has increased from 2.6 (optimally compressed) exabytes in 1986, to some 5,000 exabytes in 2014 (5 zettabytes).

1990

  • Cell phone subscribers: 12.5 million (0.25% of world population in 1990)
  • Internet users: 2.8 million (0.05% of world population in 1990)

2000

  • Cell phone subscribers: 1.5 billion (19% of world population in 2002)
  • Internet users: 631 million (11% of world population in 2002)

2010

  • Cell phone subscribers: 4 billion (68% of world population in 2010)
  • Internet users: 1.8 billion (26.6% of world population in 2010)

2020

  • Cell phone subscribers: 4.78 billion (62% of world population in 2020)
  • Internet users: 4.54 billion (59% of world population in 2020)
A university computer lab containing many desktop PCs

Overview of early developments

A timeline of major milestones of the Information Age, from the first message sent by the Internet protocol suite to global Internet access

Library expansion and Moore's law

Library expansion was calculated in 1945 by Fremont Rider to double in capacity every 16 years where sufficient space made available. He advocated replacing bulky, decaying printed works with miniaturized microform analog photographs, which could be duplicated on-demand for library patrons and other institutions.

Rider did not foresee, however, the digital technology that would follow decades later to replace analog microform with digital imaging, storage, and transmission media, whereby vast increases in the rapidity of information growth would be made possible through automated, potentially-lossless digital technologies. Accordingly, Moore's law, formulated around 1965, would calculate that the number of transistors in a dense integrated circuit doubles approximately every two years.

By the early 1980s, along with improvements in computing power, the proliferation of the smaller and less expensive personal computers allowed for immediate access to information and the ability to share and store it. Connectivity between computers within organizations enabled access to greater amounts of information.

Information storage and Kryder's law

Hilbert & López (2011). The World's Technological Capacity to Store, Communicate, and Compute Information. Science, 332(6025), 60–65.

The world's technological capacity to store information grew from 2.6 (optimally compressed) exabytes (EB) in 1986 to 15.8 EB in 1993; over 54.5 EB in 2000; and to 295 (optimally compressed) EB in 2007. This is the informational equivalent to less than one 730-megabyte (MB) CD-ROM per person in 1986 (539 MB per person); roughly four CD-ROM per person in 1993; twelve CD-ROM per person in the year 2000; and almost sixty-one CD-ROM per person in 2007. It is estimated that the world's capacity to store information has reached 5 zettabytes in 2014, the informational equivalent of 4,500 stacks of printed books from the earth to the sun.

The amount of digital data stored appears to be growing approximately exponentially, reminiscent of Moore's law. As such, Kryder's law prescribes that the amount of storage space available appears to be growing approximately exponentially.

Information transmission

The world's technological capacity to receive information through one-way broadcast networks was 432 exabytes of (optimally compressed) information in 1986; 715 (optimally compressed) exabytes in 1993; 1.2 (optimally compressed) zettabytes in 2000; and 1.9 zettabytes in 2007, the information equivalent of 174 newspapers per person per day.

The world's effective capacity to exchange information through two-way telecommunication networks was 281 petabytes of (optimally compressed) information in 1986; 471 petabytes in 1993; 2.2 (optimally compressed) exabytes in 2000; and 65 (optimally compressed) exabytes in 2007, the information equivalent of six newspapers per person per day. In the 1990s, the spread of the Internet caused a sudden leap in access to and ability to share information in businesses and homes globally. A computer that cost $3000 in 1997 would cost $2000 two years later and $1000 the following year, due to the rapid advancement of technology.

Computation

The world's technological capacity to compute information with human-guided general-purpose computers grew from 3.0 × 108 MIPS in 1986, to 4.4 × 109 MIPS in 1993; to 2.9 × 1011 MIPS in 2000; to 6.4 × 1012 MIPS in 2007. An article featured in the journal Trends in Ecology and Evolution in 2016 reported that:

Digital technology has vastly exceeded the cognitive capacity of any single human being and has done so a decade earlier than predicted. In terms of capacity, there are two measures of importance: the number of operations a system can perform and the amount of information that can be stored. The number of synaptic operations per second in a human brain has been estimated to lie between 10^15 and 10^17. While this number is impressive, even in 2007 humanity's general-purpose computers were capable of performing well over 10^18 instructions per second. Estimates suggest that the storage capacity of an individual human brain is about 10^12 bytes. On a per capita basis, this is matched by current digital storage (5x10^21 bytes per 7.2x10^9 people).

Genetic information

Genetic code may also be considered part of the information revolution. Now that sequencing has been computerized, genome can be rendered and manipulated as data. This started with DNA sequencing, invented by Walter Gilbert and Allan Maxam in 1976-1977 and Frederick Sanger in 1977, grew steadily with the Human Genome Project, initially conceived by Gilbert and finally, the practical applications of sequencing, such as gene testing, after the discovery by Myriad Genetics of the BRCA1 breast cancer gene mutation. Sequence data in Genbank has grown from the 606 genome sequences registered in December 1982 to the 231 million genomes in August 2021. An additional 13 trillion incomplete sequences are registered in the Whole Genome Shotgun submission database as of August 2021. The information contained in these registered sequences has doubled every 18 months.

Different stage conceptualizations

During rare times in human history, there have been periods of innovation that have transformed human life. The Neolithic Age, the Scientific Age and the Industrial Age all, ultimately, induced discontinuous and irreversible changes in the economic, social and cultural elements of the daily life of most people. Traditionally, these epochs have taken place over hundreds, or in the case of the Neolithic Revolution, thousands of years, whereas the Information Age swept to all parts of the globe in just a few years, as a result of the rapidly advancing speed of information exchange.

Between 7,000 and 10,000 years ago during the Neolithic period, humans began to domesticate animals, began to farm grains and to replace stone tools with ones made of metal. These innovations allowed nomadic hunter-gatherers to settle down. Villages formed along the Yangtze River in China in 6,500 B.C., the Nile River region of Africa and in Mesopotamia (Iraq) in 6,000 B.C. Cities emerged between 6,000 B.C. and 3,500 B.C. The development of written communication (cuneiform in Sumeria and hieroglyphs in Egypt in 3,500 B.C. and writing in Egypt in 2,560 B.C. and in Minoa and China around 1,450 B.C.) enabled ideas to be preserved for extended periods to spread extensively. In all, Neolithic developments, augmented by writing as an information tool, laid the groundwork for the advent of civilization.

The Scientific Age began in the period between Galileo's 1543 proof that the planets orbit the Sun and Newton's publication of the laws of motion and gravity in Principia in 1697. This age of discovery continued through the 18th century, accelerated by widespread use of the moveable type printing press by Johannes Gutenberg.

The Industrial Age began in Great Britain in 1760 and continued into the mid-19th century. The invention of machines such as the mechanical textile weaver by Edmund Cartwrite, the rotating shaft steam engine by James Watt and the cotton gin by Eli Whitney, along with processes for mass manufacturing, came to serve the needs of a growing global population. The Industrial Age harnessed steam and waterpower to reduce the dependence on animal and human physical labor as the primary means of production. Thus, the core of the Industrial Revolution was the generation and distribution of energy from coal and water to produce steam and, later in the 20th century, electricity.

The Information Age also requires electricity to power the global networks of computers that process and store data. However, what dramatically accelerated the pace of The Information Age’s adoption, as compared to previous ones, was the speed by which knowledge could be transferred and pervaded the entire human family in a few short decades. This acceleration came about with the adoptions of a new form of power. Beginning in 1972, engineers devised ways to harness light to convey data through fiber optic cable. Today, light-based optical networking systems at the heart of telecom networks and the Internet span the globe and carry most of the information traffic to and from users and data storage systems.

Three stages of the Information Age

There are different conceptualizations of the Information Age. Some focus on the evolution of information over the ages, distinguishing between the Primary Information Age and the Secondary Information Age. Information in the Primary Information Age was handled by newspapers, radio and television. The Secondary Information Age was developed by the Internet, satellite televisions and mobile phones. The Tertiary Information Age was emerged by media of the Primary Information Age interconnected with media of the Secondary Information Age as presently experienced.

Stages of development expressed as Kondratiev waves

Others classify it in terms of the well-established Schumpeterian long waves or Kondratiev waves. Here authors distinguish three different long-term metaparadigms, each with different long waves. The first focused on the transformation of material, including stone, bronze, and iron. The second, often referred to as industrial revolution, was dedicated to the transformation of energy, including water, steam, electric, and combustion power. Finally, the most recent metaparadigm aims at transforming information. It started out with the proliferation of communication and stored data and has now entered the age of algorithms, which aims at creating automated processes to convert the existing information into actionable knowledge.

Information in social and economic activities

The main feature of the information revolution is the growing economic, social and technological role of information. Information-related activities did not come up with the Information Revolution. They existed, in one form or the other, in all human societies, and eventually developed into institutions, such as the Platonic Academy, Aristotle's Peripatetic school in the Lyceum, the Musaeum and the Library of Alexandria, or the schools of Babylonian astronomy. The Agricultural Revolution and the Industrial Revolution came up when new informational inputs were produced by individual innovators, or by scientific and technical institutions. During the Information Revolution all these activities are experiencing continuous growth, while other information-oriented activities are emerging.

Information is the central theme of several new sciences, which emerged in the 1940s, including Shannon's (1949) Information Theory and Wiener's (1948) Cybernetics. Wiener stated: "information is information not matter or energy". This aphorism suggests that information should be considered along with matter and energy as the third constituent part of the Universe; information is carried by matter or by energy. By the 1990s some writers believed that changes implied by the Information revolution will lead to not only a fiscal crisis for governments but also the disintegration of all "large structures".

The theory of information revolution

The term information revolution may relate to, or contrast with, such widely used terms as Industrial Revolution and Agricultural Revolution. Note, however, that you may prefer mentalist to materialist paradigm. The following fundamental aspects of the theory of information revolution can be given:

  1. The object of economic activities can be conceptualized according to the fundamental distinction between matter, energy, and information. These apply both to the object of each economic activity, as well as within each economic activity or enterprise. For instance, an industry may process matter (e.g. iron) using energy and information (production and process technologies, management, etc.).
  2. Information is a factor of production (along with capital, labor, land (economics)), as well as a product sold in the market, that is, a commodity. As such, it acquires use value and exchange value, and therefore a price.
  3. All products have use value, exchange value, and informational value. The latter can be measured by the information content of the product, in terms of innovation, design, etc.
  4. Industries develop information-generating activities, the so-called Research and Development (R&D) functions.
  5. Enterprises, and society at large, develop the information control and processing functions, in the form of management structures; these are also called "white-collar workers", "bureaucracy", "managerial functions", etc.
  6. Labor can be classified according to the object of labor, into information labor and non-information labor.
  7. Information activities constitute a large, new economic sector, the information sector along with the traditional primary sector, secondary sector, and tertiary sector, according to the three-sector hypothesis. These should be restated because they are based on the ambiguous definitions made by Colin Clark (1940), who included in the tertiary sector all activities that have not been included in the primary (agriculture, forestry, etc.) and secondary (manufacturing) sectors. The quaternary sector and the quinary sector of the economy attempt to classify these new activities, but their definitions are not based on a clear conceptual scheme, although the latter is considered by some as equivalent with the information sector.
  8. From a strategic point of view, sectors can be defined as information sector, means of production, means of consumption, thus extending the classical Ricardo-Marx model of the Capitalist mode of production (see Influences on Karl Marx). Marx stressed in many occasions the role of the "intellectual element" in production, but failed to find a place for it into his model.ds of production, patents, etc. Diffusion of innovations manifests saturation effects (related term: market saturation), following certain cyclical patterns and creating "economic waves", also referred to as "business cycles". There are various types of waves, such as Kondratiev wave (54 years), Kuznets swing (18 years), Juglar cycle (9 years) and Kitchin (about 4 years, see also Joseph Schumpeter) distinguished by their nature, duration, and, thus, economic impact.
  9. Diffusion of innovations causes structural-sectoral shifts in the economy, which can be smooth or can create crisis and renewal, a process which Joseph Schumpeter called vividly "creative destruction".

From a different perspective, Irving E. Fang (1997) identified six 'Information Revolutions': writing, printing, mass media, entertainment, the 'tool shed' (which we call 'home' now), and the information highway. In this work the term 'information revolution' is used in a narrow sense, to describe trends in communication media.

Measuring and modeling the information revolution

Porat (1976) measured the information sector in the US using the input-output analysis; OECD has included statistics on the information sector in the economic reports of its member countries. Veneris (1984, 1990) explored the theoretical, economic and regional aspects of the informational revolution and developed a systems dynamics simulation computer model.

These works can be seen as following the path originated with the work of Fritz Machlup who in his (1962) book "The Production and Distribution of Knowledge in the United States", claimed that the "knowledge industry represented 29% of the US gross national product", which he saw as evidence that the Information Age had begun. He defines knowledge as a commodity and attempts to measure the magnitude of the production and distribution of this commodity within a modern economy. Machlup divided information use into three classes: instrumental, intellectual, and pastime knowledge. He identified also five types of knowledge: practical knowledge; intellectual knowledge, that is, general culture and the satisfying of intellectual curiosity; pastime knowledge, that is, knowledge satisfying non-intellectual curiosity or the desire for light entertainment and emotional stimulation; spiritual or religious knowledge; unwanted knowledge, accidentally acquired and aimlessly retained.

More recent estimates have reached the following results:

  • the world's technological capacity to receive information through one-way broadcast networks grew at a sustained compound annual growth rate of 7% between 1986 and 2007;
  • the world's technological capacity to store information grew at a sustained compound annual growth rate of 25% between 1986 and 2007;
  • the world's effective capacity to exchange information through two-way telecommunication networks grew at a sustained compound annual growth rate of 30% during the same two decades;
  • the world's technological capacity to compute information with the help of humanly guided general-purpose computers grew at a sustained compound annual growth rate of 61% during the same period.

Economics

Eventually, Information and communication technology (ICT)—i.e. computers, computerized machinery, fiber optics, communication satellites, the Internet, and other ICT tools—became a significant part of the world economy, as the development of optical networking and microcomputers greatly changed many businesses and industries. Nicholas Negroponte captured the essence of these changes in his 1995 book, Being Digital, in which he discusses the similarities and differences between products made of atoms and products made of bits.

Jobs and income distribution

The Information Age has affected the workforce in several ways, such as compelling workers to compete in a global job market. One of the most evident concerns is the replacement of human labor by computers that can do their jobs faster and more effectively, thus creating a situation in which individuals who perform tasks that can easily be automated are forced to find employment where their labor is not as disposable. This especially creates issue for those in industrial cities, where solutions typically involve lowering working time, which is often highly resisted. Thus, individuals who lose their jobs may be pressed to move up into more indispensable professions (e.g. engineers, doctors, lawyers, teachers, professors, scientists, executives, journalists, consultants), who are able to compete successfully in the world market and receive (relatively) high wages.

Along with automation, jobs traditionally associated with the middle class (e.g. assembly line, data processing, management, and supervision) have also begun to disappear as result of outsourcing. Unable to compete with those in developing countries, production and service workers in post-industrial (i.e. developed) societies either lose their jobs through outsourcing, accept wage cuts, or settle for low-skill, low-wage service jobs. In the past, the economic fate of individuals would be tied to that of their nation's. For example, workers in the United States were once well paid in comparison to those in other countries. With the advent of the Information Age and improvements in communication, this is no longer the case, as workers must now compete in a global job market, whereby wages are less dependent on the success or failure of individual economies.

In effectuating a globalized workforce, the internet has just as well allowed for increased opportunity in developing countries, making it possible for workers in such places to provide in-person services, therefore competing directly with their counterparts in other nations. This competitive advantage translates into increased opportunities and higher wages.

Automation, productivity, and job gain

The Information Age has affected the workforce in that automation and computerization have resulted in higher productivity coupled with net job loss in manufacturing. In the United States, for example, from January 1972 to August 2010, the number of people employed in manufacturing jobs fell from 17,500,000 to 11,500,000 while manufacturing value rose 270%. Although it initially appeared that job loss in the industrial sector might be partially offset by the rapid growth of jobs in information technology, the recession of March 2001 foreshadowed a sharp drop in the number of jobs in the sector. This pattern of decrease in jobs would continue until 2003, and data has shown that, overall, technology creates more jobs than it destroys even in the short run.

Information-intensive industry

Industry has become more information-intensive while less labor- and capital-intensive. This has left important implications for the workforce, as workers have become increasingly productive as the value of their labor decreases. For the system of capitalism itself, the value of labor decreases, the value of capital increases.

In the classical model, investments in human and financial capital are important predictors of the performance of a new venture. However, as demonstrated by Mark Zuckerberg and Facebook, it now seems possible for a group of relatively inexperienced people with limited capital to succeed on a large scale.

Innovations

A visualization of the various routes through a portion of the Internet

The Information Age was enabled by technology developed in the Digital Revolution, which was itself enabled by building on the developments of the Technological Revolution.

Transistors

The onset of the Information Age can be associated with the development of transistor technology. The concept of a field-effect transistor was first theorized by Julius Edgar Lilienfeld in 1925. The first practical transistor was the point-contact transistor, invented by the engineers Walter Houser Brattain and John Bardeen while working for William Shockley at Bell Labs in 1947. This was a breakthrough that laid the foundations for modern technology. Shockley's research team also invented the bipolar junction transistor in 1952. The most widely used type of transistor is the metal–oxide–semiconductor field-effect transistor (MOSFET), invented by Mohamed M. Atalla and Dawon Kahng at Bell Labs in 1960. The complementary MOS (CMOS) fabrication process was developed by Frank Wanlass and Chih-Tang Sah in 1963.

Computers

Before the advent of electronics, mechanical computers, like the Analytical Engine in 1837, were designed to provide routine mathematical calculation and simple decision-making capabilities. Military needs during World War II drove development of the first electronic computers, based on vacuum tubes, including the Z3, the Atanasoff–Berry Computer, Colossus computer, and ENIAC.

The invention of the transistor enabled the era of mainframe computers (1950s–1970s), typified by the IBM 360. These large, room-sized computers provided data calculation and manipulation that was much faster than humanly possible, but were expensive to buy and maintain, so were initially limited to a few scientific institutions, large corporations, and government agencies.

The germanium integrated circuit (IC) was invented by Jack Kilby at Texas Instruments in 1958. The silicon integrated circuit was then invented in 1959 by Robert Noyce at Fairchild Semiconductor, using the planar process developed by Jean Hoerni, who was in turn building on Mohamed Atalla's silicon surface passivation method developed at Bell Labs in 1957. Following the invention of the MOS transistor by Mohamed Atalla and Dawon Kahng at Bell Labs in 1959, the MOS integrated circuit was developed by Fred Heiman and Steven Hofstein at RCA in 1962. The silicon-gate MOS IC was later developed by Federico Faggin at Fairchild Semiconductor in 1968. With the advent of the MOS transistor and the MOS IC, transistor technology rapidly improved, and the ratio of computing power to size increased dramatically, giving direct access to computers to ever smaller groups of people.

The first commercial single-chip microprocessor launched in 1971, the Intel 4004, which was developed by Federico Faggin using his silicon-gate MOS IC technology, along with Marcian Hoff, Masatoshi Shima and Stan Mazor.

Along with electronic arcade machines and home video game consoles pioneered by Nolan Bushnell in the 1970s, the development of personal computers like the Commodore PET and Apple II (both in 1977) gave individuals access to the computer. However, data sharing between individual computers was either non-existent or largely manual, at first using punched cards and magnetic tape, and later floppy disks.

Data

The first developments for storing data were initially based on photographs, starting with microphotography in 1851 and then microform in the 1920s, with the ability to store documents on film, making them much more compact. Early information theory and Hamming codes were developed about 1950, but awaited technical innovations in data transmission and storage to be put to full use.

Magnetic-core memory was developed from the research of Frederick W. Viehe in 1947 and An Wang at Harvard University in 1949. With the advent of the MOS transistor, MOS semiconductor memory was developed by John Schmidt at Fairchild Semiconductor in 1964. In 1967, Dawon Kahng and Simon Sze at Bell Labs described in 1967 how the floating gate of an MOS semiconductor device could be used for the cell of a reprogrammable ROM. Following the invention of flash memory by Fujio Masuoka at Toshiba in 1980, Toshiba commercialized NAND flash memory in 1987.

Copper wire cables transmitting digital data connected computer terminals and peripherals to mainframes, and special message-sharing systems leading to email, were first developed in the 1960s. Independent computer-to-computer networking began with ARPANET in 1969. This expanded to become the Internet (coined in 1974). Access to the Internet improved with the invention of the World Wide Web in 1991. The capacity expansion from dense wave division multiplexing, optical amplification and optical networking in the mid-1990s led to record data transfer rates. By 2018, optical networks routinely delivered 30.4 terabits/s over a fiber optic pair, the data equivalent of 1.2 million simultaneous 4K HD video streams.

MOSFET scaling, the rapid miniaturization of MOSFETs at a rate predicted by Moore's law, led to computers becoming smaller and more powerful, to the point where they could be carried. During the 1980s–1990s, laptops were developed as a form of portable computer, and personal digital assistants (PDAs) could be used while standing or walking. Pagers, widely used by the 1980s, were largely replaced by mobile phones beginning in the late 1990s, providing mobile networking features to some computers. Now commonplace, this technology is extended to digital cameras and other wearable devices. Starting in the late 1990s, tablets and then smartphones combined and extended these abilities of computing, mobility, and information sharing. Metal–oxide–semiconductor (MOS) image sensors, which first began appearing in the late 1960s, led to the transition from analog to digital imaging, and from analog to digital cameras, during the 1980s–1990s. The most common image sensors are the charge-coupled device (CCD) sensor and the CMOS (complementary MOS) active-pixel sensor (CMOS sensor).

Electronic paper, which has origins in the 1970s, allows digital information to appear as paper documents.

Personal computers

By 1976, there were several firms racing to introduce the first truly successful commercial personal computers. Three machines, the Apple II, Commodore PET 2001 and TRS-80 were all released in 1977, becoming the most popular by late 1978. Byte magazine later referred to Commodore, Apple, and Tandy as the "1977 Trinity". Also in 1977, Sord Computer Corporation released the Sord M200 Smart Home Computer in Japan.

Apple II

April 1977: Apple II.

Steve Wozniak (known as "Woz"), a regular visitor to Homebrew Computer Club meetings, designed the single-board Apple I computer and first demonstrated it there. With specifications in hand and an order for 100 machines at US$500 each from the Byte Shop, Woz and his friend Steve Jobs founded Apple Computer.

About 200 of the machines sold before the company announced the Apple II as a complete computer. It had color graphics, a full QWERTY keyboard, and internal slots for expansion, which were mounted in a high quality streamlined plastic case. The monitor and I/O devices were sold separately. The original Apple II operating system was only the built-in BASIC interpreter contained in ROM. Apple DOS was added to support the diskette drive; the last version was "Apple DOS 3.3".

Its higher price and lack of floating point BASIC, along with a lack of retail distribution sites, caused it to lag in sales behind the other Trinity machines until 1979, when it surpassed the PET. It was again pushed into 4th place when Atari, Inc. introduced its Atari 8-bit computers.

Despite slow initial sales, the lifetime of the Apple II series was about eight years longer than other machines, and so accumulated the highest total sales. By 1985, 2.1 million had sold and more than 4 million Apple II's were shipped by the end of its production in 1993.

Optical networking

Optical communication plays a crucial role in communication networks. Optical communication provides the transmission backbone for the telecommunications and computer networks that underlie the Internet, the foundation for the Digital Revolution and Information Age.

The two core technologies are the optical fiber and light amplification (the optical amplifier). In 1953, Bram van Heel demonstrated image transmission through bundles of optical fibers with a transparent cladding. The same year, Harold Hopkins and Narinder Singh Kapany at Imperial College succeeded in making image-transmitting bundles with over 10,000 optical fibers, and subsequently achieved image transmission through a 75 cm long bundle which combined several thousand fibers.

Gordon Gould invented the optical amplifier and the laser, and also established the first optical telecommunications company, Optelecom, to design communication systems. The firm was a co-founder in Ciena Corp., the venture that popularized the optical amplifier with the introduction of the first dense wave division multiplexing system. This massive scale communication technology has emerged as the common basis of all telecommunication networks and, thus, a foundation of the Information Age.

Economy, society and culture

Manuel Castells captures the significance of the Information Age in The Information Age: Economy, Society and Culture when he writes of our global interdependence and the new relationships between economy, state and society, what he calls "a new society-in-the-making." He cautions that just because humans have dominated the material world, does not mean that the Information Age is the end of history:

"It is in fact, quite the opposite: history is just beginning, if by history we understand the moment when, after millennia of a prehistoric battle with Nature, first to survive, then to conquer it, our species has reached the level of knowledge and social organization that will allow us to live in a predominantly social world. It is the beginning of a new existence, and indeed the beginning of a new age, The Information Age, marked by the autonomy of culture vis-à-vis the material basis of our existence."

Thomas Chatterton Williams wrote about the dangers of anti-intellectualism in the Information Age in a piece for The Atlantic. Although access to information has never been greater, most information is irrelevant or insubstantial. The Information Age's emphasis on speed over expertise contributes to "superficial culture in which even the elite will openly disparage as pointless our main repositories for the very best that has been thought."

God helmet

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/God_helmet

The God helmet is an experimental apparatus (originally called the Koren helmet) developed by Stanley Koren and neuroscientist Michael Persinger to study creativity, religious experience and the effects of subtle stimulation of the temporal lobes. Reports by participants of a "sensed presence" while wearing the God helmet brought public attention and resulted in several TV documentaries. The device has been used in Persinger's research in the field of neurotheology, the study of the purported neural correlates of religion and spirituality. The apparatus, placed on the head of an experimental subject, generates very weak magnetic fields, that Persinger refers to as "complex". Like other neural stimulation with low-intensity magnetic fields, these fields are approximately as strong as those generated by a land line telephone handset or an ordinary hair dryer, but far weaker than that of an ordinary refrigerator magnet and approximately a million times weaker than transcranial magnetic stimulation.

Persinger reports that many subjects have reported "mystical experiences and altered states" while wearing the God Helmet. The foundations of his theory have been criticized in the scientific press. Anecdotal reports by journalists, academics and documentarists have been mixed and several effects reported by Persinger have not yet been independently replicated. One attempt at replication published in the scientific literature reported a failure to reproduce Persinger's effects and the authors speculated that the suggestibility of participants, improper blinding of participants or idiosyncratic methodology could explain Persinger's results. Persinger argues that the replication was technically flawed, but the researchers have stood by their replication. However, one group has published a direct replication of one God Helmet experiment. Other groups have reported no effects at all or have generated similar experiences by using sham helmets, or helmets that are not turned on. The research using sham equipment was marred by the fact that, in one case " ... the data from the ... study (using only a sham headset) had been faked", and "the student ... (who did it) ... was banned from the University."

Development

The God Helmet was not specifically designed to elicit visions of God, but to test several of Persinger's hypotheses about brain function. The first of these is the Vectorial Hemisphericity Hypothesis, which proposes that the human sense of self has two components, one on each side of the brain, that ordinarily work together but in which the left hemisphere is usually dominant. Persinger argues that the two hemispheres make different contributions to a single sense of self, but under certain conditions can appear as two separate 'selves'. Persinger and Koren designed the God Helmet in an attempt to create conditions in which contributions to the sense of self from both cerebral hemispheres is disrupted.

The second experimental hypothesis was that when communication between the left and right senses of self is disturbed, as they report it is while wearing the God Helmet, the usually-subordinate 'self' in the right hemisphere intrudes into the awareness of the left-hemispheric dominant self, causing what Persinger refers to as "interhemispheric intrusions".

The third hypothesis was that "visitor experiences" could be explained by such "interhemispheric intrusions" caused by a disruption in "vectorial hemisphericity". Persinger theorises that many paranormal experiences, feelings of having lived past lives, felt presences of non-physical beings, ghosts, muses, and other "spiritual beings", are examples of interhemispheric intrusions (an idea originally proposed in 1976 in Julian Jaynes' bicameral mentality hypothesis).

The God Helmet experiments were also intended, though not specifically designed (see above), to validate the idea that religious and mystic experiences are artifacts of temporal lobe function.

The device

Persinger uses a modified snowmobile helmet that incorporates solenoids placed over the temporal lobes. This device produces magnetic fields that Persinger describes as "weak but complex" (1 microTesla). The pattern of fluctuation in these magnetic fields is derived from physiological sources, for example patterns that appear in EEG traces taken from limbic structures. The purpose of exposing magnetic fields patterned after neurophysiological sources, such as the burst-firing profile of the amygdala, is to enhance the probability of activating the structure from which the signal was derived.

The sessions are conducted with the subject seated in an acoustic chamber. The acoustic chamber is also a Faraday cage, shielding out all EMF emissions and radiation except the Earth's magnetic field. Persinger reports that this shielding allows him to use the apparatus to investigate the effects of geomagnetism on the human brain.

Comparison with TMS

Neither the God Helmet, nor technologies derived from it, are examples of transcranial magnetic stimulation (TMS), which uses magnetic fields on the order of one million times stronger than those used in Persinger's lab. Despite this, Persinger reports similar effect sizes with his apparatus. The magnetic fields employed in TMS and in Persinger's experiments are also very different. TMS uses single, paired, and repetitive pulses of high intensity to penetrate the cranium. In contrast, Persinger's apparatus uses weak complex magnetic signals patterned after physiological processes, such as one derived from limbic burst firing.

Experiences

Most reports from Persinger's lab consist of people sensing "presences"; people often interpreted these to be that of angels, a deceased being known to the subject, or a group of beings of some kind. There have also been reports in which the participant has experienced what they perceive as God. Persinger reports that "at least" 80 percent of his participants experience a presence beside them in the room, and others report less evocative experiences of "another consciousness or sentient being".

Anecdotal reports

The scientist and science writer Richard Dawkins, appearing in the BBC science documentary series Horizon, did not have a 'sensed presence' experience, but instead felt at times 'slightly dizzy', 'quite strange' and had sensations in his limbs and changes in his breathing. He summarised his experience as follows: "It pretty much felt as though I was in total darkness, with a helmet on my head and pleasantly relaxed". Persinger explained Dawkins' limited results in terms of his low score on a psychological scale measuring temporal lobe sensitivity.

In contrast, the experimental psychologist, and former parapsychology researcher, Susan Blackmore said: "When I went to Persinger's lab and underwent his procedures I had the most extraordinary experiences I've ever had… I'll be surprised if it turns out to be a placebo effect."

Jack Hitt, a journalist from Wired magazine, visited Persinger's lab in 1999 and expressed confusion over Persinger's post-stimulation debriefing ("One question: Did the red bulb on the wall grow larger or smaller? There was a red bulb on the wall? I hadn't noticed.") and reported: "Many other questions suggest that there were other experiences I should have had, but to be honest, I didn't. In fact, as transcendental experiences go, on a scale of 1 to 10, Persinger's helmet falls somewhere around, oh, 4. Even though I did have a fairly convincing out-of-body experience, I'm disappointed relative to the great expectations and anxieties I had going in."

Replication attempts and debate

In December 2004 Nature reported that a group of Swedish researchers led by Pehr Granqvist, a psychologist at Uppsala University in Sweden, had attempted to replicate Persinger's experiments under double-blind conditions, and were not able to reproduce the effect. The study was published in Neuroscience Letters in 2005. Granqvist et al concluded that the presence or absence of the magnetic field had no relationship with any religious or spiritual experience reported by the participants, but was predicted entirely by their suggestibility and personality traits. Persinger, however, took issue with the Swedish attempt to replicate his work. "They didn't replicate it, not even close," he says. He argued that the Swedish group did not expose the subjects to magnetic fields for long enough to produce an effect. Granqvist et al. respond that Persinger agreed with their proposed methodology beforehand and they stand by their replication.

The theoretical basis for the God helmet, especially the connection between temporal lobe function and mystic experiences, has also been questioned.

Only one group unconnected to Persinger's lab has so far succeeded in replicating the effects of one of Persinger's early studies. They reported that their experiment had ruled out suggestibility as an explanation for Persinger's effects, and that analysis of their subjects' verbal reports revealed significant differences between the speech of subjects and controls, as well as less robust effects for suggestion and expectation.

Other groups have subsequently found that individual differences such as strong belief in the paranormal and magical ideation predict some alterations in consciousness and reported "exceptional experiences" when Persinger et al's experimental set-up and procedure are reproduced, but with a sham "God helmet" that is completely inert or a helmet that is turned off. These groups have concluded that psychological factors must have played an important role in prior experiments.

Persinger and colleagues also developed a device nicknamed "The Octopus" which uses solenoids around the whole brain, in a circle just above subject's ears. Commercial versions of the God helmet, Octopus and associated devices are sold by Persinger's research associate Todd Murphy, and he reports that his devices are able to modulate emotional states in addition to enhancing meditation and generating altered states. One experiment found no changes in emotional responses to photographs whether the device was on or off, Persinger and colleagues report significant changes in subjects' EEG during stimulation with a Shakti system. In one report by Persinger's lab, published in the fringe journal NeuroQuantology, these changes were correlated with an out-of-body experience.

One published attempt to test Persinger's theories regarding the psychological effects of environmental magnetic fields, used whole-body exposure to magnetic fields and ultrasound in freely-moving participants to create a "haunted room" within which it was hoped subjects would sense a "presence". The study found that reports of unusual experiences were unrelated with the presence or absence of "complex" environmental electromagnetic fields similar to Persinger's. They speculated that the effects were likely due to suggestibility, though they did not directly measure it.

Thursday, July 25, 2024

Just war theory

From Wikipedia, the free encyclopedia
Saint Augustine was the first clear advocate of just-war theory.

The just war theory (Latin: bellum iustum) is a doctrine, also referred to as a tradition, of military ethics that aims to ensure that a war is morally justifiable through a series of criteria, all of which must be met for a war to be considered just. It has been studied by military leaders, theologians, ethicists and policymakers. The criteria are split into two groups: jus ad bellum ("right to go to war") and jus in bello ("right conduct in war"). The first group of criteria concerns the morality of going to war, and the second group of criteria concerns the moral conduct within war. There have been calls for the inclusion of a third category of just war theory (jus post bellum) dealing with the morality of post-war settlement and reconstruction. The just war theory postulates the belief that war, while it is terrible but less so with the right conduct, is not always the worst option. Important responsibilities, undesirable outcomes, or preventable atrocities may justify war.

Opponents of the just war theory may either be inclined to a stricter pacifist standard (proposing that there has never been nor can there ever be a justifiable basis for war) or they may be inclined toward a more permissive nationalist standard (proposing that a war need only to serve a nation's interests to be justifiable). In many cases, philosophers state that individuals do not need to be plagued by a guilty conscience if they are required to fight. A few philosophers ennoble the virtues of the soldier while they also declare their apprehensions for war itself. A few, such as Rousseau, argue for insurrection against oppressive rule.

The historical aspect, or the "just war tradition", deals with the historical body of rules or agreements that have applied in various wars across the ages. The just war tradition also considers the writings of various philosophers and lawyers through history, and examines both their philosophical visions of war's ethical limits and whether their thoughts have contributed to the body of conventions that have evolved to guide war and warfare.

In the twenty-first century there has been significant debate between traditional just war theorists, who largely support the existing law of war and develop arguments to support it, and revisionists who reject many traditional assumptions, although not necessarily advocating a change in the law.

Origins

Ancient Egypt

A 2017 study found that the just war tradition can be traced as far back as to Ancient Egypt. Egyptian ethics of war usually centered on three main ideas, these including the cosmological role of Egypt, the pharaoh as a divine office and executor of the will of the gods, and the superiority of the Egyptian state and population over all other states and peoples. Egyptian political theology held that the pharaoh had the exclusive legitimacy in justly initiating a war, usually claimed to carry out the will of the gods. Senusret I, in the Twelfth Dynasty, claimed, "I was nursed to be a conqueror...his [Atum's] son and his protector, he gave me to conquer what he conquered." Later pharaohs also considered their sonship of the god Amun-Re as granting them absolute ability to declare war on the deity's behalf. Pharaohs often visited temples prior to initiating campaigns, where the pharaoh was believed to receive their commands of war from the deities. For example, Kamose claimed that "I went north because I was strong (enough) to attack the Asiatics through the command of Amon, the just of counsels." A stele erected by Thutmose III at the Temple of Amun at Karnak "provides an unequivocal statement of the pharaoh's divine mandate to wage war on his enemies." As the period of the New Kingdom progressed and Egypt heightened its territorial ambition, so did the invocation of just war aid the justification of these efforts. The universal principle of Maat, signifying order and justice, was central to the Egyptian notion of just war and its ability to guarantee Egypt virtually no limits on what it could take, do, or use to guarantee the ambitions of the state.

India

The Indian Hindu epic, the Mahabharata, offers the first written discussions of a "just war" (dharma-yuddha or "righteous war"). In it, one of five ruling brothers (Pandavas) asks if the suffering caused by war can ever be justified. A long discussion then ensues between the siblings, establishing criteria like proportionality (chariots cannot attack cavalry, only other chariots; no attacking people in distress), just means (no poisoned or barbed arrows), just cause (no attacking out of rage), and fair treatment of captives and the wounded.

In Sikhism, the term dharamyudh describes a war that is fought for just, righteous or religious reasons, especially in defence of one's own beliefs. Though some core tenets in the Sikh religion are understood to emphasise peace and nonviolence, especially before the 1606 execution of Guru Arjan by Mughal emperor Jahangir, military force may be justified if all peaceful means to settle a conflict have been exhausted, thus resulting in a dharamyudh.

East Asian

Chinese philosophy produced a massive body of work on warfare, much of it during the Zhou dynasty, especially the Warring States era. War was justified only as a last resort and only by the rightful sovereign; however, questioning the decision of the emperor concerning the necessity of a military action was not permissible. The success of a military campaign was sufficient proof that the campaign had been righteous.

Japan did not develop its own doctrine of just war but between the 5th and the 7th centuries drew heavily from Chinese philosophy, and especially Confucian views. As part of the Japanese campaign to take the northeastern island Honshu, Japanese military action was portrayed as an effort to "pacify" the Emishi people, who were likened to "bandits" and "wild-hearted wolf cubs" and accused of invading Japan's frontier lands.

Ancient Greece and Rome

The notion of just war in Europe originates and is developed first in ancient Greece and then in the Roman Empire.

It was Aristotle who first introduced the concept and terminology to the Hellenic world that called war a last resort requiring conduct that would allow the restoration of peace. Aristotle argues that the cultivation of a military is necessary and good for the purpose of self-defense, not for conquering: "The proper object of practising military training is not in order that men may enslave those who do not deserve slavery, but in order that first they may themselves avoid becoming enslaved to others" (Politics, Book 7).

In ancient Rome, a "just cause" for war might include the necessity of repelling an invasion, or retaliation for pillaging or a breach of treaty. War was always potentially nefas ("wrong, forbidden"), and risked religious pollution and divine disfavor. A "just war" (bellum iustum) thus required a ritualized declaration by the fetial priests. More broadly, conventions of war and treaty-making were part of the ius gentium, the "law of nations", the customary moral obligations regarded as innate and universal to human beings.

Christian views

Christian theory of the Just War begins around the time of Augustine of Hippo The Just War theory, with some amendments, is still used by Christians today as a guide to whether or not a war can be justified. War may be necessary and right, even though it may not be good. In the case of a country that has been invaded by an occupying force, war may be the only way to restore justice. 

Saint Augustine

Saint Augustine held that individuals should not resort immediately to violence, but God has given the sword to government for a good reason (based upon Romans 13:4). In Contra Faustum Manichaeum book 22 sections 69–76, Augustine argues that Christians, as part of a government, need not be ashamed of protecting peace and punishing wickedness when they are forced to do so by a government. Augustine asserted that was a personal and philosophical stance: "What is here required is not a bodily action, but an inward disposition. The sacred seat of virtue is the heart."

Nonetheless, he asserted, peacefulness in the face of a grave wrong that could be stopped by only violence would be a sin. Defense of one's self or others could be a necessity, especially when it is authorized by a legitimate authority:

They who have waged war in obedience to the divine command, or in conformity with His laws, have represented in their persons the public justice or the wisdom of government, and in this capacity have put to death wicked men; such persons have by no means violated the commandment, "Thou shalt not kill."

While not breaking down the conditions necessary for war to be just, Augustine nonetheless originated the very phrase itself in his work The City of God:

But, say they, the wise man will wage Just Wars. As if he would not all the rather lament the necessity of just wars, if he remembers that he is a man; for if they were not just he would not wage them, and would therefore be delivered from all wars.

Augustine further taught:

No war is undertaken by a good state except on behalf of good faith or for safety.

J. Mark Mattox writes,

In terms of the traditional notion of jus ad bellum (justice of war, that is, the circumstances in which wars can be justly fought), war is a coping mechanism for righteous sovereigns who would ensure that their violent international encounters are minimal, a reflection of the Divine Will to the greatest extent possible, and always justified. In terms of the traditional notion of jus in bello (justice in war, or the moral considerations which ought to constrain the use of violence in war), war is a coping mechanism for righteous combatants who, by divine edict, have no choice but to subject themselves to their political masters and seek to ensure that they execute their war-fighting duty as justly as possible.

Isidore of Seville

Isidore of Seville writes:

Those wars are unjust which are undertaken without cause. For aside from vengeance or to fight off enemies no just war can be waged. 

Peace and Truce of God

The medieval Peace of God (Latin: pax dei) was a 10th century mass movement in Western Europe instigated by the clergy that granted immunity from violence for non-combatants.

Starting in the 11th Century, the Truce of God (Latin: treuga dei) involved Church rules that successfully limited when and where fighting could occur: Catholic forces (e.g. of warring barons) could not fight each other on Sundays, Thursdays, holidays, the entirety of Lent and Advent and other times, severely disrupting the conduct of wars. The 1179 Third Council of the Lateran adopted a version of it for the whole church.

Saint Thomas Aquinas

The just war theory by Thomas Aquinas has had a lasting impact on later generations of thinkers and was part of an emerging consensus in Medieval Europe on just war. In the 13th century Aquinas reflected in detail on peace and war. Aquinas was a Dominican friar and contemplated the teachings of the Bible on peace and war in combination with ideas from Aristotle, Plato, Socrates, Saint Augustine and other philosophers whose writings are part of the Western canon. Aquinas' views on war drew heavily on the Decretum Gratiani, a book the Italian monk Gratian had compiled with passages from the Bible. After its publication in the 12th century, the Decretum Gratiani had been republished with commentary from Pope Innocent IV and the Dominican friar Raymond of Penafort. Other significant influences on Aquinas just war theory were Alexander of Hales and Henry of Segusio.

In Summa Theologica Aquinas asserted that it is not always a sin to wage war, and he set out criteria for a just war. According to Aquinas, three requirements must be met. Firstly, the war must be waged upon the command of a rightful sovereign. Secondly, the war needs to be waged for just cause, on account of some wrong the attacked have committed. Thirdly, warriors must have the right intent, namely to promote good and to avoid evil. Aquinas came to the conclusion that a just war could be offensive and that injustice should not be tolerated so as to avoid war. Nevertheless, Aquinas argued that violence must only be used as a last resort. On the battlefield, violence was only justified to the extent it was necessary. Soldiers needed to avoid cruelty and a just war was limited by the conduct of just combatants. Aquinas argued that it was only in the pursuit of justice, that the good intention of a moral act could justify negative consequences, including the killing of the innocent during a war.

Renaissance and Christian Humanists

Various Renaissance humanists promoted Pacificist views.

  • John Colet famously preached a Lenten sermon before Henry VIII, who was preparing for a war, quoting Cicero "Better an unjust peace rather than the justest war."
  • Erasmus of Rotterdam wrote numerous works on peace which criticized Just War theory as a smokescreen and added extra limitations, notably The Complaint of Peace and the Treatise on War (Dulce bellum inexpertis).

A leading humanist writer after the Reformation was legal theorist Hugo Grotius, whose De jura belli ac pacis re-considered Just War and fighting wars justly.

First World War

At the beginning of the First World War, a group of theologians in Germany published a manifesto that sought to justify the actions of the German government. At the British government's request, Randall Davidson, Archbishop of Canterbury, took the lead in collaborating with a large number of other religious leaders, including some with whom he had differed in the past, to write a rebuttal of the Germans' contentions. Both German and British theologians based themselves on the just war theory, each group seeking to prove that it applied to the war waged by its own side.

Contemporary Catholic doctrine

The just war doctrine of the Catholic Church found in the 1992 Catechism of the Catholic Church, in paragraph 2309, lists four strict conditions for "legitimate defense by military force:"

  • The damage inflicted by the aggressor on the nation or community of nations must be lasting, grave and certain.
  • All other means of putting an end to it must have been shown to be impractical or ineffective.
  • There must be serious prospects of success.
  • The use of arms must not produce evils and disorders graver than the evil to be eliminated.

The Compendium of the Social Doctrine of the Church elaborates on the just war doctrine in paragraphs 500 to 501, while citing the Charter of the United Nations:

If this responsibility justifies the possession of sufficient means to exercise this right to defense, States still have the obligation to do everything possible "to ensure that the conditions of peace exist, not only within their own territory but throughout the world". It is important to remember that "it is one thing to wage a war of self-defense; it is quite another to seek to impose domination on another nation. The possession of war potential does not justify the use of force for political or military objectives. Nor does the mere fact that war has unfortunately broken out mean that all is fair between the warring parties".

The Charter of the United Nations ... is based on a generalized prohibition of a recourse to force to resolve disputes between States, with the exception of two cases: legitimate defence and measures taken by the Security Council within the area of its responsibilities for maintaining peace. In every case, exercising the right to self-defence must respect "the traditional limits of necessity and proportionality".

Therefore, engaging in a preventive war without clear proof that an attack is imminent cannot fail to raise serious moral and juridical questions. International legitimacy for the use of armed force, on the basis of rigorous assessment and with well-founded motivations, can only be given by the decision of a competent body that identifies specific situations as threats to peace and authorizes an intrusion into the sphere of autonomy usually reserved to a State.

Pope John Paul II in an address to a group of soldiers said the following:

Peace, as taught by Sacred Scripture and the experience of men itself, is more than just the absence of war. And the Christian is aware that on earth a human society that is completely and always peaceful is, unfortunately, an utopia and that the ideologies which present it as easily attainable only nourish vain hopes. The cause of peace will not go forward by denying the possibility and the obligation to defend it.

Russian Orthodox Church

The War and Peace section in the Basis of the Social Concept of the Russian Orthodox Church is crucial for understanding the Russian Orthodox Church's attitude towards war. The document offers criteria of distinguishing between an aggressive war, which is unacceptable, and a justified war, attributing the highest moral and sacred value of military acts of bravery to a true believer who participates in a justified war. Additionally, the document considers the just war criteria as developed in Western Christianity to be eligible for Russian Orthodoxy; therefore, the justified war theory in Western theology is also applicable to the Russian Orthodox Church.

In the same document, it is stated that wars have accompanied human history since the fall of man, and according to the gospel, they will continue to accompany it. While recognizing war as evil, the Russian Orthodox Church does not prohibit its members from participating in hostilities if there is the security of their neighbours and the restoration of trampled justice at stake. War is considered to be necessary but undesirable. It is also stated that the Russian Orthodox Church has had profound respect for soldiers who gave their lives to protect the life and security of their neighbours.

Just war tradition

The just war theory, propounded by the medieval Christian philosopher Thomas Aquinas, was developed further by legal scholars in the context of international law. Cardinal Cajetan, the jurist Francisco de Vitoria, the two Jesuit priests Luis de Molina and Francisco Suárez, as well as the humanist Hugo Grotius and the lawyer Luigi Taparelli were most influential in the formation of a just war tradition. The just war tradition, which was well established by the 19th century, found its practical application in the Hague Peace Conferences (1899 and 1907) and in the founding of the League of Nations in 1920. After the United States Congress declared war on Germany in 1917, Cardinal James Gibbons issued a letter that all Catholics were to support the war because "Our Lord Jesus Christ does not stand for peace at any price... If by Pacifism is meant the teaching that the use of force is never justifiable, then, however well meant, it is mistaken, and it is hurtful to the life of our country."

Armed conflicts such as the Spanish Civil War, World War II and the Cold War were, as a matter of course, judged according to the norms (as established in Aquinas' just war theory) by philosophers such as Jacques Maritain, Elizabeth Anscombe and John Finnis.

The first work dedicated specifically to just war was the 15th-century sermon De bellis justis of Stanisław of Skarbimierz (1360–1431), who justified war by the Kingdom of Poland against the Teutonic Knights. Francisco de Vitoria criticized the conquest of America by the Spanish conquistadors on the basis of just-war theory. With Alberico Gentili and Hugo Grotius, just war theory was replaced by international law theory, codified as a set of rules, which today still encompass the points commonly debated, with some modifications.

Just-war theorists combine a moral abhorrence towards war with a readiness to accept that war may sometimes be necessary. The criteria of the just-war tradition act as an aid in determining whether resorting to arms is morally permissible. Just-war theories aim "to distinguish between justifiable and unjustifiable uses of organized armed forces"; they attempt "to conceive of how the use of arms might be restrained, made more humane, and ultimately directed towards the aim of establishing lasting peace and justice".

The just war tradition addresses the morality of the use of force in two parts: when it is right to resort to armed force (the concern of jus ad bellum) and what is acceptable in using such force (the concern of jus in bello).

In 1869 the Russian military theorist Genrikh Antonovich Leer theorized on the advantages and potential benefits of war.

The Soviet leader Vladimir Lenin defined only three types of just war.

But picture to yourselves a slave-owner who owned 100 slaves warring against a slave-owner who owned 200 slaves for a more "just" distribution of slaves. Clearly, the application of the term "defensive" war, or war "for the defense of the fatherland" in such a case would be historically false, and in practice would be sheer deception of the common people, of philistines, of ignorant people, by the astute slaveowners. Precisely in this way are the present-day imperialist bourgeoisie deceiving the peoples by means of "national ideology" and the term "defense of the fatherland" in the present war between slave-owners for fortifying and strengthening slavery.

The anarcho-capitalist scholar Murray Rothbard (1926-1995) stated that "a just war exists when a people tries to ward off the threat of coercive domination by another people, or to overthrow an already-existing domination. A war is unjust, on the other hand, when a people try to impose domination on another people or try to retain an already-existing coercive rule over them."

Jonathan Riley-Smith writes:

The consensus among Christians on the use of violence has changed radically since the crusades were fought. The just war theory prevailing for most of the last two centuries—that violence is an evil that can, in certain situations, be condoned as the lesser of evils—is relatively young. Although it has inherited some elements (the criteria of legitimate authority, just cause, right intention) from the older war theory that first evolved around AD 400, it has rejected two premises that underpinned all medieval just wars, including crusades: first, that violence could be employed on behalf of Christ's intentions for mankind and could even be directly authorized by him; and second, that it was a morally neutral force that drew whatever ethical coloring it had from the intentions of the perpetrators.

Criteria

The just war theory has two sets of criteria, the first establishing jus ad bellum (the right to go to war), and the second establishing jus in bello (right conduct within war).

Jus ad bellum

Competent authority
Only duly constituted public authorities may wage war. "A just war must be initiated by a political authority within a political system that allows distinctions of justice. Dictatorships (e.g. Hitler's regime) or deceptive military actions (e.g. the 1968 US bombing of Cambodia) are typically considered as violations of this criterion. The importance of this condition is key. Plainly, we cannot have a genuine process of judging a just war within a system that represses the process of genuine justice. A just war must be initiated by a political authority within a political system that allows distinctions of justice".
Probability of success
According to this principle, there must be good grounds for concluding that aims of the just war are achievable. This principle emphasizes that mass violence must not be undertaken if it is unlikely to secure the just cause. This criterion is to avoid invasion for invasion's sake and links to the proportionality criteria. One cannot invade if there is no chance of actually winning. However, wars are fought with imperfect knowledge, so one must simply be able to make a logical case that one can win; there is no way to know this in advance. These criteria move the conversation from moral and theoretical grounds to practical grounds. Essentially, this is meant to gather coalition building and win approval of other state actors.
Last resort
The principle of last resort stipulates that all non-violent options must first be exhausted before the use of force can be justified. Diplomatic options, sanctions, and other non-military methods must be attempted or validly ruled out before the engagement of hostilities. Further, in regard to the amount of harm—proportionally—the principle of last resort would support using small intervention forces first and then escalating rather than starting a war with massive force such as carpet bombing or nuclear warfare.
Just cause
The reason for going to war needs to be just and cannot, therefore, be solely for recapturing things taken or punishing people who have done wrong; innocent life must be in imminent danger and intervention must be to protect life. A contemporary view of just cause was expressed in 1993 when the US Catholic Conference said: "Force may be used only to correct a grave, public evil, i.e., aggression or massive violation of the basic human rights of whole populations."

Jus in bello

Once war has begun, just war theory (jus in bello) also directs how combatants are to act or should act:

Distinction
Just war conduct should be governed by the principle of distinction. The acts of war should be directed towards enemy combatants, and not towards non-combatants caught in circumstances that they did not create. The prohibited acts include bombing civilian residential areas that include no legitimate military targets, committing acts of terrorism or reprisal against civilians or prisoners of war (POWs), and attacking neutral targets. Moreover, combatants are not permitted to attack enemy combatants who have surrendered, or who have been captured, or who are injured and not presenting an immediate lethal threat, or who are parachuting from disabled aircraft and are not airborne forces, or who are shipwrecked.
Proportionality
Just war conduct should be governed by the principle of proportionality. Combatants must make sure that the harm caused to civilians or civilian property is not excessive in relation to the concrete and direct military advantage anticipated by an attack on a legitimate military objective. This principle is meant to discern the correct balance between the restriction imposed by a corrective measure and the severity of the nature of the prohibited act.
Military necessity
Just war conduct should be governed by the principle of military necessity. An attack or action must be intended to help in the defeat of the enemy; it must be an attack on a legitimate military objective, and the harm caused to civilians or civilian property must be proportional and not excessive in relation to the concrete and direct military advantage anticipated. This principle is meant to limit excessive and unnecessary death and destruction.
Fair treatment of prisoners of war
Enemy combatants who surrendered or who are captured no longer pose a threat. It is therefore wrong to torture them or otherwise mistreat them.
No means malum in se
Combatants may not use weapons or other methods of warfare that are considered evil, such as mass rape, forcing enemy combatants to fight against their own side or using weapons whose effects cannot be controlled (e.g., nuclear/biological weapons).

Ending a war: Jus post bellum

In recent years, some theorists, such as Gary Bass, Louis Iasiello and Brian Orend, have proposed a third category within the just war theory. Jus post bellum concerns justice after a war, including peace treaties, reconstruction, environmental remediation, war crimes trials, and war reparations. Jus post bellum has been added to deal with the fact that some hostile actions may take place outside a traditional battlefield. Jus post bellum governs the justice of war termination and peace agreements, as well as the prosecution of war criminals, and publicly labelled terrorists. The idea has largely been added to help decide what to do if there are prisoners that have been taken during battle. It is, through government labelling and public opinion, that people use jus post bellum to justify the pursuit of labelled terrorist for the safety of the government's state in a modern context. The actual fault lies with the aggressor and so by being the aggressor, they forfeit their rights for honourable treatment by their actions. That theory is used to justify the actions taken by anyone fighting in a war to treat prisoners outside of war.

Moral equality of combatants

The moral equality of combatants has been cited in relation to the 2022 Russian invasion of Ukraine. Opponents of MEC argue that soldiers who fight a war of aggression, such as these Russian soldiers in Ukraine, are in the wrong.

The moral equality of combatants (MEC) or moral equality of soldiers is the principle that soldiers fighting on both sides of a war are equally honorable, unless they commit war crimes, regardless of whether they fight for a just cause. MEC is a key element underpinning international humanitarian law (IHL)—which applies the rules of war equally to both sides—and traditional just war theory. According to philosopher Henrik Syse, MEC presents a serious quandary because "it makes as little practical sense to ascribe blame to individual soldiers for the cause of the war in which they fight as it makes theoretical sense to hold the fighters on the two sides to be fully morally equal". The moral equality of combatants has been cited in relation to the Israeli–Palestinian conflict or the U.S.-led wars in Iraq and Afghanistan.

Traditional view

MEC as a formal doctrine was articulated in Just and Unjust Wars (1977) by Michael Walzer, although earlier just war theorists such as Augustine and Aquinas argued that soldiers should obey their leaders when fighting. There is dispute over whether early modern just war theory promoted MEC. A full crystallization of MEC could only occur after both jus ad bellum and jus in bello were developed. Proponents of MEC argue that individual soldiers are not well-placed to determine the justness of a war. Walzer, for example, argues that the entire responsibility for an unjust war is borne by military and civilian leaders who choose to go to war, rather than individual soldiers who have little say in the matter.

MEC is one of the underpinnings of international humanitarian law (IHL), which applies equally to both sides regardless of the justice of their cause. In IHL, this principle is known as equality of belligerents. This contradicts the legal principle of ex injuria jus non oritur that no one should be able to derive benefit from their illegal action. British jurist Hersch Lauterpacht articulated the pragmatic basis of belligerent equality, stating: "it is impossible to visualize the conduct of hostilities in which one side would be bound by rules of warfare without benefiting from them and the other side would benefit from them without being bound by them". International law scholar Eliav Lieblich states that the moral responsibility of soldiers who participate in unjust wars is "one of the stickiest problems in the ethics of war".

Revisionist challenge

There is no equivalent to MEC in peacetime circumstances. In 2006, philosopher Jeff McMahan began to contest MEC, arguing that soldiers fighting an unjust or illegal war are not morally equal to those fighting in self-defense. Although they do not favor criminal prosecution of individual soldiers who fight in an unjust war, they argue that individual soldiers should assess the legality or morality of the war they are asked to fight, and refuse if it is an illegal or unjust war. According to the revisionist view, a soldier or officer who knows or strongly suspects that their side is fighting an unjust war has a moral obligation not to fight it, unless this would entail capital punishment or some other extreme consequence.

Opponents of MEC—sometimes grouped under the label of revisionist just war theory—nevertheless generally support the belligerent equality principle of IHL on pragmatic grounds. In his 2018 book The Crime of Aggression, Humanity, and the Soldier, law scholar Tom Dannenbaum was one of the first to propose legal reforms based on rejection of MEC. Dannenbaum argued that soldiers who refuse to fight illegal wars should be allowed selective conscientious objection and be accepted as refugees if they have to flee their country. He also argued that soldiers fighting against a war of aggression should be recognized as victims in postwar reparations processes.

Public opinion

A 2019 study found that the majority of Americans endorse the revisionist view on MEC and many are even willing to allow a war crime against noncombatants to go unpunished when committed by soldiers who are fighting a just war. Responding to the study, Walzer cautioned that differently phrased questions might have led to different results.

Archetype

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Archetype The concept of an archetyp...