Search This Blog

Wednesday, September 6, 2023

Frontier thesis

From Wikipedia, the free encyclopedia

The Frontier Thesis or Turner's Thesis (also American frontierism) is the argument advanced by historian Frederick Jackson Turner in 1893 that a settler colonial exceptionalism, under the guise of American democracy, was formed by the appropriation of the rugged American frontier. He stressed the process of "winning a wilderness" to extend the frontier line further for U.S. colonization, and the impact this had on pioneer culture and character. A modern simplification describes it as Indigenous land possessing an "American ingenuity" that settlers are compelled to forcibly appropriate to create cultural identity that differs from their European ancestors. Turner's text follows in a long line of thought within the framework of Manifest Destiny established decades earlier. He stressed in this thesis that American democracy was the primary result, along with egalitarianism, of a lack of interest in bourgeois or high culture, and violence. "American democracy was born of no theorist's dream; it was not carried in the Susan Constant to Virginia, nor in the Mayflower to Plymouth. It came out of the American forest, and it gained new strength each time it touched a new frontier," said Turner.

In the thesis, the American frontier established liberty from European mindsets and eroding old, dysfunctional customs. Turner's ideal of frontier had no need for standing armies, established churches, aristocrats or nobles; there was no landed gentry who controlled the land or charged heavy rents and fees. Frontier land was practically free for the taking according to Turner. The Frontier Thesis was first published in a paper entitled "The Significance of the Frontier in American History", delivered to the American Historical Association in 1893 in Chicago. He won wide acclaim among historians and intellectuals. Turner elaborated on the theme in his advanced history lectures and in a series of essays published over the next 25 years, published along with his initial paper as The Frontier in American History.

Turner's emphasis on the importance of the frontier in shaping American character influenced the interpretation found in thousands of scholarly histories. By the time Turner died in 1932, 60% of the leading history departments in the U.S. were teaching courses in frontier history along Turnerian lines.

Summary

Turner begins the essay by calling to attention the fact that the western frontier line, which had defined the entirety of American history up to the 1880s, had ended. He elaborates by stating,

Behind institutions, behind constitutional forms and modifications, lie the vital forces that call these organs into life and shape them to meet changing conditions. The peculiarity of American institutions is, the fact that they have been compelled to adapt themselves to the changes of an expanding people to the changes involved in crossing a continent, in winning a wilderness, and in developing at each area of this progress out of the primitive economic and political conditions of the frontier into the complexity of city life.

According to Turner, American progress has repeatedly undergone a cyclical process on the frontier line as society has needed to redevelop with its movement westward. Everything in American history up to the 1880s somehow relates the western frontier, including slavery. In spite of this, Turner laments, the frontier has received little serious study from historians and economists.

The frontier line, which separates civilization from wilderness, is “the most rapid and effective Americanization” on the continent; it takes the European from across the Atlantic and shapes him into something new. American emigration west is not spurred by government incentives, but rather some “expansive power” inherent within them that seeks to dominate nature. Furthermore, there is a need to escape the confines of the State. The most important aspect of the frontier to Turner is its effect on democracy. The frontier transformed Jeffersonian democracy into Jacksonian democracy. The individualism fostered by the frontier's wilderness created a national spirit complementary to democracy, as the wilderness defies control. Therefore, Andrew Jackson's brand of popular democracy was a triumph of the frontier.

Turner sets up the East and the West as opposing forces; as the West strives for freedom, the East seeks to control it. He cites British attempts to stifle western emigration during the colonial era and as an example of eastern control. Even after independence, the eastern coast of the United States sought to control the West. Religious institutions from the eastern seaboard, in particular, battled for possession of the West. The tensions between small churches as a result of this fight, Turner states, exist today because of the religious attempt to master the West and those effects are worth further study.

American intellect owes its form to the frontier as well. The traits of the frontier are “coarseness and strength combined with acuteness and inquisitiveness; that practical, inventive turn of mind, quick to find expedients; that masterful grasp of material things, lacking in the artistic but powerful to effect great ends; that restless, nervous energy; that dominant individualism, working for good and for evil, and withal that buoyancy and exuberance which comes with freedom.”

Turner concludes the essay by saying that with the end of the frontier, the first period of American history has ended.

Intellectual context

Germanic germ theory

The Frontier Thesis came about at a time when the Germanic germ theory of history was popular. Proponents of the germ theory believed that political habits are determined by innate racial attributes. Americans inherited such traits as adaptability and self-reliance from the Germanic peoples of Europe. According to the theory, the Germanic race appeared and evolved in the ancient Teutonic forests, endowed with a great capacity for politics and government. Their germs were, directly and by way of England, carried to the New World where they were allowed to germinate in the North American forests. In so doing, the Anglo-Saxons and the Germanic people's descendants, being exposed to a forest like their Teutonic ancestors, birthed the free political institutions that formed the foundation of American government.

Historian and ethnologist Hubert Howe Bancroft articulated the latest iteration of the Germanic germ theory just three years before Turner's paper in 1893. He argued that the “tide of intelligence” had always moved from east to west. According to Bancroft, the Germanic germs had spread across of all Western Europe by the Middle Ages and had reached their height. This Germanic intelligence was only halted by “civil and ecclesiastical restraints” and a lack of “free land.” This was Bancroft's explanation for the Dark Ages.

Turner's theory of early American development, which relied on the frontier as a transformative force, starkly opposed the Bancroftian racial determinism. Turner referred to the Germanic germ theory by name in his essay, claiming that “too exclusive attention has been paid by institutional students to the Germanic origins.” Turner believed that historians should focus on the settlers’ struggle with the frontier as the catalyst for the creation of American character, not racial or hereditary traits.

Though Turner's view would win over the Germanic germ theory's version of Western history, the theory persisted for decades after Turner's thesis enraptured the American Historical Association. In 1946, medieval historian Carl Stephenson published an extended article refuting the Germanic germ theory. Evidently, the belief that free political institutions of the United States spawned in ancient Germanic forests endured well into the 1940s.

Racial warfare

A similarly race-based interpretation of Western history also occupied the intellectual sphere in the United States before Turner. The racial warfare theory was an emerging belief in the late nineteenth century advocated by Theodore Roosevelt in The Winning of the West. Though Roosevelt would later accept Turner's historiography on the West, calling Turner's work a correction or supplementation of his own, the two certainly contradict.

Roosevelt was not entirely unfounded in saying that he and Turner agreed; both Turner and Roosevelt agreed that the frontier had shaped what would become distinctly American institutions and the mysterious entity they each called “national character.” They also agreed that studying the history of the West was necessary to face the challenges to democracy in the late 1890s.

Turner and Roosevelt diverged on the exact aspect of frontier life that shaped the contemporary American. Roosevelt contended that the formation of the American character occurred not with early settlers struggling to survive while learning a foreign land, but “on the cutting edge of expansion” in the early battles with Native Americans in the New World. To Roosevelt, the journey westward was one of nonstop encounters with the “hostile races and cultures” of the New World, forcing the early colonists to defend themselves as they pressed forward. Each side, the Westerners and the native savages, struggled for mastery of the land through violence.

Whereas Turner saw the development of American character occur just behind the frontier line, as the colonists tamed and tilled the land, Roosevelt saw it form in battles just beyond the frontier line. In the end, Turner's view would win out among historians, which Roosevelt would accept.

Evolution

Frederick Jackson Turner, c. 1890

Turner set up an evolutionary model (he had studied evolution with a leading geologist, Thomas Chrowder Chamberlin), using the time dimension of American history, and the geographical space of the land that became the United States. The first settlers who arrived on the east coast in the 17th century acted and thought like Europeans. They adapted to the new physical, economic and political environment in certain ways—the cumulative effect of these adaptations was Americanization.

Successive generations moved further inland, shifting the lines of settlement and wilderness, but preserving the essential tension between the two. European characteristics fell by the wayside and the old country's institutions (e.g., established churches, established aristocracies, standing armies, intrusive government, and highly unequal land distribution) were increasingly out of place. Every generation moved further west and became more American, more democratic, and more intolerant of hierarchy. They also became more violent, more individualistic, more distrustful of authority, less artistic, less scientific, and more dependent on ad-hoc organizations they formed themselves. In broad terms, the further west, the more American the community.

Closed frontier

Turner saw the land frontier was ending, since the U.S. Census of 1890 had officially stated that the American frontier had broken up.

By 1890, settlement in the American West had reached sufficient population density that the frontier line had disappeared; in 1890 the Census Bureau released a bulletin declaring the closing of the frontier, stating: "Up to and including 1880 the country had a frontier of settlement, but at present the unsettled area has been so broken into by isolated bodies of settlement that there can hardly be said to be a frontier line. In the discussion of its extent, its westward movement, etc., it can not, therefore, any longer have a place in the census reports."

Comparative frontiers

Historians, geographers, and social scientists have studied frontier-like conditions in other countries, with an eye on the Turnerian model. South Africa, Canada, Russia, Brazil, Argentina and Australia—and even ancient Rome—had long frontiers that were also settled by pioneers. However these other frontier societies operated in a very difficult political and economic environment that made democracy and individualism much less likely to appear and it was much more difficult to throw off a powerful royalty, standing armies, established churches and an aristocracy that owned most of the land. The question is whether their frontiers were powerful enough to overcome conservative central forces based in the metropolis. Each nation had quite different frontier experiences. For example, the Dutch Boers in South Africa were defeated in war by Britain. In Australia, "mateship" and working together was valued more than individualism. Alexander Petrov noted that Russia had its own frontier and Russians moved over centuries across Siberia all the way from the Urals to the Pacific, struggling with nature in many physical ways similar to the American move across North America - without developing the social and political characteristics noted by Turner. To the contrary, Siberia - the Russian Frontier Land - became emblematic of the oppression of Czarist Absolute Monarchy. This comparison, Petrov suggests, shows that it is far from inevitable that an expanding settlement of wild land would produce the American type of cultural and political institutions. Other factors need to be taken into consideration, such as the great difference between British society from which settlers went across the Atlantic and the Russian society which sent its own pioneers across the Urals.

Impact and influence

Turner's thesis quickly became popular among intellectuals. It explained why the American people and American government were so different from their European counterparts. It was popular among New Dealers—Franklin D. Roosevelt and his top aides thought in terms of finding new frontiers. FDR, in celebrating the third anniversary of Social Security in 1938, advised, "There is still today a frontier that remains unconquered—an America unreclaimed. This is the great, the nation-wide frontier of insecurity, of human want and fear. This is the frontier—the America—we have set ourselves to reclaim." Historians adopted it, especially in studies of the west, but also in other areas, such as the influential work of Alfred D. Chandler Jr. (1918–2007) in business history.

Many believed that the end of the frontier represented the beginning of a new stage in American life and that the United States must expand overseas. However, others viewed this interpretation as the impetus for a new wave in the history of United States imperialism. William Appleman Williams led the "Wisconsin School" of diplomatic historians by arguing that the frontier thesis encouraged American overseas expansion, especially in Asia, during the 20th century. Williams viewed the frontier concept as a tool to promote democracy through both world wars, to endorse spending on foreign aid, and motivate action against totalitarianism. However, Turner's work, in contrast to Roosevelt's work The Winning of the West, places greater emphasis on the development of American republicanism than on territorial conquest. Other historians, who wanted to focus scholarship on minorities, especially Native Americans and Hispanics, started in the 1970s to criticize the frontier thesis because it did not attempt to explain the evolution of those groups. Indeed, their approach was to reject the frontier as an important process and to study the West as a region, ignoring the frontier experience east of the Mississippi River.

Turner never published a major book on the frontier for which he did 40 years of research. However his ideas presented in his graduate seminars at Wisconsin and Harvard influenced many areas of historiography. In the history of religion, for example, Boles (1993) notes that William Warren Sweet at the University of Chicago Divinity School as well as Peter G. Mode (in 1930), argued that churches adapted to the characteristics of the frontier, creating new denominations such as the Mormons, the Church of Christ, the Disciples of Christ, and the Cumberland Presbyterians. The frontier, they argued, shaped uniquely American institutions such as revivals, camp meetings, and itinerant preaching. This view dominated religious historiography for decades. Moos (2002) shows that the 1910s to 1940s black filmmaker and novelist Oscar Micheaux incorporated Turner's frontier thesis into his work. Micheaux promoted the West as a place where blacks could experience less institutionalized forms of racism and earn economic success through hard work and perseverance.

Slatta (2001) argues that the widespread popularization of Turner's frontier thesis influenced popular histories, motion pictures, and novels, which characterize the West in terms of individualism, frontier violence, and rough justice. Disneyland's Frontierland of the mid to late 20th century reflected the myth of rugged individualism that celebrated what was perceived to be the American heritage. The public has ignored academic historians' anti-Turnerian models, largely because they conflict with and often destroy the icons of Western heritage. However, the work of historians during the 1980s–1990s, some of whom sought to bury Turner's conception of the frontier, and others who sought to spare the concept but with nuance, have done much to place Western myths in context.

A 2020 study in Econometrica found empirical support for the frontier thesis, showing that frontier experience had a causal impact on individualism.

Early anti-Turnerian thought

Though Turner's work was massively popular in its time and for decades after, it received significant intellectual pushback in the midst of World War II. This quote from Turner's The Frontier in American History is arguably the most famous statement of his work and, to later historians, the most controversial:

American democracy was born of no theorist's dream; it was not carried in the Susan Constant to Virginia, nor in the Mayflower to Plymouth. It came out of the American forest, and it gained new strength each time it touched a new frontier. Not the constitution but free land and an abundance of natural resources open to a fit people, made the democratic type of society in America for three centuries while it occupied its empire.

This assertion's racial overtones concerned historians as Adolf Hitler and the Blood and soil ideology, stoking racial and destructive enthusiasm, rose to power in Germany. An example of this concern is in George Wilson Pierson’s influential essay on the frontier. He asked why the Turnerian American character was limited to the Thirteen Colonies that went on to form the United States, why the frontier did not produce that same character among pre-Columbian Native Americans and Spaniards in the New World.

Despite Pierson and other scholars’ work, Turner's influence did not end during World War II or even after the war. Indeed, his influence was felt in American classrooms until the 1970s and 80s.

New frontiers

President John F. Kennedy

Subsequent critics, historians, and politicians have suggested that other 'frontiers,' such as scientific innovation, could serve similar functions in American development. Historians have noted that John F. Kennedy in the early 1960s explicitly called upon the ideas of the frontier. At his acceptance speech upon securing the Democratic Party nomination for U.S. president on July 15, 1960, Kennedy called out to the American people, "I am asking each of you to be new pioneers on that New Frontier. My call is to the young in heart, regardless of age—to the stout in spirit, regardless of party." Mathiopoulos notes that he "cultivated this resurrection of frontier ideology as a motto of progress ('getting America moving') throughout his term of office." He promoted his political platform as the "New Frontier," with a particular emphasis on space exploration and technology. Limerick points out that Kennedy assumed that "the campaigns of the Old Frontier had been successful, and morally justified." The frontier metaphor thus maintained its rhetorical ties to American social progress.

Fermilab

Adrienne Kolb and Lillian Hoddeson argue that during the heyday of Kennedy's "New Frontier," the physicists who built Fermilab explicitly sought to recapture the excitement of the old frontier. They argue that, "Frontier imagery motivates Fermilab physicists, and a rhetoric remarkably similar to that of Turner helped them secure support for their research." Rejecting the East and West coast life styles that most scientists preferred, they selected a Chicago suburb on the prairie as the location of the lab. A small herd of American bison was started at the lab's founding to symbolize Fermilab's presence on the frontier of physics and its connection to the American prairie. The bison herd still lives on the grounds of Fermilab. Architecturally, The lab's designers rejected the militaristic design of Los Alamos and Brookhaven as well as the academic architecture of the Lawrence Berkeley National Laboratory and the Stanford Linear Accelerator Center. Instead Fermilab's planners sought to return to Turnerian themes. They emphasized the values of individualism, empiricism, simplicity, equality, courage, discovery, independence, and naturalism in the service of democratic access, human rights, ecological balance, and the resolution of social, economic, and political issues. Milton Stanley Livingston, the lab's associate director, said in 1968, "The frontier of high energy and the infinitesimally small is a challenge to the mind of man. If we can reach and cross this frontier, our generations will have furnished a significant milestone in human history."

Electronic frontier

John Perry Barlow, along with Mitch Kapor, promoted the idea of cyberspace (the realm of telecommunication) as an "electronic frontier" beyond the borders of any physically based government, in which freedom and self-determination could be fully realized. Scholars analyzing the Internet have often cited Frederick Jackson Turner's frontier model. Of special concern is the question whether the electronic frontier will broadly replicate the stages of development of the American land frontier.

Technosignature

From Wikipedia, the free encyclopedia
Illustration of various types of technosignatures.

Technosignature or technomarker is any measurable property or effect that provides scientific evidence of past or present technology. Technosignatures are analogous to biosignatures, which signal the presence of life, whether intelligent or not. Some authors prefer to exclude radio transmissions from the definition, but such restrictive usage is not widespread. Jill Tarter has proposed that the search for extraterrestrial intelligence (SETI) be renamed "the search for technosignatures". Various types of technosignatures, such as radiation leakage from megascale astroengineering installations such as Dyson spheres, the light from an extraterrestrial ecumenopolis, or Shkadov thrusters with the power to alter the orbits of stars around the Galactic Center, may be detectable with hypertelescopes. Some examples of technosignatures are described in Paul Davies's 2010 book The Eerie Silence, although the terms "technosignature" and "technomarker" do not appear in the book.

In February 2023, astronomers reported, after scanning 820 stars, the detection of 8 possible technosignatures for follow-up studies.

Astroengineering projects

A Dyson sphere, one of the best-known speculative technologies that may generate a technosignature

A Dyson sphere, constructed by life forms dwelling in proximity to a Sun-like star, would cause an increase in the amount of infrared radiation in the star system's emitted spectrum. Hence, Freeman Dyson selected the title "Search for Artificial Stellar Sources of Infrared Radiation" for his 1960 paper on the subject. SETI has adopted these assumptions in its search, looking for such "infrared heavy" spectra from solar analogs. Since 2005, Fermilab has conducted an ongoing survey for such spectra, analyzing data from the Infrared Astronomical Satellite.

Identifying one of the many infra-red sources as a Dyson sphere would require improved techniques for discriminating between a Dyson sphere and natural sources. Fermilab discovered 17 "ambiguous" candidates, of which four have been named "amusing but still questionable". Other searches also resulted in several candidates, which remain unconfirmed. In October 2012, astronomer Geoff Marcy, one of the pioneers of the search for extrasolar planets, was given a research grant to search data from the Kepler telescope, with the aim of detecting possible signs of Dyson spheres.

Orbital paths, transit signatures, stellar activity and star-system composition

Shkadov thrusters, with the hypothetical ability to change the orbital paths of stars in order to avoid various dangers to life such as cold molecular clouds or cometary impacts, would also be detectable in a similar fashion to the transiting extrasolar planets searched by Kepler. Unlike planets, though, the thrusters would appear to abruptly stop over the surface of a star rather than crossing it completely, revealing their technological origin. In addition, evidence of targeted extrasolar asteroid mining may also reveal extraterrestrial intelligence (ETI). Furthermore, it has been suggested that information could be hidden within the transit signatures of other planets. Advanced civilizations could "cloak their presence, or deliberately broadcast it, through controlled laser emission". Other characteristics proposed as potential technosignatures (or starting points for detection of clearer signatures) include peculiar orbital periods such as arranging planets in prime number patterns. Coronal and chromospheric activity on stars might be altered. Extraterrestrial civilizations may use free-floating planets (rogue planets) for interstellar transportation with a number of proposed possible technosignatures.

Communication networks

A study suggests that if ETs exist, they may have established communications network(s) and may already have probes in the solar system whose communication may be detectable. Studies by John Gertz suggest flyby (scout) probes might intermittently surveil nascent solar systems and permanent probes would communicate with a home base, potentially using triggers and conditions such as detection of electromagnetic leakage or biosignatures. They also suggest several strategies to detecting local ET probes such as detecting emitted optical messages. He also finds that due to interstellar networks of communications nodes, the search for deliberate interstellar signals – as is common in SETI – may be futile. The architecture may consist of nodes separated by sub-light-year distances and strung out between neighboring stars. It may also contain pulsars as beacons or nodes whose beams are modulated by mechanisms that could be searched for. Moreover, a study suggests prior searches wouldn't have detected cost-effective electromagnetic signal beacons.

Planetary analysis

Artificial heat and light

Lights from cities and infrastructure on Earth at night from space

Various astronomers, including Avi Loeb of the Harvard-Smithsonian Center for Astrophysics and Edwin L. Turner of Princeton University have proposed that artificial light from extraterrestrial planets, such as that originating from cities, industries, and transport networks, could be detected and signal the presence of an advanced civilization. Such approaches, though, make the assumption that the radiant energy generated by civilization would be relatively clustered and can therefore be detected easily.

Light and heat detected from planets must be distinguished from natural sources to conclusively prove the existence of intelligent life on a planet. For example, NASA's 2012 Black Marble experiment showed that significant stable light and heat sources on Earth, such as chronic wildfires in arid Western Australia, originate from uninhabited areas and are naturally occurring. The proposed LUVOIR A may be able to detect city lights twelve times those of Earth on Proxima b in 300 hours.

Atmospheric analysis

Artist's illustration of an advanced ET civilization with industrial pollution

Atmospheric analysis of planetary atmospheres, as is already done on various Solar System bodies and in a rudimentary fashion on several hot Jupiter extrasolar planets, may reveal the presence of chemicals produced by technological civilizations. For example, atmospheric emissions from human technology use on Earth, including nitrogen dioxide and chlorofluorocarbons, are detectable from space. Artificial air pollution may therefore be detectable on extrasolar planets and on Earth via "atmospheric SETI" – including NO2 pollution levels and with telescopic technology close to today. Such technosignatures may consist not of the detection of the level of one specific chemical but simultaneous detections of levels of multiple specific chemicals in atmospheres.

However, there remains a possibility of mis-detection; for example, the atmosphere of Titan has detectable signatures of complex chemicals that are similar to what on Earth are industrial pollutants, though not the byproduct of civilisation. Some SETI scientists have proposed searching for artificial atmospheres created by planetary engineering to produce habitable environments for colonisation by an ETI.

Extraterrestrial artifacts, influence and spacecraft

Spacecraft

The IKAROS light sail of 2010

Interstellar spacecraft may be detectable from hundreds to thousands of light-years away through various forms of radiation, such as the photons emitted by an antimatter rocket or cyclotron radiation from the interaction of a magnetic sail with the interstellar medium. Such a signal would be easily distinguishable from a natural signal and could hence firmly establish the existence of extraterrestrial life, were it to be detected. In addition, smaller Bracewell probes within the Solar System itself may also be detectable by means of optical or radio searches. Self-replicating spacecraft or their communications networks could potentially be detectable within our Solar system or in nearby star-based systems, if they are located there. Such technologies or their footprints could be in Earth's orbit, on the Moon or on the Earth.

Satellites

A less advanced technology, and one closer to humanity's current technological level, is the Clarke Exobelt proposed by Astrophysicist Hector Socas-Navarro of the Instituto de Astrofisica de Canarias. This hypothetical belt would be formed by all the artificial satellites occupying geostationary/geosynchronous orbits around an exoplanet. From early simulations it appeared that a very dense satellite belt, requiring only a moderately more-advanced civilization than ours, would be detectable with existing technology in the light curves from transiting exoplanets, but subsequent analysis has questioned this result, suggesting that exobelts detectable by current and upcoming missions will be very rare.

Extraterrestrial influence or activity on Earth

It has been suggested that once extraterrestrials arrive "at a new home, such life will almost certainly create technosignatures (because it used technology to get there), and some fraction of them may also eventually give rise to a new biosphere". Microorganism DNA may have been used for self-replicating messages. See also: DNA digital data storage

On exoplanets

Low- or high-albedo installations such as solar panels may also be detectable, albeit distinguishing artificial megastructures from high- and low-albedo natural environments (e.g., bright ice caps) may make it unfeasible.

Scientific projects searching for technosignatures

Major technosignatures as outlined in a 2021 scientific review.

One of the first attempts to search for Dyson Spheres was made by Vyacheslav Slysh from the Russian Space Research Institute in Moscow in 1985 using data from the Infrared Astronomical Satellite (IRAS).

Another search for technosignatures, c. 2001, involved an analysis of data from the Compton Gamma Ray Observatory for traces of anti-matter, which, besides one "intriguing spectrum probably not related to SETI", came up empty.

In 2005, Fermilab had an ongoing survey for such spectra by analyzing data from IRAS. Identifying one of the many infra-red sources as a Dyson Sphere would require improved techniques for discriminating between a Dyson Sphere and natural sources. Fermilab discovered 17 potential "ambiguous" candidates of which four have been named "amusing but still questionable". Other searches also resulted in several candidates, which are, however, unconfirmed.

In a 2005 paper, Luc Arnold proposed a means of detecting planetary-sized artifacts from their distinctive transit light curve signature. He showed that such technosignature was within the reach of space missions aimed at detecting exoplanets by the transit method, as were Corot or Kepler projects at that time. The principle of the detection remains applicable for future exoplanets missions.

In 2012, a trio of astronomers led by Jason Wright started a two-year search for Dyson Spheres, aided by grants from the Templeton Foundation.

In 2013, Geoff Marcy received funding to use data from the Kepler Telescope to search for Dyson Spheres and interstellar communication using lasers, and Lucianne Walkowicz received funding to detect artificial signatures in stellar photometry.

Starting in 2016, astronomer Jean-Luc Margot of UCLA has been searching for technosignatures with large radio telescopes.

Vanishing stars

In 2016, it was proposed that vanishing stars are a plausible technosignature. A pilot project searching for vanishing stars was carried out, finding one candidate object. In 2019, the Vanishing & Appearing Sources during a Century of Observations (VASCO) project began more general searches for vanishing and appearing stars, and other astrophysical transients They identified 100 red transients of "most likely natural origin", while analyzing 15% of the image data. In 2020, the VASCO collaboration started up a citizen science project, vetting through images of many thousands of candidate objects. The citizen science project is carried out in close collaboration with schools and amateur associations mainly in African countries. The VASCO project has been referred to as "Perhaps the most general artefact search to date". In 2021, VASCO's principal investigator Beatriz Villarroel received a L'Oreal-Unesco prize in Sweden for the project. In June 2021, the collaboration published the discovery of nine light sources seemingly appearing and vanishing simultaneously in the sky. No natural phenomena can explain the presence of the objects in an old photographic plate from 1950. The group carefully indicated that either nuclear fallout from unlisted atomic bombs contaminated the plates or that a new celestial phenomenon might be behind. For example, the high spatial density of transients is caused by the presence of artificial, reflective objects at high orbits around Earth in 1950. Continued studies, are bringing more support for the authenticity of the phenomenon with multiple transients. See also: Diminished Reality (the reverse of augmented reality).

Organization of novel projects

Methods and ancillary benefits of the search for various technosignatures.

In June 2020, NASA was awarded their first SETI-specific grant in three decades. The grant funds the first NASA-funded search for technosignatures from advanced extraterrestrial civilizations other than radio waves, including the creation and population of an online technosignature library. A 2021 scientific review produced by the i.a. NASA-sponsored online workshop TechnoClimes 2020 classified possible optimal mission concepts for the search of technosignatures. It evaluates signatures based on a metric about the distance of humanity to the capacity of developing the signature's required technology – a comparison to contemporary human technology footprints, associated methods of detection and ancillary benefits of their search for other astronomy. The study's conclusions include a robust rationale for organizing missions for searching artifacts – including probes – within the Solar system.

In 2021, astronomers proposed a sequence of "verification checks for narrowband technosignature signals" after concluding that technosignature candidate BLC1 could be the result of a form of local radiofrequency interference.

Capabilities for detecting technosignatures with recent, ongoing, and future missions and facilities. Cells colored green indicate that at least such a signature could be detectable for at least one stellar system and there being at least one peer-reviewed publication that has evaluated detectability of that signature.

It has been suggested that observatories on the Moon could be more successful. In 2022, scientists provided an overview of the capabilities of ongoing, recent, past, planned and proposed missions and observatories for detecting various alien technosignatures.

Implications of detection

Steven J. Dick states that there generally are no principles for dealing with successful SETI detections. Detections of technosignatures may have ethical implications, such as conveying information related to astroethical and related machine ethics ones (e.g. related to machines' applied ethical values), or include information about alien societies or histories or fates, which may vary depending on the type, prevalence and form of the detected signature's technology. Moreover, various types of information about detected technosignatures and their distribution or dissemination may have varying implications that may also depend on time and context.

Robotic process automation

From Wikipedia, the free encyclopedia

Robotic process automation (RPA) is a form of business process automation that is based on software robots (bots) or artificial intelligence (AI) agents. It is sometimes referred to as software robotics (not to be confused with robot software).

In traditional workflow automation tools, a software developer produces a list of actions to automate a task and interface to the back end system using internal application programming interfaces (APIs) or dedicated scripting language. In contrast, RPA systems develop the action list by watching the user perform that task in the application's graphical user interface (GUI), and then perform the automation by repeating those tasks directly in the GUI. This can lower the barrier to the use of automation in products that might not otherwise feature APIs for this purpose.

RPA tools have strong technical similarities to graphical user interface testing tools. These tools also automate interactions with the GUI, and often do so by repeating a set of demonstration actions performed by a user. RPA tools differ from such systems in that they allow data to be handled in and between multiple applications, for instance, receiving email containing an invoice, extracting the data, and then typing that into a bookkeeping system.

Historic evolution

The typical benefits of robotic automation include reduced cost; increased speed, accuracy, and consistency; improved quality and scalability of production. Automation can also provide extra security, especially for sensitive data and financial services.

As a form of automation, the concept has been around for a long time in the form of screen scraping, which can be traced back to early forms of malware. However, RPA is much more extensible, consisting of API integration into other enterprise applications, connectors into ITSM systems, terminal services and even some types of AI (e.g. Machine Learning) services such as image recognition. It is considered to be a significant technological evolution in the sense that new software platforms are emerging which are sufficiently mature, resilient, scalable and reliable to make this approach viable for use in large enterprises (who would otherwise be reluctant due to perceived risks to quality and reputation).

A principal barrier to the adoption of self-service is often technological: it may not always be feasible or economically viable to retrofit new interfaces onto existing systems. Moreover, organisations may wish to layer a variable and configurable set of process rules on top of the system interfaces which may vary according to market offerings and the type of customer. This only adds to the cost and complexity of the technological implementation. Robotic automation software provides a pragmatic means of deploying new services in this situation, where the robots simply mimic the behaviour of humans to perform the back-end transcription or processing. The relative affordability of this approach arises from the fact that no new IT transformation or investment is required; instead the software robots simply leverage greater use out of existing IT assets.

Use

The hosting of RPA services also aligns with the metaphor of a software robot, with each robotic instance having its own virtual workstation, much like a human worker. The robot uses keyboard and mouse controls to take actions and execute automations. Normally all of these actions take place in a virtual environment and not on screen; the robot does not need a physical screen to operate, rather it interprets the screen display electronically. The scalability of modern solutions based on architectures such as these owes much to the advent of virtualization technology, without which the scalability of large deployments would be limited by the available capacity to manage physical hardware and by the associated costs. The implementation of RPA in business enterprises has shown dramatic cost savings when compared to traditional non-RPA solutions.

There are however several risks with RPA. Criticism includes risks of stifling innovation and creating a more complex maintenance environment of existing software that now needs to consider the use of graphical user interfaces in a way they weren't intended to be used.

Impact on employment

According to Harvard Business Review, most operations groups adopting RPA have promised their employees that automation would not result in layoffs. Instead, workers have been redeployed to do more interesting work. One academic study highlighted that knowledge workers did not feel threatened by automation: they embraced it and viewed the robots as team-mates. The same study highlighted that, rather than resulting in a lower "headcount", the technology was deployed in such a way as to achieve more work and greater productivity with the same number of people.

Conversely, however, some analysts proffer that RPA represents a threat to the business process outsourcing (BPO) industry. The thesis behind this notion is that RPA will enable enterprises to "repatriate" processes from offshore locations into local data centers, with the benefit of this new technology. The effect, if true, will be to create high-value jobs for skilled process designers in onshore locations (and within the associated supply chain of IT hardware, data center management, etc.) but to decrease the available opportunity to low-skilled workers offshore. On the other hand, this discussion appears to be healthy ground for debate as another academic study was at pains to counter the so-called "myth" that RPA will bring back many jobs from offshore.

RPA actual use

  • Banking and finance process automation
  • Mortgage and lending processes
  • Customer care automation
  • eCommerce merchandising operations
  • Social media marketing
  • Optical character recognition applications
  • Data extraction process
  • Fixed automation process

Impact on society

Academic studies project that RPA, among other technological trends, is expected to drive a new wave of productivity and efficiency gains in the global labour market. Although not directly attributable to RPA alone, Oxford University conjectures that up to 35% of all jobs might be automated by 2035.

There are geographic implications to the trend in robotic automation. In the example above where an offshored process is "repatriated" under the control of the client organization (or even displaced by a Business Process Outsourcer) from an offshore location to a data centre, the impact will be a deficit in economic activity to the offshore location and an economic benefit to the originating economy. On this basis, developed economies – with skills and technological infrastructure to develop and support a robotic automation capability – can be expected to achieve a net benefit from the trend.

In a TEDx talk hosted by University College London (UCL), entrepreneur David Moss explains that digital labour in the form of RPA is likely to revolutionize the cost model of the services industry by driving the price of products and services down, while simultaneously improving the quality of outcomes and creating increased opportunity for the personalization of services.

In a separate TEDx in 2019 talk, Japanese business executive, and former CIO of Barclays bank, Koichi Hasegawa noted that digital robots can be a positive effect on society if we start using a robot with empathy to help every person. He provides a case study of the Japanese insurance companies – Sompo Japan and Aioi – both of whom introduced bots to speed up the process of insurance pay-outs in past massive disaster incidents.

Meanwhile, Professor Willcocks, author of the LSE paper cited above, speaks of increased job satisfaction and intellectual stimulation, characterising the technology as having the ability to "take the robot out of the human", a reference to the notion that robots will take over the mundane and repetitive portions of people's daily workload, leaving them to be used in more interpersonal roles or to concentrate on the remaining, more meaningful, portions of their day.

It was also found in a 2021 study observing the effects of robotization in Europe that, the gender pay gap increased at a rate of .18% for every 1% increase in robotization of a given industry.

Unassisted RPA

Unassisted RPA, or RPAAI, is the next generation of RPA related technologies. Technological advancements around artificial intelligence allow a process to be run on a computer without needing input from a user.

Hyperautomation

Hyperautomation is the application of advanced technologies like RPA, artificial intelligence, machine learning (ML) and process mining to augment workers and automate processes in ways that are significantly more impactful than traditional automation capabilities. Hyperautomation is the combination of automation tools to deliver work.

Gartner’s report notes that this trend was kicked off with robotic process automation (RPA). The report notes that, "RPA alone is not hyperautomation. Hyperautomation requires a combination of tools to help support replicating pieces of where the human is involved in a task."

Outsourcing

Back office clerical processes outsourced by large organisations - particularly those sent offshore - tend to be simple and transactional in nature, requiring little (if any) analysis or subjective judgement. This would seem to make an ideal starting point for organizations beginning to adopt robotic automation for the back office. Whether client organisations choose to take outsourced processes back "in house" from their Business Process Outsourcing (BPO) providers, thus representing a threat to the future of the BPO business, or whether the BPOs implement such automations on their clients' behalf may well depend on a number of factors.

Conversely however, a BPO provider may seek to effect some form of client lock-in by means of automation. By removing cost from a business operation, where the BPO provider is considered to be the owner of the intellectual property and physical implementation of a robotic automation solution (perhaps in terms of hardware, ownership of software licences, etc.), the provider can make it very difficult for the client to take a process back "in house" or elect a new BPO provider. This effect occurs as the associated cost savings made through automation would - temporarily at least - have to be reintroduced to the business whilst the technical solution is reimplemented in the new operational context.

The geographically agnostic nature of software means that new business opportunities may arise for those organisations that have a political or regulatory impediment to offshoring or outsourcing. A robotised automation can be hosted in a data centre in any jurisdiction and this has two major consequences for BPO providers. Firstly, for example, a sovereign government may not be willing or legally able to outsource the processing of tax affairs and security administration. On this basis, if robots are compared to a human workforce, this creates a genuinely new opportunity for a "third sourcing" option, after the choices of onshore vs. offshore. Secondly, and conversely, BPO providers have previously relocated outsourced operations to different political and geographic territories in response to changing wage inflation and new labor arbitrage opportunities elsewhere. By contrast, a data centre solution would seem to offer a fixed and predictable cost base that, if sufficiently low in cost on a robot vs. human basis, would seem to eliminate any potential need or desire to continually relocate operational bases.

Examples

  • Voice recognition and digital dictation software linked to join up business processes for straight through processing without manual intervention
  • Specialised Remote Infrastructure Management software featuring automated investigation and resolution of problems, using robots for the first line IT support
  • Chatbots used by internet retailers and service providers to service customer requests for information. Also used by companies to service employee requests for information from internal databases
  • Presentation layer automation software, increasingly used by Business Process Outsourcers to displace human labor
  • IVR systems incorporating intelligent interaction with callers

Aggregate data

From Wikipedia, the free encyclopedia
A diagram showing the basic meaning of aggregate data, which is a combination of individual data.

Aggregate data is high-level data which is acquired by combining individual-level data. For instance, the output of an industry is an aggregate of the firms’ individual outputs within that industry. Aggregate data are applied in statistics, data warehouses, and in economics.

There is a distinction between aggregate data and individual data. Aggregate data refers to individual data that are averaged by geographic area, by year, by service agency, or by other means. Individual data are disaggregated individual results and are used to conduct analyses for estimation of subgroup differences.

Aggregate data are mainly used by researchers and analysts, policymakers, banks and administrators for multiple reasons. They are used to evaluate policies, recognise trends and patterns of processes, gain relevant insights, and assess current measures for strategic planning. Aggregate data collected from various sources are used in different areas of studies such as comparative political analysis and APD scientific analysis for further analyses. Aggregate data are also used for medical and educational purposes. Aggregate data is widely used, but it also has some limitations, including drawing inaccurate inferences and false conclusions which is also termed ‘ecological fallacy’. ‘Ecological fallacy’ means that it is invalid for users to draw conclusions on the ecological relationships between two quantitative variables at the individual level.

Applications

In statistics, aggregate data are data combined from several measurements. When data is aggregated, groups of observations are replaced with summary statistics based on those observations.

In a data warehouse, the use of aggregate data dramatically reduces the time to query large sets of data. Developers pre-summarise queries that are regularly used, such as Weekly Sales across several dimensions for example by item hierarchy or geographical hierarchy.

In economics, aggregate data or data aggregates are high-level data that are composed from a multitude or combination of other more individual data, such as:

Major users

Researchers and analysts

Researchers use aggregate data to understand the prevalent ethos, evaluate the essence of social realities and a social organisation, stipulate primary issues of concern in research, and supply projections in relation to the nature of social issues. Aggregate data are useful for researchers when they are interested in investigating on the relationships between two distinct variables at the aggregate level, and the connections between an aggregate variable and a characteristic at the individual level. Researchers have also made an effort to evaluate policies, practices and precepts of systems critically with the assistance of aggregate data, to investigate the corresponding relevance and efficacy.

Policymakers

Aggregate data are used by governments to develop more effective policies because they serve as a measure of how capable a government is to be aware of the demands and needs of its citizens and a measure of the way a government maintains social order effectively. For example, governments around the world use of aggregate mobile location data for analysis in response to Covid-19. Aggregate mobile location data could provide insights about the effectiveness of social distancing measures launched by governments. Governments also use aggregate data to identify possible “hot spots” and the potential for transmission.

As well as projecting effectiveness of government policies, aggregate data analyses are also taken to evaluate the nature, assess the extent, recognise the trend and study the pattern of a specific phenomenon or process with the aim to devise strategies, prepare short- or long-term policies, and take efficacious and relevant procedures for control or prevention. Policymakers also utilise financial aggregates data in evaluating companies and households’ economic and financial activities because these data help to identify risks associated with financial stability. Policymakers can employ aggregate data to better understand the developments of a country’s economic and financial conditions.

Banks

Banks collect aggregated data from a significant number of customers and then anonymise the data through eliminating personal information. The main reason for banks to use aggregate data is to estimate economic trends and gain insights on customer clusters. Banks are not permitted to share customers’ personal data, but aggregate data can be shared with banks’ business customers and can be accessed by other partners who also use the same platform to acquire information on aggregate data.

In Australia, the Commonwealth Bank provides its business clients anonymised data related to their customers which are derived from card transactions. The ANZ also provides its business customers with anonymised data which is gathered from millions of merchant terminal transactions and ANZ card transactions.

In the UK, the Integrated Urgent Care Aggregate Data Collection (IUC ADC) provides comprehensive information about IUC activity, its performance, as well as its service demand. Its data are sourced from the lead data providers responsible for offering integrated urgent care services in England. The National Health Service (NHS) under the Department of Health and Social Care (DHSC) in England stated that this collection of aggregate data is going to replace the NHS 111 minimum dataset. It will also be used as a formal source for IUC statistics, as well as to oversee the Key Performance Indicators (KPIs) of the IUC ADC.

Administrators

National or regional level of available empirical data are used by administrators and intellectuals, as well as people who are concerned about a region or a society’s welfare, as sources of reference. In particular, administrators utilise aggregate data for assessments in current political, religious, social, or other atmosphere of a nation to track the gaps in social responses relating to time and space, and to dictate priorities for action. These assessments help administrators in evaluating current measures that are useful in future strategic planning and provide indicators about effective corrective measures.

Sources and collection methods

Aggregate data can be a composition of various types of writings and records, including biography, autobiography, descriptive accounts and correspondence. For example, a researcher collects, collates, or compiles aggregate data through utilising multiple mechanisms of social research, including inventory, interview, an opinionnaire, and a questionnaire or schedule. Official or non-official agencies also collect and compile aggregate data on an ongoing basis through utilising infrastructures available within a department at the field level.

Sources of aggregate data can also be regarded as tools for discovering data. In the US, some of the US data are presented in the form of tables. Examples of sources for these US aggregate data include the United States Census Bureau, Statistical Abstract of the United States, and Social Explorer. International Monetary Fund data, World DataBank, and Penn World Table are examples of transactional and international aggregate data sources.

Use of aggregate data

Comparative political analysis

Aggregate data is used in comparative political analysis because analysts do not only focus on individual’s behaviour. They also focus on the behaviour of areal units, including electoral constituencies and nations. In political activity analyses, significant data such as those related to industrialisation, urbanization, as well as mass communication networks, are not expressed readily in individual levels. They are expressed in per capita terms in order to control for the variations in the areal units’ population size. Aggregate data are widely available because demographic, socio-economic, and political data are collected and published by the nations. This facilitates researchers and analysts in carrying out longer trend studies and allows them to bring changes and developments in a deeper focus.

APD scientific meta-analyses

Factors including the need for time, considerable resources and wide international cooperation, impeded the use of individual patient data (IPD) meta-analysis, which led to most of the published meta-analyses relying upon aggregate patient data (APD). To acquire data in all trials on all patients, aggregate patient data are collected from completed studies being presented at professional meetings, published in the medical literature, or were directly supplied by individual investigators. The aggregated patient data are utilised by users including the Cochrane Collaboration, the United States Preventive Services Task Force, and multiple professional societies in providing support for clinical practice guidelines. Aggregate patient data are also used in time-to-event studies of meta-analyses as the results can inform investors about the worthiness to proceed to conducting more meta-analyses that are based on resource-intensive individual patient data.

Other uses

Health care

In a health information system, aggregate data is the integration of data concerning numerous patients. A particular patient cannot be traced based on aggregate data. These aggregated data are only counts, including Tuberculous, Malaria, or other diseases. Health facilities use this type of aggregated statistics to generate reports and indicators, and to undertake strategic planning in their health systems. Compared with aggregated data, patient data are individual data related to a single patient, including one’s name, age, diagnosis and medical history. Patient-based data are mainly used to track the progress of a patient, such as how the patient responds to particular treatment, over time.

The COVID-19 Data Archive, also called the COVID-ARC, aggregates data from studies around the globe. Researchers are able to have access towards the discoveries of international colleagues and forges collaborations to facilitate processes involved in fighting against the disease. Specifically, using aggregated healthcare data allows health care providers to unbolt actionable clinical insights when for instance, thorough views of clinical data or continuous patient records become possible.

Education

Aggregate data such as aggregate school-level demographic data and aggregate school-level achievement data are used in experimental analysis to assess the relationships between student achievement and school-level interventions. Aggregate data can also be used in non-experimental analysis such as regression discontinuity analysis and interrupted time-series analysis. Individual-level data are not required in these non-experimental analyses. For example, interrupted time-series analysis estimates the impact brought by a school-level program through comparing a school’s achievement before and after the program is launched where individual-level data are not necessary.

Limitations

During the process of averaging units within some cluster or within a country, information is lost which increases the probability of drawing inaccurate inferences. Information loss occurs because aggregation of data ignores individual variation as if it were only a type of statistical noise or measurement error. Inference also vary from one to another when either individual firm data or aggregated data is used for analysis. For instance, calculation of country averages does not account for firm-specific variables, such as firm size, firm age, or firm-ownership concentration, but calculation of individual averages does. Differences exist between results generated from aggregate data and individual data.

There is also a problem of ‘ecological fallacy’. The concept was brought about by Robinson (1950). The meaning of the term is that the variability around the individual-level means is significantly different from the variability encompassing the aggregate means. With the aggregate concept, things other than the individual equivalents of aggregate data are expressed, which means that individual-level conclusions cannot be drawn. Although aggregate data has wider applicability than individual-level data, it is more challenging for researchers to tackle with analysis on subgroup results when aggregate data is used. Eventually, individual information may also be required. Growth modelling and longitudinal modelling based on aggregate data are also difficult because variables can vary over time.

Other types of aggregate data

Financial aggregates data

Financial aggregates data is a type of aggregate data about credit and the money supply in Australia, which is utilised by policymakers in evaluating both the households and the companies’ economic and financial activities.

Credit aggregates

Credit aggregates are measurements of the households and businesses’ borrowings from financial intermediaries. The amount of funds borrowed by businesses for purposes including project investments, assets purchases, or cash flow managements are also measured using credit aggregates.

Monetary aggregates

Monetary aggregates are measurements of the money or ‘money-like’ instruments of the banking system, which is owed to businesses and households. An example of a ‘money-like’ instrument is deposits in the bank account.

Census aggregate data

In the UK, census aggregate data are data generated as outputs from the United Kingdom censuses. They provide information about the socio-economic and demographic characteristics of the country’s population. They are a compilation of aggregated, or summarised, calculations of the number of individuals, household residents, or families in particular geographic areas with specific characteristics, or compounds of characteristics, taken from the subjects of people and places, populations, families, health, ethnicity and religion, housing and work.

Aggregate data are used as components of the UK censuses’ outputs. They are obtained from analysis on the information given in the census returns. The census aggregate data are used to compare and describe population characteristics across various locations in the UK because they are able to provide comparable information at a range of geographical levels over the entire UK. Census aggregate data are also utilised in the academic sector for teaching and research purposes, as well as for site location and marketing in the private sector.

Romance (love)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/w...