Search This Blog

Sunday, April 25, 2021

Critical theory

From Wikipedia, the free encyclopedia

Critical theory (also capitalized as Critical Theory) is an approach to social philosophy that focuses on reflective assessment and critique of society and culture in order to reveal and challenge power structures. With origins in sociology and literary criticism, it argues that social problems are influenced and created more by societal structures and cultural assumptions than by individual and psychological factors. Maintaining that ideology is the principal obstacle to human liberation, critical theory was established as a school of thought primarily by the Frankfurt School theoreticians Herbert Marcuse, Theodor Adorno, Walter Benjamin, Erich Fromm, and Max Horkheimer. Horkheimer described a theory as critical insofar as it seeks "to liberate human beings from the circumstances that enslave them."

In sociology and political philosophy, "Critical Theory" means the Western-Marxist philosophy of the Frankfurt School, developed in Germany in the 1930s and drawing on the ideas of Karl Marx and Sigmund Freud. Though a "critical theory" or a "critical social theory" may have similar elements of thought, capitalizing Critical Theory as if it were a proper noun stresses the intellectual lineage specific to the Frankfurt School.

Modern critical theory has also been influenced by György Lukács and Antonio Gramsci, as well as second-generation Frankfurt School scholars, notably Jürgen Habermas. In Habermas's work, critical theory transcended its theoretical roots in German idealism and progressed closer to American pragmatism. Concern for social "base and superstructure" is one of the remaining Marxist philosophical concepts in much contemporary critical theory.

Postmodern critical theory analyzes the fragmentation of cultural identities in order to challenge modernist-era constructs such as metanarratives, rationality, and universal truths, while politicizing social problems "by situating them in historical and cultural contexts, to implicate themselves in the process of collecting and analyzing data, and to relativize their findings."

Overview

Max Horkheimer first defined critical theory (German: Kritische Theorie) in his 1937 essay "Traditional and Critical Theory", as a social theory oriented toward critiquing and changing society as a whole, in contrast to traditional theory oriented only toward understanding or explaining it. Wanting to distinguish critical theory as a radical, emancipatory form of Marxist philosophy, Horkheimer critiqued both the model of science put forward by logical positivism, and what he and his colleagues saw as the covert positivism and authoritarianism of orthodox Marxism and Communism. He described a theory as critical insofar as it seeks "to liberate human beings from the circumstances that enslave them." Critical theory involves a normative dimension, either by criticizing society in terms of some general theory of values or norms (oughts), or by criticizing society in terms of its own espoused values (i.e. immanent critique).

The core concepts of critical theory are that it should:

Kant and Marx

This version of "critical" theory derives from the use of the term critique by Immanuel Kant in his Critique of Pure Reason and from Marx, on the premise that Das Kapital is a "critique of political economy".

In Kant's transcendental idealism, critique means examining and establishing the limits of the validity of a faculty, type, or body of knowledge, especially by accounting for the limitations of that knowledge system's fundamental, irreducible concepts.

Kant's notion of critique has been associated with the overturning of false, unprovable, or dogmatic philosophical, social, and political beliefs. His critique of reason involved the critique of dogmatic theological and metaphysical ideas and was intertwined with the enhancement of ethical autonomy and the Enlightenment critique of superstition and irrational authority. Ignored by many in "critical realist" circles is that Kant's immediate impetus for writing Critique of Pure Reason was to address problems raised by David Hume's skeptical empiricism which, in attacking metaphysics, employed reason and logic to argue against the knowability of the world and common notions of causation. Kant, by contrast, pushed the employment of a priori metaphysical claims as requisite, for if anything is to be said to be knowable, it would have to be established upon abstractions distinct from perceivable phenomena.

Marx explicitly developed the notion of critique into the critique of ideology, linking it with the practice of social revolution, as stated in the 11th section of his Theses on Feuerbach: "The philosophers have only interpreted the world, in various ways; the point is to change it."

Adorno and Horkheimer

One of the distinguishing characteristics of critical theory, as Theodor W. Adorno and Max Horkheimer elaborated in their Dialectic of Enlightenment (1947), is an ambivalence about the ultimate source or foundation of social domination, an ambivalence that gave rise to the "pessimism" of the new critical theory about the possibility of human emancipation and freedom. This ambivalence was rooted in the historical circumstances in which the work was originally produced, particularly the rise of Nazism, state capitalism, and culture industry as entirely new forms of social domination that could not be adequately explained in the terms of traditional Marxist sociology.

For Adorno and Horkheimer, state intervention in the economy had effectively abolished the traditional tension between Marxism's "relations of production" and "material productive forces" of society. The market (as an "unconscious" mechanism for the distribution of goods) had been replaced by centralized planning.

Contrary to Marx's prediction in the Preface to a Contribution to the Critique of Political Economy, this shift did not lead to "an era of social revolution" but to fascism and totalitarianism. As such, critical theory was left, in Habermas's words, without "anything in reserve to which it might appeal, and when the forces of production enter into a baneful symbiosis with the relations of production that they were supposed to blow wide open, there is no longer any dynamism upon which critique could base its hope." For Adorno and Horkheimer, this posed the problem of how to account for the apparent persistence of domination in the absence of the very contradiction that, according to traditional critical theory, was the source of domination itself.

Habermas

In the 1960s, Habermas, a proponent of critical social theory, raised the epistemological discussion to a new level in his Knowledge and Human Interests (1968), by identifying critical knowledge as based on principles that differentiated it either from the natural sciences or the humanities, through its orientation to self-reflection and emancipation. Although unsatisfied with Adorno and Horkheimer's thought in Dialectic of Enlightenment, Habermas shares the view that, in the form of instrumental rationality, the era of modernity marks a move away from the liberation of enlightenment and toward a new form of enslavement. In Habermas's work, critical theory transcended its theoretical roots in German idealism, and progressed closer to American pragmatism.

Habermas's ideas about the relationship between modernity and rationalization are in this sense strongly influenced by Max Weber. He further dissolved the elements of critical theory derived from Hegelian German idealism, though his epistemology remains broadly Marxist. Perhaps his two most influential ideas are the concepts of the public sphere and communicative action, the latter arriving partly as a reaction to new post-structural or so-called "postmodern" challenges to the discourse of modernity. Habermas engaged in regular correspondence with Richard Rorty, and a strong sense of philosophical pragmatism may be felt in his thought, which frequently traverses the boundaries between sociology and philosophy.

In academia

Postmodern critical social theory

Focusing on language, symbolism, communication, and social construction, critical theory has been applied in the social sciences as a critique of social construction and postmodern society.

While modernist critical theory (as described above) concerns itself with "forms of authority and injustice that accompanied the evolution of industrial and corporate capitalism as a political-economic system", postmodern critical theory politicizes social problems "by situating them in historical and cultural contexts, to implicate themselves in the process of collecting and analyzing data, and to relativize their findings." Meaning itself is seen as unstable due to social structures' rapid transformation. As a result, research focuses on local manifestations rather than broad generalizations.

Postmodern critical research is also characterized by the crisis of representation, which rejects the idea that a researcher's work is an "objective depiction of a stable other." Instead, many postmodern scholars have adopted "alternatives that encourage reflection about the 'politics and poetics' of their work. In these accounts, the embodied, collaborative, dialogic, and improvisational aspects of qualitative research are clarified."

The term critical theory is often appropriated when an author works in sociological terms, yet attacks the social or human sciences, thus attempting to remain "outside" those frames of inquiry. Michel Foucault has been described as one such author. Jean Baudrillard has also been described as a critical theorist to the extent that he was an unconventional and critical sociologist; this appropriation is similarly casual, holding little or no relation to the Frankfurt School. In contrast, Habermas is one of the key critics of postmodernism.

Communication studies

From the 1960s and 1970s onward, language, symbolism, text, and meaning came to be seen as the theoretical foundation for the humanities, through the influence of Ludwig Wittgenstein, Ferdinand de Saussure, George Herbert Mead, Noam Chomsky, Hans-Georg Gadamer, Roland Barthes, Jacques Derrida and other thinkers in linguistic and analytic philosophy, structural linguistics, symbolic interactionism, hermeneutics, semiology, linguistically oriented psychoanalysis (Jacques Lacan, Alfred Lorenzer), and deconstruction.

When, in the 1970s and 1980s, Habermas redefined critical social theory as a study of communication, with communicative competence and communicative rationality on the one hand, and distorted communication on the other, the two versions of critical theory began to overlap to a much greater degree than before.

Pedagogy

Critical theorists have widely credited Paulo Freire for the first applications of critical theory to education/pedagogy, considering his best-known work to be Pedagogy of the Oppressed, a seminal text in what is now known as the philosophy and social movement of critical pedagogy. Dedicated to the oppressed and based on his own experience helping Brazilian adults learn to read and write, Freire includes a detailed Marxist class analysis in his exploration of the relationship between the colonizer and the colonized. In the book, he calls traditional pedagogy the "banking model of education", because it treats the student as an empty vessel to be filled with knowledge. He argues that pedagogy should instead treat the learner as a co-creator of knowledge.

In contrast to the banking model, the teacher in the critical-theory model is not the dispenser of all knowledge, but a participant who learns with and from the students—in conversation with them, even as they learn from the teacher. The goal is to liberate the learner from an oppressive construct of teacher versus student, a dichotomy analogous to colonizer and colonized. It is not enough for the student to analyze societal power structures and hierarchies, to merely recognize imbalance and inequity; critical theory pedagogy must also empower the learner to reflect and act on that reflection to challenge an oppressive status quo.

Criticism

While critical theorists have often been called Marxist intellectuals, their tendency to denounce some Marxist concepts and to combine Marxian analysis with other sociological and philosophical traditions has resulted in accusations of revisionism by classical, orthodox, and analytical Marxists, and by Marxist–Leninist philosophers. Martin Jay has said that the first generation of critical theory is best understood not as promoting a specific philosophical agenda or ideology, but as "a gadfly of other systems."

Critical theory has been criticized for not offering any clear road map to political action (praxis), often explicitly repudiating any solutions (as with Marcuse's "Great Refusal", which promoted abstaining from engaging in active political change).

A primary criticism of the theory is that it is anti-scientific, both for its lack of the use of the scientific method, and for its assertion that science is a tool used for oppression of marginalized groups of people.

 

Digital humanities

From Wikipedia, the free encyclopedia
 
Example of a textual analysis program being used to study a novel, with Jane Austen's Pride and Prejudice in Voyant Tools

Digital humanities (DH) is an area of scholarly activity at the intersection of computing or digital technologies and the disciplines of the humanities. It includes the systematic use of digital resources in the humanities, as well as the analysis of their application. DH can be defined as new ways of doing scholarship that involve collaborative, transdisciplinary, and computationally engaged research, teaching, and publishing. It brings digital tools and methods to the study of the humanities with the recognition that the printed word is no longer the main medium for knowledge production and distribution.

By producing and using new applications and techniques, DH makes new kinds of teaching and research possible, while at the same time studying and critiquing how these impact cultural heritage and digital culture. Thus, a distinctive feature of DH is its cultivation of a two-way relationship between the humanities and the digital: the field both employs technology in the pursuit of humanities research and subjects technology to humanistic questioning and interrogation, often simultaneously.

Definition

The definition of the digital humanities is being continually formulated by scholars and practitioners. Since the field is constantly growing and changing, specific definitions can quickly become outdated or unnecessarily limit future potential. The second volume of Debates in the Digital Humanities (2016) acknowledges the difficulty in defining the field: "Along with the digital archives, quantitative analyses, and tool-building projects that once characterized the field, DH now encompasses a wide range of methods and practices: visualizations of large image sets, 3D modeling of historical artifacts, 'born digital' dissertations, hashtag activism and the analysis thereof, alternate reality games, mobile makerspaces, and more. In what has been called 'big tent' DH, it can at times be difficult to determine with any specificity what, precisely, digital humanities work entails."

Historically, the digital humanities developed out of humanities computing and has become associated with other fields, such as humanistic computing, social computing, and media studies. In concrete terms, the digital humanities embraces a variety of topics, from curating online collections of primary sources (primarily textual) to the data mining of large cultural data sets to topic modeling. Digital humanities incorporates both digitized (remediated) and born-digital materials and combines the methodologies from traditional humanities disciplines (such as rhetoric, history, philosophy, linguistics, literature, art, archaeology, music, and cultural studies) and social sciences, with tools provided by computing (such as hypertext, hypermedia, data visualisation, information retrieval, data mining, statistics, text mining, digital mapping), and digital publishing. Related subfields of digital humanities have emerged like software studies, platform studies, and critical code studies. Fields that parallel the digital humanities include new media studies and information science as well as media theory of composition, game studies, particularly in areas related to digital humanities project design and production, and cultural analytics.

The Digital Humanities Stack (from Berry and Fagerjord, Digital Humanities: Knowledge and Critique in a Digital Age)

Berry and Fagerjord have suggested that a way to reconceptualise digital humanities could be through a "digital humanities stack". They argue that "this type of diagram is common in computation and computer science to show how technologies are 'stacked' on top of each other in increasing levels of abstraction. Here, [they] use the method in a more illustrative and creative sense of showing the range of activities, practices, skills, technologies and structures that could be said to make up the digital humanities, with the aim of providing a high-level map." Indeed, the "diagram can be read as the bottom levels indicating some of the fundamental elements of the digital humanities stack, such as computational thinking and knowledge representation, and then other elements that later build on these. "

In practical terms, a major distinction within digital humanities is the focus on the data being processed. For processing textual data, digital humanities builds on a long and extensive history of digital edition, computational linguistics and natural language processing and developed an independent and highly specialized technology stack (largely cumulating in the specifications of the Text Encoding Initiative). This part of the field is sometimes thus set apart from Digital Humanities in general as 'digital philology' or 'computational philology'. For the analysis and digital edition of objects or artifacts, different technologies are required.

History

Digital humanities descends from the field of humanities computing, whose origins reach back to 1940s and 50s, in the pioneering work of Jesuit scholar Roberto Busa, which began in 1946, and of English professor Josephine Miles, beginning in the early 1950s. In collaboration with IBM, Busa and his team created a computer-generated concordance to Thomas Aquinas' writings known as the Index Thomisticus. Other scholars began using mainframe computers to automate tasks like word-searching, sorting, and counting, which was much faster than processing information from texts with handwritten or typed index cards. In the decades which followed archaeologists, classicists, historians, literary scholars, and a broad array of humanities researchers in other disciplines applied emerging computational methods to transform humanities scholarship.

As Tara McPherson has pointed out, the digital humanities also inherit practices and perspectives developed through many artistic and theoretical engagements with electronic screen culture beginning the late 1960s and 1970s. These range from research developed by organizations such as SIGGRAPH to creations by artists such as Charles and Ray Eames and the members of E.A.T. (Experiments in Art and Technology). The Eames and E.A.T. explored nascent computer culture and intermediality in creative works that dovetailed technological innovation with art.

The first specialized journal in the digital humanities was Computers and the Humanities, which debuted in 1966. The Computer Applications and Quantitative Methods in Archaeology (CAA) association was founded in 1973. The Association for Literary and Linguistic Computing (ALLC) and the Association for Computers and the Humanities (ACH) were then founded in 1977 and 1978, respectively.

Soon, there was a need for a standardized protocol for tagging digital texts, and the Text Encoding Initiative (TEI) was developed. The TEI project was launched in 1987 and published the first full version of the TEI Guidelines in May 1994. TEI helped shape the field of electronic textual scholarship and led to Extensible Markup Language (XML), which is a tag scheme for digital editing. Researchers also began experimenting with databases and hypertextual editing, which are structured around links and nodes, as opposed to the standard linear convention of print. In the nineties, major digital text and image archives emerged at centers of humanities computing in the U.S. (e.g. the Women Writers Project, the Rossetti Archive, and The William Blake Archive), which demonstrated the sophistication and robustness of text-encoding for literature. The advent of personal computing and the World Wide Web meant that Digital Humanities work could become less centered on text and more on design. The multimedia nature of the internet has allowed Digital Humanities work to incorporate audio, video, and other components in addition to text.

The terminological change from "humanities computing" to "digital humanities" has been attributed to John Unsworth, Susan Schreibman, and Ray Siemens who, as editors of the anthology A Companion to Digital Humanities (2004), tried to prevent the field from being viewed as "mere digitization." Consequently, the hybrid term has created an overlap between fields like rhetoric and composition, which use "the methods of contemporary humanities in studying digital objects," and digital humanities, which uses "digital technology in studying traditional humanities objects". The use of computational systems and the study of computational media within the humanities, arts and social sciences more generally has been termed the 'computational turn'.

In 2006 the National Endowment for the Humanities (NEH) launched the Digital Humanities Initiative (renamed Office of Digital Humanities in 2008), which made widespread adoption of the term "digital humanities" in the United States.

Digital humanities emerged from its former niche status and became "big news" at the 2009 MLA convention in Philadelphia, where digital humanists made "some of the liveliest and most visible contributions" and had their field hailed as "the first 'next big thing' in a long time."

Values and methods

Although digital humanities projects and initiatives are diverse, they often reflect common values and methods. These can help in understanding this hard-to-define field.

Values

  • Critical and theoretical
  • Iterative and experimental
  • Collaborative and distributed
  • Multimodal and performative
  • Open and accessible

Methods

  • Enhanced critical curation
  • Augmented editions and fluid textuality
  • Scale: the law of large numbers
  • Distant/close, macro/micro, surface/depth
  • Cultural analytics, aggregation, and data-mining
  • Visualization and data design
  • Locative investigation and thick mapping
  • The animated archive
  • Distributed knowledge production and performative access
  • Humanities gaming
  • Code, software, and platform studies
  • Database documentaries
  • Repurposable content and remix culture
  • Pervasive infrastructure
  • Ubiquitous scholarship

In keeping with the value of being open and accessible, many digital humanities projects and journals are open access and/or under Creative Commons licensing, showing the field's "commitment to open standards and open source." Open access is designed to enable anyone with an internet-enabled device and internet connection to view a website or read an article without having to pay, as well as share content with the appropriate permissions.

Digital humanities scholars use computational methods either to answer existing research questions or to challenge existing theoretical paradigms, generating new questions and pioneering new approaches. One goal is to systematically integrate computer technology into the activities of humanities scholars, as is done in contemporary empirical social sciences. Yet despite the significant trend in digital humanities towards networked and multimodal forms of knowledge, a substantial amount of digital humanities focuses on documents and text in ways that differentiate the field's work from digital research in media studies, information studies, communication studies, and sociology. Another goal of digital humanities is to create scholarship that transcends textual sources. This includes the integration of multimedia, metadata, and dynamic environments (see The Valley of the Shadow project at the University of Virginia, the Vectors Journal of Culture and Technology in a Dynamic Vernacular at University of Southern California, or Digital Pioneers projects at Harvard). A growing number of researchers in digital humanities are using computational methods for the analysis of large cultural data sets such as the Google Books corpus. Examples of such projects were highlighted by the Humanities High Performance Computing competition sponsored by the Office of Digital Humanities in 2008, and also by the Digging Into Data challenge organized in 2009 and 2011 by NEH in collaboration with NSF, and in partnership with JISC in the UK, and SSHRC in Canada. In addition to books, historical newspapers can also be analyzed with big data methods. The analysis of vast quantities of historical newspaper content has showed how periodic structures can be automatically discovered, and a similar analysis was performed on social media. As part of the big data revolution, gender bias, readability, content similarity, reader preferences, and even mood have been analyzed based on text mining methods over millions of documents and historical documents written in literary Chinese.

Digital humanities is also involved in the creation of software, providing "environments and tools for producing, curating, and interacting with knowledge that is 'born digital' and lives in various digital contexts." In this context, the field is sometimes known as computational humanities.

Narrative network of US Elections 2012

Tools

Digital humanities scholars use a variety of digital tools for their research, which may take place in an environment as small as a mobile device or as large as a virtual reality lab. Environments for "creating, publishing and working with digital scholarship include everything from personal equipment to institutes and software to cyberspace." Some scholars use advanced programming languages and databases, while others use less complex tools, depending on their needs. DiRT (Digital Research Tools Directory) offers a registry of digital research tools for scholars. TAPoR (Text Analysis Portal for Research) is a gateway to text analysis and retrieval tools. An accessible, free example of an online textual analysis program is Voyant Tools, which only requires the user to copy and paste either a body of text or a URL and then click the 'reveal' button to run the program. There is also an online list of online or downloadable Digital Humanities tools that are largely free, aimed toward helping students and others who lack access to funding or institutional servers. Free, open source web publishing platforms like WordPress and Omeka are also popular tools.

Projects

Digital humanities projects are more likely than traditional humanities work to involve a team or a lab, which may be composed of faculty, staff, graduate or undergraduate students, information technology specialists, and partners in galleries, libraries, archives, and museums. Credit and authorship are often given to multiple people to reflect this collaborative nature, which is different from the sole authorship model in the traditional humanities (and more like the natural sciences).

There are thousands of digital humanities projects, ranging from small-scale ones with limited or no funding to large-scale ones with multi-year financial support. Some are continually updated while others may not be due to loss of support or interest, though they may still remain online in either a beta version or a finished form. The following are a few examples of the variety of projects in the field:

Digital archives

The Women Writers Project (begun in 1988) is a long-term research project to make pre-Victorian women writers more accessible through an electronic collection of rare texts. The Walt Whitman Archive (begun in the 1990s) sought to create a hypertext and scholarly edition of Whitman's works and now includes photographs, sounds, and the only comprehensive current bibliography of Whitman criticism. The Emily Dickinson Archive (begun in 2013) is a collection of high-resolution images of Dickinson's poetry manuscripts as well as a searchable lexicon of over 9,000 words that appear in the poems.

Example of network analysis

as an archival tool at the League of Nations.

The Slave Societies Digital Archive (formerly Ecclesiastical and Secular Sources for Slave Societies), directed by Jane Landers and hosted at Vanderbilt University, preserves endangered ecclesiastical and secular documents related to Africans and African-descended peoples in slave societies. This Digital Archive currently holds 500,000 unique images, dating from the 16th to the 20th centuries, and documents the history of between 6 and 8 million individuals. They are the most extensive serial records for the history of Africans in the Atlantic World and also include valuable information on the indigenous, European, and Asian populations who lived alongside them.

The involvement of librarians and archivists plays an important part in digital humanities projects because of the recent expansion of their role so that it now covers digital curation, which is critical in the preservation, promotion, and access to digital collections, as well as the application of scholarly orientation to digital humanities projects. A specific example involves the case of initiatives where archivists help scholars and academics build their projects through their experience in evaluating, implementing, and customizing metadata schemas for library collections.

The initiatives at the National Autonomous University of Mexico is another example of a digital humanities project. These include the digitization of 17th-century manuscripts, an electronic corpus of Mexican history from the 16th to 19th century, and the visualization of pre-Hispanic archaeological sites in 3-D.

Cultural analytics

"Cultural analytics" refers to the use of computational method for exploration and analysis of large visual collections and also contemporary digital media. The concept was developed in 2005 by Lev Manovich who then established the Cultural Analytics Lab in 2007 at Qualcomm Institute at California Institute for Telecommunication and Information (Calit2). The lab has been using methods from the field of computer science called Computer Vision many types of both historical and contemporary visual media—for example, all covers of Time magazine published between 1923 and 2009, 20,000 historical art photographs from the collection in Museum of Modern Art (MoMA) in New York, one million pages from Manga books, and 16 million images shared on Instagram in 17 global cities. Cultural analytics also includes using methods from media design and data visualization to create interactive visual interfaces for exploration of large visual collections e.g., Selfiecity and On Broadway.

Cultural analytics research is also addressing a number of theoretical questions. How can we "observe" giant cultural universes of both user-generated and professional media content created today, without reducing them to averages, outliers, or pre-existing categories? How can work with large cultural data help us question our stereotypes and assumptions about cultures? What new theoretical cultural concepts and models are required for studying global digital culture with its new mega-scale, speed, and connectivity?

The term "cultural analytics" (or "culture analytics") is now used by many other researchers, as exemplified by two academic symposiums, a four-month long research program at UCLA that brought together 120 leading researchers from university and industry labs, an academic peer-review Journal of Cultural Analytics: CA established in 2016, and academic job listings.

Textual mining, analysis, and visualization

WordHoard (begun in 2004) is a free application that enables scholarly but non-technical users to read and analyze, in new ways, deeply-tagged texts, including the canon of Early Greek epic, Chaucer, Shakespeare, and Spenser. The Republic of Letters (begun in 2008) seeks to visualize the social network of Enlightenment writers through an interactive map and visualization tools. Network analysis and data visualization is also used for reflections on the field itself – researchers may produce network maps of social media interactions or infographics from data on digital humanities scholars and projects.

Network analysis: graph of Digital Humanities Twitter users.

Analysis of macroscopic trends in cultural change

Culturomics is a form of computational lexicology that studies human behavior and cultural trends through the quantitative analysis of digitized texts. Researchers data mine large digital archives to investigate cultural phenomena reflected in language and word usage. The term is an American neologism first described in a 2010 Science article called Quantitative Analysis of Culture Using Millions of Digitized Books, co-authored by Harvard researchers Jean-Baptiste Michel and Erez Lieberman Aiden.

A 2017 study published in the Proceedings of the National Academy of Sciences of the United States of America compared the trajectory of n-grams over time in both digitised books from the 2010 Science article with those found in a large corpus of regional newspapers from the United Kingdom over the course of 150 years. The study further went on to use more advanced Natural language processing techniques to discover macroscopic trends in history and culture, including gender bias, geographical focus, technology, and politics, along with accurate dates for specific events.

Online publishing

The Stanford Encyclopedia of Philosophy (begun in 1995) is a dynamic reference work of terms, concepts, and people from philosophy maintained by scholars in the field. MLA Commons offers an open peer-review site (where anyone can comment) for their ongoing curated collection of teaching artifacts in Digital Pedagogy in the Humanities: Concepts, Models, and Experiments (2016). The Debates in the Digital Humanities platform contains volumes of the open-access book of the same title (2012 and 2016 editions) and allows readers to interact with material by marking sentences as interesting or adding terms to a crowdsourced index.

Wikimedia projects

Some research institutions work with the Wikimedia Foundation or volunteers of the community, for example, to make freely licensed media files available via Wikimedia Commons or to link or load data sets with Wikidata. Text analysis has been performed on the contribution history of articles on Wikipedia or its sister projects.

Criticism

In 2012, Matthew K. Gold identified a range of perceived criticisms of the field of digital humanities: "'a lack of attention to issues of race, class, gender, and sexuality; a preference for research-driven projects over pedagogical ones; an absence of political commitment; an inadequate level of diversity among its practitioners; an inability to address texts under copyright; and an institutional concentration in well-funded research universities". Similarly Berry and Fagerjord have argued that a digital humanities should "focus on the need to think critically about the implications of computational imaginaries, and raise some questions in this regard. This is also to foreground the importance of the politics and norms that are embedded in digital technology, algorithms and software. We need to explore how to negotiate between close and distant readings of texts and how micro-analysis and macro-analysis can be usefully reconciled in humanist work." Alan Liu has argued, "while digital humanists develop tools, data, and metadata critically, therefore (e.g., debating the 'ordered hierarchy of content objects' principle; disputing whether computation is best used for truth finding or, as Lisa Samuels and Jerome McGann put it, 'deformance'; and so on) rarely do they extend their critique to the full register of society, economics, politics, or culture." Some of these concerns have given rise to the emergent subfield of Critical Digital Humanities (CDH):

"Some key questions include: how do we make the invisible become visible in the study of software? How is knowledge transformed when mediated through code and software? What are the critical approaches to Big Data, visualization, digital methods, etc.? How does computation create new disciplinary boundaries and gate-keeping functions? What are the new hegemonic representations of the digital – 'geons', 'pixels', 'waves', visualization, visual rhetorics, etc.? How do media changes create epistemic changes, and how can we look behind the 'screen essentialism' of computational interfaces? Here we might also reflect on the way in which the practice of making-visible also entails the making-invisible – computation involves making choices about what is to be captured. "

Negative publicity

Lauren F. Klein and Gold note that many appearances of the digital humanities in public media are often in a critical fashion. Armand Leroi, writing in The New York Times, discusses the contrast between the algorithmic analysis of themes in literary texts and the work of Harold Bloom, who qualitatively and phenomenologically analyzes the themes of literature over time. Leroi questions whether or not the digital humanities can provide a truly robust analysis of literature and social phenomenon or offer a novel alternative perspective on them. The literary theorist Stanley Fish claims that the digital humanities pursue a revolutionary agenda and thereby undermine the conventional standards of "pre-eminence, authority and disciplinary power." However, digital humanities scholars note that "Digital Humanities is an extension of traditional knowledge skills and methods, not a replacement for them. Its distinctive contributions do not obliterate the insights of the past, but add and supplement the humanities' long-standing commitment to scholarly interpretation, informed research, structured argument, and dialogue within communities of practice".

Some have hailed the digital humanities as a solution to the apparent problems within the humanities, namely a decline in funding, a repeat of debates, and a fading set of theoretical claims and methodological arguments. Adam Kirsch, writing in the New Republic, calls this the "False Promise" of the digital humanities. While the rest of humanities and many social science departments are seeing a decline in funding or prestige, the digital humanities has been seeing increasing funding and prestige. Burdened with the problems of novelty, the digital humanities is discussed as either a revolutionary alternative to the humanities as it is usually conceived or as simply new wine in old bottles. Kirsch believes that digital humanities practitioners suffer from problems of being marketers rather than scholars, who attest to the grand capacity of their research more than actually performing new analysis and when they do so, only performing trivial parlor tricks of research. This form of criticism has been repeated by others, such as in Carl Staumshein, writing in Inside Higher Education, who calls it a "Digital Humanities Bubble". Later in the same publication, Straumshein alleges that the digital humanities is a 'Corporatist Restructuring' of the Humanities. Some see the alliance of the digital humanities with business to be a positive turn that causes the business world to pay more attention, thus bringing needed funding and attention to the humanities. If it were not burdened by the title of digital humanities, it could escape the allegations that it is elitist and unfairly funded.

Black box

There has also been critique of the use of digital humanities tools by scholars who do not fully understand what happens to the data they input and place too much trust in the "black box" of software that cannot be sufficiently examined for errors. Johanna Drucker, a professor at UCLA Department of Information Studies, has criticized the "epistemological fallacies" prevalent in popular visualization tools and technologies (such as Google's n-gram graph) used by digital humanities scholars and the general public, calling some network diagramming and topic modeling tools "just too crude for humanistic work." The lack of transparency in these programs obscures the subjective nature of the data and its processing, she argues, as these programs "generate standard diagrams based on conventional algorithms for screen display...mak[ing] it very difficult for the semantics of the data processing to be made evident."

Diversity

There has also been some recent controversy among practitioners of digital humanities around the role that race and/or identity politics plays. Tara McPherson attributes some of the lack of racial diversity in digital humanities to the modality of UNIX and computers themselves. An open thread on DHpoco.org recently garnered well over 100 comments on the issue of race in digital humanities, with scholars arguing about the amount that racial (and other) biases affect the tools and texts available for digital humanities research. McPherson posits that there needs to be an understanding and theorizing of the implications of digital technology and race, even when the subject for analysis appears not to be about race.

Amy E. Earhart criticizes what has become the new digital humanities "canon" in the shift from websites using simple HTML to the usage of the TEI and visuals in textual recovery projects. Works that has been previously lost or excluded were afforded a new home on the internet, but much of the same marginalizing practices found in traditional humanities also took place digitally. According to Earhart, there is a "need to examine the canon that we, as digital humanists, are constructing, a canon that skews toward traditional texts and excludes crucial work by women, people of color, and the LGBTQ community."

Issues of access

Practitioners in digital humanities are also failing to meet the needs of users with disabilities. George H. Williams argues that universal design is imperative for practitioners to increase usability because "many of the otherwise most valuable digital resources are useless for people who are—for example—deaf or hard of hearing, as well as for people who are blind, have low vision, or have difficulty distinguishing particular colors." In order to provide accessibility successfully, and productive universal design, it is important to understand why and how users with disabilities are using the digital resources while remembering that all users approach their informational needs differently.

Cultural criticism

Digital humanities have been criticized for not only ignoring traditional questions of lineage and history in the humanities, but lacking the fundamental cultural criticism that defines the humanities. However, it remains to be seen whether or not the humanities have to be tied to cultural criticism, per se, in order to be the humanities. The sciences might imagine the Digital Humanities as a welcome improvement over the non-quantitative methods of the humanities and social sciences.

Difficulty of evaluation

As the field matures, there has been a recognition that the standard model of academic peer-review of work may not be adequate for digital humanities projects, which often involve website components, databases, and other non-print objects. Evaluation of quality and impact thus require a combination of old and new methods of peer review. One response has been the creation of the DHCommons Journal. This accepts non-traditional submissions, especially mid-stage digital projects, and provides an innovative model of peer review more suited for the multimedia, transdisciplinary, and milestone-driven nature of Digital Humanities projects. Other professional humanities organizations, such as the American Historical Association and the Modern Language Association, have developed guidelines for evaluating academic digital scholarship.

Lack of focus on pedagogy

The 2012 edition of Debates in the Digital Humanities recognized the fact that pedagogy was the "neglected 'stepchild' of DH" and included an entire section on teaching the digital humanities. Part of the reason is that grants in the humanities are geared more toward research with quantifiable results rather than teaching innovations, which are harder to measure. In recognition of a need for more scholarship on the area of teaching, the edited volume Digital Humanities Pedagogy was published and offered case studies and strategies to address how to teach digital humanities methods in various disciplines.

Post-scarcity economy

From Wikipedia, the free encyclopedia

Post-scarcity is a theoretical economic situation in which most goods can be produced in great abundance with minimal human labor needed, so that they become available to all very cheaply or even freely. Post-scarcity does not mean that scarcity has been eliminated for all goods and services, but that all people can easily have their basic survival needs met along with some significant proportion of their desires for goods and services. Writers on the topic often emphasize that some commodities will remain scarce in a post-scarcity society.

In the paper "The Post-Scarcity World of 2050–2075" the authors assert that the current age is one of scarcity resulting from negligent behavior (as regards the future) of the 19th and 20th centuries. The period between 1975 and 2005 was characterized by relative abundance of resources (oil, water, energy, food, credit, among others) which boosted industrialization and development in the Western economies. An increased demand of resources combined with a rising population led to resource exhaustion. In part, the ideas developed about post-scarcity are motivated by analyses that posit that capitalism leverages scarcity.

One of the main traces of the scarcity periods is the increase and fluctuation of prices. To deal with that situation, advances in technology come into play, driving an efficient use of resources to a certain extent that costs will be considerably reduced (almost everything will be free). Consequently, the authors claim that the period between 2050 and 2075 will be a post-scarcity age in which scarcity will no longer exist.

Models

Speculative technology

Today, futurists who speak of "post-scarcity" suggest economies based on advances in automated manufacturing technologies, often including the idea of self-replicating machines, the adoption of division of labour which in theory could produce nearly all goods in abundance, given adequate raw materials and energy.

More speculative forms of nanotechnology such as molecular assemblers or nanofactories, which do not currently exist, raise the possibility of devices that can automatically manufacture any specified goods given the correct instructions and the necessary raw materials and energy, and many nanotechnology enthusiasts have suggested it will usher in a post-scarcity world.

In the more near-term future, the increasing automation of physical labor using robots is often discussed as means of creating a post-scarcity economy.

Increasingly versatile forms of rapid prototyping machines, and a hypothetical self-replicating version of such a machine known as a RepRap, have also been predicted to help create the abundance of goods needed for a post-scarcity economy. Advocates of self-replicating machines such as Adrian Bowyer, the creator of the RepRap project, argue that once a self-replicating machine is designed, then since anyone who owns one can make more copies to sell (and would also be free to ask for a lower price than other sellers), market competition will naturally drive the cost of such machines down to the bare minimum needed to make a profit,[16][17] in this case just above the cost of the physical materials and energy that must be fed into the machine as input, and the same should go for any other goods that the machine can build.

Even with fully automated production, limitations on the number of goods produced would arise from the availability of raw materials and energy, as well as ecological damage associated with manufacturing technologies. Advocates of technological abundance often argue for more extensive use of renewable energy and greater recycling in order to prevent future drops in availability of energy and raw materials, and reduce ecological damage. Solar energy in particular is often emphasized, as the cost of solar panels continues to drop (and could drop far more with automated production by self-replicating machines), and advocates point out the total solar power striking the Earth's surface annually exceeds our civilization's current annual power usage by a factor of thousands.

Advocates also sometimes argue that the energy and raw materials available could be greatly expanded by looking to resources beyond the Earth. For example, asteroid mining is sometimes discussed as a way of greatly reducing scarcity for many useful metals such as nickel. While early asteroid mining might involve manned missions, advocates hope that eventually humanity could have automated mining done by self-replicating machines. If this were done, then the only capital expenditure would be a single self-replicating unit (whether robotic or nanotechnological), after which the number of units could replicate at no further cost, limited only by the available raw materials needed to build more.

Marxism

Karl Marx, in a section of his Grundrisse that came to be known as the "Fragment on Machines", argued that the transition to a post-capitalist society combined with advances in automation would allow for significant reductions in labor needed to produce necessary goods, eventually reaching a point where all people would have significant amounts of leisure time to pursue science, the arts, and creative activities; a state some commentators later labeled as "post-scarcity". Marx argued that capitalism—the dynamic of economic growth based on capital accumulation—depends on exploiting the surplus labor of workers, but a post-capitalist society would allow for:

The free development of individualities, and hence not the reduction of necessary labour time so as to posit surplus labour, but rather the general reduction of the necessary labour of society to a minimum, which then corresponds to the artistic, scientific etc. development of the individuals in the time set free, and with the means created, for all of them.

Marx's concept of a post-capitalist communist society involves the free distribution of goods made possible by the abundance provided by automation. The fully developed communist economic system is postulated to develop from a preceding socialist system. Marx held the view that socialism—a system based on social ownership of the means of production—would enable progress toward the development of fully developed communism by further advancing productive technology. Under socialism, with its increasing levels of automation, an increasing proportion of goods would be distributed freely.

Marx did not believe in the elimination of most physical labor through technological advancements alone in a capitalist society, because he believed capitalism contained within it certain tendencies which countered increasing automation and prevented it from developing beyond a limited point, so that manual industrial labor could not be eliminated until the overthrow of capitalism. Some commentators on Marx have argued that at the time he wrote the Grundrisse, he thought that the collapse of capitalism due to advancing automation was inevitable despite these counter-tendencies, but that by the time of his major work Capital: Critique of Political Economy he had abandoned this view, and came to believe that capitalism could continually renew itself unless overthrown.

Post-Scarcity Anarchism

Murray Bookchin, in his 1971 essay collection Post-Scarcity Anarchism, outlines an economy based on social ecology, libertarian municipalism, and an abundance of fundamental resources, arguing that post-industrial societies have the potential to be developed into post-scarcity societies. For Bookchin, such development would enable "the fulfillment of the social and cultural potentialities latent in a technology of abundance".

Bookchin claims that the expanded production made possible by the technological advances of the twentieth century were in the pursuit of market profit and at the expense of the needs of humans and of ecological sustainability. The accumulation of capital can no longer be considered a prerequisite for liberation, and the notion that obstructions such as the state, social hierarchy, and vanguard political parties are necessary in the struggle for freedom of the working classes can be dispelled as a myth.

Fiction

  • The Mars trilogy by Kim Stanley Robinson. Over three novels, Robinson charts the terraforming of Mars as a human colony and the establishment of a post-scarcity society.
  • The Culture novels by Iain M. Banks are centered on a post-scarcity economy where technology is advanced to such a degree that all production is automated, and there is no use for money or property (aside from personal possessions with sentimental value). People in the Culture are free to pursue their own interests in an open and socially-permissive society. The society has been described by some commentators as "communist-bloc" or "anarcho-communist". Banks' close friend and fellow science fiction writer Ken MacLeod has said that The Culture can be seen as a realization of Marx's communism, but adds that "however friendly he was to the radical left, Iain had little interest in relating the long-range possibility of utopia to radical politics in the here and now. As he saw it, what mattered was to keep the utopian possibility open by continuing technological progress, especially space development, and in the meantime to support whatever policies and politics in the real world were rational and humane."
  • The Rapture of the Nerds by Cory Doctorow and Charles Stross takes place in a post-scarcity society and involves "disruptive" technology. The title is a derogatory term for the technological singularity coined by SF author Ken MacLeod.
  • Con Blomberg's 1959 short story "Sales Talk" depicts a post-scarcity society in which society incentivizes consumption to reduce the burden of overproduction. To further reduce production, virtual reality is used to fulfill peoples' needs to create.
  • The 24th-century human society of Star Trek: The Next Generation and Star Trek: Deep Space Nine has been labeled a post-scarcity society due to the ability of the fictional "replicator" technology to synthesize a wide variety of goods nearly instantaneously, along with dialogue such as Captain Picard's statement in the film Star Trek: First Contact that "The acquisition of wealth is no longer the driving force of our lives. We work to better ourselves and the rest of humanity." By the 22nd century, money had been rendered obsolete on Earth.
  • Cory Doctorow's novel Walkaway presents a modern take on the idea of post-scarcity. With the advent of 3D printing – and especially the ability to use these to fabricate even better fabricators – and with machines that can search for and reprocess waste or discarded materials, the protagonists no longer have need of regular society for the basic essentials of life, such as food, clothing and shelter.

Technological utopianism

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Technological_utopianism

Technological utopianism (often called techno-utopianism or technoutopianism) is any ideology based on the premise that advances in science and technology could and should bring about a utopia, or at least help to fulfill one or another utopian ideal.

A techno-utopia is therefore an ideal society, in which laws, government, and social conditions are solely operating for the benefit and well-being of all its citizens, set in the near- or far-future, as advanced science and technology will allow these ideal living standards to exist; for example, post-scarcity, transformations in human nature, the avoidance or prevention of suffering and even the end of death.

Technological utopianism is often connected with other discourses presenting technologies as agents of social and cultural change, such as technological determinism or media imaginaries.

A tech-utopia does not disregard any problems that technology may cause, but strongly believes that technology allows mankind to make social, economic, political, and cultural advancements. Overall, Technological Utopianism views technology’s impacts as extremely positive.

In the late 20th and early 21st centuries, several ideologies and movements, such as the cyberdelic counterculture, the Californian Ideology, transhumanism, and singularitarianism, have emerged promoting a form of techno-utopia as a reachable goal. Cultural critic Imre Szeman argues technological utopianism is an irrational social narrative because there is no evidence to support it. He concludes that it shows the extent to which modern societies place faith in narratives of progress and technology overcoming things, despite all evidence to the contrary.

History

From the 19th to mid-20th centuries

Karl Marx believed that science and democracy were the right and left hands of what he called the move from the realm of necessity to the realm of freedom. He argued that advances in science helped delegitimize the rule of kings and the power of the Christian Church.

19th-century liberals, socialists, and republicans often embraced techno-utopianism. Radicals like Joseph Priestley pursued scientific investigation while advocating democracy. Robert Owen, Charles Fourier and Henri de Saint-Simon in the early 19th century inspired communalists with their visions of a future scientific and technological evolution of humanity using reason. Radicals seized on Darwinian evolution to validate the idea of social progress. Edward Bellamy’s socialist utopia in Looking Backward, which inspired hundreds of socialist clubs in the late 19th century United States and a national political party, was as highly technological as Bellamy’s imagination. For Bellamy and the Fabian Socialists, socialism was to be brought about as a painless corollary of industrial development.

Marx and Engels saw more pain and conflict involved, but agreed about the inevitable end. Marxists argued that the advance of technology laid the groundwork not only for the creation of a new society, with different property relations, but also for the emergence of new human beings reconnected to nature and themselves. At the top of the agenda for empowered proletarians was "to increase the total productive forces as rapidly as possible". The 19th and early 20th century Left, from social democrats to communists, were focused on industrialization, economic development and the promotion of reason, science, and the idea of progress.

Some technological utopians promoted eugenics. Holding that in studies of families, such as the Jukes and Kallikaks, science had proven that many traits such as criminality and alcoholism were hereditary, many advocated the sterilization of those displaying negative traits. Forcible sterilization programs were implemented in several states in the United States.

H.G. Wells in works such as The Shape of Things to Come promoted technological utopianism.

The horrors of the 20th century – namely Fascist and Communist dictatorships and the world wars – caused many to abandon optimism. The Holocaust, as Theodor Adorno underlined, seemed to shatter the ideal of Condorcet and other thinkers of the Enlightenment, which commonly equated scientific progress with social progress.

From late 20th and early 21st centuries

The Goliath of totalitarianism will be brought down by the David of the microchip.

— Ronald Reagan, The Guardian, 14 June 1989

A movement of techno-utopianism began to flourish again in the dot-com culture of the 1990s, particularly in the West Coast of the United States, especially based around Silicon Valley. The Californian Ideology was a set of beliefs combining bohemian and anti-authoritarian attitudes from the counterculture of the 1960s with techno-utopianism and support for libertarian economic policies. It was reflected in, reported on, and even actively promoted in the pages of Wired magazine, which was founded in San Francisco in 1993 and served for a number years as the "bible" of its adherents.

This form of techno-utopianism reflected a belief that technological change revolutionizes human affairs, and that digital technology in particular – of which the Internet was but a modest harbinger – would increase personal freedom by freeing the individual from the rigid embrace of bureaucratic big government. "Self-empowered knowledge workers" would render traditional hierarchies redundant; digital communications would allow them to escape the modern city, an "obsolete remnant of the industrial age".

Similar forms of "digital utopianism" has often entered in the political messages of party and social movements that point to the Web or more broadly to new media as harbingers of political and social change. Its adherents claim it transcended conventional "right/left" distinctions in politics by rendering politics obsolete. However, techno-utopianism disproportionately attracted adherents from the libertarian right end of the political spectrum. Therefore, techno-utopians often have a hostility toward government regulation and a belief in the superiority of the free market system. Prominent "oracles" of techno-utopianism included George Gilder and Kevin Kelly, an editor of Wired who also published several books.

During the late 1990s dot-com boom, when the speculative bubble gave rise to claims that an era of "permanent prosperity" had arrived, techno-utopianism flourished, typically among the small percentage of the population who were employees of Internet startups and/or owned large quantities of high-tech stocks. With the subsequent crash, many of these dot-com techno-utopians had to rein in some of their beliefs in the face of the clear return of traditional economic reality.

In the late 1990s and especially during the first decade of the 21st century, technorealism and techno-progressivism are stances that have risen among advocates of technological change as critical alternatives to techno-utopianism. However, technological utopianism persists in the 21st century as a result of new technological developments and their impact on society. For example, several technical journalists and social commentators, such as Mark Pesce, have interpreted the WikiLeaks phenomenon and the United States diplomatic cables leak in early December 2010 as a precursor to, or an incentive for, the creation of a techno-utopian transparent society. Cyber-utopianism, first coined by Evgeny Morozov, is another manifestation of this, in particular in relation to the Internet and social networking.

Principles

Bernard Gendron, a professor of philosophy at the University of Wisconsin–Milwaukee, defines the four principles of modern technological utopians in the late 20th and early 21st centuries as follows:

  1. We are presently undergoing a (post-industrial) revolution in technology;
  2. In the post-industrial age, technological growth will be sustained (at least);
  3. In the post-industrial age, technological growth will lead to the end of economic scarcity;
  4. The elimination of economic scarcity will lead to the elimination of every major social evil.

Rushkoff presents us with multiple claims that surround the basic principles of Technological Utopianism:

  1. Technology reflects and encourages the best aspects of human nature, fostering “communication, collaboration, sharing, helpfulness, and community.”
  2. Technology improves our interpersonal communication, relationships, and communities. Early Internet users shared their knowledge of the Internet with others around them.
  3. Technology democratizes society. The expansion of access to knowledge and skills led to the connection of people and information. The broadening of freedom of expression created “the online world...in which we are allowed to voice our own opinions.” The reduction of the inequalities of power and wealth meant that everyone has an equal status on the internet and is allowed to do as much as the next person.
  4. Technology inevitably progresses. The interactivity that came from the inventions of the TV remote control, video game joystick, computer mouse and computer keyboard allowed for much more progress.
  5. Unforeseen impacts of technology are positive. As more people discovered the Internet, they took advantage of being linked to millions of people, and turned the Internet into a social revolution. The government released it to the public, and its “social side effect… [became] its main feature.”
  6. Technology increases efficiency and consumer choice. The creation of the TV remote, video game joystick, and computer mouse liberated these technologies and allowed users to manipulate and control them, giving them many more choices.
  7. New technology can solve the problems created by old technology. Social networks and blogs were created out of the collapse of dot.com bubble businesses’ attempts to run pyramid schemes on users.

Criticisms

Critics claim that techno-utopianism's identification of social progress with scientific progress is a form of positivism and scientism. Critics of modern libertarian techno-utopianism point out that it tends to focus on "government interference" while dismissing the positive effects of the regulation of business. They also point out that it has little to say about the environmental impact of technology and that its ideas have little relevance for much of the rest of the world that are still relatively quite poor (see global digital divide).

In his 2010 study System Failure: Oil, Futurity, and the Anticipation of Disaster, Canada Research Chairholder in cultural studies Imre Szeman argues that technological utopianism is one of the social narratives that prevent people from acting on the knowledge they have concerning the effects of oil on the environment.

In a controversial article "Techno-Utopians are Mugged by Reality", The Wall Street Journal explores the concept of the violation of free speech by shutting down social media to stop violence. As a result of British cities being looted consecutively, British Prime Minister David Cameron argued that the government should have the ability to shut down social media during crime sprees so that the situation could be contained. A poll was conducted to see if Twitter users would prefer to let the service be closed temporarily or keep it open so they can chat about the famous television show X-Factor. The end report showed that every Tweet opted for X-Factor. The negative social effects of technological utopia is that society is so addicted to technology that we simply can't be parted even for the greater good. While many Techno-Utopians would like to believe that digital technology is for the greater good, it can also be used negatively to bring harm to the public.

Other critics of a techno-utopia include the worry of the human element. Critics suggest that a techno-utopia may lessen human contact, leading to a distant society. Another concern is the amount of reliance society may place on their technologies in these techno-utopia settings. These criticisms are sometimes referred to as a technological anti-utopian view or a techno-dystopia.

Even today, the negative social effects of a technological utopia can be seen. Mediated communication such as phone calls, instant messaging and text messaging are steps towards a utopian world in which one can easily contact another regardless of time or location. However, mediated communication removes many aspects that are helpful in transferring messages. As it stands today, most text, email, and instant messages offer fewer nonverbal cues about the speaker’s feelings than do face-to-face encounters. This makes it so that mediated communication can easily be misconstrued and the intended message is not properly conveyed. With the absence of tone, body language, and environmental context, the chance of a misunderstanding is much higher, rendering the communication ineffective. In fact, mediated technology can be seen from a dystopian view because it can be detrimental to effective interpersonal communication. These criticisms would only apply to messages that are prone to misinterpretation as not every text based communication requires contextual cues. The limitations of lacking tone and body language in text based communication are likely to be mitigated by video and augmented reality versions of digital communication technologies.

 

Aspect-oriented programming

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Asp...