Search This Blog

Sunday, August 1, 2021

Phenomenology (philosophy)

From Wikipedia, the free encyclopedia

Phenomenology (from Greek phainómenon "that which appears" and lógos "study") is the philosophical study of the structures of experience and consciousness. As a philosophical movement it was founded in the early years of the 20th century by Edmund Husserl and was later expanded upon by a circle of his followers at the universities of Göttingen and Munich in Germany. It then spread to France, the United States, and elsewhere, often in contexts far removed from Husserl's early work.

Phenomenology is not a unified movement; rather, different authors share a common family resemblance but also with many significant differences. Gabriella Farina states:

A unique and final definition of phenomenology is dangerous and perhaps even paradoxical as it lacks a thematic focus. In fact, it is not a doctrine, nor a philosophical school, but rather a style of thought, a method, an open and ever-renewed experience having different results, and this may disorient anyone wishing to define the meaning of phenomenology.

Phenomenology, in Husserl's conception, is primarily concerned with the systematic reflection on and study of the structures of consciousness and the phenomena that appear in acts of consciousness. Phenomenology can be clearly differentiated from the Cartesian method of analysis which sees the world as objects, sets of objects, and objects acting and reacting upon one another.

Husserl's conception of phenomenology has been criticized and developed not only by him but also by students and colleagues such as Edith Stein, Max Scheler, Roman Ingarden, and Dietrich von Hildebrand, by existentialists such as Nicolai Hartmann, Gabriel Marcel, Maurice Merleau-Ponty, and Jean-Paul Sartre, by hermeneutic philosophers such as Martin Heidegger, Hans-Georg Gadamer, and Paul Ricoeur, by later French philosophers such as Jean-Luc Marion, Michel Henry, Emmanuel Levinas, and Jacques Derrida, by sociologists such as Alfred Schütz and Eric Voegelin, and by Christian philosophers, such as Dallas Willard.

Overview

In its most basic form, phenomenology attempts to create conditions for the objective study of topics usually regarded as subjective: consciousness and the content of conscious experiences such as judgements, perceptions, and emotions. Although phenomenology seeks to be scientific, it does not attempt to study consciousness from the perspective of clinical psychology or neurology. Instead, it seeks through systematic reflection to determine the essential properties and structures of experience.

There are several assumptions behind phenomenology that help explain its foundations:

  1. Phenomenologists reject the concept of objective research. They prefer grouping assumptions through a process called phenomenological epoché.
  2. They believe that analyzing daily human behavior can provide one with a greater understanding of nature.
  3. They assert that persons should be explored. This is because persons can be understood through the unique ways they reflect the society they live in.
  4. Phenomenologists prefer to gather "capta", or conscious experience, rather than traditional data.
  5. They consider phenomenology to be oriented toward discovery, and therefore they research using methods that are far less restrictive than in other sciences.

Husserl derived many important concepts central to phenomenology from the works and lectures of his teachers, the philosophers and psychologists Franz Brentano and Carl Stumpf. An important element of phenomenology that Husserl borrowed from Brentano is intentionality (often described as "aboutness"), the notion that consciousness is always consciousness of something. The object of consciousness is called the intentional object, and this object is constituted for consciousness in many different ways, through, for instance, perception, memory, retention and protention, signification, etc. Throughout these different intentionalities, though they have different structures and different ways of being "about" the object, an object is still constituted as the identical object; consciousness is directed at the same intentional object in direct perception as it is in the immediately following retention of this object and the eventual remembering of it.

Though many of the phenomenological methods involve various reductions, phenomenology is, in essence, anti-reductionistic; the reductions are mere tools to better understand and describe the workings of consciousness, not to reduce any phenomenon to these descriptions. In other words, when a reference is made to a thing's essence or idea, or when the constitution of an identical coherent thing is specified by describing what one "really" sees as being only these sides and aspects, these surfaces, it does not mean that the thing is only and exclusively what is described here: the ultimate goal of these reductions is to understand how these different aspects are constituted into the actual thing as experienced by the person experiencing it. Phenomenology is a direct reaction to the psychologism and physicalism of Husserl's time.

Although previously employed by Georg Wilhelm Friedrich Hegel in his Phenomenology of Spirit, it was Husserl's adoption of this term (c. 1900) that propelled it into becoming the designation of a philosophical school. As a philosophical perspective, phenomenology is its method, though the specific meaning of the term varies according to how it is conceived by a given philosopher. As envisioned by Husserl, phenomenology is a method of philosophical inquiry that rejects the rationalist bias that has dominated Western thought since Plato in favor of a method of reflective attentiveness that discloses the individual's "lived experience." Loosely rooted in an epistemological device, with Sceptic roots, called epoché, Husserl's method entails the suspension of judgment while relying on the intuitive grasp of knowledge, free of presuppositions and intellectualizing. Sometimes depicted as the "science of experience," the phenomenological method is rooted in intentionality, i.e. Husserl's theory of consciousness (developed from Brentano). Intentionality represents an alternative to the representational theory of consciousness, which holds that reality cannot be grasped directly because it is available only through perceptions of reality that are representations of it in the mind. Husserl countered that consciousness is not "in" the mind; rather, consciousness is conscious of something other than itself (the intentional object), whether the object is a substance or a figment of imagination (i.e., the real processes associated with and underlying the figment). Hence the phenomenological method relies on the description of phenomena as they are given to consciousness, in their immediacy.

According to Maurice Natanson (1973, p. 63), "The radicality of the phenomenological method is both continuous and discontinuous with philosophy's general effort to subject experience to fundamental, critical scrutiny: to take nothing for granted and to show the warranty for what we claim to know." In practice, it entails an unusual combination of discipline and detachment to bracket theoretical explanations and second-hand information while determining one's "naïve" experience of the matter. (To "bracket" in this sense means to provisionally suspend or set aside some idea as a way to facilitate the inquiry by focusing only on its most significant components.) The phenomenological method serves to momentarily erase the world of speculation by returning the subject to his or her primordial experience of the matter, whether the object of inquiry is a feeling, an idea, or a perception. According to Husserl the suspension of belief in what we ordinarily take for granted or infer by conjecture diminishes the power of what we customarily embrace as objective reality. According to Rüdiger Safranski (1998, 72), "[Husserl's and his followers'] great ambition was to disregard anything that had until then been thought or said about consciousness or the world [while] on the lookout for a new way of letting the things [they investigated] approach them, without covering them up with what they already knew."

Martin Heidegger modified Husserl's conception of phenomenology because of what Heidegger perceived as Husserl's subjectivist tendencies. Whereas Husserl conceived humans as having been constituted by states of consciousness, Heidegger countered that consciousness is peripheral to the primacy of one's existence (i.e., the mode of being of Dasein), which cannot be reduced to one's consciousness of it. From this angle, one's state of mind is an "effect" rather than a determinant of existence, including those aspects of existence of which one is not conscious. By shifting the center of gravity from consciousness (psychology) to existence (ontology), Heidegger altered the subsequent direction of phenomenology. As one consequence of Heidegger's modification of Husserl's conception, phenomenology became increasingly relevant to psychoanalysis. Whereas Husserl gave priority to a depiction of consciousness that was fundamentally alien to the psychoanalytic conception of the unconscious, Heidegger offered a way to conceptualize experience that could accommodate those aspects of one's existence that lie on the periphery of sentient awareness.

Etymology

Phenomenology has at least three main meanings in philosophical history: one in the writings of G. W. F. Hegel, another in the writings of Edmund Husserl in 1920, and thirdly, succeeding Husserl's work, in the writings of his former research assistant Martin Heidegger in 1927.

  • For G. W. F. Hegel, phenomenology is a philosophical (philosophischen) and scientific (wissenschaftliche) study of phenomena (what presents itself to us in conscious experience) as a means to finally grasp the absolute, logical, ontological and metaphysical Spirit (Absolute Spirit) that is essential to phenomena. This has been called dialectical phenomenology.
  • For Edmund Husserl, phenomenology is "the reflective study of the essence of consciousness as experienced from the first-person point of view." Phenomenology takes the intuitive experience of phenomena (whatever presents itself in phenomenological reflexion) as its starting point and tries to extract from it the essential features of experiences and the essence of what we experience. When generalized to the essential features of any possible experience, this has been called transcendental phenomenology. Husserl's view was based on aspects of the work of Franz Brentano and was developed further by philosophers such as Maurice Merleau-Ponty, Max Scheler, Edith Stein, Dietrich von Hildebrand and Emmanuel Levinas.

Although the term "phenomenology" was used occasionally in the history of philosophy before Husserl, modern use ties it more explicitly to his particular method. Following is a list of important thinkers, in rough chronological order, who used the term "phenomenology" in a variety of ways, with brief comments on their contributions:

  • Friedrich Christoph Oetinger (1702–1782), German pietist, for the study of the "divine system of relations"
  • Johann Heinrich Lambert (1728–1777), mathematician, physicist and philosopher, known for the theory of appearances underlying empirical knowledge.
  • Immanuel Kant (1724–1804), in the Critique of Pure Reason, distinguished between objects as phenomena, which are objects as shaped and grasped by human sensibility and understanding, and objects as things-in-themselves or noumena, which do not appear to us in space and time and about which we can make no legitimate judgments.
  • G. W. F. Hegel (1770–1831) challenged Kant's doctrine of the unknowable thing-in-itself, and declared that by knowing phenomena more fully we can gradually arrive at a consciousness of the absolute and spiritual truth of Divinity, most notably in his Phenomenology of Spirit, published in 1807.
  • Carl Stumpf (1848–1936), student of Brentano and mentor to Husserl, used "phenomenology" to refer to an ontology of sensory contents.
  • Edmund Husserl (1859–1938) established phenomenology at first as a kind of "descriptive psychology" and later as a transcendental and eidetic science of consciousness. He is considered to be the founder of contemporary phenomenology.
  • Max Scheler (1874–1928) developed further the phenomenological method of Edmund Husserl and extended it to include also a reduction of the scientific method. He influenced the thinking of Pope John Paul II, Dietrich von Hildebrand, and Edith Stein.
  • Martin Heidegger (1889–1976) criticized Husserl's theory of phenomenology and attempted to develop a theory of ontology that led him to his original theory of Dasein, the non-dualistic human being.
  • Alfred Schütz (1899–1959) developed a phenomenology of the social world on the basis of everyday experience that has influenced major sociologists such as Harold Garfinkel, Peter Berger, and Thomas Luckmann.
  • Francisco Varela (1946–2001), Chilean philosopher and biologist. Developed the basis for experimental phenomenology and neurophenomenology.

Later usage is mostly based on or (critically) related to Husserl's introduction and use of the term. This branch of philosophy differs from others in that it tends to be more "descriptive" than "prescriptive".

Varieties

The Encyclopedia of Phenomenology (Kluwer Academic Publishers, 1997) features separate articles on the following seven types of phenomenology:

  1. Transcendental constitutive phenomenology studies how objects are constituted in transcendental consciousness, setting aside questions of any relation to the natural world.
  2. Naturalistic constitutive phenomenology studies how consciousness constitutes things in the world of nature, assuming with the natural attitude that consciousness is part of nature.
  3. Existential phenomenology studies concrete human existence, including our experience of free choice and/or action in concrete situations.
  4. Generative historicist phenomenology studies how meaning—as found in our experience—is generated in historical processes of collective experience over time.
  5. Genetic phenomenology studies the emergence/genesis of meanings of things within one's own stream of experience.
  6. Hermeneutical phenomenology (also hermeneutic phenomenology or post-phenomenology/postphenomenology elsewhere; see hermeneutics) studies interpretive structures of experience. This approach was introduced in Martin Heidegger's early work.
  7. Realistic phenomenology (also realist phenomenology elsewhere) studies the structure of consciousness and intentionality as "it occurs in a real world that is largely external to consciousness and not somehow brought into being by consciousness."

The contrast between "constitutive phenomenology" (German: konstitutive Phänomenologie; also static phenomenology (statische Phänomenologie) or descriptive phenomenology (beschreibende Phänomenologie)) and "genetic phenomenology" (genetische Phänomenologie; also phenomenology of genesis (Phänomenologie der Genesis)) is due to Husserl.

Modern scholarship also recognizes the existence of the following varieties: late Heidegger's transcendental hermeneutic phenomenology (see transcendental philosophy and a priori), Maurice Merleau-Ponty's embodied phenomenology (see embodied cognition), Michel Henry's material phenomenology (also based on embodied cognition), Alva Noë's analytic phenomenology (see analytic philosophy), J. L. Austin's linguistic phenomenology, and Paul Crowther's post-analytic phenomenology.

Concepts

Intentionality

Intentionality refers to the notion that consciousness is always the consciousness of something. The word itself should not be confused with the "ordinary" use of the word intentional, but should rather be taken as playing on the etymological roots of the word. Originally, intention referred to a "stretching out" ("in tension," from Latin intendere), and in this context it refers to consciousness "stretching out" towards its object. However, one should be careful with this image: there is not some consciousness first that, subsequently, stretches out to its object; rather, consciousness occurs as the simultaneity of a conscious act and its object.

Intentionality is often summed up as "aboutness." Whether this something that consciousness is about is in direct perception or in fantasy is inconsequential to the concept of intentionality itself; whatever consciousness is directed at, that is what consciousness is conscious of. This means that the object of consciousness doesn't have to be a physical object apprehended in perception: it can just as well be a fantasy or a memory. Consequently, these "structures" of consciousness, i.e., perception, memory, fantasy, etc., are called intentionalities.

The term "intentionality" originated with the Scholastics in the medieval period and was resurrected by Brentano who in turn influenced Husserl's conception of phenomenology, who refined the term and made it the cornerstone of his theory of consciousness. The meaning of the term is complex and depends entirely on how it is conceived by a given philosopher. The term should not be confused with "intention" or the psychoanalytic conception of unconscious "motive" or "gain".

Intuition

Intuition in phenomenology refers to cases where the intentional object is directly present to the intentionality at play; if the intention is "filled" by the direct apprehension of the object, you have an intuited object. Having a cup of coffee in front of you, for instance, seeing it, feeling it, or even imagining it – these are all filled intentions, and the object is then intuited. The same goes for the apprehension of mathematical formulae or a number. If you do not have the object as referred to directly, the object is not intuited, but still intended, but then emptily. Examples of empty intentions can be signitive intentions – intentions that only imply or refer to their objects.

Evidence

In everyday language, we use the word evidence to signify a special sort of relation between a state of affairs and a proposition: State A is evidence for the proposition "A is true." In phenomenology, however, the concept of evidence is meant to signify the "subjective achievement of truth." This is not an attempt to reduce the objective sort of evidence to subjective "opinion," but rather an attempt to describe the structure of having something present in intuition with the addition of having it present as intelligible: "Evidence is the successful presentation of an intelligible object, the successful presentation of something whose truth becomes manifest in the evidencing itself."

Noesis and noema

In Husserl's phenomenology, which is quite common, this pair of terms, derived from the Greek nous (mind), designate respectively the real content, noesis, and the ideal content, noema, of an intentional act (an act of consciousness). The noesis is the part of the act that gives it a particular sense or character (as in judging or perceiving something, loving or hating it, accepting or rejecting it, and so on). This is real in the sense that it is actually part of what takes place in the consciousness (or psyche) of the subject of the act. The noesis is always correlated with a noema; for Husserl, the full noema is a complex ideal structure comprising at least a noematic sense and a noematic core. The correct interpretation of what Husserl meant by the noema has long been controversial, but the noematic sense is generally understood as the ideal meaning of the act and the noematic core as the act's referent or object as it is meant in the act. One element of controversy is whether this noematic object is the same as the actual object of the act (assuming it exists) or is some kind of ideal object.

Empathy and intersubjectivity

In phenomenology, empathy refers to the experience of one's own body as another. While we often identify others with their physical bodies, this type of phenomenology requires that we focus on the subjectivity of the other, as well as our intersubjective engagement with them. In Husserl's original account, this was done by a sort of apperception built on the experiences of your own lived-body. The lived body is your own body as experienced by yourself, as yourself. Your own body manifests itself to you mainly as your possibilities of acting in the world. It is what lets you reach out and grab something, for instance, but it also, and more importantly, allows for the possibility of changing your point of view. This helps you differentiate one thing from another by the experience of moving around it, seeing new aspects of it (often referred to as making the absent present and the present absent), and still retaining the notion that this is the same thing that you saw other aspects of just a moment ago (it is identical). Your body is also experienced as a duality, both as object (you can touch your own hand) and as your own subjectivity (you experience being touched).

The experience of your own body as your own subjectivity is then applied to the experience of another's body, which, through apperception, is constituted as another subjectivity. You can thus recognise the Other's intentions, emotions, etc. This experience of empathy is important in the phenomenological account of intersubjectivity. In phenomenology, intersubjectivity constitutes objectivity (i.e., what you experience as objective is experienced as being intersubjectively available – available to all other subjects. This does not imply that objectivity is reduced to subjectivity nor does it imply a relativist position, cf. for instance intersubjective verifiability).

In the experience of intersubjectivity, one also experiences oneself as being a subject among other subjects, and one experiences oneself as existing objectively for these Others; one experiences oneself as the noema of Others' noeses, or as a subject in another's empathic experience. As such, one experiences oneself as objectively existing subjectivity. Intersubjectivity is also a part in the constitution of one's lifeworld, especially as "homeworld."

Lifeworld

The lifeworld (German: Lebenswelt) is the "world" each one of us lives in. One could call it the "background" or "horizon" of all experience, and it is that on which each object stands out as itself (as different) and with the meaning it can only hold for us. The lifeworld is both personal and intersubjective (it is then called a "homeworld"), and, as such, it does not enclose each one of us in a solus ipse.

Husserl's Logical Investigations (1900/1901)

In the first edition of the Logical Investigations, still under the influence of Brentano, Husserl describes his position as "descriptive psychology." Husserl analyzes the intentional structures of mental acts and how they are directed at both real and ideal objects. The first volume of the Logical Investigations, the Prolegomena to Pure Logic, begins with a devastating critique of psychologism, i.e., the attempt to subsume the a priori validity of the laws of logic under psychology. Husserl establishes a separate field for research in logic, philosophy, and phenomenology, independently from the empirical sciences.

"Pre-reflective self-consciousness" is Shaun Gallagher and Dan Zahavi's term for Husserl's (1900/1901) idea that self-consciousness always involves a self-appearance or self-manifestation (German: Für-sich-selbst-erscheinens) prior to self-reflection, and his idea that the fact that "an appropriate train of sensations or images is experienced, and is in this sense conscious, does not and cannot mean that this is the object of an act of consciousness, in the sense that a perception, a presentation or a judgment is directed upon it".

Husserl's Ideas (1913)

In 1913, some years after the publication of the Logical Investigations, Husserl published Ideas: General Introduction to Pure Phenomenology, a work which introduced some key elaborations that led him to the distinction between the act of consciousness (noesis) and the phenomena at which it is directed (the noemata).

  • "noetic" refers to the intentional act of consciousness (believing, willing, etc.)
  • "noematic" refers to the object or content (noema), which appears in the noetic acts (the believed, wanted, hated, and loved, etc.).

What we observe is not the object as it is in itself, but how and inasmuch it is given in the intentional acts. Knowledge of essences would only be possible by "bracketing" all assumptions about the existence of an external world and the inessential (subjective) aspects of how the object is concretely given to us. This procedure Husserl called epoché.

Husserl concentrated more on the ideal, essential structures of consciousness. As he wanted to exclude any hypothesis on the existence of external objects, he introduced the method of phenomenological reduction to eliminate them. What was left over was the pure transcendental ego, as opposed to the concrete empirical ego.

Transcendental phenomenology is the study of the essential structures that are left in pure consciousness: this amounts in practice to the study of the noemata and the relations among them.

Transcendental phenomenologists include Oskar Becker, Aron Gurwitsch, and Alfred Schütz.

The philosopher Theodor Adorno criticised Husserl's concept of phenomenological epistemology in his metacritique Against Epistemology, which is anti-foundationalist in its stance

Realism

After Husserl's publication of the Ideas in 1913, many phenomenologists took a critical stance towards his new theories. Especially the members of the Munich group distanced themselves from his new transcendental phenomenology and preferred the earlier realist phenomenology of the first edition of the Logical Investigations.

Realist phenomenologists include Edith Stein, Adolf Reinach, Alexander Pfänder, Johannes Daubert [de], Max Scheler, Roman Ingarden, Nicolai Hartmann, and Dietrich von Hildebrand.

Existentialism

Existential phenomenology differs from transcendental phenomenology by its rejection of the transcendental ego. Merleau-Ponty objects to the ego's transcendence of the world, which for Husserl leaves the world spread out and completely transparent before the conscious. Heidegger thinks of a conscious being as always already in the world. Transcendence is maintained in existential phenomenology to the extent that the method of phenomenology must take a presuppositionless starting point – transcending claims about the world arising from, for example, natural or scientific attitudes or theories of the ontological nature of the world.

While Husserl thought of philosophy as a scientific discipline that had to be founded on a phenomenology understood as epistemology, Martin Heidegger held a radically different view. Heidegger himself states their differences this way:

For Husserl, the phenomenological reduction is the method of leading phenomenological vision from the natural attitude of the human being whose life is involved in the world of things and persons back to the transcendental life of consciousness and its noetic-noematic experiences, in which objects are constituted as correlates of consciousness. For us, phenomenological reduction means leading phenomenological vision back from the apprehension of a being, whatever may be the character of that apprehension, to the understanding of the Being of this being (projecting upon the way it is unconcealed).

According to Heidegger, philosophy was not at all a scientific discipline, but more fundamental than science itself. According to him science is only one way of knowing the world with no special access to truth. Furthermore, the scientific mindset itself is built on a much more "primordial" foundation of practical, everyday knowledge. Husserl was skeptical of this approach, which he regarded as quasi-mystical, and it contributed to the divergence in their thinking.

Instead of taking phenomenology as prima philosophia or a foundational discipline, Heidegger took it as a metaphysical ontology: "being is the proper and sole theme of philosophy... this means that philosophy is not a science of beings but of being." Yet to confuse phenomenology and ontology is an obvious error. Phenomena are not the foundation or Ground of Being. Neither are they appearances, for, as Heidegger argues in Being and Time, an appearance is "that which shows itself in something else," while a phenomenon is "that which shows itself in itself."

While for Husserl, in the epoché, being appeared only as a correlate of consciousness, for Heidegger being is the starting point. While for Husserl we would have to abstract from all concrete determinations of our empirical ego, to be able to turn to the field of pure consciousness, Heidegger claims that "the possibilities and destinies of philosophy are bound up with man's existence, and thus with temporality and with historicality."

However, ontological being and existential being are different categories, so Heidegger's conflation of these categories is, according to Husserl's view, the root of Heidegger's error. Husserl charged Heidegger with raising the question of ontology but failing to answer it, instead switching the topic to the Dasein, the only being for whom Being is an issue. That is neither ontology nor phenomenology, according to Husserl, but merely abstract anthropology. To clarify, perhaps, by abstract anthropology, as a non-existentialist searching for essences, Husserl rejected the existentialism implicit in Heidegger's distinction between beings qua existents as things in reality and their Being as it unfolds in Dasein's own reflections on its being-in-the-world, wherein being becomes present to us, that is, is unconcealed.

Existential phenomenologists include: Martin Heidegger (1889–1976), Hannah Arendt (1906–1975), Karl Jaspers (1883–1969), Emmanuel Levinas (1906–1995), Gabriel Marcel (1889–1973), Jean-Paul Sartre (1905–1980), Paul Ricoeur (1913–2005) and Maurice Merleau-Ponty (1908–1961).

Eastern thought

Some researchers in phenomenology (in particular in reference to Heidegger's legacy) see possibilities of establishing dialogues with traditions of thought outside of the so-called Western philosophy, particularly with respect to East-Asian thinking, and despite perceived differences between "Eastern" and "Western". Furthermore, it has been claimed that a number of elements within phenomenology (mainly Heidegger's thought) have some resonance with Eastern philosophical ideas, particularly with Zen Buddhism and Taoism. According to Tomonobu Imamichi, the concept of Dasein was inspired – although Heidegger remained silent on this – by Okakura Kakuzo's concept of das-in-der-Welt-sein (being in the world) expressed in The Book of Tea to describe Zhuangzi's philosophy, which Imamichi's teacher had offered to Heidegger in 1919, after having studied with him the year before.

There are also recent signs of the reception of phenomenology (and Heidegger's thought in particular) within scholarly circles focused on studying the impetus of metaphysics in the history of ideas in Islam and Early Islamic philosophy such as in the works of the Lebanese philosopher Nader El-Bizri; perhaps this is tangentially due to the indirect influence of the tradition of the French Orientalist and phenomenologist Henri Corbin, and later accentuated through El-Bizri's dialogues with the Polish phenomenologist Anna-Teresa Tymieniecka.

In addition, the work of Jim Ruddy in the field of comparative philosophy, combined the concept of "transcendental ego" in Husserl's phenomenology with the concept of the primacy of self-consciousness in the work of Sankaracharya. In the course of this work, Ruddy uncovered a wholly new eidetic phenomenological science, which he called "convergent phenomenology." This new phenomenology takes over where Husserl left off, and deals with the constitution of relation-like, rather than merely thing-like, or "intentional" objectivity.

Approaches to technology

James Moor has argued that computers show up policy vacuums that require new thinking and the establishment of new policies. Others have argued that the resources provided by classical ethical theory such as utilitarianism, consequentialism and deontological ethics is more than enough to deal with all the ethical issues emerging from our design and use of information technology.

For the phenomenologist the 'impact view' of technology as well as the constructivist view of the technology/society relationships is valid but not adequate (Heidegger 1977, Borgmann 1985, Winograd and Flores 1987, Ihde 1990, Dreyfus 1992, 2001). They argue that these accounts of technology, and the technology/society relationship, posit technology and society as if speaking about the one does not immediately and already draw upon the other for its ongoing sense or meaning. For the phenomenologist, society and technology co-constitute each other; they are each other's ongoing condition, or possibility for being what they are. For them technology is not just the artifact. Rather, the artifact already emerges from a prior 'technological' attitude towards the world (Heidegger 1977).

Heidegger's

For Heidegger the essence of technology is the way of being of modern humans—a way of conducting themselves towards the world—that sees the world as something to be ordered and shaped in line with projects, intentions and desires—a 'will to power' that manifests itself as a 'will to technology'. Heidegger claims that there were other times in human history, a pre-modern time, where humans did not orient themselves towards the world in a technological way—simply as resources for our purposes.

However, according to Heidegger this 'pre-technological' age (or mood) is one where humans' relation with the world and artifacts, their way of being disposed, was poetic and aesthetic rather than technological (enframing). There are many who disagree with Heidegger's account of the modern technological attitude as the 'enframing' of the world. For example, Andrew Feenberg argues that Heidegger's account of modern technology is not borne out in contemporary everyday encounters with technology. Christian Fuchs has written on the anti-Semitism rooted in Heidegger's view of technology.

Dreyfus'

In critiquing the artificial intelligence (AI) programme, Hubert Dreyfus (1992) argues that the way skill development has become understood in the past has been wrong. He argues, this is the model that the early artificial intelligence community uncritically adopted. In opposition to this view, he argues, with Heidegger, that what we observe when we learn a new skill in everyday practice is in fact the opposite. We most often start with explicit rules or preformulated approaches and then move to a multiplicity of particular cases, as we become an expert. His argument draws directly on Heidegger's account in "Being and Time" of humans as beings that are always already situated in-the-world. As humans 'in-the-world', we are already experts at going about everyday life, at dealing with the subtleties of every particular situation; that is why everyday life seems so obvious. Thus, the intricate expertise of everyday activity is forgotten and taken for granted by AI as an assumed starting point. What Dreyfus highlighted in his critique of AI was the fact that technology (AI algorithms) does not make sense by itself. It is the assumed, and forgotten, horizon of everyday practice that makes technological devices and solutions show up as meaningful. If we are to understand technology we need to 'return' to the horizon of meaning that made it show up as the artifacts we need, want and desire. We need to consider how these technologies reveal (or disclose) us.

 

Protein dynamics

From Wikipedia, the free encyclopedia

Proteins are generally thought to adopt unique structures determined by their amino acid sequences. However, proteins are not strictly static objects, but rather populate ensembles of (sometimes similar) conformations. Transitions between these states occur on a variety of length scales (tenths of Å to nm) and time scales (ns to s), and have been linked to functionally relevant phenomena such as allosteric signaling and enzyme catalysis.

The study of protein dynamics is most directly concerned with the transitions between these states, but can also involve the nature and equilibrium populations of the states themselves. These two perspectives—kinetics and thermodynamics, respectively—can be conceptually synthesized in an "energy landscape" paradigm: highly populated states and the kinetics of transitions between them can be described by the depths of energy wells and the heights of energy barriers, respectively.

Kinesin walking on a microtubule. It is a molecular biological machine that uses protein domain dynamics on nanoscales

Local flexibility: atoms and residues

Portions of protein structures often deviate from the equilibrium state. Some such excursions are harmonic, such as stochastic fluctuations of chemical bonds and bond angles. Others are anharmonic, such as sidechains that jump between separate discrete energy minima, or rotamers.

Evidence for local flexibility is often obtained from NMR spectroscopy. Flexible and potentially disordered regions of a protein can be detected using the random coil index. Flexibility in folded proteins can be identified by analyzing the spin relaxation of individual atoms in the protein. Flexibility can also be observed in very high-resolution electron density maps produced by X-ray crystallography, particularly when diffraction data is collected at room temperature instead of the traditional cryogenic temperature (typically near 100 K). Information on the frequency distribution and dynamics of local protein flexibility can be obtained using Raman and optical Kerr-effect spectroscopy in the terahertz frequency domain.

Regional flexibility: intra-domain multi-residue coupling

A network of alternative conformations in catalase (Protein Data Bank code: 1gwe) with diverse properties. Multiple phenomena define the network: van der Waals interactions (blue dots and line segments) between sidechains, a hydrogen bond (dotted green line) through a partial-occupancy water (brown), coupling through the locally mobile backbone (black), and perhaps electrostatic forces between the Lys (green) and nearby polar residues (blue: Glu, yellow: Asp, purple: Ser). This particular network is distal from the active site and is therefore putatively not critical for function.

Many residues are in close spatial proximity in protein structures. This is true for most residues that are contiguous in the primary sequence, but also for many that are distal in sequence yet are brought into contact in the final folded structure. Because of this proximity, these residue's energy landscapes become coupled based on various biophysical phenomena such as hydrogen bonds, ionic bonds, and van der Waals interactions (see figure). Transitions between states for such sets of residues therefore become correlated.

This is perhaps most obvious for surface-exposed loops, which often shift collectively to adopt different conformations in different crystal structures (see figure). However, coupled conformational heterogeneity is also sometimes evident in secondary structure. For example, consecutive residues and residues offset by 4 in the primary sequence often interact in α helices. Also, residues offset by 2 in the primary sequence point their sidechains toward the same face of β sheets and are close enough to interact sterically, as are residues on adjacent strands of the same β sheet. Some of these conformational changes are induced by post-translational modifications in protein structure, such as phosphorylation and methylation.

An "ensemble" of 44 crystal structures of hen egg white lysozyme from the Protein Data Bank, showing that different crystallization conditions lead to different conformations for various surface-exposed loops and termini (red arrows).

When these coupled residues form pathways linking functionally important parts of a protein, they may participate in allosteric signaling. For example, when a molecule of oxygen binds to one subunit of the hemoglobin tetramer, that information is allosterically propagated to the other three subunits, thereby enhancing their affinity for oxygen. In this case, the coupled flexibility in hemoglobin allows for cooperative oxygen binding, which is physiologically useful because it allows rapid oxygen loading in lung tissue and rapid oxygen unloading in oxygen-deprived tissues (e.g. muscle).

Global flexibility: multiple domains

The presence of multiple domains in proteins gives rise to a great deal of flexibility and mobility, leading to protein domain dynamics. Domain motions can be inferred by comparing different structures of a protein (as in Database of Molecular Motions), or they can be directly observed using spectra measured by neutron spin echo spectroscopy. They can also be suggested by sampling in extensive molecular dynamics trajectories and principal component analysis. Domain motions are important for:

One of the largest observed domain motions is the 'swivelling' mechanism in pyruvate phosphate dikinase. The phosphoinositide domain swivels between two states in order to bring a phosphate group from the active site of the nucleotide binding domain to that of the phosphoenolpyruvate/pyruvate domain. The phosphate group is moved over a distance of 45 Å involving a domain motion of about 100 degrees around a single residue. In enzymes, the closure of one domain onto another captures a substrate by an induced fit, allowing the reaction to take place in a controlled way. A detailed analysis by Gerstein led to the classification of two basic types of domain motion; hinge and shear. Only a relatively small portion of the chain, namely the inter-domain linker and side chains undergo significant conformational changes upon domain rearrangement.

Hinges by secondary structures

A study by Hayward found that the termini of α-helices and β-sheets form hinges in a large number of cases. Many hinges were found to involve two secondary structure elements acting like hinges of a door, allowing an opening and closing motion to occur. This can arise when two neighbouring strands within a β-sheet situated in one domain, diverge apart as they join the other domain. The two resulting termini then form the bending regions between the two domains. α-helices that preserve their hydrogen bonding network when bent are found to behave as mechanical hinges, storing `elastic energy' that drives the closure of domains for rapid capture of a substrate.

Helical to extended conformation

The interconversion of helical and extended conformations at the site of a domain boundary is not uncommon. In calmodulin, torsion angles change for five residues in the middle of a domain linking α-helix. The helix is split into two, almost perpendicular, smaller helices separated by four residues of an extended strand.

Shear motions

Shear motions involve a small sliding movement of domain interfaces, controlled by the amino acid side chains within the interface. Proteins displaying shear motions often have a layered architecture: stacking of secondary structures. The interdomain linker has merely the role of keeping the domains in close proximity.

Domain motion and functional dynamics in enzymes

The analysis of the internal dynamics of structurally different, but functionally similar enzymes has highlighted a common relationship between the positioning of the active site and the two principal protein sub-domains. In fact, for several members of the hydrolase superfamily, the catalytic site is located close to the interface separating the two principal quasi-rigid domains. Such positioning appears instrumental for maintaining the precise geometry of the active site, while allowing for an appreciable functionally oriented modulation of the flanking regions resulting from the relative motion of the two sub-domains.

Implications for macromolecular evolution

Evidence suggests that protein dynamics are important for function, e.g. enzyme catalysis in DHFR, yet they are also posited to facilitate the acquisition of new functions by molecular evolution. This argument suggests that proteins have evolved to have stable, mostly unique folded structures, but the unavoidable residual flexibility leads to some degree of functional promiscuity, which can be amplified/harnessed/diverted by subsequent mutations.

However, there is growing awareness that intrinsically unstructured proteins are quite prevalent in eukaryotic genomes, casting further doubt on the simplest interpretation of Anfinsen's dogma: "sequence determines structure (singular)". In effect, the new paradigm is characterized by the addition of two caveats: "sequence and cellular environment determine structural ensemble".

 

Nanorobotics

From Wikipedia, the free encyclopedia

Nanorobotics is an emerging technology field creating machines or robots whose components are at or near the scale of a nanometer (10−9 meters). More specifically, nanorobotics (as opposed to microrobotics) refers to the nanotechnology engineering discipline of designing and building nanorobots, with devices ranging in size from 0.1 to 10 micrometres and constructed of nanoscale or molecular components. The terms nanobot, nanoid, nanite, nanomachine, or nanomite have also been used to describe such devices currently under research and development.

Nanomachines are largely in the research and development phase, but some primitive molecular machines and nanomotors have been tested. An example is a sensor having a switch approximately 1.5 nanometers across, able to count specific molecules in the chemical sample. The first useful applications of nanomachines may be in nanomedicine. For example, biological machines could be used to identify and destroy cancer cells. Another potential application is the detection of toxic chemicals, and the measurement of their concentrations, in the environment. Rice University has demonstrated a single-molecule car developed by a chemical process and including Buckminsterfullerenes (buckyballs) for wheels. It is actuated by controlling the environmental temperature and by positioning a scanning tunneling microscope tip.

Another definition is a robot that allows precise interactions with nanoscale objects, or can manipulate with nanoscale resolution. Such devices are more related to microscopy or scanning probe microscopy, instead of the description of nanorobots as molecular machines. Using the microscopy definition, even a large apparatus such as an atomic force microscope can be considered a nanorobotic instrument when configured to perform nanomanipulation. For this viewpoint, macroscale robots or microrobots that can move with nanoscale precision can also be considered nanorobots.

Nanorobotics theory

According to Richard Feynman, it was his former graduate student and collaborator Albert Hibbs who originally suggested to him (circa 1959) the idea of a medical use for Feynman's theoretical micro-machines. Hibbs suggested that certain repair machines might one day be reduced in size to the point that it would, in theory, be possible to (as Feynman put it) "swallow the surgeon". The idea was incorporated into Feynman's 1959 essay There's Plenty of Room at the Bottom.

Since nano-robots would be microscopic in size, it would probably be necessary for very large numbers of them to work together to perform microscopic and macroscopic tasks. These nano-robot swarms, both those unable to replicate (as in utility fog) and those able to replicate unconstrained in the natural environment (as in grey goo and synthetic biology), are found in many science fiction stories, such as the Borg nano-probes in Star Trek and The Outer Limits episode "The New Breed". Some proponents of nano-robotics, in reaction to the grey goo scenarios that they earlier helped to propagate, hold the view that nano-robots able to replicate outside of a restricted factory environment do not form a necessary part of a purported productive nanotechnology, and that the process of self-replication, were it ever to be developed, could be made inherently safe. They further assert that their current plans for developing and using molecular manufacturing do not in fact include free-foraging replicators.

A detailed theoretical discussion of nanorobotics, including specific design issues such as sensing, power communication, navigation, manipulation, locomotion, and onboard computation, has been presented in the medical context of nanomedicine by Robert Freitas. Some of these discussions remain at the level of unbuildable generality and do not approach the level of detailed engineering.

Legal and ethical implications

Open technology

A document with a proposal on nanobiotech development using open design technology methods, as in open-source hardware and open-source software, has been addressed to the United Nations General Assembly. According to the document sent to the United Nations, in the same way that open source has in recent years accelerated the development of computer systems, a similar approach should benefit the society at large and accelerate nanorobotics development. The use of nanobiotechnology should be established as a human heritage for the coming generations, and developed as an open technology based on ethical practices for peaceful purposes. Open technology is stated as a fundamental key for such an aim.

Nanorobot race

In the same ways that technology research and development drove the space race and nuclear arms race, a race for nanorobots is occurring. There is plenty of ground allowing nanorobots to be included among the emerging technologies. Some of the reasons are that large corporations, such as General Electric, Hewlett-Packard, Synopsys, Northrop Grumman and Siemens have been recently working in the development and research of nanorobots; surgeons are getting involved and starting to propose ways to apply nanorobots for common medical procedures; universities and research institutes were granted funds by government agencies exceeding $2 billion towards research developing nanodevices for medicine; bankers are also strategically investing with the intent to acquire beforehand rights and royalties on future nanorobots commercialisation. Some aspects of nanorobot litigation and related issues linked to monopoly have already arisen. A large number of patents has been granted recently on nanorobots, done mostly for patent agents, companies specialized solely on building patent portfolios, and lawyers. After a long series of patents and eventually litigations, see for example the invention of radio, or the war of currents, emerging fields of technology tend to become a monopoly, which normally is dominated by large corporations.

Manufacturing approaches

Manufacturing nanomachines assembled from molecular components is a very challenging task. Because of the level of difficulty, many engineers and scientists continue working cooperatively across multidisciplinary approaches to achieve breakthroughs in this new area of development. Thus, it is quite understandable the importance of the following distinct techniques currently applied towards manufacturing nanorobots:

Biochip

The joint use of nanoelectronics, photolithography, and new biomaterials provides a possible approach to manufacturing nanorobots for common medical uses, such as surgical instrumentation, diagnosis, and drug delivery. This method for manufacturing on nanotechnology scale is in use in the electronics industry since 2008. So, practical nanorobots should be integrated as nanoelectronics devices, which will allow tele-operation and advanced capabilities for medical instrumentation.

Nubots

A nucleic acid robot (nubot) is an organic molecular machine at the nanoscale. DNA structure can provide means to assemble 2D and 3D nanomechanical devices. DNA based machines can be activated using small molecules, proteins and other molecules of DNA. Biological circuit gates based on DNA materials have been engineered as molecular machines to allow in-vitro drug delivery for targeted health problems. Such material based systems would work most closely to smart biomaterial drug system delivery, while not allowing precise in vivo teleoperation of such engineered prototypes.

Surface-bound systems

Several reports have demonstrated the attachment of synthetic molecular motors to surfaces. These primitive nanomachines have been shown to undergo machine-like motions when confined to the surface of a macroscopic material. The surface anchored motors could potentially be used to move and position nanoscale materials on a surface in the manner of a conveyor belt.

Positional nanoassembly

Nanofactory Collaboration, founded by Robert Freitas and Ralph Merkle in 2000 and involving 23 researchers from 10 organizations and 4 countries, focuses on developing a practical research agenda specifically aimed at developing positionally-controlled diamond mechanosynthesis and a diamondoid nanofactory that would have the capability of building diamondoid medical nanorobots.

Biohybrids

The emerging field of bio-hybrid systems combines biological and synthetic structural elements for biomedical or robotic applications. The constituting elements of bio-nanoelectromechanical systems (BioNEMS) are of nanoscale size, for example DNA, proteins or nanostructured mechanical parts. Thiol-ene e-beams resist allow the direct writing of nanoscale features, followed by the functionalization of the natively reactive resist surface with biomolecules. Other approaches use a biodegradable material attached to magnetic particles that allow them to be guided around the body.

Bacteria-based

This approach proposes the use of biological microorganisms, like the bacterium Escherichia coli and Salmonella typhimurium. Thus the model uses a flagellum for propulsion purposes. Electromagnetic fields normally control the motion of this kind of biological integrated device. Chemists at the University of Nebraska have created a humidity gauge by fusing a bacterium to a silicone computer chip.

Virus-based

Retroviruses can be retrained to attach to cells and replace DNA. They go through a process called reverse transcription to deliver genetic packaging in a vector. Usually, these devices are Pol – Gag genes of the virus for the Capsid and Delivery system. This process is called retroviral gene therapy, having the ability to re-engineer cellular DNA by usage of viral vectors. This approach has appeared in the form of retroviral, adenoviral, and lentiviral gene delivery systems. These gene therapy vectors have been used in cats to send genes into the genetically modified organism (GMO), causing it to display the trait. 

3D printing

3D printing is the process by which a three-dimensional structure is built through the various processes of additive manufacturing. Nanoscale 3D printing involves many of the same process, incorporated at a much smaller scale. To print a structure in the 5-400 µm scale, the precision of the 3D printing machine needs to be improved greatly. A two-step process of 3D printing, using a 3D printing and laser etched plates method was incorporated as an improvement technique. To be more precise at a nanoscale, the 3D printing process uses a laser etching machine, which etches the details needed for the segments of nanorobots into each plate. The plate is then transferred to the 3D printer, which fills the etched regions with the desired nanoparticle. The 3D printing process is repeated until the nanorobot is built from the bottom up. This 3D printing process has many benefits. First, it increases the overall accuracy of the printing process. Second, it has the potential to create functional segments of a nanorobot. The 3D printer uses a liquid resin, which is hardened at precisely the correct spots by a focused laser beam. The focal point of the laser beam is guided through the resin by movable mirrors and leaves behind a hardened line of solid polymer, just a few hundred nanometers wide. This fine resolution enables the creation of intricately structured sculptures as tiny as a grain of sand. This process takes place by using photoactive resins, which are hardened by the laser at an extremely small scale to create the structure. This process is quick by nanoscale 3D printing standards. Ultra-small features can be made with the 3D micro-fabrication technique used in multiphoton photopolymerisation. This approach uses a focused laser to trace the desired 3D object into a block of gel. Due to the nonlinear nature of photo excitation, the gel is cured to a solid only in the places where the laser was focused while the remaining gel is then washed away. Feature sizes of under 100 nm are easily produced, as well as complex structures with moving and interlocked parts.

Potential uses

Nanomedicine

Potential uses for nanorobotics in medicine include early diagnosis and targeted drug-delivery for cancer, biomedical instrumentation, surgery, pharmacokinetics, monitoring of diabetes, and health care.

In such plans, future medical nanotechnology is expected to employ nanorobots injected into the patient to perform work at a cellular level. Such nanorobots intended for use in medicine should be non-replicating, as replication would needlessly increase device complexity, reduce reliability, and interfere with the medical mission.

Nanotechnology provides a wide range of new technologies for developing customized means to optimize the delivery of pharmaceutical drugs. Today, harmful side effects of treatments such as chemotherapy are commonly a result of drug delivery methods that don't pinpoint their intended target cells accurately. Researchers at Harvard and MIT, however, have been able to attach special RNA strands, measuring nearly 10 nm in diameter, to nanoparticles, filling them with a chemotherapy drug. These RNA strands are attracted to cancer cells. When the nanoparticle encounters a cancer cell, it adheres to it, and releases the drug into the cancer cell. This directed method of drug delivery has great potential for treating cancer patients while avoiding negative effects (commonly associated with improper drug delivery). The first demonstration of nanomotors operating in living organisms was carried out in 2014 at University of California, San Diego. MRI-guided nanocapsules are one potential precursor to nanorobots.

Another useful application of nanorobots is assisting in the repair of tissue cells alongside white blood cells. Recruiting inflammatory cells or white blood cells (which include neutrophil granulocytes, lymphocytes, monocytes, and mast cells) to the affected area is the first response of tissues to injury. Because of their small size, nanorobots could attach themselves to the surface of recruited white cells, to squeeze their way out through the walls of blood vessels and arrive at the injury site, where they can assist in the tissue repair process. Certain substances could possibly be used to accelerate the recovery.

The science behind this mechanism is quite complex. Passage of cells across the blood endothelium, a process known as transmigration, is a mechanism involving engagement of cell surface receptors to adhesion molecules, active force exertion and dilation of the vessel walls and physical deformation of the migrating cells. By attaching themselves to migrating inflammatory cells, the robots can in effect "hitch a ride" across the blood vessels, bypassing the need for a complex transmigration mechanism of their own.

As of 2016, in the United States, Food and Drug Administration (FDA) regulates nanotechnology on the basis of size.

Soutik Betal, during his doctoral research at the University of Texas, San Antonio developed nanocomposite particles that are controlled remotely by an electromagnetic field. This series of nanorobots that are now enlisted in the Guinness World Records, can be used to interact with the biological cells. Scientists suggest that this technology can be used for the treatment of cancer.

Cultural references

The Nanites are characters on the TV show Mystery Science Theater 3000. They're self-replicating, bio-engineered organisms that work on the ship and reside in the SOL's computer systems. They made their first appearance in Season 8. Nanites are used in a number of episodes in the Netflix series "Travelers". They be programmed and injected into injured people to perform repairs. First appearance in season 1

Nanites also feature in the Rise of Iron 2016 expansion for Destiny in which SIVA, a self-replicating nanotechnology is used as a weapon.

Nanites (referred to more often as Nanomachines) are often referenced in Konami's "Metal Gear" series being used to enhance and regulate abilities and body functions.

In the Star Trek franchise TV shows nanites play an important plot device. Starting with Evolution in the third season of The Next Generation, Borg Nanoprobes perform the function of maintaining the Borg cybernetic systems, as well as repairing damage to the organic parts of a Borg. They generate new technology inside a Borg when needed, as well as protecting them from many forms of disease.

Nanites play a role in the video game Deus Ex, being the basis of the nano-augmentation technology which gives augmented people superhuman abilities.

Nanites are also mentioned in the Arc of a Scythe book series by Neal Shusterman and are used to heal all nonfatal injuries, regulate bodily functions, and considerably lessen pain.

Right to property

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Right_to_property The right to property , or the right to own property ...