Search This Blog

Sunday, September 1, 2024

Computational intelligence

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Computational_intelligence

The expression computational intelligence (CI) usually refers to the ability of a computer to learn a specific task from data or experimental observation. Even though it is commonly considered a synonym of soft computing, there is still no commonly accepted definition of computational intelligence.

Generally, computational intelligence is a set of nature-inspired computational methodologies and approaches to address complex real-world problems to which mathematical or traditional modelling can be useless for a few reasons: the processes might be too complex for mathematical reasoning, it might contain some uncertainties during the process, or the process might simply be stochastic in nature. Indeed, many real-life problems cannot be translated into binary language (unique values of 0 and 1) for computers to process it. Computational Intelligence therefore provides solutions for such problems.

The methods used are close to the human's way of reasoning, i.e. it uses inexact and incomplete knowledge, and it is able to produce control actions in an adaptive way. CI therefore uses a combination of five main complementary techniques. The fuzzy logic which enables the computer to understand natural language, artificial neural networks which permits the system to learn experiential data by operating like the biological one, evolutionary computing, which is based on the process of natural selection, learning theory, and probabilistic methods which helps dealing with uncertainty imprecision.

Except those main principles, currently popular approaches include biologically inspired algorithms such as swarm intelligence and artificial immune systems, which can be seen as a part of evolutionary computation, image processing, data mining, natural language processing, and artificial intelligence, which tends to be confused with Computational Intelligence. But although both Computational Intelligence (CI) and Artificial Intelligence (AI) seek similar goals, there's a clear distinction between them.

Computational Intelligence is thus a way of performing like human beings. Indeed, the characteristic of "intelligence" is usually attributed to humans. More recently, many products and items also claim to be "intelligent", an attribute which is directly linked to the reasoning and decision making.

History

Source: The notion of Computational Intelligence was first used by the IEEE Neural Networks Council in 1990. This council was founded in the 1980s by a group of researchers interested in the development of biological and artificial neural networks. On November 21, 2001, the IEEE Neural Networks Council became the IEEE Neural Networks Society, to become the IEEE Computational Intelligence Society two years later by including new areas of interest such as fuzzy systems and evolutionary computation, which they related to Computational Intelligence in 2011 (Dote and Ovaska).

But the first clear definition of Computational Intelligence was introduced by Bezdek in 1994: a system is called computationally intelligent if it deals with low-level data such as numerical data, has a pattern-recognition component and does not use knowledge in the AI sense, and additionally when it begins to exhibit computational adaptively, fault tolerance, speed approaching human-like turnaround and error rates that approximate human performance.

Bezdek and Marks (1993) clearly differentiated CI from other subsets of AI, by arguing that the first one is based on soft computing methods, whereas AI is based on hard computing ones.

Differences between Computational Intelligence and other historic approaches to Artificial Intelligence

According to Bezdek (1994), while Computational Intelligence is a subset of Artificial Intelligence indeed, there are two types of machine intelligence: the artificial one based on hard computing techniques and the computational one based on soft computing methods, which enable adaptation to many situations. According to Engelbrecht (2007), algorithmic approaches that have been classified to form the Computational Intelligence approach to AI - namely Fuzzy systems, Neural Nets, Evolutionary Computation, Swarm Intelligence, and Artificial ImmuneSystems- are called "intelligent algorithms". Together with logic, deductive reasoning, expert systems, case-based reasoning, and symbolic machine learning systems (the aforementioned "hard" computing approaches), formed the Artificial Intelligence toolset of the time. Of course today, with machine learning and deep learning in particular utilizing a breadth of supervised, unupervised, and reinforcement learning approaches, the AI landscape has been greatly enhanced, with novell intelligent approaches.

Hard computing techniques work following binary logic based on only two values (the Booleans true or false, 0 or 1) on which modern computers are based. One problem with this logic is that our natural language cannot always be translated easily into absolute terms of 0 and 1. Soft computing techniques, based on fuzzy logic can be useful here. Much closer to the way the human brain works by aggregating data to partial truths (Crisp/fuzzy systems), this logic is one of the main exclusive aspects of CI.

Within the same principles of fuzzy and binary logics follow crispy and fuzzy systems. Crisp logic is a part of artificial intelligence principles and consists of either including an element in a set, or not, whereas fuzzy systems (CI) enable elements to be partially in a set. Following this logic, each element can be given a degree of membership (from 0 to 1) and not exclusively one of these 2 values.

The five main algorithmic approaches of CI and their applications

The main applications of Computational Intelligence include computer science, engineering, data analysis and bio-medicine.

Fuzzy logic

As explained before, fuzzy logic, one of CI's main principles, consists in measurements and process modelling made for real life's complex processes. It can face incompleteness, and most importantly ignorance of data in a process model, contrarily to Artificial Intelligence, which requires exact knowledge.

This technique tends to apply to a wide range of domains such as control, image processing and decision making. But it is also well introduced in the field of household appliances with washing machines, microwave ovens, etc. We can face it too when using a video camera, where it helps stabilizing the image while holding the camera unsteadily. Other areas such as medical diagnostics, foreign exchange trading and business strategy selection are apart from this principle's numbers of applications.

Fuzzy logic is mainly useful for approximate reasoning, and doesn't have learning abilities, a qualification much needed that human beings have. It enables them to improve themselves by learning from their previous mistakes.

Neural networks

This is why CI experts work on the development of artificial neural networks based on the biological ones, which can be defined by 3 main components: the cell-body which processes the information, the axon, which is a device enabling the signal conducting, and the synapse, which controls signals. Therefore, artificial neural networks are doted of distributed information processing systems, enabling the process and the learning from experiential data. Working like human beings, fault tolerance is also one of the main assets of this principle.

Concerning its applications, neural networks can be classified into five groups: data analysis and classification, associative memory, clustering generation of patterns and control. Generally, this method aims to analyze and classify medical data, proceed to face and fraud detection, and most importantly deal with nonlinearities of a system in order to control it. Furthermore, neural networks techniques share with the fuzzy logic ones the advantage of enabling data clustering.

Evolutionary computation

Evolutionary computation can be seen as a family of methods and algorithms for global optimization, which are usually based on a population of candidate solutions. They are inspired by biological evolution and are often summarized as evolutionary algorithms. These include the genetic algorithms, evolution strategy, genetic programming and many others. They are considered as problem solvers for tasks not solvable by traditional mathematical methods and are frequently used for optimization including multi-objective optimization.

Learning theory

Still looking for a way of "reasoning" close to the humans' one, learning theory is one of the main approaches of CI. In psychology, learning is the process of bringing together cognitive, emotional and environmental effects and experiences to acquire, enhance or change knowledge, skills, values and world views (Ormrod, 1995; Illeris, 2004). Learning theories then helps understanding how these effects and experiences are processed, and then helps making predictions based on previous experience.

Probabilistic methods

Being one of the main elements of fuzzy logic, probabilistic methods firstly introduced by Paul Erdos and Joel Spencer (1974), aim to evaluate the outcomes of a Computation Intelligent system, mostly defined by randomness. Therefore, probabilistic methods bring out the possible solutions to a problem, based on prior knowledge.

Impact on university education

According to bibliometrics studies, computational intelligence plays a key role in research. All the major academic publishers are accepting manuscripts in which a combination of Fuzzy logic, neural networks and evolutionary computation is discussed. On the other hand, Computational intelligence isn't available in the university curriculum. The amount of technical universities in which students can attend a course is limited. Only British Columbia, Technical University of Dortmund (involved in the European fuzzy boom) and Georgia Southern University are offering courses from this domain.

The reason why major university are ignoring the topic is because they don't have the resources. The existing computer science courses are so complex, that at the end of the semester there is no room for fuzzy logic. Sometimes it is taught as a subproject in existing introduction courses, but in most cases the universities are preferring courses about classical AI concepts based on boolean logic, turing machines and toy problems like blocks world.

Since a while with the upraising of STEM education, the situation has changed a bit. There are some efforts available in which multidisciplinary approaches are preferred which allows the student to understand complex adaptive systems. These objectives are discussed only on a theoretical basis. The curriculum of real universities wasn't adapted yet.

Saturday, August 31, 2024

Natural computing

From Wikipedia, the free encyclopedia

Natural computing, also called natural computation, is a terminology introduced to encompass three classes of methods: 1) those that take inspiration from nature for the development of novel problem-solving techniques; 2) those that are based on the use of computers to synthesize natural phenomena; and 3) those that employ natural materials (e.g., molecules) to compute. The main fields of research that compose these three branches are artificial neural networks, evolutionary algorithms, swarm intelligence, artificial immune systems, fractal geometry, artificial life, DNA computing, and quantum computing, among others.

Computational paradigms studied by natural computing are abstracted from natural phenomena as diverse as self-replication, the functioning of the brain, Darwinian evolution, group behavior, the immune system, the defining properties of life forms, cell membranes, and morphogenesis. Besides traditional electronic hardware, these computational paradigms can be implemented on alternative physical media such as biomolecules (DNA, RNA), or trapped-ion quantum computing devices.

Dually, one can view processes occurring in nature as information processing. Such processes include self-assembly, developmental processes, gene regulation networks, protein–protein interaction networks, biological transport (active transport, passive transport) networks, and gene assembly in unicellular organisms. Efforts to understand biological systems also include engineering of semi-synthetic organisms, and understanding the universe itself from the point of view of information processing. Indeed, the idea was even advanced that information is more fundamental than matter or energy. The Zuse-Fredkin thesis, dating back to the 1960s, states that the entire universe is a huge cellular automaton which continuously updates its rules. Recently it has been suggested that the whole universe is a quantum computer that computes its own behaviour. The universe/nature as computational mechanism is addressed by, exploring nature with help the ideas of computability, and studying natural processes as computations (information processing).

Nature-inspired models of computation

The most established "classical" nature-inspired models of computation are cellular automata, neural computation, and evolutionary computation. More recent computational systems abstracted from natural processes include swarm intelligence, artificial immune systems, membrane computing, and amorphous computing. Detailed reviews can be found in many books.

Cellular automata

A cellular automaton is a dynamical system consisting of an array of cells. Space and time are discrete and each of the cells can be in a finite number of states. The cellular automaton updates the states of its cells synchronously according to the transition rules given a priori. The next state of a cell is computed by a transition rule and it depends only on its current state and the states of its neighbors.

Conway's Game of Life is one of the best-known examples of cellular automata, shown to be computationally universal. Cellular automata have been applied to modelling a variety of phenomena such as communication, growth, reproduction, competition, evolution and other physical and biological processes.

Neural computation

Neural computation is the field of research that emerged from the comparison between computing machines and the human nervous system. This field aims both to understand how the brain of living organisms works (brain theory or computational neuroscience), and to design efficient algorithms based on the principles of how the human brain processes information (Artificial Neural Networks, ANN ).

An artificial neural network is a network of artificial neurons. An artificial neuron A is equipped with a function , receives n real-valued inputs with respective weights , and it outputs . Some neurons are selected to be the output neurons, and the network function is the vectorial function that associates to the n input values, the outputs of the m selected output neurons. Note that different choices of weights produce different network functions for the same inputs. Back-propagation is a supervised learning method by which the weights of the connections in the network are repeatedly adjusted so as to minimize the difference between the vector of actual outputs and that of desired outputs. Learning algorithms based on backwards propagation of errors can be used to find optimal weights for given topology of the network and input-output pairs.

Evolutionary computation

Evolutionary computation is a computational paradigm inspired by Darwinian evolution.

An artificial evolutionary system is a computational system based on the notion of simulated evolution. It comprises a constant- or variable-size population of individuals, a fitness criterion, and genetically inspired operators that produce the next generation from the current one. The initial population is typically generated randomly or heuristically, and typical operators are mutation and recombination. At each step, the individuals are evaluated according to the given fitness function (survival of the fittest). The next generation is obtained from selected individuals (parents) by using genetically inspired operators. The choice of parents can be guided by a selection operator which reflects the biological principle of mate selection. This process of simulated evolution eventually converges towards a nearly optimal population of individuals, from the point of view of the fitness function.

The study of evolutionary systems has historically evolved along three main branches: Evolution strategies provide a solution to parameter optimization problems for real-valued as well as discrete and mixed types of parameters. Evolutionary programming originally aimed at creating optimal "intelligent agents" modelled, e.g., as finite state machines. Genetic algorithms applied the idea of evolutionary computation to the problem of finding a (nearly-)optimal solution to a given problem. Genetic algorithms initially consisted of an input population of individuals encoded as fixed-length bit strings, the genetic operators mutation (bit flips) and recombination (combination of a prefix of a parent with the suffix of the other), and a problem-dependent fitness function. Genetic algorithms have been used to optimize computer programs, called genetic programming, and today they are also applied to real-valued parameter optimization problems as well as to many types of combinatorial tasks.

Estimation of Distribution Algorithm (EDA), on the other hand, are evolutionary algorithms that substitute traditional reproduction operators by model-guided ones. Such models are learned from the population by employing machine learning techniques and represented as Probabilistic Graphical Models, from which new solutions can be sampled or generated from guided-crossover.

Swarm intelligence

Swarm intelligence, sometimes referred to as collective intelligence, is defined as the problem solving behavior that emerges from the interaction of individual agents (e.g., bacteria, ants, termites, bees, spiders, fish, birds) which communicate with other agents by acting on their local environments.

Particle swarm optimization applies this idea to the problem of finding an optimal solution to a given problem by a search through a (multi-dimensional) solution space. The initial set-up is a swarm of particles, each representing a possible solution to the problem. Each particle has its own velocity which depends on its previous velocity (the inertia component), the tendency towards the past personal best position (the nostalgia component), and its tendency towards a global neighborhood optimum or local neighborhood optimum (the social component). Particles thus move through a multidimensional space and eventually converge towards a point between the global best and their personal best. Particle swarm optimization algorithms have been applied to various optimization problems, and to unsupervised learning, game learning, and scheduling applications.

In the same vein, ant algorithms model the foraging behaviour of ant colonies. To find the best path between the nest and a source of food, ants rely on indirect communication by laying a pheromone trail on the way back to the nest if they found food, respectively following the concentration of pheromones if they are looking for food. Ant algorithms have been successfully applied to a variety of combinatorial optimization problems over discrete search spaces.

Artificial immune systems

Artificial immune systems (a.k.a. immunological computation or immunocomputing) are computational systems inspired by the natural immune systems of biological organisms.

Viewed as an information processing system, the natural immune system of organisms performs many complex tasks in parallel and distributed computing fashion. These include distinguishing between self and nonself, neutralization of nonself pathogens (viruses, bacteria, fungi, and parasites), learning, memory, associative retrieval, self-regulation, and fault-tolerance. Artificial immune systems are abstractions of the natural immune system, emphasizing these computational aspects. Their applications include computer virus detection, anomaly detection in a time series of data, fault diagnosis, pattern recognition, machine learning, bioinformatics, optimization, robotics and control.

Membrane computing

Membrane computing investigates computing models abstracted from the compartmentalized structure of living cells affected by membranes. A generic membrane system (P-system) consists of cell-like compartments (regions) delimited by membranes, that are placed in a nested hierarchical structure. Each membrane-enveloped region contains objects, transformation rules which modify these objects, as well as transfer rules, which specify whether the objects will be transferred outside or stay inside the region. Regions communicate with each other via the transfer of objects. The computation by a membrane system starts with an initial configuration, where the number (multiplicity) of each object is set to some value for each region (multiset of objects). It proceeds by choosing, nondeterministically and in a maximally parallel manner, which rules are applied to which objects. The output of the computation is collected from an a priori determined output region.

Applications of membrane systems include machine learning, modelling of biological processes (photosynthesis, certain signaling pathways, quorum sensing in bacteria, cell-mediated immunity), as well as computer science applications such as computer graphics, public-key cryptography, approximation and sorting algorithms, as well as analysis of various computationally hard problems.

Amorphous computing

In biological organisms, morphogenesis (the development of well-defined shapes and functional structures) is achieved by the interactions between cells guided by the genetic program encoded in the organism's DNA.

Inspired by this idea, amorphous computing aims at engineering well-defined shapes and patterns, or coherent computational behaviours, from the local interactions of a multitude of simple unreliable, irregularly placed, asynchronous, identically programmed computing elements (particles). As a programming paradigm, the aim is to find new programming techniques that would work well for amorphous computing environments. Amorphous computing also plays an important role as the basis for "cellular computing" (see the topics synthetic biology and cellular computing, below).

Morphological computing

The understanding that the morphology performs computation is used to analyze the relationship between morphology and control and to theoretically guide the design of robots with reduced control requirements, has been used in both robotics and for understanding of cognitive processes in living organisms, see Morphological computation and ...

Cognitive computing

Cognitive computing CC is a new type of computing, typically with the goal of modelling of functions of human sensing, reasoning, and response to stimulus, see Cognitive computing and .

Cognitive capacities of present-day cognitive computing are far from human level. The same info-computational approach can be applied to other, simpler living organisms. Bacteria are an example of a cognitive system modelled computationally, see Eshel Ben-Jacob and Microbes-mind.

Synthesizing nature by means of computing

Artificial life

Artificial life (ALife) is a research field whose ultimate goal is to understand the essential properties of life organisms  by building, within electronic computers or other artificial media, ab initio systems that exhibit properties normally associated only with living organisms. Early examples include Lindenmayer systems (L-systems), that have been used to model plant growth and development. An L-system is a parallel rewriting system that starts with an initial word, and applies its rewriting rules in parallel to all letters of the word.

Pioneering experiments in artificial life included the design of evolving "virtual block creatures" acting in simulated environments with realistic features such as kinetics, dynamics, gravity, collision, and friction. These artificial creatures were selected for their abilities endowed to swim, or walk, or jump, and they competed for a common limited resource (controlling a cube). The simulation resulted in the evolution of creatures exhibiting surprising behaviour: some developed hands to grab the cube, others developed legs to move towards the cube. This computational approach was further combined with rapid manufacturing technology to actually build the physical robots that virtually evolved. This marked the emergence of the field of mechanical artificial life.

The field of synthetic biology explores a biological implementation of similar ideas. Other research directions within the field of artificial life include artificial chemistry as well as traditionally biological phenomena explored in artificial systems, ranging from computational processes such as co-evolutionary adaptation and development, to physical processes such as growth, self-replication, and self-repair.

Nature-inspired novel hardware

All of the computational techniques mentioned above, while inspired by nature, have been implemented until now mostly on traditional electronic hardware. In contrast, the two paradigms introduced here, molecular computing and quantum computing, employ radically different types of hardware.

Molecular computing

Molecular computing (a.k.a. biomolecular computing, biocomputing, biochemical computing, DNA computing) is a computational paradigm in which data is encoded as biomolecules such as DNA strands, and molecular biology tools act on the data to perform various operations (e.g., arithmetic or logical operations).

The first experimental realization of special-purpose molecular computer was the 1994 breakthrough experiment by Leonard Adleman who solved a 7-node instance of the Hamiltonian Path Problem solely by manipulating DNA strands in test tubes. DNA computations start from an initial input encoded as a DNA sequence (essentially a sequence over the four-letter alphabet {A, C, G, T}), and proceed by a succession of bio-operations such as cut-and-paste (by restriction enzymes and ligases), extraction of strands containing a certain subsequence (by using Watson-Crick complementarity), copy (by using polymerase chain reaction that employs the polymerase enzyme), and read-out. Recent experimental research succeeded in solving more complex instances of NP-complete problems such as a 20-variable instance of 3SAT, and wet DNA implementations of finite state machines with potential applications to the design of smart drugs.

DNA tile self-assembly of a Sierpinski triangle, starting from a seed obtained by the DNA origami technique

One of the most notable contributions of research in this field is to the understanding of self-assembly. Self-assembly is the bottom-up process by which objects autonomously come together to form complex structures. Instances in nature abound, and include atoms binding by chemical bonds to form molecules, and molecules forming crystals or macromolecules. Examples of self-assembly research topics include self-assembled DNA nanostructures such as Sierpinski triangles or arbitrary nanoshapes obtained using the DNA origami technique, and DNA nanomachines such as DNA-based circuits (binary counter, bit-wise cumulative XOR), ribozymes for logic operations, molecular switches (DNA tweezers), and autonomous molecular motors (DNA walkers).

Theoretical research in molecular computing has yielded several novel models of DNA computing (e.g. splicing systems introduced by Tom Head already in 1987) and their computational power has been investigated. Various subsets of bio-operations are now known to be able to achieve the computational power of Turing machines.

Quantum computing

A quantum computer processes data stored as quantum bits (qubits), and uses quantum mechanical phenomena such as superposition and entanglement to perform computations. A qubit can hold a "0", a "1", or a quantum superposition of these. A quantum computer operates on qubits with quantum logic gates. Through Shor's polynomial algorithm for factoring integers, and Grover's algorithm for quantum database search that has a quadratic time advantage, quantum computers were shown to potentially possess a significant benefit relative to electronic computers.

Quantum cryptography is not based on the complexity of the computation, but on the special properties of quantum information, such as the fact that quantum information cannot be measured reliably and any attempt at measuring it results in an unavoidable and irreversible disturbance. A successful open air experiment in quantum cryptography was reported in 2007, where data was transmitted securely over a distance of 144 km. Quantum teleportation is another promising application, in which a quantum state (not matter or energy) is transferred to an arbitrary distant location. Implementations of practical quantum computers are based on various substrates such as ion-traps, superconductors, nuclear magnetic resonance, etc. As of 2006, the largest quantum computing experiment used liquid state nuclear magnetic resonance quantum information processors, and could operate on up to 12 qubits.

Nature as information processing

The dual aspect of natural computation is that it aims to understand nature by regarding natural phenomena as information processing. Already in the 1960s, Zuse and Fredkin suggested the idea that the entire universe is a computational (information processing) mechanism, modelled as a cellular automaton which continuously updates its rules. A recent quantum-mechanical approach of Lloyd suggests the universe as a quantum computer that computes its own behaviour, while Vedral  suggests that information is the most fundamental building block of reality.

The universe/nature as computational mechanism is elaborated in, exploring the nature with help of the ideas of computability, whilst, based on the idea of nature as network of networks of information processes on different levels of organization, is studying natural processes as computations (information processing).

The main directions of research in this area are systems biology, synthetic biology and cellular computing.

Systems biology

Computational systems biology (or simply systems biology) is an integrative and qualitative approach that investigates the complex communications and interactions taking place in biological systems. Thus, in systems biology, the focus of the study is the interaction networks themselves and the properties of biological systems that arise due to these networks, rather than the individual components of functional processes in an organism. This type of research on organic components has focused strongly on four different interdependent interaction networks: gene-regulatory networks, biochemical networks, transport networks, and carbohydrate networks.

Gene regulatory networks comprise gene-gene interactions, as well as interactions between genes and other substances in the cell. Genes are transcribed into messenger RNA (mRNA), and then translated into proteins according to the genetic code. Each gene is associated with other DNA segments (promoters, enhancers, or silencers) that act as binding sites for activators or repressors for gene transcription. Genes interact with each other either through their gene products (mRNA, proteins) which can regulate gene transcription, or through small RNA species that can directly regulate genes. These gene-gene interactions, together with genes' interactions with other substances in the cell, form the most basic interaction network: the gene regulatory networks. They perform information processing tasks within the cell, including the assembly and maintenance of other networks. Models of gene regulatory networks include random and probabilistic Boolean networks, asynchronous automata, and network motifs.

Another viewpoint is that the entire genomic regulatory system is a computational system, a genomic computer. This interpretation allows one to compare human-made electronic computation with computation as it occurs in nature.

A comparison between genomic and electronic computers

Genomic computer Electronic computer
Architecture changeable rigid
Components construction as-needed basis from the start
Coordination causal coordination temporal synchrony
Distinction between hardware and software No Yes
Transport media molecules and ions wires

In addition, unlike a conventional computer, robustness in a genomic computer is achieved by various feedback mechanisms by which poorly functional processes are rapidly degraded, poorly functional cells are killed by apoptosis, and poorly functional organisms are out-competed by more fit species.

Biochemical networks refer to the interactions between proteins, and they perform various mechanical and metabolic tasks inside a cell. Two or more proteins may bind to each other via binding of their interactions sites, and form a dynamic protein complex (complexation). These protein complexes may act as catalysts for other chemical reactions, or may chemically modify each other. Such modifications cause changes to available binding sites of proteins. There are tens of thousands of proteins in a cell, and they interact with each other. To describe such a massive scale interactions, Kohn maps were introduced as a graphical notation to depict molecular interactions in succinct pictures. Other approaches to describing accurately and succinctly protein–protein interactions include the use of textual bio-calculus or pi-calculus enriched with stochastic features.

Transport networks refer to the separation and transport of substances mediated by lipid membranes. Some lipids can self-assemble into biological membranes. A lipid membrane consists of a lipid bilayer in which proteins and other molecules are embedded, being able to travel along this layer. Through lipid bilayers, substances are transported between the inside and outside of membranes to interact with other molecules. Formalisms depicting transport networks include membrane systems and brane calculi.

Synthetic biology

Synthetic biology aims at engineering synthetic biological components, with the ultimate goal of assembling whole biological systems from their constituent components. The history of synthetic biology can be traced back to the 1960s, when François Jacob and Jacques Monod discovered the mathematical logic in gene regulation. Genetic engineering techniques, based on recombinant DNA technology, are a precursor of today's synthetic biology which extends these techniques to entire systems of genes and gene products.

Along with the possibility of synthesizing longer and longer DNA strands, the prospect of creating synthetic genomes with the purpose of building entirely artificial synthetic organisms became a reality. Indeed, rapid assembly of chemically synthesized short DNA strands made it possible to generate a 5386bp synthetic genome of a virus.

Alternatively, Smith et al. found about 100 genes that can be removed individually from the genome of Mycoplasma Genitalium. This discovery paves the way to the assembly of a minimal but still viable artificial genome consisting of the essential genes only.

A third approach to engineering semi-synthetic cells is the construction of a single type of RNA-like molecule with the ability of self-replication. Such a molecule could be obtained by guiding the rapid evolution of an initial population of RNA-like molecules, by selection for the desired traits.

Another effort in this field is towards engineering multi-cellular systems by designing, e.g., cell-to-cell communication modules used to coordinate living bacterial cell populations.

Cellular computing

Computation in living cells (a.k.a. cellular computing, or in-vivo computing) is another approach to understand nature as computation. One particular study in this area is that of the computational nature of gene assembly in unicellular organisms called ciliates. Ciliates store a copy of their DNA containing functional genes in the macronucleus, and another "encrypted" copy in the micronucleus. Conjugation of two ciliates consists of the exchange of their micronuclear genetic information, leading to the formation of two new micronuclei, followed by each ciliate re-assembling the information from its new micronucleus to construct a new functional macronucleus. The latter process is called gene assembly, or gene re-arrangement. It involves re-ordering some fragments of DNA (permutations and possibly inversion) and deleting other fragments from the micronuclear copy. From the computational point of view, the study of this gene assembly process led to many challenging research themes and results, such as the Turing universality of various models of this process. From the biological point of view, a plausible hypothesis about the "bioware" that implements the gene-assembly process was proposed, based on template guided recombination.

Other approaches to cellular computing include developing an in vivo programmable and autonomous finite-state automaton with E. coli, designing and constructing in vivo cellular logic gates and genetic circuits that harness the cell's existing biochemical processes and the global optimization of stomata aperture in leaves, following a set of local rules resembling a cellular automaton.

Antibody–drug conjugate

From Wikipedia, the free encyclopedia
Schematic structure of an antibody–drug conjugate (ADC)

Antibody–drug conjugates or ADCs are a class of biopharmaceutical drugs designed as a targeted therapy for treating cancer. Unlike chemotherapy, ADCs are intended to target and kill tumor cells while sparing healthy cells. As of 2019, some 56 pharmaceutical companies were developing ADCs.

ADCs are complex molecules composed of an antibody linked to a biologically active cytotoxic (anticancer) payload or drug. Antibody–drug conjugates are an example of bioconjugates and immunoconjugates.

ADCs combine the targeting properties of monoclonal antibodies with the cancer-killing capabilities of cytotoxic drugs, designed to discriminate between healthy and diseased tissue.

Mechanism of action

An anticancer drug is coupled to an antibody that targets a specific tumor antigen (or protein) that, ideally, is only found in or on tumor cells. Antibodies attach themselves to the antigens on the surface of cancerous cells. The biochemical reaction that occurs upon attaching triggers a signal in the tumor cell, which then absorbs, or internalizes, the antibody together with the linked cytotoxin. After the ADC is internalized, the cytotoxin kills the cancer. Their targeting ability was believed to limit side effects for cancer patients and to give a wider therapeutic window than other chemotherapeutic agents, although this promise hasn't yet been realized in the clinic.

ADC technologies have been featured in many publications, including scientific journals.

History

The idea of drugs that would target tumor cells and ignore others was conceived in 1900 by German Nobel laureate Paul Ehrlich; he described the drugs as a "magic bullet" due to their targeting properties.

In 2001 Pfizer/Wyeth's drug Gemtuzumab ozogamicin (trade name: Mylotarg) was approved based on a study with a surrogate endpoint, through the accelerated approval process. In June 2010, after evidence accumulated showing no evidence of benefit and significant toxicity, the U.S. Food and Drug Administration (FDA) forced the company to withdraw it. It was reintroduced into the US market in 2017.

Brentuximab vedotin (trade name: Adcetris, marketed by Seattle Genetics and Millennium/Takeda) was approved for relapsed HL and relapsed systemic anaplastic large-cell lymphoma (sALCL)) by the FDA on August 19, 2011 and received conditional marketing authorization from the European Medicines Agency in October 2012.

Trastuzumab emtansine (ado-trastuzumab emtansine or T-DM1, trade name: Kadcyla, marketed by Genentech and Roche) was approved in February 2013 for the treatment of people with HER2-positive metastatic breast cancer (mBC) who had received prior treatment with trastuzumab and a taxane chemotherapy.

The European Commission approved Inotuzumab ozogamicin as a monotherapy for the treatment of adults with relapsed or refractory CD22-positive B-cell precursor acute lymphoblastic leukemia (ALL) on June 30, 2017 under the trade name Besponsa® (Pfizer/Wyeth), followed on August 17, 2017 by the FDA.

The first immunology antibody–drug conjugate (iADC), ABBV-3373, showed an improvement in disease activity in a Phase 2a study of patients with rheumatoid arthritis and a study with the second iADC, ABBV-154 to evaluate adverse events and change in disease activity in participants treated with subcutaneous injection of ABBV-154 is ongoing.

In July 2018, Daiichi Sankyo Company, Limited and Glycotope GmbH have inked a pact regarding the combination of Glycotope's investigational tumor-associated TA-MUC1 antibody gatipotuzumab and Daiichi Sankyo's proprietary ADC technology for developing gatipotuzumab antibody drug conjugate.

In 2019 AstraZeneca agreed to pay up to US$6.9 billion to jointly develop DS-8201 with Japan's Daiichi Sankyo. It is intended to replace Herceptin for treating breast cancer. DS8201 carries eight payloads, compared to the usual four.

Commercial products

Thirteen ADCs have received market approval by the FDA – all for oncotherapies. Belantamab mafodotin is in the process of being withdrawn from US marketing.

FDA Approved ADCs
Drug Trade name Maker Condition
Gemtuzumab ozogamicin Mylotarg Pfizer/Wyeth relapsed acute myelogenous leukemia (AML)
Brentuximab vedotin Adcetris Seattle Genetics, Millennium/Takeda Hodgkin lymphoma (HL) and systemic anaplastic large-cell lymphoma (ALCL)
Trastuzumab emtansine Kadcyla Genentech, Roche HER2-positive metastatic breast cancer (mBC) following treatment with trastuzumab and a maytansinoid
Inotuzumab ozogamicin Besponsa Pfizer/Wyeth relapsed or refractory CD22-positive B-cell precursor acute lymphoblastic leukemia
Polatuzumab vedotin Polivy Genentech, Roche relapsed or refractory diffuse large B-cell lymphoma (DLBCL)
Enfortumab vedotin Padcev Astellas/Seattle Genetics adult patients with locally advanced or metastatic urothelial cancer who have received a PD-1 or PD-L1 inhibitor, and a Pt-containing therapy
Trastuzumab deruxtecan Enhertu AstraZeneca/Daiichi Sankyo adult patients with unresectable or metastatic HER2-positive breast cancer who have received two or more prior anti-HER2 based regimens
Sacituzumab govitecan Trodelvy Immunomedics adult patients with metastatic triple-negative breast cancer (mTNBC) who have received at least two prior therapies for patients with relapsed or refractory metastatic disease
Belantamab mafodotin Blenrep GlaxoSmithKline multiple myeloma patients whose disease has progressed despite prior treatment with an immunomodulatory agent, proteasome inhibitor and anti-CD38 antibody
Moxetumomab pasudotox Lumoxiti AstraZeneca relapsed or refractory hairy cell leukemia (HCL)
Loncastuximab tesirine Zynlonta ADC Therapeutics relapsed or refractory large B-cell lymphoma (including diffuse large B-cell lymphoma (DLBCL) not otherwise specified, DLBCL arising from low-grade lymphoma, and high-grade B-cell lymphoma) after two or more lines of systemic therapy
Tisotumab vedotin-tftv Tivdak Seagen Inc, Genmab adult patients with recurrent or metastatic cervical cancer with disease progression on or after chemotherapy
Mirvetuximab soravtansine Elahere ImmunoGen treatment of adult patients with folate receptor alpha (FRα)-positive, platinum-resistant epithelial ovarian, fallopian tube, or primary peritoneal cancer, who have received one to three prior systemic treatment regimens

Components of an ADC

An antibody–drug conjugate consists of 3 components:

  • Antibody - targets the cancer cell surface and may also elicit a therapeutic response.
  • Payload - elicits the desired therapeutic response.
  • Linker - attaches the payload to the antibody and should be stable in circulation only releasing the payload at the desired target. Multiple approaches to conjugation have been developed for attachment to the antibody and reviewed. DAR is the drug to antibody ratio and indicates the level of loading of the payload on the ADC.

Payloads

Many of the payloads for oncology ADCs (oADC) are natural product based with some making covalent interactions with their target. Payloads include the microtubulin inhibitors monomethyl auristatin E (MMAE), monomethyl auristatin F (MMAF) and mertansine, DNA binder calicheamicin and topoisomerase 1 inhibitors SN-38 and exatecan resulting in a renaissance for natural product total synthesis. Glucocorticoid receptor modulators (GRMs) represent to most active payload class for iADCs. Approaches releasing marketed GRM molecules such as dexamethasone and budesonide have been developed. Modified GRM molecules have also been developed that enable the attachment of the linker with the term ADCidified describing the medicinal chemistry process of payload optimization to facilitate linker attachment. Alternatives to small molecule payloads have also been investigated, for example, siRNA.

Linkers

A stable link between the antibody and cytotoxic (anti-cancer) agent is a crucial aspect of an ADC. A stable ADC linker ensures that less of the cytotoxic payload falls off before reaching a tumor cell, improving safety, and limiting dosages.

Linkers are based on chemical motifs including disulfides, hydrazones or peptides (cleavable), or thioethers (noncleavable). Cleavable and noncleavable linkers were proved to be safe in preclinical and clinical trials. Brentuximab vedotin includes an enzyme-sensitive cleavable linker that delivers the antimicrotubule agent monomethyl auristatin E or MMAE, a synthetic antineoplastic agent, to human-specific CD30-positive malignant cells. MMAE inhibits cell division by blocking the polymerization of tubulin. Because of its high toxicity MMAE cannot be used as a single-agent chemotherapeutic drug. However, MMAE linked to an anti-CD30 monoclonal antibody (cAC10, a cell membrane protein of the tumor necrosis factor or TNF receptor) was stable in extracellular fluid. It is cleavable by cathepsin and safe for therapy. Trastuzumab emtansine is a combination of the microtubule-formation inhibitor mertansine (DM-1) and antibody trastuzumab that employs a stable, non-cleavable linker.

The availability of better and more stable linkers has changed the function of the chemical bond. The type of linker, cleavable or noncleavable, lends specific properties to the cytotoxic drug. For example, a non-cleavable linker keeps the drug within the cell. As a result, the entire antibody, linker and cytotoxic (anti-cancer) agent enter the targeted cancer cell where the antibody is degraded into an amino acid. The resulting complex – amino acid, linker and cytotoxic agent – is considered to be the active drug. In contrast, cleavable linkers are detached by enzymes in the cancer cell. The cytotoxic payload can then escape from the targeted cell and, in a process called "bystander killing", attack neighboring cells.

Another type of cleavable linker, currently in development, adds an extra molecule between the cytotoxin and the cleavage site. This allows researchers to create ADCs with more flexibility without changing cleavage kinetics. Researchers are developing a new method of peptide cleavage based on Edman degradation, a method of sequencing amino acids in a peptide. Also under development are site-specific conjugation (TDCs) and novel conjugation techniques to further improve stability and therapeutic index, α emitting immunoconjugates, antibody-conjugated nanoparticles and antibody-oligonucleotide conjugates.

Anything Drug Conjugates

As the antibody–drug conjugate field has matured, a more accurate definition of ADC is now Anything-Drug Conjugate. Alternatives for the antibody targeting component now include multiple smaller antibody fragments like diabodies, Fab, scFv, and bicyclic peptides.

Research

Non-natural amino acids

The first generation uses linking technologies that conjugate drugs non-selectively to cysteine or lysine residues in the antibody, resulting in a heterogeneous mixture. This approach leads to suboptimal safety and efficacy and complicates optimization of the biological, physical and pharmacological properties. Site-specific incorporation of unnatural amino acids generates a site for controlled and stable attachment. This enables the production of homogeneous ADCs with the antibody precisely linked to the drug and controlled ratios of antibody to drug, allowing the selection of a best-in-class ADC. An Escherichia coli-based open cell-free synthesis (OCFS) allows the synthesis of proteins containing site-specifically incorporated non-natural amino acids and has been optimized for predictable high-yield protein synthesis and folding. The absence of a cell wall allows the addition of non-natural factors to the system to manipulate transcription, translation and folding to provide precise protein expression modulation.

Other disease areas

The majority of ADCs under development or in clinical trials are for oncological and hematological indications. This is primarily driven by the inventory of monoclonal antibodies, which target various types of cancer. However, some developers are looking to expand the application to other important disease areas.

Anthropogenic metabolism

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Anthropogenic_metabolism

Anthropogenic metabolism, also referred to as metabolism of the anthroposphere, is a term used in industrial ecology, material flow analysis, and waste management to describe the material and energy turnover of human society. It emerges from the application of systems thinking to the industrial and other man-made activities and it is a central concept of sustainable development. In modern societies, the bulk of anthropogenic (man-made) material flows is related to one of the following activities: sanitation, transportation, habitation, and communication, which were "of little metabolic significance in prehistoric times". Global man-made stocks of steel in buildings, infrastructure, and vehicles, for example, amount to about 25 Gigatonnes (more than three tonnes per person), a figure that is surpassed only by construction materials such as concrete. Sustainable development is closely linked to the design of a sustainable anthropogenic metabolism, which will entail substantial changes in the energy and material turnover of the different human activities. Anthropogenic metabolism can be seen as synonymous to social or socioeconomic metabolism. It comprises both industrial metabolism and urban metabolism.

Negative effects

In layman's terms, anthropogenic metabolism indicates the human impact on the world by the modern industrialized world. Much of these impacts include waste management, ecological footprints, water footprints, and flow analysis (i.e., the rate at which each human depleted the energy around them). Most anthropogenic metabolism happens in developed countries. According to Rosales, "Economic growth is at present the main cause of increased climate change, and climate change is a main mechanism of biodiversity loss; because of this, economic growth is a major catalyst of biodiversity loss."

A water footprint is the amount of water that each person uses in their daily lives. Most of the world's water is salt water which cannot be used in human food or water supplies. Therefore, the freshwater sources that were once plentiful are now being diminished due to anthropogenic metabolism of the growing population. The water footprint encompasses how much fresh water is needed for each consumer's needs. According to J. Allan, "there is a huge impact of water use on stores of surface and groundwater and on flows to which water is returned after use. These impacts are shown to be particularly high for manufacturing industries. For example, that there are less than 10 economies worldwide that have a significant water surplus, but that these economies have successfully met, or have the potential to meet, the water deficits of the other 190 economies. Consumers enjoy the delusion of food and water security provided by virtual water trade.

In addition, the ecological footprint is a more economical and land-focused way of looking at human impact. Developed countries tend to have higher ecological footprints, which do not strictly correspond to a country's total population. According to research by Dias de Oliveira, Vaughan and Rykiel, "The Ecological Footprint...is an accounting tool based on two fundamental concepts, sustainability and carrying capacity. It makes it possible to estimate the resource consumption and waste assimilation requirements of a defined human population or economy sector in terms of corresponding productive land area."

One of the major cycles that humans can contribute to that cause a major impact on climate change is the nitrogen cycle. This comes from nitrogen fertilizers that humans use. Gruber and Galloway have researched, "The massive acceleration of the nitrogen cycle caused by the production and industrial use of artificial nitrogen fertilizers worldwide has led to a range of environmental problems. Most important is how the availability of nitrogen will affect the capacity of Earth's biosphere to continue absorbing carbon from the atmosphere and to thereby continue helping to mitigate climate change."

The carbon cycle is another major contributor to climate change primarily from anthropogenic metabolism. A couple examples of how humans contribute to the carbon in the atmosphere is by burning fossil fuels and deforestation. By taking a close look at the carbon cycle Peng, Thomas and Tian have discovered that, "It is recognized that human activities, such as fossil fuel burning, land-use change, and forest harvesting at a large scale, have resulted in the increase of greenhouse gases in the atmosphere since the onset of the Industrial Revolution. The increasing amounts of greenhouse gases, particularly CO2 in the atmosphere, is believed to have induced climate change and global warming."

Impact of climate change extend beyond humans. There is a forecast for extinctions of species because of their habitats being affected. An example of this is marine animals. There are major impacts on the marine systems as a result of anthropogenic metabolism, according to Blaustein, the dramatic findings indicate that "every square kilometer [is] affected by some anthropogenic driver of ecological change".

The negative effects of anthropogenic metabolism are seen through the water footprint, ecological footprint, carbon cycle, and the nitrogen cycle. Studies on the marine ecosystem that show major impacts by humans and developed countries which include more industries, thus more anthropogenic metabolism.

Renaissance philosophy

From Wikipedia, the free encyclopedia   Renaissance The School of Athens (15...