Search This Blog

Monday, March 23, 2026

Paradigm shift

From Wikipedia, the free encyclopedia

A paradigm shift is a fundamental change in the basic concepts and experimental practices of a scientific discipline. It is a concept in the philosophy of science that was introduced and brought into the common lexicon by the American physicist and philosopher Thomas Kuhn. Even though Kuhn restricted the use of the term to the natural sciences, the concept of a paradigm shift has also been used in numerous non-scientific contexts to describe a profound change in a fundamental model or perception of events.

Kuhn presented his notion of a paradigm shift in his influential book The Structure of Scientific Revolutions (1962).

Kuhn contrasts paradigm shifts, which characterize a Scientific Revolution, to the activity of normal science, which he describes as scientific work done within a prevailing framework or paradigm. Paradigm shifts arise when the dominant paradigm under which normal science operates is rendered incompatible with new phenomena, facilitating the adoption of a new theory or paradigm.

As one commentator summarizes:

Kuhn acknowledges having used the term "paradigm" in two different meanings. In the first one, "paradigm" designates what the members of a certain scientific community have in common, that is to say, the whole of techniques, patents and values shared by the members of the community. In the second sense, the paradigm is a single element of a whole, say for instance Newton's Principia, which, acting as a common model or an example... stands for the explicit rules and thus defines a coherent tradition of investigation. Thus the question is for Kuhn to investigate by means of the paradigm what makes possible the constitution of what he calls "normal science". That is to say, the science which can decide if a certain problem will be considered scientific or not. Normal science does not mean at all a science guided by a coherent system of rules, on the contrary, the rules can be derived from the paradigms, but the paradigms can guide the investigation also in the absence of rules. This is precisely the second meaning of the term "paradigm", which Kuhn considered the most new and profound, though it is in truth the oldest.

History

The nature of scientific revolutions has been studied by modern philosophy since Immanuel Kant used the phrase in the preface to the second edition of his Critique of Pure Reason (1787). Kant used the phrase "revolution of the way of thinking" (Revolution der Denkart) to refer to Greek mathematics and Newtonian physics. In the 20th century, new developments in the basic concepts of mathematics, physics, and biology revitalized interest in the question among scholars.

Original usage

Kuhn used the duck-rabbit ambiguous image, which Wittgenstein famously used to distinguish between the meaning of 'seeing as' and 'seeing that,' to show how a paradigm shift could cause one to see the same information in an entirely different way.

In his 1962 book The Structure of Scientific Revolutions, Kuhn explains the development of paradigm shifts in science into four stages:

  • Normal science – In this stage, which Kuhn sees as most prominent in science, a dominant paradigm is active. This paradigm is characterized by a set of theories and ideas that define what is possible and rational to do, giving scientists a clear set of tools to approach certain problems. Some examples of dominant paradigms that Kuhn gives are: Newtonian physics, caloric theory, and the theory of electromagnetism. Insofar as paradigms are useful, they expand both the scope and the tools with which scientists do research. Kuhn, in discussion of theory-ladenness, stresses that, rather than being monolithic, the paradigms that define normal science can be particular to different people. A chemist and a physicist might operate with different paradigms of what a helium atom is. Under normal science, scientists encounter anomalies that cannot be explained by the universally accepted paradigm within which scientific progress has thereto been made.
  • Extraordinary research – When enough significant anomalies have accrued against a current paradigm, the scientific discipline is thrown into a state of crisis. To address the crisis, scientists push the boundaries of normal science in what Kuhn calls “extraordinary research”, which is characterized by its exploratory nature. Without the structures of the dominant paradigm to depend on, scientists engaging in extraordinary research must produce new theories, thought experiments, and experiments to explain the anomalies. Kuhn sees the practice of this stage – “the proliferation of competing articulations, the willingness to try anything, the expression of explicit discontent, the recourse to philosophy and to debate over fundamentals” – as even more important to science than paradigm shifts.
  • Adoption of a new paradigm – Eventually a new paradigm is formed, which gains its own new followers. For Kuhn, this stage entails both resistance to the new paradigm, and reasons for why individual scientists adopt it. According to Max Planck, "a new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it." Because scientists are committed to the dominant paradigm, and paradigm shifts involve gestalt-like changes, Kuhn stresses that paradigms are difficult to change. However, paradigms can gain influence by explaining or predicting phenomena much better than before (i.e., Bohr's model of the atom) or by being more subjectively pleasing. During this phase, proponents for competing paradigms address what Kuhn considers the core of a paradigm debate: whether a given paradigm will be a good guide for future problems – things that neither the proposed paradigm nor the dominant paradigm are capable of solving currently.
  • Aftermath of the scientific revolution – In the long run, the new paradigm becomes institutionalized as the dominant one. Textbooks are written, obscuring the revolutionary process.

Features

Paradigm shifts and progress

A common misinterpretation of paradigms is the belief that the discovery of paradigm shifts and the dynamic nature of science (with its many opportunities for subjective judgments by scientists) are a case for relativism: the view that all kinds of belief systems are equal. Kuhn vehemently denies this interpretation and states that when a scientific paradigm is replaced by a new one, albeit through a complex social process, the new one is always better, not just different.

Incommensurability

These claims of relativism are, however, tied to another claim that Kuhn does at least somewhat endorse: that the language and theories of different paradigms cannot be translated into one another or rationally evaluated against one another—that they are incommensurable. This gave rise to much talk of different peoples and cultures having radically different worldviews or conceptual schemes—so different that whether or not one was better, they could not be understood by one another.

Donald Davidson famously argued against this idea of conceptual relativism, claiming that the notion that any languages or theories could be incommensurable with one another was itself incoherent. If this is correct, Kuhn's claims must be taken in a weaker sense than they often are.

Furthermore, the hold of the Kuhnian analysis on social science has long been tenuous, with the wide application of multi-paradigmatic approaches in order to understand complex human behaviour.

Gradualism vs. sudden change

Paradigm shifts tend to be most dramatic in sciences that appear to be stable and mature, as in physics at the end of the 19th century. At that time, physics seemed to be a discipline filling in the last few details of a largely worked-out system.

In The Structure of Scientific Revolutions, Kuhn wrote, "Successive transition from one paradigm to another via revolution is the usual developmental pattern of mature science" (p. 12). Kuhn's idea was itself revolutionary in its time as it caused a major change in the way that academics talk about science. Thus, it could be argued that it caused or was itself part of a "paradigm shift" in the history and sociology of science. However, Kuhn would not recognise such a paradigm shift. In the social sciences, people can still use earlier ideas to discuss the history of science.

Philosophers and historians of science, including Kuhn himself, ultimately accepted a modified version of Kuhn's model, which synthesizes his original view with the gradualist model that preceded it.

Examples

Natural sciences

Some of the "classical cases" of Kuhnian paradigm shifts in science are:

Social sciences

In Kuhn's view, the existence of a single reigning paradigm is characteristic of the natural sciences, while philosophy and much of social science were characterized by a "tradition of claims, counterclaims, and debates over fundamentals." Others have applied Kuhn's concept of paradigm shift to the social sciences.

Applied sciences

More recently, paradigm shifts are also recognisable in applied sciences:

Other uses

The term "paradigm shift" has found uses in other contexts, representing the notion of a major change in a certain thought pattern—a radical change in personal beliefs, complex systems or organizations, replacing the former way of thinking or organizing with a radically different way of thinking or organizing:

  • M. L. Handa, a professor of sociology in education at O.I.S.E. University of Toronto, Canada, developed the concept of a paradigm within the context of social sciences. He defines what he means by "paradigm" and introduces the idea of a "social paradigm". In addition, he identifies the basic component of any social paradigm. Like Kuhn, he addresses the issue of changing paradigms, the process popularly known as "paradigm shift". In this respect, he focuses on the social circumstances that precipitate such a shift. Relatedly, he addresses how that shift affects social institutions, including the institution of education.
  • The concept has been developed for technology and economics in the identification of new techno-economic paradigms as changes in technological systems that have a major influence on the behaviour of the entire economy (Carlota Perez; earlier work only on technological paradigms by Giovanni Dosi). This concept is linked to Joseph Schumpeter's idea of creative destruction. Examples include the move to mass production and the introduction of microelectronics.
  • Two photographs of the Earth from space, "Earthrise" (1968) and "The Blue Marble" (1972), are thought to have helped to usher in the environmentalist movement, which gained great prominence in the years immediately following distribution of those images.
  • Hans Küng applies Thomas Kuhn's theory of paradigm change to the entire history of Christian thought and theology. He identifies six historical "macromodels": 1) the apocalyptic paradigm of primitive Christianity, 2) the Hellenistic paradigm of the patristic period, 3) the medieval Roman Catholic paradigm, 4) the Protestant (Reformation) paradigm, 5) the modern Enlightenment paradigm, and 6) the emerging ecumenical paradigm. He also discusses five analogies between natural science and theology in relation to paradigm shifts. Küng addresses paradigm change in his books, Paradigm Change in Theology and Theology for the Third Millennium: An Ecumenical View.
  • In the later part of the 1990s, 'paradigm shift' emerged as a buzzword, popularized as marketing speak and appearing more frequently in print and publication. In his book Mind The Gaffe, author Larry Trask advises readers to refrain from using it, and to use caution when reading anything that contains the phrase. It is referred to in several articles and books as abused and overused to the point of becoming meaningless.
  • The concept of technological paradigms has been advanced, particularly by Giovanni Dosi.

Criticism

In a 2015 retrospective on Kuhn, the philosopher Martin Cohen describes the notion of the paradigm shift as a kind of intellectual virus – spreading from hard science to social science and on to the arts and even everyday political rhetoric today. Cohen claims that Kuhn had only a very hazy idea of what it might mean and, in line with the Austrian philosopher of science Paul Feyerabend, accuses Kuhn of retreating from the more radical implications of his theory, which are that scientific facts are never really more than opinions whose popularity is transitory and far from conclusive. Cohen says scientific knowledge is less certain than it is usually portrayed, and that science and knowledge generally is not the 'very sensible and reassuringly solid sort of affair' that Kuhn describes, in which progress involves periodic paradigm shifts in which much of the old certainties are abandoned in order to open up new approaches to understanding that scientists would never have considered valid before. He argues that information cascades can distort rational, scientific debate. He has focused on health issues, including the example of highly mediatised 'pandemic' alarms, and why they have turned out eventually to be little more than scares.

One of the early critics of Kuhn, Norwood Russell Hanson criticized Kuhn's paradigm shift theory as being conceptually circular and therefore unfalsifiable.

There is a discussion about genetic fallacy in the concept of paradigm shift and the way in which theories are evaluated.

Semantic network

From Wikipedia, the free encyclopedia
Example of a semantic network

A semantic network, or frame network is a knowledge base that represents semantic relations between concepts in a network. This is often used as a form of knowledge representation. It is a directed or undirected graph consisting of vertices, which represent concepts, and edges, which represent semantic relations between concepts, mapping or connecting semantic fields. A semantic network may be instantiated as, for example, a graph database or a concept map. Typical standardized semantic networks are expressed as semantic triples.

Semantic networks are used in natural language processing applications such as semantic parsing and word-sense disambiguation. Semantic networks can also be used as a method to analyze large texts and identify the main themes and topics (e.g., of social media posts), to reveal biases (e.g., in news coverage), or even to map an entire research field.

History

Examples of the use of semantic networks in logic, directed acyclic graphs as a mnemonic tool, dates back centuries. The earliest documented use being the Greek philosopher Porphyry's commentary on Aristotle's categories in the third century AD.

In computing history, "Semantic Nets" for the propositional calculus were first implemented for computers by Richard H. Richens of the Cambridge Language Research Unit in 1956 as an "interlingua" for machine translation of natural languages. Although the importance of this work and the CLRU was only belatedly realized.

Semantic networks were also independently implemented by Robert F. Simmons and Sheldon Klein, using the first order predicate calculus as a base, after being inspired by a demonstration of Victor Yngve. The "line of research was originated by the first President of the Association [Association for Computational Linguistics], Victor Yngve, who in 1960 had published descriptions of algorithms for using a phrase structure grammar to generate syntactically well-formed nonsense sentences. Sheldon Klein and I about 1962-1964 were fascinated by the technique and generalized it to a method for controlling the sense of what was generated by respecting the semantic dependencies of words as they occurred in text." Other researchers, most notably M. Ross Quillian and others at System Development Corporation helped contribute to their work in the early 1960s as part of the SYNTHEX project. It's from these publications at SDC that most modern derivatives of the term "semantic network" cite as their background. Later prominent works were done by Allan M. Collins and Quillian (e.g., Collins and Quillian; Collins and Loftus Quillian. Still later in 2006, Hermann Helbig fully described MultiNet.

In the late 1980s, two Netherlands universities, Groningen and Twente, jointly began a project called Knowledge Graphs, which are semantic networks but with the added constraint that edges are restricted to be from a limited set of possible relations, to facilitate algebras on the graph. In the subsequent decades, the distinction between semantic networks and knowledge graphs was blurred. In 2012, Google gave their knowledge graph the name Knowledge Graph. The Semantic Link Network was systematically studied as a social semantics networking method. Its basic model consists of semantic nodes, semantic links between nodes, and a semantic space that defines the semantics of nodes and links and reasoning rules on semantic links. The systematic theory and model was published in 2004. This research direction can trace to the definition of inheritance rules for efficient model retrieval in 1998 and the Active Document Framework ADF. Since 2003, research has developed toward social semantic networking. This work is a systematic innovation at the age of the World Wide Web and global social networking rather than an application or simple extension of the Semantic Net (Network). Its purpose and scope are different from that of the Semantic Net (or network). The rules for reasoning and evolution and automatic discovery of implicit links play an important role in the Semantic Link Network. Recently it has been developed to support Cyber-Physical-Social Intelligence. It was used for creating a general summarization method. The self-organised Semantic Link Network was integrated with a multi-dimensional category space to form a semantic space to support advanced applications with multi-dimensional abstractions and self-organised semantic links It has been verified that Semantic Link Network play an important role in understanding and representation through text summarisation applications. Semantic Link Network has been extended from cyberspace to cyber-physical-social space. Competition relation and symbiosis relation as well as their roles in evolving society were studied in the emerging topic: Cyber-Physical-Social Intelligence

More specialized forms of semantic networks has been created for specific use. For example, in 2008, Fawsy Bendeck's PhD thesis formalized the Semantic Similarity Network (SSN) that contains specialized relationships and propagation algorithms to simplify the semantic similarity representation and calculations.

Basics of semantic networks

A semantic network is used when one has knowledge that is best understood as a set of concepts that are related to one another.

Most semantic networks are cognitively based. They also consist of arcs and nodes which can be organized into a taxonomic hierarchy. Semantic networks contributed ideas of spreading activation, inheritance, and nodes as proto-objects.

Examples

In Lisp

The following code shows an example of a semantic network in the Lisp programming language using an association list.

(setq *database*
'((canary  (is-a bird)
           (color yellow)
           (size small))
  (penguin (is-a bird)
           (movement swim))
  (bird    (is-a vertebrate)
           (has-part wings)
           (reproduction egg-laying))))

To extract all the information about the "canary" type, one would use the assoc function with a key of "canary".

WordNet

An example of a semantic network is WordNet, a lexical database of English. It groups English words into sets of synonyms called synsets, provides short, general definitions, and records the various semantic relations between these synonym sets. Some of the most common semantic relations defined are meronymy (A is a meronym of B if A is part of B), holonymy (B is a holonym of A if B contains A), hyponymy (or troponymy) (A is subordinate of B; A is kind of B), hypernymy (A is superordinate of B), synonymy (A denotes the same as B) and antonymy (A denotes the opposite of B).

WordNet properties have been studied from a network theory perspective and compared to other semantic networks created from Roget's Thesaurus and word association tasks. From this perspective the three of them are a small world structure.

Other examples

It is also possible to represent logical descriptions using semantic networks such as the existential graphs of Charles Sanders Peirce or the related conceptual graphs of John F. Sowa. These have expressive power equal to or exceeding standard first-order predicate logic. Unlike WordNet or other lexical or browsing networks, semantic networks using these representations can be used for reliable automated logical deduction. Some automated reasoners exploit the graph-theoretic features of the networks during processing.

Other examples of semantic networks are Gellish models. Gellish English with its Gellish English dictionary, is a formal language that is defined as a network of relations between concepts and names of concepts. Gellish English is a formal subset of natural English, just as Gellish Dutch is a formal subset of Dutch, whereas multiple languages share the same concepts. Other Gellish networks consist of knowledge models and information models that are expressed in the Gellish language. A Gellish network is a network of (binary) relations between things. Each relation in the network is an expression of a fact that is classified by a relation type. Each relation type itself is a concept that is defined in the Gellish language dictionary. Each related thing is either a concept or an individual thing that is classified by a concept. The definitions of concepts are created in the form of definition models (definition networks) that together form a Gellish Dictionary. A Gellish network can be documented in a Gellish database and is computer interpretable.

SciCrunch is a collaboratively edited knowledge base for scientific resources. It provides unambiguous identifiers (Research Resource IDentifiers or RRIDs) for software, lab tools etc. and it also provides options to create links between RRIDs and from communities.

Another example of semantic networks, based on category theory, is ologs. Here each type is an object, representing a set of things, and each arrow is a morphism, representing a function. Commutative diagrams also are prescribed to constrain the semantics.

In the social sciences people sometimes use the term semantic network to refer to co-occurrence networks.

Software tools

There are also elaborate types of semantic networks connected with corresponding sets of software tools used for lexical knowledge engineering, like the Semantic Network Processing System (SNePS) of Stuart C. Shapiro or the MultiNet paradigm of Hermann Helbig, especially suited for the semantic representation of natural language expressions and used in several NLP applications.

Semantic networks are used in specialized information retrieval tasks, such as plagiarism detection. They provide information on hierarchical relations in order to employ semantic compression to reduce language diversity and enable the system to match word meanings, independently from sets of words used.

The Knowledge Graph proposed by Google in 2012 is actually an application of semantic network in search engine.

Modeling multi-relational data like semantic networks in low-dimensional spaces through forms of embedding has benefits in expressing entity relationships as well as extracting relations from mediums like text. There are many approaches to learning these embeddings, notably using Bayesian clustering frameworks or energy-based frameworks, and more recently, TransE (NIPS 2013). Applications of embedding knowledge base data include Social network analysis and Relationship extraction.

Systems biology

From Wikipedia, the free encyclopedia

Systems biology is the computational and mathematical analysis and modeling of complex biological systems. It is a biology-based interdisciplinary field of study that focuses on complex interactions within biological systems, using a holistic approach (holism instead of the more traditional reductionism) to biological research. This multifaceted research domain necessitates the collaborative efforts of chemists, biologists, mathematicians, physicists, and engineers to decipher the biology of intricate living systems by merging various quantitative molecular measurements with carefully constructed mathematical models. It represents a comprehensive method for comprehending the complex relationships within biological systems. In contrast to conventional biological studies that typically center on isolated elements, systems biology seeks to combine different biological data to create models that illustrate and elucidate the dynamic interactions within a system. This methodology is essential for understanding the complex networks of genes, proteins, and metabolites that influence cellular activities and the traits of organisms. One of the aims of systems biology is to model and discover emergent properties, of cells, tissues and organisms functioning as a system whose theoretical description is only possible using techniques of systems biology. By exploring how function emerges from dynamic interactions, systems biology bridges the gaps that exist between molecules and physiological processes.

As a paradigm, systems biology is usually defined in antithesis to the so-called reductionist paradigm (biological organisation), although it is consistent with the scientific method. The distinction between the two paradigms is referred to in these quotations: "the reductionist approach has successfully identified most of the components and many of the interactions but, unfortunately, offers no convincing concepts or methods to understand how system properties emerge ... the pluralism of causes and effects in biological networks is better addressed by observing, through quantitative measures, multiple components simultaneously and by rigorous data integration with mathematical models." (Sauer et al.) "Systems biology ... is about putting together rather than taking apart, integration rather than reduction. It requires that we develop ways of thinking about integration that are as rigorous as our reductionist programmes, but different. ... It means changing our philosophy, in the full sense of the term." (Denis Noble)

The central flow of biological information and the corresponding omics fields, emphasizing the systems biology approach of integrating genomics, transcriptomics, proteomics, and metabolomics to link genotype to phenotype.

As a series of operational protocols used for performing research, namely a cycle composed of theory, analytic or computational modelling to propose specific testable hypotheses about a biological system, experimental validation, and then using the newly acquired quantitative description of cells or cell processes to refine the computational model or theory. Since the objective is a model of the interactions in a system, the experimental techniques that most suit systems biology are those that are system-wide and attempt to be as complete as possible. Therefore, multiomics (transcriptomics, metabolomics, proteomics, etc) and high-throughput techniques are used to collect quantitative data for the construction and validation of models.

A comprehensive systems biology approach necessitates: (i) a thorough characterization of an organism concerning its molecular components, the interactions among these molecules, and how these interactions contribute to cellular functions; (ii) a detailed spatio-temporal molecular characterization of a cell (for example, component dynamics, compartmentalization, and vesicle transport); and (iii) an extensive systems analysis of the cell's 'molecular response' to both external and internal perturbations. Furthermore, the data from (i) and (ii) should be synthesized into mathematical models to test knowledge by generating predictions (hypotheses), uncovering new biological mechanisms, assessing the system's behavior derived from (iii), and ultimately formulating rational strategies for controlling and manipulating cells. To tackle these challenges, systems biology must incorporate methods and approaches from various disciplines that have not traditionally interfaced with one another. The emergence of multi-omics technologies has transformed systems biology by providing extensive datasets that cover different biological layers, including genomics, transcriptomics, proteomics, and metabolomics. These technologies enable the large-scale measurement of biomolecules, leading to a more profound comprehension of biological processes and interactions. Increasingly, methods such as network analysis, machine learning, and pathway enrichment are utilized to integrate and interpret multi-omics data, thereby improving our understanding of biological functions and disease mechanisms.

History

Shows trends in systems biology research. From 1992 to 2013 database development articles increased. Articles about algorithms have fluctuated but remained fairly steady. Network properties articles and software development articles have remained low but experienced an increased about halfway through the time period 1992–2013. The articles on metabolic flux analysis decreased from 1992 to 2013. In 1992 algorithms, equations, modeling and simulation articles were most cited. In 2012 the most cited were database development articles.
Shows trends in systems biology research by presenting the number of articles out of the top 30 cited systems biology papers during that time which include a specific topic

Holism vs. Reductionism

It is challenging to trace the origins and beginnings of systems biology. A comprehensive perspective on the human body was central to the medical practices of Greek, Roman, and East Asian traditions, where physicians and thinkers like Hippocrates believed that health and illness were linked to the equilibrium or disruption of bodily fluids known as humors. This holistic perspective persisted in the Western world throughout the 19th and 20th centuries, with prominent physiologists viewing the body as controlled by various systems, including the nervous system, the gastrointestinal system, and the cardiovascular system. In the latter half of the 20th century, however, this way of thinking was largely supplanted by reductionism: To grasp how the body functions properly, one needed to comprehend the role of each component, from tissues and cells to the complete set of intracellular molecular building blocks.

In the 17th century, the triumphs of physics and the advancement of mechanical clockwork prompted a reductionist viewpoint in biology, interpreting organisms as intricate machines made up of simpler elements.

Jan Smuts (1870–1950), naturalist/philosopher and twice Prime Minister of South Africa, coined the commonly used term holism. Whole systems such as cells, tissues, organisms, and populations were proposed to have unique (emergent) properties. It was impossible to try and reassemble the behavior of the whole from the properties of the individual components, and new technologies were necessary to define and understand the behavior of systems.

Even though reductionism and holism are often contrasted with one another, they can be synthesized. One must understand how organisms are built (reductionism), while it is just as important to understand why they are so arranged (systems; holism). Each provides useful insights and answers different questions. However, the study of biological systems requires knowledge about control and design paradigms, as well as principles of structural stability, resilience, and robustness that are not directly inferred from mechanistic information. More profound insight will be gained by employing computer modeling to overcome the complexity in biological systems.

Nevertheless, this perspective was consistently balanced by thinkers who underscored the significance of organization and emergent traits in living systems. This reductionist perspective has achieved remarkable success, and our understanding of biological processes has expanded with incredible speed and intensity. However, alongside these extraordinary advancements, science gradually came to understand that possessing complete information about molecular components alone would not suffice to elucidate the workings of life: the individual components rarely illustrate the function of a complex system. It is now commonly recognized that we need approaches for reconstructing integrated systems from their constituent parts and processes if we are to comprehend biological phenomena and manipulate them in a thoughtful, focused way.

Origin of systems biology as a field

In 1968, the term "systems biology" was first introduced at a conference. Those within the discipline soon recognized—and this understanding gradually became known to the wider public—that computational approaches were necessary to fully articulate the concepts and potential of systems biology. Specifically, these techniques needed to view biological phenomena as complex, multi-layered, adaptive, and dynamic systems. They had to account for transformations and intricate nonlinearities, thereby allowing for the smooth integration of smaller models ("modules") into larger, well-organized assemblies of models within complex settings. It became clear that mathematics and computation were vital for these methods. An acceleration of systems understanding came with the publication of the first ground-breaking text compiling molecular, physiological, and anatomical individuality in animals, which has been described as a revolution.

Initially, the wider scientific community was reluctant to accept the integration of computational methods and control theory in the exploration of living systems, believing that "biology was too complex to apply mathematics." However, as the new millennium neared, this viewpoint underwent a significant and lasting transformation. More scientists started working on integration of mathematical concepts to understand and solve biological problems. Now, systems biology has been widely applied in several fields including agriculture and medicine.

Approaches to systems biology

Top-down approach

Top-down systems biology identifies molecular interaction networks by analyzing the correlated behaviors observed in large-scale 'omics' studies. With the advent of 'omics', this top-down strategy has become a leading approach. It begins with an overarching perspective of the system's behavior – examining everything at once – by gathering genome-wide experimental data and seeks to unveil and understand biological mechanisms at a more granular level – specifically, the individual components and their interactions. In this framework of 'top-down' systems biology, the primary goal is to uncover novel molecular mechanisms through a cyclical process that initiates with experimental data, transitions into data analysis and integration to identify correlations among molecule concentrations and concludes with the development of hypotheses regarding the co- and inter-regulation of molecular groups. These hypotheses then generate new predictions of correlations, which can be explored in subsequent experiments or through additional biochemical investigations. The notable advantages of top-down systems biology lie in its potential to provide comprehensive (i.e., genome-wide) insights and its focus on the metabolome, fluxome, transcriptome, and/or proteome. Top-down methods prioritize overall system states as influencing factors in models and the computational (or optimality) principles that govern the dynamics of the global system. For instance, while the dynamics of motor control (neuro) emerge from the interactions of millions of neurons, one can also characterize the neural motor system as a sort of feedback control system, which directs a 'plant' (the body) and guides movement by minimizing 'cost functions' (e.g., achieving trajectories with minimal jerk).

Bottom-up approach

Bottom-up systems biology infers the functional characteristics that may arise from a subsystem characterized with a high degree of mechanistic detail using molecular techniques. This approach begins with the foundational elements by developing the interactive behavior (rate equation) of each component process (e.g., enzymatic processes) within a manageable portion of the system. It examines the mechanisms through which functional properties arise in the interactions of known components. Subsequently, these formulations are combined to understand the behavior of the system. The primary goal of this method is to integrate the pathway models into a comprehensive model representing the entire system - the top or whole. As research and understanding advance, these models are often expanded by incorporating additional processes with high mechanistic detail.

The bottom-up approach facilitates the integration and translation of drug-specific in vitro findings to the in vivo human context. This encompasses data collected during the early phases of drug development, such as safety evaluations. When assessing cardiac safety, a purely bottom-up modeling and simulation method entails reconstructing the processes that determine exposure, which includes the plasma (or heart tissue) concentration-time profiles and their electrophysiological implications, ideally incorporating hemodynamic effects and changes in contractility. Achieving this necessitates various models, ranging from single-cell to advanced three-dimensional (3D) multiphase models. Information from multiple in vitro systems that serve as stand-ins for the in vivo absorption, distribution, metabolism, and excretion (ADME) processes enables predictions of drug exposure, while in vitro data on drug-ion channel interactions support the translation of exposure to body surface potentials and the calculation of important electrophysiological endpoints. The separation of data related to the drug, system, and trial design, which is characteristic of the bottom-up approach, allows for predictions of exposure-response relationships considering both inter- and intra-individual variability, making it a valuable tool for evaluating drug effects at a population level. Numerous successful instances of applying physiologically based pharmacokinetic (PBPK) modeling in drug discovery and development have been documented in the literature.

Associated disciplines

Overview of signal transduction pathways

According to the interpretation of systems biology as using large data sets using interdisciplinary tools, a typical application is metabolomics, which is the complete set of all the metabolic products, metabolites, in the system at the organism, cell, or tissue level.

Items that may be a computer database include: phenomics, organismal variation in phenotype as it changes during its life span; genomics, organismal deoxyribonucleic acid (DNA) sequence, including intra-organismal cell specific variation. (i.e., telomere length variation); epigenomics/epigenetics, organismal and corresponding cell specific transcriptomic regulating factors not empirically coded in the genomic sequence. (i.e., DNA methylation, Histone acetylation and deacetylation, etc.); transcriptomics, organismal, tissue or whole cell gene expression measurements by DNA microarrays or serial analysis of gene expression; interferomics, organismal, tissue, or cell-level transcript correcting factors (i.e., RNA interference), proteomics, organismal, tissue, or cell level measurements of proteins and peptides via two-dimensional gel electrophoresis, mass spectrometry or multi-dimensional protein identification techniques (advanced HPLC systems coupled with mass spectrometry). Sub disciplines include phosphoproteomics, glycoproteomics and other methods to detect chemically modified proteins; glycomics, organismal, tissue, or cell-level measurements of carbohydrates; lipidomics, organismal, tissue, or cell level measurements of lipids.

The molecular interactions within the cell are also studied, this is called interactomics. A discipline in this field of study is protein–protein interactions, although interactomics includes the interactions of other molecules. Neuroelectrodynamics, where the computer's or a brain's computing function as a dynamic system is studied along with its (bio)physical mechanisms; and fluxomics, measurements of the rates of metabolic reactions in a biological system (cell, tissue, or organism).

In approaching a systems biology problem there are two main approaches. These are the top down and bottom up approach. The top down approach takes as much of the system into account as possible and relies largely on experimental results. The RNA-Seq technique is an example of an experimental top down approach. Conversely, the bottom up approach is used to create detailed models while also incorporating experimental data. An example of the bottom up approach is the use of circuit models to describe a simple gene network.

Various technologies utilized to capture dynamic changes in mRNA, proteins, and post-translational modifications. Mechanobiology, forces and physical properties at all scales, their interplay with other regulatory mechanisms; biosemiotics, analysis of the system of sign relations of an organism or other biosystems; Physiomics, a systematic study of physiome in biology.

Cancer systems biology is an example of the systems biology approach, which can be distinguished by the specific object of study (tumorigenesis and treatment of cancer). It works with the specific data (patient samples, high-throughput data with particular attention to characterizing cancer genome in patient tumour samples) and tools (immortalized cancer cell lines, mouse models of tumorigenesis, xenograft models, high-throughput sequencing methods, siRNA-based gene knocking down high-throughput screenings, computational modeling of the consequences of somatic mutations and genome instability). The long-term objective of the systems biology of cancer is ability to better diagnose cancer, classify it and better predict the outcome of a suggested treatment, which is a basis for personalized cancer medicine and virtual cancer patient in more distant prospective. Significant efforts in computational systems biology of cancer have been made in creating realistic multi-scale in silico models of various tumours.

The systems biology approach often involves the development of mechanistic models, such as the reconstruction of dynamic systems from the quantitative properties of their elementary building blocks. For instance, a cellular network can be modelled mathematically using methods coming from chemical kinetics and control theory. Due to the large number of parameters, variables and constraints in cellular networks, numerical and computational techniques are often used (e.g., flux balance analysis).

Other aspects of computer science, informatics, and statistics are also used in systems biology. These include new forms of computational models, such as the use of process calculi to model biological processes (notable approaches include stochastic π-calculus, BioAmbients, Beta Binders, BioPEPA, and Brane calculus) and constraint-based modeling; integration of information from the literature, using techniques of information extraction and text mining; development of online databases and repositories for sharing data and models, approaches to database integration and software interoperability via loose coupling of software, websites and databases, or commercial suits; network-based approaches for analyzing high dimensional genomic data sets. For example, weighted correlation network analysis is often used for identifying clusters (referred to as modules), modeling the relationship between clusters, calculating fuzzy measures of cluster (module) membership, identifying intramodular hubs, and for studying cluster preservation in other data sets; pathway-based methods for omics data analysis, e.g. approaches to identify and score pathways with differential activity of their gene, protein, or metabolite members. Much of the analysis of genomic data sets also include identifying correlations. Additionally, as much of the information comes from different fields, the development of syntactically and semantically sound ways of representing biological models is needed.

Model and its types

Definition

A model serves as a conceptual depiction of objects or processes, highlighting certain characteristics of these items or activities. A model captures only certain facets of reality; however, when created correctly, this limited scope is adequate because the primary goal of modeling is to address specific inquiries. The saying, "essentially, all models are wrong, but some are useful," attributed to the statistician George Box, is a suitable principle for constructing models.

Types of models

  • Boolean Models: These models are also known as logical models and represent biological systems using binary states, allowing for the analysis of gene regulatory networks and signaling pathways. They are advantageous for their simplicity and ability to capture qualitative behaviors.
    This figure deals with the tool BooleSim which is used for simulating and manipulating Boolean models. The given figure deals with a simple synthetic repressilator (A&C) and the concerned output (time series) is obtained using the tool BooleSim (B & D). The boxes represent nodes and the arrow shows the relationship between them. Pointed and blunt arrows indicate promotion and repression of the gene. Yellow coloured boxes indicate the switched on status of the gene and blue colour denotes its switched off state.
  • Petri nets (PN):  A unique type of bipartite graph consisting of two types of nodes: places and transitions. When a transition is activated, a token is transferred from the input places to the output places; the process is asynchronous and non-deterministic.
  • Polynomial dynamical systems (PDS)- An algebraically based approach that represents a specific type of sequential FDS (Finite Dynamical System) operating over a finite field. Each transition function is an element within a polynomial ring defined over the finite field. It employs advanced rapid techniques from computer algebra and computational algebraic geometry, originating from the Buchberger algorithm, to compute the Gröbner bases of ideals in these rings. An ideal consists of a set of polynomials that remain closed under polynomial combinations.
  • Differential equation models (ODE and PDE)- Ordinary Differential Equations (ODEs) are commonly utilized to represent the temporal dynamics of networks, while Partial Differential Equations (PDEs) are employed to describe behaviors occurring in both space and time, enabling the modeling of pattern formation. These spatiotemporal Diffusion-Reaction Systems demonstrate the emergence of self-organizing patterns, typically articulated by the general local activity principle, which elucidates the factors contributing to complexity and self-organization observed in nature.
  • Bayesian models: This kind of model is commonly referred to as dynamic models. It utilizes a probabilistic approach that enables the integration of prior knowledge through Bayes' Theorem. A challenge can arise when determining the direction of an interaction.
  • Finite State Linear Model (FSML): This model integrates continuous variables (such as protein concentration) with discrete elements (like promoter regions that have a limited number of states) in modeling.
  • Agent-based models (ABM): Initially created within the fields of social sciences and economics, it models the behavior of individual agents (such as genes, mRNAs (siRNA, miRNA, lncRNA), proteins, and transcription factors) and examines how their interactions influence the larger system, which in this case is the cell.
  • Rule – based models: In this approach, molecular interactions are simulated using local rules that can be utilized even in the absence of a specific network structure, meaning that the step to infer the network is not required, allowing these network-free methods to avoid the complex challenges associated with network inference.
  • Piecewise-linear differential equation models (PLDE): The model is composed of a piecewise-linear representation of differential equations using step functions, along with a collection of inequality restrictions for the parameter values.
A simple three protein negative feedback loop modeled with mass action kinetic differential equations. Each protein interaction is described by a Michaelis–Menten reaction.
  • Stochastic models: Models utilizing the Gillespie algorithm for addressing the chemical master equation provide the likelihood that a particular molecular species will possess a defined molecular population or concentration at a specified future point in time. The Gillespie method is the most computationally intensive option available. In cases where the number of molecules is low or when modeling the effects of molecular crowding is desired, the stochastic approach is preferred.
    The graph demonstrates the enzymatic conversion of cellulose to glucose over time where red line denoted cellulose and green line denotes glucose, with key enzymes facilitating the process and their concentrations changing as the reaction progresses (Time course run in COPASI). This is a typical kinetic profile for a multi-enzyme hydrolysis system.
  • State Space Model (SSM): Linear or non-linear modeling techniques that utilize an abstract state space along with various algorithms, which include Bayesian and other statistical methods, autoregressive models, and Kalman filtering.

Creating biological models

Researchers begin by choosing a biological pathway and diagramming all of the protein, gene, and/or metabolic pathways. After determining all of the interactions, mass action kinetics or enzyme kinetic rate laws are used to describe the speed of the reactions in the system. Using mass-conservation, the differential equations for the biological system can be constructed. Experiments or parameter fitting can be done to determine the parameter values to use in the differential equations. These parameter values will be the various kinetic constants required to fully describe the model. This model determines the behavior of species in biological systems and bring new insight to the specific activities of systems. Sometimes it is not possible to gather all reaction rates of a system. Unknown reaction rates are determined by simulating the model of known parameters and target behavior which provides possible parameter values.

The use of constraint-based reconstruction and analysis (COBRA) methods has become popular among systems biologists to simulate and predict the metabolic phenotypes, using genome-scale models. One of the methods is the flux balance analysis (FBA) approach, by which one can study the biochemical networks and analyze the flow of metabolites through a particular metabolic network, by optimizing the objective function of interest (e.g. maximizing biomass production to predict growth).

Applications in system biology

Systems biology, an interdisciplinary field that combines biology, data analysis, and mathematical modeling, has revolutionized various sectors, including medicine, agriculture, and environmental science. By integrating omics data (genomics, proteomics, metabolomics, etc.), systems biology provides a holistic understanding of complex biological systems, enabling advancements in drug discovery, crop improvement, and environmental impact assessment. This response explores the applications of systems biology across these domains, highlighting both industrial and academic research contributions. System biology is used in agriculture to identify the genetic and metabolic components of complex characteristics through trait dissection. It aids in the comprehension of plant-pathogen interactions in disease resistance. It is utilized in nutritional quality to enhance nutritional content through metabolic engineering.

Cancer

Approaches to cancer systems biology have made it possible to effectively combine experimental data with computer algorithms and, as an exception, to apply actionable targeted medicines for the treatment of cancer. In order to apply innovative cancer systems biology techniques and boost their effectiveness for customizing new, individualized cancer treatment modalities, comprehensive multi-omics data acquired through the sequencing of tumor samples and experimental model systems will be crucial.

Cancer systems biology has the potential to provide insights into intratumor heterogeneity and identify therapeutic options. In particular, enhanced cancer systems biology methods that incorporate not only multi-omics data from tumors, but also extensive experimental models derived from patients can assist clinicians in their decision-making processes, ultimately aiming to address treatment failures in cancer.

Drug development

Before the 1990s, phenotypic drug discovery formed the foundation of most research in drug discovery, utilizing cellular and animal disease models to find drugs without focusing on a specific molecular target. However, following the completion of the human genome project, target-based drug discovery has become the predominant approach in contemporary pharmaceutical research for various reasons. Gene knockout and transgenic models enable researchers to investigate and gain insights into the function of targets and the mechanisms by which drugs operate on a molecular level. Target-based assays lend themselves better to high-throughput screening, which simplifies the process of identifying second-generation drugs—those that improve upon first-in-class drugs in aspects such as potency, selectivity, and half-life, especially when combined with structure-based drug design. To do this, researchers utilize the three-dimensional structure of target proteins and computational models of interactions between small molecules and those targets to aid in the identification of superior compounds.

Food safety and quality

The multi-omics technologies in system biology can be also be used in aspects of food quality and safety. High-throughput omics techniques, including genomics, proteomics, and metabolomics, offer valuable insights into the molecular composition of food products, facilitating the identification of critical elements that affect food quality and safety. For example, integrating omics data can enhance the understanding of the metabolic pathways and associated functional gene patterns that contribute to both the nutritional value and safety of food crops. This comprehensive approach guarantees the creation of food products that are both nutritious and safe, capable of satisfying the increasing global demand.

Environmental system biology

Genomics examines all genes as an evolving system over time, aiming to understand their interactions and effects on biological pathways, networks, and physiology in a broader context compared to genetics. As a result, genomics holds significant potential for discovering clusters of genes associated with complex disorders, aiding in the comprehension and management of diseases induced by environmental factors.

When exploring the interactions between the environment and the genome as contributors to complex diseases, it is clear that the genome itself cannot be altered for the time being. However, once these interactions are recognized, it is feasible to minimize exposure or adjust lifestyle factors related to the environmental aspect of the disease. Gene-environment interactions can occur through direct associations with active metabolites at certain locations within the genome, potentially leading to mutations that could cause human diseases. Indirect interactions with the human genome can take place through intracellular receptors that function as ligand-activated transcription factors, which modulate gene expression and maintain cellular balance, or with an environmental factor that may produce detrimental effects. This type of environmental-gene interaction could be more straightforward to investigate than direct interactions since there are numerous markers of this kind of interaction that are readily measurable before the disease manifests. Examples of this include the expression of cytochrome P450 genes following exposure to environmental substances, such as the polycyclic aromatic hydrocarbon benzo[a]pyrene, which binds to the Ah receptor.

Technical challenges

One of the main challenges in systems biology is the connection between experimental descriptions, observations, data, models, and the assumptions that stem from them. In essence, systems biology must be understood within an information management framework that significantly encompasses experimental life sciences. Models are created using various languages or representation schemes, each suitable for conveying and reasoning about distinct sets of characteristics. There is no single universal language for systems biology that can adequately cover the diverse phenomena we aim to investigate. However, this intricate scenario overlooks two important aspects. Models can be developed in multiple versions over time and by different research teams. Conflicts can occur, and observations may be disputed. Various researchers might produce models in different versions and configurations. The unpredictable elements suggest that systems biology is not likely to yield a definitive collection of established models. Instead, we can expect a rich ecosystem of models to develop within a structure that fosters discussion and cooperation among participants. Challenges also exist in verifying the constraints and creating modeling frameworks with robust compositional strategies. This may create a need  to handle models that may conflict with one another, whether between schemes or across different scales.  In the end, the goal could involve the creation of personalized models that reflect differences in physiology, as opposed to universal models of biological processes.

Other challenges include the massive amount of data created by high-throughput omics technologies which presents considerable challenges in terms of computation and storage. Each analysis in omics can result in data files ranging from terabytes to petabytes, which requires strong computational systems and ample storage solutions to manage and process these datasets effectively. The computational requirements are made more difficult by the necessity for advanced algorithms that can integrate and analyze diverse, high-dimensional data. Approaches like deep learning and network-based methods have displayed potential in tackling these issues, but they also demand significant computational power.

Artificial intelligence (AI) in systems biology

Utilizing AI in Systems Biology enables scientists to uncover novel insights into the intricate relationships present within biological systems, such as those among genes, proteins, and cells. A significant focus within Systems Biology is the application of AI for the analysis of expansive and complex datasets, including multi-omics data produced by high-throughput methods like next-generation sequencing and proteomics. Approaches powered by AI can be employed to detect patterns and correlations within these datasets and to anticipate the behavior of biological systems under varying conditions.

For instance, artificial intelligence can identify genes that are expressed differently across various cancer types or detect small molecules linked to particular disease states. A key difficulty in analyzing multi-omics data is the integration of information from multiple sources. AI can create integrative models that consider the intricate interactions between different types of molecular data. These models may be utilized to uncover new biomarkers or therapeutic targets for diseases, as well as to enhance our understanding of fundamental biological processes. By significantly speeding up our comprehension of complex biological systems, AI has the potential to lead to new treatments and therapies for a range of diseases.

Structural systems biology is a multidisciplinary field that merges systems biology with structural biology to investigate biological systems at the molecular scale. This domain strives for a thorough understanding of how biological molecules interact and function within cells, tissues, and organisms. The integration of AI in structural systems biology has become increasingly vital for examining extensive and complex datasets and modeling the behavior of biological systems. AI facilitates the analysis of protein–protein interaction networks within structural systems biology. These networks can be explored using graph theory and various mathematical methods, uncovering key characteristics such as hubs and modules. AI can also assist in the discovery of new drugs or therapies by predicting the effect of a drug on a particular biological component or pathway.

Paradigm shift

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Paradigm...