Search This Blog

Tuesday, June 19, 2018

Neuroinformatics

From Wikipedia, the free encyclopedia
Neuroinformatics is a research field concerned with the organization of neuroscience data by the application of computational models and analytical tools. These areas of research are important for the integration and analysis of increasingly large-volume, high-dimensional, and fine-grain experimental data. Neuroinformaticians provide computational tools, mathematical models, and create interoperable databases for clinicians and research scientists. Neuroscience is a heterogeneous field, consisting of many and various sub-disciplines (e.g., cognitive psychology, behavioral neuroscience, and behavioral genetics). In order for our understanding of the brain to continue to deepen, it is necessary that these sub-disciplines are able to share data and findings in a meaningful way; Neuroinformaticians facilitate this.[1]

Neuroinformatics stands at the intersection of neuroscience and information science. Other fields, like genomics, have demonstrated the effectiveness of freely distributed databases and the application of theoretical and computational models for solving complex problems. In Neuroinformatics, such facilities allow researchers to more easily quantitatively confirm their working theories by computational modeling. Additionally, neuroinformatics fosters collaborative research—an important fact that facilitates the field's interest in studying the multi-level complexity of the brain.

There are three main directions where neuroinformatics has to be applied:[2]
  1. the development of tools and databases for management and sharing of neuroscience data at all levels of analysis,
  2. the development of tools for analyzing and modeling neuroscience data,
  3. the development of computational models of the nervous system and neural processes.
In the recent decade, as vast amounts of diverse data about the brain were gathered by many research groups, the problem was raised of how to integrate the data from thousands of publications in order to enable efficient tools for further research. The biological and neuroscience data are highly interconnected and complex, and by itself, integration represents a great challenge for scientists.

Combining informatics research and brain research provides benefits for both fields of science. On one hand, informatics facilitates brain data processing and data handling, by providing new electronic and software technologies for arranging databases, modeling and communication in brain research. On the other hand, enhanced discoveries in the field of neuroscience will invoke the development of new methods in information technologies (IT).

History

Starting in 1989, the United States National Institute of Mental Health (NIMH), the National Institute of Drug Abuse (NIDA) and the National Science Foundation (NSF) provided the National Academy of Sciences Institute of Medicine with funds to undertake a careful analysis and study of the need to create databases, share neuroscientific data and to examine how the field of information technology could create the tools needed for the increasing volume and modalities of neuroscientific data.[citation needed] The positive recommendations were reported in 1991.[3] This positive report enabled NIMH, now directed by Allan Leshner, to create the "Human Brain Project" (HBP), with the first grants awarded in 1993. The HBP was led by Koslow along with cooperative efforts of other NIH Institutes, the NSF, the National Aeronautics and Space Administration and the Department of Energy. The HPG[expand acronym] and grant-funding initiative in this area slightly preceded the explosive expansion of the World Wide Web. From 1993 through 2004 this program grew to over 100 million dollars in funded grants.

Next, Koslow pursued the globalization of the HPG and neuroinformatics through the European Union and the Office for Economic Co-operation and Development (OECD), Paris, France. Two particular opportunities occurred in 1996.
  • The first was the existence of the US/European Commission Biotechnology Task force co-chaired by Mary Clutter from NSF. Within the mandate of this committee, of which Koslow was a member the United States European Commission Committee on Neuroinformatics was established and co-chaired by Koslow from the United States. This committee resulted in the European Commission initiating support for neuroinformatics in Framework 5 and it has continued to support activities in neuroinformatics research and training.
  • A second opportunity for globalization of neuroinformatics occurred when the participating governments of the Mega Science Forum (MSF) of the OECD were asked if they had any new scientific initiatives to bring forward for scientific cooperation around the globe. The White House Office of Science and Technology Policy requested that agencies in the federal government meet at NIH to decide if cooperation were needed that would be of global benefit. The NIH held a series of meetings in which proposals from different agencies were discussed. The proposal recommendation from the U.S. for the MSF was a combination of the NSF and NIH proposals. Jim Edwards of NSF supported databases and data-sharing in the area of biodiversity; Koslow proposed the HPG as a model for sharing neuroscientific data, with the new moniker of neuroinformatics.
The two related initiates were combined to form the United States proposal on "Biological Informatics". This initiative was supported by the White House Office of Science and Technology Policy and presented at the OECD MSF by Edwards and Koslow. An MSF committee was established on Biological Informatics with two subcommittees: 1. Biodiversity (Chair, James Edwards, NSF), and 2. Neuroinformatics (Chair, Stephen Koslow, NIH). At the end of two years the Neuroinformatics subcommittee of the Biological Working Group issued a report supporting a global neuroinformatics effort. Koslow, working with the NIH and the White House Office of Science and Technology Policy to establishing a new Neuroinformatics working group to develop specific recommendation to support the more general recommendations of the first report. The Global Science Forum (GSF; renamed from MSF) of the OECD supported this recommendation.

The International Neuroinformatics Coordinating Facility

This committee presented 3 recommendations to the member governments of GSF. These recommendations were:
  1. National neuroinformatics programs should be continued or initiated in each country should have a national node to both provide research resources nationally and to serve as the contact for national and international coordination.
  2. An International Neuroinformatics Coordinating Facility (INCF) should be established. The INCF will coordinate the implementation of a global neuroinformatics network through integration of national neuroinformatics nodes.
  3. A new international funding scheme should be established. This scheme should eliminate national and disciplinary barriers and provide a most efficient approach to global collaborative research and data sharing. In this new scheme, each country will be expected to fund the participating researchers from their country.
The GSF neuroinformatics committee then developed a business plan for the operation, support and establishment of the INCF which was supported and approved by the GSF Science Ministers at its 2004 meeting. In 2006 the INCF was created and its central office established and set into operation at the Karolinska Institute, Stockholm, Sweden under the leadership of Sten Grillner. Sixteen countries (Australia, Canada, China, the Czech Republic, Denmark, Finland, France, Germany, India, Italy, Japan, the Netherlands, Norway, Sweden, Switzerland, the United Kingdom and the United States), and the EU Commission established the legal basis for the INCF and Programme in International Neuroinformatics (PIN). To date, eighteen countries (Australia, Belgium, Czech Republic, Finland, France, Germany, India, Italy, Japan, Malaysia, Netherlands, Norway, Poland, Republic of Korea, Sweden, Switzerland, the United Kingdom and the United States) are members of the INCF. Membership is pending for several other countries.

The goal of the INCF is to coordinate and promote international activities in neuroinformatics. The INCF contributes to the development and maintenance of database and computational infrastructure and support mechanisms for neuroscience applications. The system is expected to provide access to all freely accessible human brain data and resources to the international research community. The more general task of INCF is to provide conditions for developing convenient and flexible applications for neuroscience laboratories in order to improve our knowledge about the human brain and its disorders.

Society for Neuroscience Brain Information Group

On the foundation of all of these activities, Huda Akil, the 2003 President of the Society for Neuroscience (SfN) established the Brain Information Group (BIG) to evaluate the importance of neuroinformatics to neuroscience and specifically to the SfN. Following the report from BIG, SfN also established a neuroinformatics committee.

In 2004, SfN announced the Neuroscience Database Gateway (NDG) as a universal resource for neuroscientists through which almost any neuroscience databases and tools may be reached. The NDG was established with funding from NIDA, NINDS and NIMH. The Neuroscience Database Gateway has transitioned to a new enhanced platform, the Neuroscience Information Framework.[4] Funded by the NIH Neuroscience BLueprint, the NIF is a dynamic portal providing access to neuroscience-relevant resources (data, tools, materials) from a single search interface. The NIF builds upon the foundation of the NDG, but provides a unique set of tools tailored especially for neuroscientists: a more expansive catalog, the ability to search multiple databases directly from the NIF home page, a custom web index of neuroscience resources, and a neuroscience-focused literature search function.

Collaboration with other disciplines

Neuroinformatics is formed at the intersections of the following fields:
Biology is concerned with molecular data (from genes to cell specific expression); medicine and anatomy with the structure of synapses and systems level anatomy; engineering – electrophysiology (from single channels to scalp surface EEG), brain imaging; computer science – databases, software tools, mathematical sciences – models, chemistry – neurotransmitters, etc. Neuroscience uses all aforementioned experimental and theoretical studies to learn about the brain through its various levels. Medical and biological specialists help to identify the unique cell types, and their elements and anatomical connections. Functions of complex organic molecules and structures, including a myriad of biochemical, molecular, and genetic mechanisms which regulate and control brain function, are determined by specialists in chemistry and cell biology. Brain imaging determines structural and functional information during mental and behavioral activity. Specialists in biophysics and physiology study physical processes within neural cells neuronal networks. The data from these fields of research is analyzed and arranged in databases and neural models in order to integrate various elements into a sophisticated system; this is the point where neuroinformatics meets other disciplines.

Neuroscience provides the following types of data and information on which neuroinformatics operates:
Neuroinformatics uses databases, the Internet, and visualization in the storage and analysis of the mentioned neuroscience data.

Research programs and groups

Australia

Neuroimaging & Neuroinformatics, Howard Florey Institute, University of Melbourne
Institute scientists utilize brain imaging techniques, such as magnetic resonance imaging, to reveal the organization of brain networks involved in human thought. Led by Gary Egan.

Canada

McGill Centre for Integrative Neuroscience (MCIN), Montreal Neurological Institute, McGill University
Led by Alan Evans, MCIN conducts computationally-intensive brain research using innovative mathematical and statistical approaches to integrate clinical, psychological and brain imaging data with genetics. MCIN researchers and staff also develop infrastructure and software tools in the areas of image processing, databasing, and high performance computing. The MCIN community, together with the Ludmer Centre for Neuroinformatics and Mental Health, collaborates with a broad range of researchers and increasingly focuses on open data sharing and open science, including for the Montreal Neurological Institute.

Denmark

The THOR Center for Neuroinformatics
Established April 1998 at the Department of Mathematical Modelling, Technical University of Denmark. Besides pursuing independent research goals, the THOR Center hosts a number of related projects concerning neural networks, functional neuroimaging, multimedia signal processing, and biomedical signal processing.

Germany

The Neuroinformatics Portal Pilot
The project is part of a larger effort to enhance the exchange of neuroscience data, data-analysis tools, and modeling software. The portal is supported from many members of the OECD Working Group on Neuroinformatics. The Portal Pilot is promoted by the German Ministry for Science and Education.
Computational Neuroscience, ITB, Humboldt-University Berlin
This group focuses on computational neurobiology, in particular on the dynamics and signal processing capabilities of systems with spiking neurons. Lead by Andreas VM Herz.
The Neuroinformatics Group in Bielefeld
Active in the field of Artificial Neural Networks since 1989. Current research programmes within the group are focused on the improvement of man-machine-interfaces, robot-force-control, eye-tracking experiments, machine vision, virtual reality and distributed systems.

Italy

Laboratory of Computational Embodied Neuroscience (LOCEN)[5]
This group, part of the Institute of Cognitive Sciences and Technologies, Italian National Research Council (ISTC-CNR) in Rome and founded in 2006 is currently led by Gianluca Baldassarre. It has two objectives: (a) understanding the brain mechanisms underlying learning and expression of sensorimotor behaviour, and related motivations and higher-level cognition grounded on it, on the basis of embodied computational models; (b) transferring the acquired knowledge to building innovative controllers for autonomous humanoid robots capable of learning in an open-ended fashion on the basis of intrinsic and extrinsic motivations.

Japan

Japan national neuroinformatics resource
The Visiome Platform is the Neuroinformatics Search Service that provides access to mathematical models, experimental data, analysis libraries and related resources. An online portal for neurophysiological data sharing is also available at BrainLiner.jp as part of the MEXT Strategic Research Program for Brain Sciences (SRPBS).
Laboratory for Mathematical Neuroscience, RIKEN Brain Science Institute (Wako, Saitama)
The target of Laboratory for Mathematical Neuroscience is to establish mathematical foundations of brain-style computations toward construction of a new type of information science. Led by Shun-ichi Amari.

The Netherlands

Netherlands state program in neuroinformatics
Started in the light of the international OECD Global Science Forum which aim is to create a worldwide program in Neuroinformatics.

Pakistan

NUST-SEECS Neuroinformatics Research Lab[6]
Establishment of the Neuro-Informatics Lab at SEECS-NUST has enabled Pakistani researchers and members of the faculty to actively participate in such efforts, thereby becoming an active part of the above-mentioned experimentation, simulation, and visualization processes. The lab collaborates with the leading international institutions to develop highly skilled human resource in the related field. This lab facilitates neuroscientists and computer scientists in Pakistan to conduct their experiments and analysis on the data collected using state of the art research methodologies without investing in establishing the experimental neuroscience facilities. The key goal of this lab is to provide state of the art experimental and simulation facilities, to all beneficiaries including higher education institutes, medical researchers/practitioners, and technology industry.

Switzerland

The Blue Brain Project
The Blue Brain Project was founded in May 2005, and uses an 8000 processor Blue Gene/L supercomputer developed by IBM. At the time, this was one of the fastest supercomputers in the world.
The project involves:
  • Databases: 3D reconstructed model neurons, synapses, synaptic pathways, microcircuit statistics, computer model neurons, virtual neurons.
  • Visualization: microcircuit builder and simulation results visualizator, 2D, 3D and immersive visualization systems are being developed.
  • Simulation: a simulation environment for large-scale simulations of morphologically complex neurons on 8000 processors of IBM's Blue Gene supercomputer.
  • Simulations and experiments: iterations between large-scale simulations of neocortical microcircuits and experiments in order to verify the computational model and explore predictions.
The mission of the Blue Brain Project is to understand mammalian brain function and dysfunction through detailed simulations. The Blue Brain Project will invite researchers to build their own models of different brain regions in different species and at different levels of detail using Blue Brain Software for simulation on Blue Gene. These models will be deposited in an internet database from which Blue Brain software can extract and connect models together to build brain regions and begin the first whole brain simulations.
The Institute of Neuroinformatics (INI)
Established at the University of Zurich at the end of 1995, the mission of the Institute is to discover the key principles by which brains work and to implement these in artificial systems that interact intelligently with the real world.

United Kingdom

Genes to Cognition Project
A neuroscience research programme that studies genes, the brain and behaviour in an integrated manner. It is engaged in a large-scale investigation of the function of molecules found at the synapse. This is mainly focused on proteins that interact with the NMDA receptor, a receptor for the neurotransmitter, glutamate, which is required for processes of synaptic plasticity such as long-term potentiation (LTP). Many of the techniques used are high-throughout in nature, and integrating the various data sources, along with guiding the experiments has raised numerous informatics questions. The program is primarily run by Professor Seth Grant at the Wellcome Trust Sanger Institute, but there are many other teams of collaborators across the world.
The CARMEN project[7]
The CARMEN project is a multi-site (11 universities in the United Kingdom) research project aimed at using GRID computing to enable experimental neuroscientists to archive their datasets in a structured database, making them widely accessible for further research, and for modellers and algorithm developers to exploit.
EBI Computational Neurobiology, EMBL-EBI (Hinxton)
The main goal of the group is to build realistic models of neuronal function at various levels, from the synapse to the micro-circuit, based on the precise knowledge of molecule functions and interactions (Systems Biology). Led by Nicolas Le Novère.

United States

Neuroscience Information Framework
The Neuroscience Information Framework (NIF) is an initiative of the NIH Blueprint for Neuroscience Research, which was established in 2004 by the National Institutes of Health. Unlike general search engines, NIF provides deeper access to a more focused set of resources that are relevant to neuroscience, search strategies tailored to neuroscience, and access to content that is traditionally "hidden" from web search engines. The NIF is a dynamic inventory of neuroscience databases, annotated and integrated with a unified system of biomedical terminology (i.e. NeuroLex). NIF supports concept-based queries across multiple scales of biological structure and multiple levels of biological function, making it easier to search for and understand the results. NIF will also provide a registry through which resources providers can disclose availability of resources relevant to neuroscience research. NIF is not intended to be a warehouse or repository itself, but a means for disclosing and locating resources elsewhere available via the web.
Neurogenetics GeneNetwork
Genenetwork started as component of the NIH Human Brain Project in 1999 with a focus on the genetic analysis of brain structure and function. This international program consists of tightly integrated genome and phenome data sets for human, mouse, and rat that are designed specifically for large-scale systems and network studies relating gene variants to differences in mRNA and protein expression and to differences in CNS structure and behavior. The great majority of data are open access. GeneNetwork has a companion neuroimaging web site—the Mouse Brain Library—that contains high resolution images for thousands of genetically defined strains of mice.
The Neuronal Time Series Analysis (NTSA)[8]
NTSA Workbench is a set of tools, techniques and standards designed to meet the needs of neuroscientists who work with neuronal time series data. The goal of this project is to develop information system that will make the storage, organization, retrieval, analysis and sharing of experimental and simulated neuronal data easier. The ultimate aim is to develop a set of tools, techniques and standards in order to satisfy the needs of neuroscientists who work with neuronal data.
The Cognitive Atlas[9]
The Cognitive Atlas is a project developing a shared knowledge base in cognitive science and neuroscience. This comprises two basic kinds of knowledge: tasks and concepts, providing definitions and properties thereof, and also relationships between them. An important feature of the site is ability to cite literature for assertions (e.g. "The Stroop task measures executive control") and to discuss their validity. It contributes to NeuroLex and the Neuroscience Information Framework, allows programmatic access to the database, and is built around semantic web technologies.
Brain Big Data research group at the Allen Institute for Brain Science (Seattle, WA)
Led by Hanchuan Peng,[10] this group has focused on using large-scale imaging computing and data analysis techniques to reconstruct single neuron models and mapping them in brains of different animals.

Technologies and developments

The main technological tendencies in neuroinformatics are:
  1. Application of computer science for building databases, tools, and networks in neuroscience;
  2. Analysis and modeling of neuronal systems.
In order to organize and operate with neural data scientists need to use the standard terminology and atlases that precisely describe the brain structures and their relationships.
  • Neuron Tracing and Reconstruction is an essential technique to establish digital models of the morphology of neurons. Such morphology is useful for neuron classification and simulation.
  • BrainML[11] is a system that provides a standard XML metaformat for exchanging neuroscience data.
  • The Biomedical Informatics Research Network (BIRN)[12] is an example of a grid system for neuroscience. BIRN is a geographically distributed virtual community of shared resources offering vast scope of services to advance the diagnosis and treatment of disease. BIRN allows combining databases, interfaces and tools into a single environment.
  • Budapest Reference Connectome is a web-based 3D visualization tool to browse connections in the human brain. Nodes, and connections are calculated from the MRI datasets of the Human Connectome Project.
  • GeneWays[13] is concerned with cellular morphology and circuits. GeneWays is a system for automatically extracting, analyzing, visualizing and integrating molecular pathway data from the research literature. The system focuses on interactions between molecular substances and actions, providing a graphical view on the collected information and allows researchers to review and correct the integrated information.
  • Neocortical Microcircuit Database (NMDB).[14] A database of versatile brain's data from cells to complex structures. Researchers are able not only to add data to the database but also to acquire and edit one.
  • SenseLab.[15] SenseLab is a long-term effort to build integrated, multidisciplinary models of neurons and neural systems. It was founded in 1993 as part of the original Human Brain Project. A collection of multilevel neuronal databases and tools. SenseLab contains six related databases that support experimental and theoretical research on the membrane properties that mediate information processing in nerve cells, using the olfactory pathway as a model system.
  • BrainMaps.org[16] is an interactive high-resolution digital brain atlas using a high-speed database and virtual microscope that is based on over 12 million megapixels of scanned images of several species, including human.
Another approach in the area of the brain mappings is the probabilistic atlases obtained from the real data from different group of people, formed by specific factors, like age, gender, diseased etc. Provides more flexible tools for brain research and allow obtaining more reliable and precise results, which cannot be achieved with the help of traditional brain atlases.

Reverse Engineering the Brain

 
 

Original link:  http://www.engineeringchallenges.org/challenges/9109.aspx

Summary

The intersection of engineering and neuroscience promises great advances in health care, manufacturing, and communication.

For decades, some of engineering’s best minds have focused their thinking skills on how to create thinking machines — computers capable of emulating human intelligence.

Why should you reverse-engineer the brain?

While some of thinking machines have mastered specific narrow skills — playing chess, for instance — general-purpose artificial intelligence (AI) has remained elusive.

Part of the problem, some experts now believe, is that artificial brains have been designed without much attention to real ones. Pioneers of artificial intelligence approached thinking the way that aeronautical engineers approached flying without much learning from birds. It has turned out, though, that the secrets about how living brains work may offer the best guide to engineering the artificial variety. Discovering those secrets by reverse-engineering the brain promises enormous opportunities for reproducing intelligence the way assembly lines spit out cars or computers.

Figuring out how the brain works will offer rewards beyond building smarter computers. Advances gained from studying the brain may in return pay dividends for the brain itself. Understanding its methods will enable engineers to simulate its activities, leading to deeper insights about how and why the brain works and fails. Such simulations will offer more precise methods for testing potential biotechnology solutions to brain disorders, such as drugs or neural implants. Neurological disorders may someday be circumvented by technological innovations that allow wiring of new materials into our bodies to do the jobs of lost or damaged nerve cells. Implanted electronic devices could help victims of dementia to remember, blind people to see, and crippled people to walk.

Sophisticated computer simulations could also be used in many other applications. Simulating the interactions of proteins in cells would be a novel way of designing and testing drugs, for instance. And simulation capacity will be helpful beyond biology, perhaps in forecasting the impact of earthquakes in ways that would help guide evacuation and recovery plans.

Much of this power to simulate reality effectively will come from increased computing capability rooted in the reverse-engineering of the brain. Learning from how the brain itself learns, researchers will likely improve knowledge of how to design computing devices that process multiple streams of information in parallel, rather than the one-step-at-a-time approach of the basic PC. Another feature of real brains is the vast connectivity of nerve cells, the biological equivalent of computer signaling switches. While nerve cells typically form tens of thousands of connections with their neighbors, traditional computer switches typically possess only two or three. AI systems attempting to replicate human abilities, such as vision, are now being developed with more, and more complex, connections.

What are the applications for this information?

Already, some applications using artificial intelligence have benefited from simulations based on brain reverse-engineering. Examples include AI algorithms used in speech recognition and in machine vision systems in automated factories. More advanced AI software should in the future be able to guide devices that can enter the body to perform medical diagnoses and treatments.

Of potentially even greater impact on human health and well-being is the use of new AI insights for repairing broken brains.  Damage from injury or disease to the hippocampus, a brain structure important for learning and memory, can disrupt the proper electrical signaling between nerve cells that is needed for forming and recalling memories. With knowledge of the proper signaling patterns in healthy brains, engineers have begun to design computer chips that mimic the brain’s own communication skills. Such chips could be useful in cases where healthy brain tissue is starved for information because of the barrier imposed by damaged tissue. In principle, signals from the healthy tissue could be recorded by an implantable chip, which would then generate new signals to bypass the damage. Such an electronic alternate signaling route could help restore normal memory skills to an impaired brain that otherwise could not form them.

“Neural prostheses” have already been put to use in the form of cochlear implants to treat hearing loss and stimulating electrodes to treat Parkinson’s disease. Progress has also been made in developing “artificial retinas,” light-sensitive chips that could help restore vision.

Even more ambitious programs are underway for systems to control artificial limbs. Engineers envision computerized implants capable of receiving the signals from thousands of the brain’s nerve cells and then wirelessly transmitting that information to an interface device that would decode the brain’s intentions. The interface could then send signals to an artificial limb, or even directly to nerves and muscles, giving directions for implementing the desired movements.

Other research has explored, with some success, implants that could literally read the thoughts of immobilized patients and signal an external computer, giving people unable to speak or even move a way to communicate with the outside world.

What is needed to reverse-engineer the brain?

The progress so far is impressive. But to fully realize the brain’s potential to teach us how to make machines learn and think, further advances are needed in the technology for understanding the brain in the first place. Modern noninvasive methods for simultaneously measuring the activity of many brain cells have provided a major boost in that direction, but details of the brain’s secret communication code remain to be deciphered. Nerve cells communicate by firing electrical pulses that release small molecules called neurotransmitters, chemical messengers that hop from one nerve cell to a neighbor, inducing the neighbor to fire a signal of its own (or, in some cases, inhibiting the neighbor from sending signals). Because each nerve cell receives messages from tens of thousands of others, and circuits of nerve cells link up in complex networks, it is extremely difficult to completely trace the signaling pathways.

Furthermore, the code itself is complex — nerve cells fire at different rates, depending on the sum of incoming messages. Sometimes the signaling is generated in rapid-fire bursts; sometimes it is more leisurely. And much of mental function seems based on the firing of multiple nerve cells around the brain in synchrony. Teasing out and analyzing all the complexities of nerve cell signals, their dynamics, pathways, and feedback loops, presents a major challenge.

Today’s computers have electronic logic gates that are either on or off, but if engineers could replicate neurons’ ability to assume various levels of excitation, they could create much more powerful computing machines. Success toward fully understanding brain activity will, in any case, open new avenues for deeper understanding of the basis for intelligence and even consciousness, no doubt providing engineers with insight into even grander accomplishments for enhancing the joy of living.

References

Berger, T.W., et al. Restoring Lost Cognitive Function,” IEEE Engineering in Medicine and Biology Magazine (September/October 2005), pp. 30-44.

Griffith, A. 2007.  Chipping In,” Scientific American (February 2007), pp. 18-20.

Handelman, S. The Memory Hacker,” Popular Science (2007).

Hapgood,  F. Reverse-Engineering the Brain,” MIT News Magazine (July 1, 2006).

Lebedev, M.A. and Miguel A.L. Nicolelis. Brain-machine interfaces: Past, present, and future,” Trends in Neurosciences 29 (September 2006), pp. 536-546.

Microbiomes of the built environment

From Wikipedia, the free encyclopedia

Microbiomes of the built environment [1][2] is a field of inquiry focusing on the study of the communities of microorganisms found in human constructed environments (i.e., the built environment). It is also sometimes referred to as "microbiology of the built environment".
This field encompasses studies of any kind of microorganism (e.g. bacteria, archaea, viruses, various microbial eukaryotes including yeasts, and others sometimes generally referred to as protists) and studies of any kind of built environment such as buildings, vehicles, and water systems.

Some key highlights emphasizing the growing importance of the field include:
A 2016 paper by Brent Stephens [6] highlights some of the key findings of studies of "microbiomes of the indoor environment". These key findings include those listed below:
  • "Culture-independent methods reveal vastly greater microbial diversity compared to culture-based methods"
  • "Indoor spaces often harbor unique microbial communities"
  • "Indoor bacterial communities often originate from indoor sources."
  • "Humans are also major sources of bacteria to indoor air"
  • "Building design and operation can influence indoor microbial communities."
The microbiomes of the built environment are being studied for multiple reasons including how they may impact the health of humans and other organisms occupying the built environment but also some non health reasons such as diagnostics of building properties, for forensic application, impact on food production, impact on built environment function, and more.

Types of Built Environments For Which Microbiomes Have Been Studied

Extensive research has been conducted on individual microbes found in the built environment. More recently there has been a significant expansion in the number of studies that are examining the communities of microbes (i.e., microbiomes) found in the built environment. Such studies of microbial communities in the built environment have covered a wide range of types of built environments including those listed below.

Buildings. Examples include homes,[7][8][9] dormitories,[10] offices,[11][12] hospitals,[13][14][15] operating rooms,[16][17][18] NICUs,[19] classrooms,[20][21] transportation facilities such as train and subway stations,[22][23] food production facilities [24] (e.g. dairies, wineries,[25] cheesemaking facilities,[26][27] sake breweries [28] and beer breweries [29]), aquaria,[30] libraries,[31] cleanrooms,[32][33] zoos, animal shelters, farms, and hicken coops and housing.[34]

Vehicles. Examples include airplanes,[35] ships, tains,[23] automobiles [36] and space vehicles including International Space Station,[37] MIR,[38] the Mars Odyssey,[39] the Herschel Spacecraft.[40]

Water Systems. Examples include shower heads,[41] children's paddling pools,[42] municipal water systems,[43] drinking water and premise plumbing systems [44][45][46][47] and saunas.[48]

Other. Examples include art and cultural heritage items,[49] clothing,[50] and household appliances such as dishwashers [51] and washing machines.[52]

Results from Studies of the Microbiomes of the Built Environment

General Biogeography

Overall the many studies that have been conducted on the microbiomes of the built environment have started to identify some general patterns regarding the microbes are found in various places. For example, Adams et al., in a comparative analysis of ribosomal RNA based studies in the built environment found that geography and building type had strong associations with the types of microbes seen in the built environment.[53] Pakpour et al. in 2016 reviewed the patterns relating to the presence of archaea in indoor environments (based on analysis of rRNA gene sequence data).[54]

Human Health and Microbiomes of the Built Environment

Many studies have documented possible human health implications of the microbiomes of the built environment (e.g.,[55] ). Examples include those below.

Newborn colonization. The microbes that colonize newborns come in part from the built environment (e.g., hospital rooms). This appears to be especially true for babies born by C-section (see for example Shin et al. 2016 [56]) and also babies that spend time in a NICU.[19]

Risk of allergy and asthma. The risk of allergy and asthma is correlated to differences in the built environment microbiome. Some experimental tests (e.g., in mice) have suggested that these correlations may actually be causal (i.e., the differences in the microbiomes may actually lead to differences in risk of allergy or asthma). Review papers on this topic include Casas et al. 2016 [57] and Fujimura and Lynch 2015.[58] Studies of dust in various homes has shown that the microbiome found in the dust is correlated to the risk of children in those homes developing allergy, asthma, or phenotypes connected to these ailments.[59][60][61] The impact of the microbiome of the built environment on the risk of allergy and asthma and other inflammatory or immune conditions is a possible mechanism underlying what is known as the hygiene hypothesis.

Mental health. In a 2015 review Hoisington et al. discuss possible connections between the microbiology of the built environment and human health.[62] The concept presented in this paper is that more and more evidence is accumulating that the human microbiome has some impact on the brain and thus if the built environment either directly or indirectly impacts the human microbiome, this in turn could have impacts on human mental health.

Pathogen transmission. Many pathogens are transmitted in the built environment and may also reside in the built environment for some period of time.[63] Good examples include influenza, Norovirus, Legionella, and MRSA. The study of the transmission and survival of these pathogens is a component of studies of microbiomes of the built environment.

Indoor Air Quality The study of Indoor air quality and the health impact of such air quality is linked at least in part to microbes in the built environment since they can impact directly or indirectly indoor air quality.

Components of the Built Environment that Likely Impact Microbiomes

A major component of studies of Microbiomes of the Built Environment involves determining how components of the built environment impact these microbes and microbial communities. Factors that are thought to be important include humidity, pH, chemical exposures, temperature, filtration, surface materials, and air flow.[64] There has been an effort to develop standards for what built environment "metadata" to collect associated with studies of the microbial communities in the built environment.[65] A 2014 paper reviews the tools that are available to improve the built environment data that is collected associated with such studies.[66] Examples of types of built environment data covered in this review include building characteristics and environmental conditions, HVAC system characteristics and ventilation rates, human occupancy and activity measurements, surface characterizations and air sampling and aerosol dynamics.

Impact of Microbiomes on the Built Environment

Just as the built environment has an impact on the microbiomes found therein, the microbial communities of the built environment can impact the built environment itself. Examples include degradation of building materials, altering fluid and airflow, generating volatiles, and more.

Possible Uses in Forensics

The microbiome of the built environment has some potential for being used as a feature for forensic studies. Most of these applications are still in the early research phase. For example, it has been shown that people leave behind a somewhat diagnostic microbial signature when they type on computer keyboards,[67] use phones [68] or occupy a room.[10]

Odor and Microbes in the Built Environment

There has been a significant amount of research on the role that microbes play in various odors in the built environment. For example, Diekmann et al. examined the connection between volatile organic emissions in automobile air conditioning units.[69] They reported that the types of microbes found were correlated to the bad odors found. Park and Kim examined which microbes found in an automobile air conditioner could produce bad smelling volatile compounds and identified candidate taxa producing some such compounds.[70]

Methods Used

Many methods are used to study microbes in built environment. A review of such methods are some of the challenges in using them was published by NIST. Hoisington et al. in 2015 reviewed methods that could be used by building professionals to study the microbiology of the built environment.[71] Methods used in the study of microbes in the built environment include culturing (with subsequent studies of the cultured microbes), microscopy, air, water and surface sampling, chemical analyses, and culture independent DNA studies such as ribosomal RNA gene PCR and metagenomics.

Microbial biogeography

From Wikipedia, the free encyclopedia
Microbial biogeography is a subset of biogeography, a field that concerns the distribution of organisms across space and time.[1] Although biogeography traditionally focused on plants and larger animals, recent studies have broadened this field to include distribution patterns of microorganisms. This extension of biogeography to smaller scales—known as "microbial biogeography"—is enabled by ongoing advances in genetic technologies.

The aim of microbial biogeography is to reveal where microorganisms live, at what abundance, and why. Microbial biogeography can therefore provide insight into the underlying mechanisms that generate and hinder biodiversity.[2] Microbial biogeography also enables predictions of where certain organisms can survive and how they respond to changing environments, making it applicable to several other fields such as climate change research.

History

Schewiakoff (1893) theorized about the cosmopolitan habitat of free-living protozoans.[3] In 1934, Lourens Baas Becking, based on his own research in California's salt lakes, as well as work by others on salt lakes worldwide,[4] concluded that "everything is everywhere, but the environment selects".[5] Baas Becking attributed the first half of this hypothesis to his colleague Martinus Beijerinck (1913).[6][7]

Baas Becking hypothesis of cosmopolitan microbial distribution would later be challenged by other works.[8][9][10][11]

Microbial vs macro-organism biogeography

The biogeography of macro-organisms (i.e., plants and animals that can be seen with the naked eye) has been studied since the eighteenth century. For macro-organisms, biogeographical patterns (i.e., which organism assemblages appear in specific places and times) appear to arise from both past and current environments. For example, polar bears live in the Arctic but not the Antarctic, while the reverse is true for penguins; although both polar bears and penguins have adapted to cold climates over many generations (the result of past environments), the distance and warmer climates between the north and south poles prevent these species from spreading to the opposite hemisphere (the result of current environments). This demonstrates the biogeographical pattern known as "isolation with geographic distance" by which the limited ability of a species to physically disperse across space (rather than any selective genetic reasons) restricts the geographical range over which it can be found.

The biogeography of microorganisms (i.e., organisms that cannot be seen with the naked eye, such as fungi and bacteria) is an emerging field enabled by ongoing advancements in genetic technologies, in particular cheaper DNA sequencing with higher throughput that now allows analysis of global datasets on microbial biology at the molecular level. When scientists began studying microbial biogeography, they anticipated a lack of biogeographic patterns due to the high dispersibility and large population sizes of microbes, which were expected to ultimately render geographical distance irrelevant. Indeed, in microbial ecology the oft-repeated saying by Lourens Baas Becking that “everything is everywhere, but the environment selects” has come to mean that as long as the environment is ecologically appropriate, geological barriers are irrelevant.[12] However, recent studies show clear evidence for biogeographical patterns in microbial life, which challenge this common interpretation: the existence of microbial biogeographic patterns disputes the idea that “everything is everywhere” while also supporting the idea that environmental selection includes geography as well as historical events that can leave lasting signatures on microbial communities.[2]

Microbial biogeographic patterns are often similar to those of macro-organisms. Microbes generally follow well-known patterns such as the distance decay relationship, the abundance-range relationship, and Rapoport's rule.[13][14] This is surprising given the many disparities between microorganisms and macro-organisms, in particular their size (micrometers vs. meters), time between generations (minutes vs. years), and dispersibility (global vs. local). However, important differences between the biogeographical patterns of microorganism and macro-organism do exist, and likely result from differences in their underlying biogeographic processes (e.g., drift, dispersal, selection, and mutation). For example, dispersal is an important biogeographical process for both microbes and larger organisms, but small microbes can disperse across much greater ranges and at much greater speeds by traveling through the atmosphere (for larger animals dispersal is much more constrained due to their size).[2] As a result, many microbial species can be found in both northern and southern hemispheres, while larger animals are typically found only at one pole rather than both.[15]

Distinct patterns

Reversed latitudinal diversity gradient

Larger organisms tend to exhibit latitudinal gradients in species diversity, with larger biodiversity existing in the tropics and decreasing toward more temperate polar regions. In contrast, a study on indoor fungal communities[14] found microbial biodiversity to be significantly higher in temperate zones than in the tropics. Interestingly, the same study found that drastically different buildings exhibited the same indoor fungal composition in any given location, where similarity increased with proximity. Thus despite human efforts to control indoor climates, outside environments appear to be the strongest determinant of indoor fungal composition.

Bipolar latitude distributions

Certain microbial populations exist in opposite hemispheres and at complementary latitudes. These ‘bipolar’ (or ‘antitropical’) distributions are much rarer in macro-organisms; although macro-organisms exhibit latitude gradients, ‘isolation by geographic distance’ prevents bipolar distributions (e.g., polar bears are not found at both poles). In contrast, a study on marine surface bacteria[15] showed not only a latitude gradient, but also complementarity distributions with similar populations at both poles, suggesting no "isolation by geographic distance". This is likely due to differences in the underlying biogeographic process, dispersal, as microbes tend to disperse at high rates and far distances by traveling through the atmosphere.

Seasonal variations

Microbial diversity can exhibit striking seasonal patterns at a single geographical location. This is largely due to dormancy, a microbial feature not seen in larger animals that allows microbial community composition to fluctuate in relative abundance of persistent species (rather than actual species present). This is known as the "seed-bank hypothesis"[16] and has implications for our understanding of ecological resilience and thresholds to change.[17]

Applications

Directed panspermia

Panspermia suggests that life can be distributed throughout outer space via comets, asteroids, and meteoroids. Panspermia assumes that life can survive the harsh space environment, which features vacuum conditions, intense radiation, extreme temperatures, and a dearth of available nutrients. Many microorganisms are able to evade such stressors by forming spores or entering a state of low-metabolic dormancy.[18] Studies in microbial biogeography have even shown that the ability of microbes to enter and successfully emerge from dormancy when their respective environmental conditions are favorable contributes to the high levels of microbial biodiversity observed in almost all ecosystems.[19] Thus microbial biogeography can be applied to panspermia as it predicts that microbes are able to protect themselves from the harsh space environment, know to emerge when conditions are safe, and also take advantage of their dormancy capability to enhance biodiversity wherever they may land.

Directed panspermia is the deliberate transport of microorganisms to colonize another planet. If aiming to colonize an Earth-like environment, microbial biogeography can inform decisions on the biological payload of such a mission. In particular, microbes exhibit latitudinal ranges according to Rapoport's rule, which states that organisms living at lower latitudes (near the equator) are found within smaller latitude ranges than those living at higher latitudes (near the poles). Thus the ideal biological payload would include widespread, higher-latitude microorganisms that can tolerate of a wider range of climates. This is not necessarily the obvious choice, as these widespread organisms are also rare in microbial communities and tend to be weaker competitors when faced with endemic organisms. Still, they can survive in a range of climates and thus would be ideal for inhabiting otherwise lifeless Earth-like planets with uncertain environmental conditions. Extremophiles, although tough enough to withstand the space environment, may not be ideal for directed panspermia as any given extremophile species requires a very specific climate to survive. However, if the target was closer to Earth, such as a planet or moon in our Solar System, it may be possible to select a specific extremophile species for the well-defined target environment.

Microbial ecology

From Wikipedia, the free encyclopedia

The great plate count anomaly. Counts of cells obtained via cultivation are orders of magnitude lower than those directly observed under the microscope. This is because microbiologists are able to cultivate only a minority of naturally occurring microbes using current laboratory techniques, depending on the environment.[1]

Microbial ecology (or environmental microbiology) is the ecology of microorganisms: their relationship with one another and with their environment. It concerns the three major domains of life—Eukaryota, Archaea, and Bacteria—as well as viruses.[2]

Microorganisms, by their omnipresence, impact the entire biosphere. Microbial life plays a primary role in regulating biogeochemical systems in virtually all of our planet's environments, including some of the most extreme, from frozen environments and acidic lakes, to hydrothermal vents at the bottom of deepest oceans, and some of the most familiar, such as the human small intestine.[3][4] As a consequence of the quantitative magnitude of microbial life (Whitman and coworkers calculated 5.0×1030 cells, eight orders of magnitude greater than the number of stars in the observable universe[5][6]) microbes, by virtue of their biomass alone, constitute a significant carbon sink.[7] Aside from carbon fixation, microorganisms' key collective metabolic processes (including nitrogen fixation, methane metabolism, and sulfur metabolism) control global biogeochemical cycling.[8] The immensity of microorganisms' production is such that, even in the total absence of eukaryotic life, these processes would likely continue unchanged.[9]

History

While microbes have been studied since the seventeenth-century, this research was from a primarily physiological perspective rather than an ecological one.[10] For instance, Louis Pasteur and his disciples were interested in the problem of microbial distribution both on land and in the ocean.[11] Martinus Beijerinck invented the enrichment culture, a fundamental method of studying microbes from the environment. He is often incorrectly credited with framing the microbial biogeographic idea that "everything is everywhere, but, the environment selects", which was stated by Lourens Baas Becking.[12] Sergei Winogradsky was one of the first researchers to attempt to understand microorganisms outside of the medical context—making him among the first students of microbial ecology and environmental microbiology—discovering chemosynthesis, and developing the Winogradsky column in the process.[13]:644

Beijerinck and Windogradsky, however, were focused on the physiology of microorganisms, not the microbial habitat or their ecological interactions.[10] Modern microbial ecology was launched by Robert Hungate and coworkers, who investigated the rumen ecosystem. The study of the rumen required Hungate to develop techniques for culturing anaerobic microbes, and he also pioneered a quantitative approach to the study of microbes and their ecological activities that differentiated the relative contributions of species and catabolic pathways.[10]

Roles

Microorganisms are the backbone of all ecosystems, but even more so in the zones where photosynthesis is unable to take place because of the absence of light. In such zones, chemosynthetic microbes provide energy and carbon to the other organisms.

Other microbes are decomposers, with the ability to recycle nutrients from other organisms' waste products. These microbes play a vital role in biogeochemical cycles.[14] The nitrogen cycle, the phosphorus cycle, the sulphur cycle and the carbon cycle all depend on microorganisms in one way or another. For example, the nitrogen gas which makes up 78% of the earth's atmosphere is unavailable to most organisms, until it is converted to a biologically available form by the microbial process of nitrogen fixation.

Due to the high level of horizontal gene transfer among microbial communities,[15] microbial ecology is also of importance to studies of evolution.[16]

Symbiosis

Microbes, especially bacteria, often engage in symbiotic relationships (either positive or negative) with other microorganisms or larger organisms. Although physically small, symbiotic relationships amongst microbes are significant in eukaryotic processes and their evolution.[17][18] The types of symbiotic relationship that microbes participate in include mutualism, commensalism, parasitism,[19] and amensalism,[20] and these relationships affect the ecosystem in many ways.

Mutualism

Mutualism in microbial ecology is a relationship between microbial species and between microbial species and humans that allow for both sides to benefit.[21] One such example would be syntrophy, also known as cross-feeding,[20] which is clearly shown in Methanobacterium omelianskii. Although initially thought of as one microbial species, this system is actually two species - an S organism and Methabacterium bryantii. The S organism provides the bacterium with the H2, which the bacterium needs in order to grow and produce methane.[17][22] The reaction used by the S organism for the production of H2 is endergonic (and so thermodynamically unfavored) however, when coupled to the reaction used by Methabacterium bryantii in its production of methane, the overall reaction becomes exergonic.[17]  Thus the two organisms are in a mutualistic relationship which allows them to grow and thrive in an environment, deadly for either species alone. Lichen is an example of a symbiotic organism.[22]

Amensalism

Amensalism (also commonly known as antagonism) is a type of symbiotic relationship where one species/organism is harmed while the other remains unaffected.[21] One example of such a relationship that takes place in microbial ecology is between the microbial species Lactobacillus casei and Pseudomonas taetrolens.[23] When co-existing in an environment, Pseudomonas taetrolens shows inhibited growth and decreased production of lactobionic acid (its main product) most likely due to the byproducts created by Lactobacillus casei during its production of lactic acid.[24] However, Lactobacillus casei shows no difference in its behaviour, and such this relationship can be defined as amensalism.

Microbial resource management

Biotechnology may be used alongside microbial ecology to address a number of environmental and economic challenges. For example, molecular techniques such as community fingerprinting can be used to track changes in microbial communities over time or assess their biodiversity. Managing the carbon cycle to sequester carbon dioxide and prevent excess methanogenesis is important in mitigating global warming, and the prospects of bioenergy are being expanded by the development of microbial fuel cells. Microbial resource management advocates a more progressive attitude towards disease, whereby biological control agents are favoured over attempts at eradication. Fluxes in microbial communities has to be better characterized for this field's potential to be realised.[25] In addition, there are also clinical implications, as marine microbial symbioses are a valuable source of existing and novel antimicrobial agents, and thus offer another line of inquiry in the evolutionary arms race of antibiotic resistance, a pressing concern for researchers.[26]

In built environment and human interaction

Microbes exist in all areas, including homes, offices, commercial centers, and hospitals. In 2016, the journal Microbiome published a collection of various works studying the microbial ecology of the built environment.[27]

A 2006 study of pathogenic bacteria in hospitals found that their ability to survive varied by the type, with some surviving for only a few days while others survived for months.[28]

The lifespan of microbes in the home varies similarly. Generally bacteria and viruses require a wet environment with a humidity of over 10 percent.[29] E. coli can survive for a few hours to a day.[29] Bacteria which form spores can survive longer, with Staphylococcus aureus surviving potentially for weeks or, in the case of Bacillus anthracis, years.[29]

In the home, pets can be carriers of bacteria; for example, reptiles are commonly carriers of salmonella.[30]

S. aureus is particularly common, and asymptomatically colonizes about 30% of the human population;[31] attempts to decolonize carriers have met with limited success[32] and generally involve mupirocin nasally and chlorhexidine washing, potentially along with vancomycin and cotrimoxazole to address intestinal and urinary tract infections.[33]

Antimicrobials

Some metals, particularly copper and silver, have antimicrobial properties. Using antimicrobial copper-alloy touch surfaces is a technique which has begun to be used in the 21st century to prevent transmission of bacteria.[34] Silver nanoparticles have also begun to be incorporated into building surfaces and fabrics, although concerns have been raised about the potential side-effects of the tiny particles on human health.

Self-awareness

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Sel...