Search This Blog

Thursday, June 14, 2018

Neuroimaging

From Wikipedia, the free encyclopedia


Computed Tomography (CT) of a head, from top to base of the skull

Para-sagittal MRI of the head in a patient with benign familial macrocephaly.

Neuroimaging or brain imaging is the use of various techniques to either directly or indirectly image the structure, function/pharmacology of the nervous system. It is a relatively new discipline within medicine, neuroscience, and psychology.[1] Physicians who specialize in the performance and interpretation of neuroimaging in the clinical setting are neuroradiologists.

Neuroimaging falls into two broad categories:
Functional imaging enables, for example, the processing of information by centers in the brain to be visualized directly. Such processing causes the involved area of the brain to increase metabolism and "light up" on the scan. One of the more controversial uses of neuroimaging has been research into "thought identification" or mind-reading.

History

The first chapter of the history of neuroimaging traces back to the Italian neuroscientist Angelo Mosso who invented the 'human circulation balance', which could non-invasively measure the redistribution of blood during emotional and intellectual activity.[2] However, even if only briefly mentioned by William James in 1890, the details and precise workings of this balance and the experiments Mosso performed with it have remained largely unknown until the recent discovery of the original instrument as well as Mosso’s reports by Stefano Sandrone and colleagues.[3]

In 1918 the American neurosurgeon Walter Dandy introduced the technique of ventriculography. X-ray images of the ventricular system within the brain were obtained by injection of filtered air directly into one or both lateral ventricles of the brain. Dandy also observed that air introduced into the subarachnoid space via lumbar spinal puncture could enter the cerebral ventricles and also demonstrate the cerebrospinal fluid compartments around the base of the brain and over its surface. This technique was called pneumoencephalography.

In 1927 Egas Moniz introduced cerebral angiography, whereby both normal and abnormal blood vessels in and around the brain could be visualized with great precision.

In the early 1970s, Allan McLeod Cormack and Godfrey Newbold Hounsfield introduced computerized axial tomography (CAT or CT scanning), and ever more detailed anatomic images of the brain became available for diagnostic and research purposes. Cormack and Hounsfield won the 1979 Nobel Prize for Physiology or Medicine for their work. Soon after the introduction of CAT in the early 1980s, the development of radioligands allowed single photon emission computed tomography (SPECT) and positron emission tomography (PET) of the brain.

More or less concurrently, magnetic resonance imaging (MRI or MR scanning) was developed by researchers including Peter Mansfield and Paul Lauterbur, who were awarded the Nobel Prize for Physiology or Medicine in 2003. In the early 1980s MRI was introduced clinically, and during the 1980s a veritable explosion of technical refinements and diagnostic MR applications took place. Scientists soon learned that the large blood flow changes measured by PET could also be imaged by the correct type of MRI. Functional magnetic resonance imaging (fMRI) was born, and since the 1990s, fMRI has come to dominate the brain mapping field due to its low invasiveness, lack of radiation exposure, and relatively wide availability.

In the early 2000s the field of neuroimaging reached the stage where limited practical applications of functional brain imaging have become feasible. The main application area is crude forms of brain-computer interface.

Indications

Neuroimaging follows a neurological examination in which a physician has found cause to more deeply investigate a patient who has or may have a neurological disorder.

One of the more common neurological problems which a person may experience is simple syncope.[4][5] In cases of simple syncope in which the patient's history does not suggest other neurological symptoms, the diagnosis includes a neurological examination but routine neurological imaging is not indicated because the likelihood of finding a cause in the central nervous system is extremely low and the patient is unlikely to benefit from the procedure.[5]

Neuroimaging is not indicated for patients with stable headaches which are diagnosed as migraine.[6] Studies indicate that presence of migraine does not increase a patient's risk for intracranial disease.[6] A diagnosis of migraine which notes the absence of other problems, such as papilledema, would not indicate a need for neuroimaging.[6] In the course of conducting a careful diagnosis, the physician should consider whether the headache has a cause other than the migraine and might require neuroimaging.[6]

Another indication for neuroimaging is CT-, MRI- and PET-guided stereotactic surgery or radiosurgery for treatment of intracranial tumors, arteriovenous malformations and other surgically treatable conditions.[7][8][9][10]

Brain imaging techniques

Computed axial tomography

Computed tomography (CT) or Computed Axial Tomography (CAT) scanning uses a series of x-rays of the head taken from many different directions. Typically used for quickly viewing brain injuries, CT scanning uses a computer program that performs a numerical integral calculation (the inverse Radon transform) on the measured x-ray series to estimate how much of an x-ray beam is absorbed in a small volume of the brain. Typically the information is presented as cross sections of the brain.[11]

Diffuse optical imaging

Diffuse optical imaging (DOI) or diffuse optical tomography (DOT) is a medical imaging modality which uses near infrared light to generate images of the body. The technique measures the optical absorption of haemoglobin, and relies on the absorption spectrum of haemoglobin varying with its oxygenation status. High-density diffuse optical tomography (HD-DOT) has been compared directly to fMRI using response to visual stimulation in subjects studied with both techniques, with reassuringly similar results.[12] HD-DOT has also been compared to fMRI in terms of language tasks and resting state functional connectivity.[13]

Event-related optical signal

Event-related optical signal (EROS) is a brain-scanning technique which uses infrared light through optical fibers to measure changes in optical properties of active areas of the cerebral cortex. Whereas techniques such as diffuse optical imaging (DOT) and near infrared spectroscopy (NIRS) measure optical absorption of haemoglobin, and thus are based on blood flow, EROS takes advantage of the scattering properties of the neurons themselves, and thus provides a much more direct measure of cellular activity. EROS can pinpoint activity in the brain within millimeters (spatially) and within milliseconds (temporally). Its biggest downside is the inability to detect activity more than a few centimeters deep. EROS is a new, relatively inexpensive technique that is non-invasive to the test subject. It was developed at the University of Illinois at Urbana-Champaign where it is now used in the Cognitive Neuroimaging Laboratory of Dr. Gabriele Gratton and Dr. Monica Fabiani.

Magnetic resonance imaging


Sagittal MRI slice at the midline.

Magnetic resonance imaging (MRI) uses magnetic fields and radio waves to produce high quality two- or three-dimensional images of brain structures without use of ionizing radiation (X-rays) or radioactive tracers.

Functional magnetic resonance imaging


Axial MRI slice at the level of the basal ganglia, showing fMRI BOLD signal changes overlaid in red (increase) and blue (decrease) tones.

Functional magnetic resonance imaging (fMRI) and arterial spin labeling (ASL) relies on the paramagnetic properties of oxygenated and deoxygenated hemoglobin to see images of changing blood flow in the brain associated with neural activity. This allows images to be generated that reflect which brain structures are activated (and how) during performance of different tasks or at resting state. According to the oxygenation hypothesis, changes in oxygen usage in regional cerebral blood flow during cognitive or behavioral activity can be associated with the regional neurons as being directly related to the cognitive or behavioral tasks being attended.

Most fMRI scanners allow subjects to be presented with different visual images, sounds and touch stimuli, and to make different actions such as pressing a button or moving a joystick. Consequently, fMRI can be used to reveal brain structures and processes associated with perception, thought and action. The resolution of fMRI is about 2-3 millimeters at present, limited by the spatial spread of the hemodynamic response to neural activity. It has largely superseded PET for the study of brain activation patterns. PET, however, retains the significant advantage of being able to identify specific brain receptors (or transporters) associated with particular neurotransmitters through its ability to image radiolabelled receptor "ligands" (receptor ligands are any chemicals that stick to receptors).

As well as research on healthy subjects, fMRI is increasingly used for the medical diagnosis of disease. Because fMRI is exquisitely sensitive to oxygen usage in blood flow, it is extremely sensitive to early changes in the brain resulting from ischemia (abnormally low blood flow), such as the changes which follow stroke. Early diagnosis of certain types of stroke is increasingly important in neurology, since substances which dissolve blood clots may be used in the first few hours after certain types of stroke occur, but are dangerous to use afterwards. Brain changes seen on fMRI may help to make the decision to treat with these agents. With between 72% and 90% accuracy where chance would achieve 0.8%,[14] fMRI techniques can decide which of a set of known images the subject is viewing.[15]

Magnetoencephalography

Magnetoencephalography (MEG) is an imaging technique used to measure the magnetic fields produced by electrical activity in the brain via extremely sensitive devices such as superconducting quantum interference devices (SQUIDs) or spin exchange relaxation-free[16] (SERF) magnetometers. MEG offers a very direct measurement of neural electrical activity (compared to fMRI for example) with very high temporal resolution but relatively low spatial resolution. The advantage of measuring the magnetic fields produced by neural activity is that they are likely to be less distorted by surrounding tissue (particularly the skull and scalp) compared to the electric fields measured by electroencephalography (EEG). Specifically, it can be shown that magnetic fields produced by electrical activity are not affected by the surrounding head tissue, when the head is modeled as a set of concentric spherical shells, each being an isotropic homogeneous conductor. Real heads are non-spherical and have largely anisotropic conductivities (particularly white matter and skull). While skull anisotropy has negligible effect on MEG (unlike EEG), white matter anisotropy strongly affects MEG measurements for radial and deep sources.[17] Note, however, that the skull was assumed to be uniformly anisotropic in this study, which is not true for a real head: the absolute and relative thicknesses of diploƫ and tables layers vary among and within the skull bones. This makes it likely that MEG is also affected by the skull anisotropy,[18] although probably not to the same degree as EEG.

There are many uses for MEG, including assisting surgeons in localizing a pathology, assisting researchers in determining the function of various parts of the brain, neurofeedback, and others.

Positron emission tomography

Positron emission tomography (PET) measures emissions from radioactively labeled metabolically active chemicals that have been injected into the bloodstream. The emission data are computer-processed to produce 2- or 3-dimensional images of the distribution of the chemicals throughout the brain.[19]:57 The positron emitting radioisotopes used are produced by a cyclotron, and chemicals are labeled with these radioactive atoms. The labeled compound, called a radiotracer, is injected into the bloodstream and eventually makes its way to the brain. Sensors in the PET scanner detect the radioactivity as the compound accumulates in various regions of the brain. A computer uses the data gathered by the sensors to create multicolored 2- or 3-dimensional images that show where the compound acts in the brain. Especially useful are a wide array of ligands used to map different aspects of neurotransmitter activity, with by far the most commonly used PET tracer being a labeled form of glucose (see Fludeoxyglucose (18F) (FDG)).

The greatest benefit of PET scanning is that different compounds can show blood flow and oxygen and glucose metabolism in the tissues of the working brain. These measurements reflect the amount of brain activity in the various regions of the brain and allow to learn more about how the brain works. PET scans were superior to all other metabolic imaging methods in terms of resolution and speed of completion (as little as 30 seconds), when they first became available. The improved resolution permitted better study to be made as to the area of the brain activated by a particular task. The biggest drawback of PET scanning is that because the radioactivity decays rapidly, it is limited to monitoring short tasks.[19]:60 Before fMRI technology came online, PET scanning was the preferred method of functional (as opposed to structural) brain imaging, and it continues to make large contributions to neuroscience.

PET scanning is also used for diagnosis of brain disease, most notably because brain tumors, strokes, and neuron-damaging diseases which cause dementia (such as Alzheimer's disease) all cause great changes in brain metabolism, which in turn causes easily detectable changes in PET scans. PET is probably most useful in early cases of certain dementias (with classic examples being Alzheimer's disease and Pick's disease) where the early damage is too diffuse and makes too little difference in brain volume and gross structure to change CT and standard MRI images enough to be able to reliably differentiate it from the "normal" range of cortical atrophy which occurs with aging (in many but not all) persons, and which does not cause clinical dementia.

Single-photon emission computed tomography

Single-photon emission computed tomography (SPECT) is similar to PET and uses gamma ray-emitting radioisotopes and a gamma camera to record data that a computer uses to construct two- or three-dimensional images of active brain regions.[20] SPECT relies on an injection of radioactive tracer, or "SPECT agent," which is rapidly taken up by the brain but does not redistribute. Uptake of SPECT agent is nearly 100% complete within 30 to 60 seconds, reflecting cerebral blood flow (CBF) at the time of injection. These properties of SPECT make it particularly well-suited for epilepsy imaging, which is usually made difficult by problems with patient movement and variable seizure types. SPECT provides a "snapshot" of cerebral blood flow since scans can be acquired after seizure termination (so long as the radioactive tracer was injected at the time of the seizure). A significant limitation of SPECT is its poor resolution (about 1 cm) compared to that of MRI. Today, SPECT machines with Dual Detector Heads are commonly used, although Triple Detector Head machines are available in the marketplace. Tomographic reconstruction, (mainly used for functional "snapshots" of the brain) requires multiple projections from Detector Heads which rotate around the human skull, so some researchers have developed 6 and 11 Detector Head SPECT machines to cut imaging time and give higher resolution.[21][22]

Like PET, SPECT also can be used to differentiate different kinds of disease processes which produce dementia, and it is increasingly used for this purpose. Neuro-PET has a disadvantage of requiring use of tracers with half-lives of at most 110 minutes, such as FDG. These must be made in a cyclotron, and are expensive or even unavailable if necessary transport times are prolonged more than a few half-lives. SPECT, however, is able to make use of tracers with much longer half-lives, such as technetium-99m, and as a result, is far more widely available.

Cranial ultrasound

Cranial ultrasound is usually only used in babies, whose open fontanelles provide acoustic windows allowing ultrasound imaging of the brain. Advantages include absence of ionising radiation and the possibility of bedside scanning, but the lack of soft-tissue detail means MRI may be preferred for some conditions.

Advantages and Concerns of Neuroimaging Techniques

Functional Magnetic Resonance Imaging (fMRI)

fMRI is commonly classified as a minimally-to-moderate risk due to its non-invasiveness compared to other imaging methods. fMRI uses blood oxygenation level dependent (BOLD)-contrast in order to produce its form of imaging. BOLD-contrast is a naturally occurring process in the body so fMRI is often preferred over imaging methods that require radioactive markers to produce similar imaging.[23] A concern in the use of fMRI is its use in individuals with medical implants or devices and metallic items in the body. The magnetic resonance (MR) emitted from the equipment can cause failure of medical devices and attract metallic objects in the body if not properly screened for. Currently, the FDA classifies medical implants and devices into three categories, depending on MR-compatibility: MR-safe (safe in all MR environments), MR-unsafe (unsafe in any MR environment), and MR-conditional (MR-compatible in certain environments, requiring further information).[24]

Computed Tomography (CT) Scan

The CT scan was introduced in the 1970s and quickly became one of the most widely used methods of imaging. A CT scan can be performed in under a second and produce rapid results for clinicians, with its ease of use leading to an increase in CT scans performed in the United States from 3 million in 1980 to 62 million in 2007. Clinicians oftentimes take multiple scans, with 30% of individuals undergoing at least 3 scans in one study of CT scan usage[26]. CT scans can expose patients to levels of radiation 100-500 times higher than traditional x-rays, with higher radiation doses producing better resolution imaging.[27] While easy to use, increases in CT scan use, especially in asymptomatic patients, is a topic of concern since patients are exposed to significantly high levels of radiation[26].

Positron Emission Tomography (PET)

In PET scans, imaging does not rely on intrinsic biological processes, but relies on a foreign substance injected into the blood stream traveling to the brain. Patients are injected with radioisotopes that are metabolized in the brain and emit positrons to produce a visualization of brain activity.[23] The amount of radiation a patient is exposed to in a PET scan is relatively small, comparable to the amount of environmental radiation an individual is exposed to across a year. PET radioisotopes have limited exposure time in the body as they commonly have very short half-lives (~2 hours) and decay rapidly.[28] Currently, fMRI is a preferred method of imaging brain activity compared to PET, since it does not involve radiation, has a higher temporal resolution than PET, and is more readily available in most medical settings.[23]

Magnetoencephalography (MEG) & Electroencephalography (EEG)

The high temporal resolution of MEG and EEG allow these methods to measure brain activity down to the millisecond. Both MEG and EEG do not require exposure of the patient to radiation to function. EEG electrodes detect electrical signals produced by neurons to measure brain activity and MEG uses oscillations in magnetic field produced by these electrical currents to measure activity. A barrier in widespread usage of MEG is due to pricing, as MEG systems can cost millions of dollars. EEG is a much more widely used method to achieve such temporal resolution as EEG systems cost much less than MEG systems. A disadvantage of EEG and MEG is that both methods have poor spatial resolution when compared to fMRI.[23]

Criticism and cautions

Some scientists have criticized the brain image-based claims made in scientific journals and the popular press, like the discovery of "the part of the brain responsible" for functions like talents, specific memories, or generating emotions such as love. Many mapping techniques have a relatively low resolution, including hundreds of thousands of neurons in a single voxel. Many functions also involve multiple parts of the brain, meaning that this type of claim is probably both unverifiable with the equipment used, and generally based on an incorrect assumption about how brain functions are divided. It may be that most brain functions will only be described correctly after being measured with much more fine-grained measurements that look not at large regions but instead at a very large number of tiny individual brain circuits. Many of these studies also have technical problems like small sample size or poor equipment calibration which means they cannot be reproduced - considerations which are sometimes ignored to produce a sensational journal article or news headline. In some cases the brain mapping techniques are used for commercial purposes, lie detection, or medical diagnosis in ways which have not been scientifically validated.[29]

Biomedical engineering

From Wikipedia, the free encyclopedia

Ultrasound representation of urinary bladder (black butterfly-like shape) a hyperplastic prostate. An example of practical science and medical science working together.

Example of an approximately 40,000 probe spotted oligo microarray with enlarged inset to show detail.

Biomedical engineering (BME) is the application of engineering principles and design concepts to medicine and biology for healthcare purposes (e.g. diagnostic or therapeutic). This field seeks to close the gap between engineering and medicine, combining the design and problem solving skills of engineering with medical biological sciences to advance health care treatment, including diagnosis, monitoring, and therapy.[1] Biomedical engineering has only recently emerged as its own study, as compared to many other engineering fields. Such an evolution is common as a new field transitions from being an interdisciplinary specialization among already-established fields, to being considered a field in itself. Much of the work in biomedical engineering consists of research and development, spanning a broad array of subfields (see below). Prominent biomedical engineering applications include the development of biocompatible prostheses, various diagnostic and therapeutic medical devices ranging from clinical equipment to micro-implants, common imaging equipment such as MRIs and EKG/ECGs, regenerative tissue growth, pharmaceutical drugs and therapeutic biologicals.

Bioinformatics

Bioinformatics is an interdisciplinary field that develops methods and software tools for understanding biological data. As an interdisciplinary field of science, bioinformatics combines computer science, statistics, mathematics, and engineering to analyze and interpret biological data.

Bioinformatics is considered both an umbrella term for the body of biological studies that use computer programming as part of their methodology, as well as a reference to specific analysis "pipelines" that are repeatedly used, particularly in the field of genomics. Common uses of bioinformatics include the identification of candidate genes and nucleotides (SNPs). Often, such identification is made with the aim of better understanding the genetic basis of disease, unique adaptations, desirable properties (esp. in agricultural species), or differences between populations. In a less formal way, bioinformatics also tries to understand the organisational principles within nucleic acid and protein sequences.

Biomechanics

Biomechanics is the study of the structure and function of the mechanical aspects of biological systems, at any level from whole organisms to organs, cells and cell organelles,[2] using the methods of mechanics.[3]

Biomaterial

A biomaterial is any matter, surface, or construct that interacts with living systems. As a science, biomaterials is about fifty years old. The study of biomaterials is called biomaterials science or biomaterials engineering. It has experienced steady and strong growth over its history, with many companies investing large amounts of money into the development of new products. Biomaterials science encompasses elements of medicine, biology, chemistry, tissue engineering and materials science.

Biomedical optics

Biomedical optics refers to the interaction of biological tissue and light, and how this can be exploited for sensing, imaging, and treatment.[4]

Tissue engineering

Tissue engineering, like genetic engineering (see below), is a major segment of biotechnology – which overlaps significantly with BME.

One of the goals of tissue engineering is to create artificial organs (via biological material) for patients that need organ transplants. Biomedical engineers are currently researching methods of creating such organs. Researchers have grown solid jawbones[5] and tracheas[6] from human stem cells towards this end. Several artificial urinary bladders have been grown in laboratories and transplanted successfully into human patients.[7] Bioartificial organs, which use both synthetic and biological component, are also a focus area in research, such as with hepatic assist devices that use liver cells within an artificial bioreactor construct.[8]


Micromass cultures of C3H-10T1/2 cells at varied oxygen tensions stained with Alcian blue.

Genetic engineering

Genetic engineering, recombinant DNA technology, genetic modification/manipulation (GM) and gene splicing are terms that apply to the direct manipulation of an organism's genes. Unlike traditional breeding, an indirect method of genetic manipulation, genetic engineering utilizes modern tools such as molecular cloning and transformation to directly alter the structure and characteristics of target genes. Genetic engineering techniques have found success in numerous applications. Some examples include the improvement of crop technology (not a medical application, but see biological systems engineering), the manufacture of synthetic human insulin through the use of modified bacteria, the manufacture of erythropoietin in hamster ovary cells, and the production of new types of experimental mice such as the oncomouse (cancer mouse) for research.

Neural engineering

Neural engineering (also known as neuroengineering) is a discipline that uses engineering techniques to understand, repair, replace, or enhance neural systems. Neural engineers are uniquely qualified to solve design problems at the interface of living neural tissue and non-living constructs.

Pharmaceutical engineering

Pharmaceutical engineering is an interdisciplinary science that includes drug engineering, novel drug delivery and targeting, pharmaceutical technology, unit operations of Chemical Engineering, and Pharmaceutical Analysis. It may be deemed as a part of pharmacy due to its focus on the use of technology on chemical agents in providing better medicinal treatment. The ISPE is an international body that certifies this now rapidly emerging interdisciplinary science.

Medical devices

This is an extremely broad category—essentially covering all health care products that do not achieve their intended results through predominantly chemical (e.g., pharmaceuticals) or biological (e.g., vaccines) means, and do not involve metabolism.

A medical device is intended for use in:
  • the diagnosis of disease or other conditions, or
  • in the cure, mitigation, treatment, or prevention of disease.
Some examples include pacemakers, infusion pumps, the heart-lung machine, dialysis machines, artificial organs, implants, artificial limbs, corrective lenses, cochlear implants, ocular prosthetics, facial prosthetics, somato prosthetics, and dental implants.


Biomedical instrumentation amplifier schematic used in monitoring low voltage biological signals, an example of a biomedical engineering application of electronic engineering to electrophysiology.

Stereolithography is a practical example of medical modeling being used to create physical objects. Beyond modeling organs and the human body, emerging engineering techniques are also currently used in the research and development of new devices for innovative therapies,[9] treatments,[10] patient monitoring,[11] of complex diseases.

Medical devices are regulated and classified (in the US) as follows (see also Regulation):
  • Class I devices present minimal potential for harm to the user and are often simpler in design than Class II or Class III devices. Devices in this category include tongue depressors, bedpans, elastic bandages, examination gloves, and hand-held surgical instruments and other similar types of common equipment.
  • Class II devices are subject to special controls in addition to the general controls of Class I devices. Special controls may include special labeling requirements, mandatory performance standards, and postmarket surveillance. Devices in this class are typically non-invasive and include X-ray machines, PACS, powered wheelchairs, infusion pumps, and surgical drapes.
  • Class III devices generally require premarket approval (PMA) or premarket notification (510k), a scientific review to ensure the device's safety and effectiveness, in addition to the general controls of Class I. Examples include replacement heart valves, hip and knee joint implants, silicone gel-filled breast implants, implanted cerebellar stimulators, implantable pacemaker pulse generators and endosseous (intra-bone) implants.

Medical imaging

Medical/biomedical imaging is a major segment of medical devices. This area deals with enabling clinicians to directly or indirectly "view" things not visible in plain sight (such as due to their size, and/or location). This can involve utilizing ultrasound, magnetism, UV, radiology, and other means.


An MRI scan of a human head, an example of a biomedical engineering application of electrical engineering to diagnostic imaging. Click here to view an animated sequence of slices.

Imaging technologies are often essential to medical diagnosis, and are typically the most complex equipment found in a hospital including: fluoroscopy, magnetic resonance imaging (MRI), nuclear medicine, positron emission tomography (PET), PET-CT scans, projection radiography such as X-rays and CT scans, tomography, ultrasound, optical microscopy, and electron microscopy.

Implants

An implant is a kind of medical device made to replace and act as a missing biological structure (as compared with a transplant, which indicates transplanted biomedical tissue). The surface of implants that contact the body might be made of a biomedical material such as titanium, silicone or apatite depending on what is the most functional. In some cases, implants contain electronics, e.g. artificial pacemakers and cochlear implants. Some implants are bioactive, such as subcutaneous drug delivery devices in the form of implantable pills or drug-eluting stents.


Artificial limbs: The right arm is an example of a prosthesis, and the left arm is an example of myoelectric control.

A prosthetic eye, an example of a biomedical engineering application of mechanical engineering and biocompatible materials to ophthalmology.

Bionics

Artificial body part replacements are one of the many applications of bionics. Concerned with the intricate and thorough study of the properties and function of human body systems, bionics may be applied to solve some engineering problems. Careful study of the different functions and processes of the eyes, ears, and other organs paved the way for improved cameras, television, radio transmitters and receivers, and many other useful tools. These developments have indeed made our lives better, but the best contribution that bionics has made is in the field of biomedical engineering (the building of useful replacements for various parts of the human body). Modern hospitals now have available spare parts to replace body parts badly damaged by injury or disease [Citation Needed]. Biomedical engineers work hand in hand with doctors to build these artificial body parts.

Clinical engineering

Clinical engineering is the branch of biomedical engineering dealing with the actual implementation of medical equipment and technologies in hospitals or other clinical settings. Major roles of clinical engineers include training and supervising biomedical equipment technicians (BMETs), selecting technological products/services and logistically managing their implementation, working with governmental regulators on inspections/audits, and serving as technological consultants for other hospital staff (e.g. physicians, administrators, I.T., etc.). Clinical engineers also advise and collaborate with medical device producers regarding prospective design improvements based on clinical experiences, as well as monitor the progression of the state of the art so as to redirect procurement patterns accordingly.

Their inherent focus on practical implementation of technology has tended to keep them oriented more towards incremental-level redesigns and reconfigurations, as opposed to revolutionary research & development or ideas that would be many years from clinical adoption; however, there is a growing effort to expand this time-horizon over which clinical engineers can influence the trajectory of biomedical innovation. In their various roles, they form a "bridge" between the primary designers and the end-users, by combining the perspectives of being both 1) close to the point-of-use, while 2) trained in product and process engineering. Clinical engineering departments will sometimes hire not just biomedical engineers, but also industrial/systems engineers to help address operations research/optimization, human factors, cost analysis, etc. Also see safety engineering for a discussion of the procedures used to design safe systems.

Rehabilitation engineering

Rehabilitation engineering is the systematic application of engineering sciences to design, develop, adapt, test, evaluate, apply, and distribute technological solutions to problems confronted by individuals with disabilities. Functional areas addressed through rehabilitation engineering may include mobility, communications, hearing, vision, and cognition, and activities associated with employment, independent living, education, and integration into the community.[1]

While some rehabilitation engineers have master's degrees in rehabilitation engineering, usually a subspecialty of Biomedical engineering, most rehabilitation engineers have undergraduate or graduate degrees in biomedical engineering, mechanical engineering, or electrical engineering. A Portuguese university provides an undergraduate degree and a master's degree in Rehabilitation Engineering and Accessibility.[5][7] Qualification to become a Rehab' Engineer in the UK is possible via a University BSc Honours Degree course such as Health Design & Technology Institute, Coventry University.[8]

The rehabilitation process for people with disabilities often entails the design of assistive devices such as Walking aids intended to promote inclusion of their users into the mainstream of society, commerce, and recreation.


Schematic representation of a normal ECG trace showing sinus rhythm; an example of widely used clinical medical equipment (operates by applying electronic engineering to electrophysiology and medical diagnosis).

Regulatory issues

Regulatory issues have been constantly increased in the last decades to respond to the many incidents caused by devices to patients. For example, from 2008 to 2011, in US, there were 119 FDA recalls of medical devices classified as class I. According to U.S. Food and Drug Administration (FDA), Class I recall is associated to "a situation in which there is a reasonable probability that the use of, or exposure to, a product will cause serious adverse health consequences or death"[12]

Regardless of the country-specific legislation, the main regulatory objectives coincide worldwide.[13] For example, in the medical device regulations, a product must be: 1) safe and 2) effective and 3) for all the manufactured devices

A product is safe if patients, users and third parties do not run unacceptable risks of physical hazards (death, injuries, …) in its intended use. Protective measures have to be introduced on the devices to reduce residual risks at acceptable level if compared with the benefit derived from the use of it.

A product is effective if it performs as specified by the manufacturer in the intended use. Effectiveness is achieved through clinical evaluation, compliance to performance standards or demonstrations of substantial equivalence with an already marketed device.

The previous features have to be ensured for all the manufactured items of the medical device. This requires that a quality system shall be in place for all the relevant entities and processes that may impact safety and effectiveness over the whole medical device lifecycle.

The medical device engineering area is among the most heavily regulated fields of engineering, and practicing biomedical engineers must routinely consult and cooperate with regulatory law attorneys and other experts. The Food and Drug Administration (FDA) is the principal healthcare regulatory authority in the United States, having jurisdiction over medical devices, drugs, biologics, and combination products. The paramount objectives driving policy decisions by the FDA are safety and effectiveness of healthcare products that have to be assured through a quality system in place as specified under 21 CFR 829 regulation. In addition, because biomedical engineers often develop devices and technologies for "consumer" use, such as physical therapy devices (which are also "medical" devices), these may also be governed in some respects by the Consumer Product Safety Commission. The greatest hurdles tend to be 510K "clearance" (typically for Class 2 devices) or pre-market "approval" (typically for drugs and class 3 devices).

In the European context, safety effectiveness and quality is ensured through the "Conformity Assessment" that is defined as "the method by which a manufacturer demonstrates that its device complies with the requirements of the European Medical Device Directive". The directive specifies different procedures according to the class of the device ranging from the simple Declaration of Conformity (Annex VII) for Class I devices to EC verification (Annex IV), Production quality assurance (Annex V), Product quality assurance (Annex VI) and Full quality assurance (Annex II). The Medical Device Directive specifies detailed procedures for Certification. In general terms, these procedures include tests and verifications that are to be contained in specific deliveries such as the risk management file, the technical file and the quality system deliveries. The risk management file is the first deliverable that conditions the following design and manufacturing steps. Risk management stage shall drive the product so that product risks are reduced at an acceptable level with respect to the benefits expected for the patients for the use of the device. The technical file contains all the documentation data and records supporting medical device certification. FDA technical file has similar content although organized in different structure. The Quality System deliverables usually includes procedures that ensure quality throughout all product life cycle. The same standard (ISO EN 13485) is usually applied for quality management systems in US and worldwide.


Implants, such as artificial hip joints, are generally extensively regulated due to the invasive nature of such devices.

In the European Union, there are certifying entities named "Notified Bodies", accredited by European Member States. The Notified Bodies must ensure the effectiveness of the certification process for all medical devices apart from the class I devices where a declaration of conformity produced by the manufacturer is sufficient for marketing. Once a product has passed all the steps required by the Medical Device Directive, the device is entitled to bear a CE marking, indicating that the device is believed to be safe and effective when used as intended, and, therefore, it can be marketed within the European Union area.

The different regulatory arrangements sometimes result in particular technologies being developed first for either the U.S. or in Europe depending on the more favorable form of regulation. While nations often strive for substantive harmony to facilitate cross-national distribution, philosophical differences about the optimal extent of regulation can be a hindrance; more restrictive regulations seem appealing on an intuitive level, but critics decry the tradeoff cost in terms of slowing access to life-saving developments.

RoHS II

Directive 2011/65/EU, better known as RoHS 2 is a recast of legislation originally introduced in 2002. The original EU legislation "Restrictions of Certain Hazardous Substances in Electrical and Electronics Devices" (RoHS Directive 2002/95/EC) was replaced and superseded by 2011/65/EU published in July 2011 and commonly known as RoHS 2. RoHS seeks to limit the dangerous substances in circulation in electronics products, in particular toxins and heavy metals, which are subsequently released into the environment when such devices are recycled.

The scope of RoHS 2 is widened to include products previously excluded, such as medical devices and industrial equipment. In addition, manufacturers are now obliged to provide conformity risk assessments and test reports – or explain why they are lacking. For the first time, not only manufacturers, but also importers and distributors share a responsibility to ensure Electrical and Electronic Equipment within the scope of RoHS comply with the hazardous substances limits and have a CE mark on their products.

IEC 60601

The new International Standard IEC 60601 for home healthcare electro-medical devices defining the requirements for devices used in the home healthcare environment. IEC 60601-1-11 (2010) must now be incorporated into the design and verification of a wide range of home use and point of care medical devices along with other applicable standards in the IEC 60601 3rd edition series.

The mandatory date for implementation of the EN European version of the standard is June 1, 2013. The US FDA requires the use of the standard on June 30, 2013, while Health Canada recently extended the required date from June 2012 to April 2013. The North American agencies will only require these standards for new device submissions, while the EU will take the more severe approach of requiring all applicable devices being placed on the market to consider the home healthcare standard.

AS/NZS 3551:2012

AS/ANS 3551:2012 is the Australian and New Zealand standards for the management of medical devices. The standard specifies the procedures required to maintain a wide range of medical assets in a clinical setting (e.g. Hospital).[14] The standards are based on the IEC 606101 standards.

The standard covers a wide range of medical equipment management elements including, procurement, acceptance testing, maintenance (electrical safety and preventative maintenance testing) and decommissioning.

Training and certification

Education

Biomedical engineers require considerable knowledge of both engineering and biology, and typically have a Bachelor's (B.Tech, B.S) or Master's (M.S., M.Tech, M.S.E., or M.Eng.) or a Doctoral (Ph.D.) degree in BME (Biomedical Engineering) or another branch of engineering with considerable potential for BME overlap. As interest in BME increases, many engineering colleges now have a Biomedical Engineering Department or Program, with offerings ranging from the undergraduate (B.Tech, B.S., B.Eng or B.S.E.) to doctoral levels. Biomedical engineering has only recently been emerging as its own discipline rather than a cross-disciplinary hybrid specialization of other disciplines; and BME programs at all levels are becoming more widespread, including the Bachelor of Science in Biomedical Engineering which actually includes so much biological science content that many students use it as a "pre-med" major in preparation for medical school. The number of biomedical engineers is expected to rise as both a cause and effect of improvements in medical technology.[15]

In the U.S., an increasing number of undergraduate programs are also becoming recognized by ABET as accredited bioengineering/biomedical engineering programs. Over 65 programs are currently accredited by ABET.[16][17]

In Canada and Australia, accredited graduate programs in Biomedical Engineering are common, for example in Universities such as McMaster University, and the first Canadian undergraduate BME program at Ryerson University offering a four-year B.Eng program.[18][19][20][21] The Polytechnique in Montreal is also offering a bachelors's degree in biomedical engineering.

As with many degrees, the reputation and ranking of a program may factor into the desirability of a degree holder for either employment or graduate admission. The reputation of many undergraduate degrees are also linked to the institution's graduate or research programs, which have some tangible factors for rating, such as research funding and volume, publications and citations. With BME specifically, the ranking of a university's hospital and medical school can also be a significant factor in the perceived prestige of its BME department/program.

Graduate education is a particularly important aspect in BME. While many engineering fields (such as mechanical or electrical engineering) do not need graduate-level training to obtain an entry-level job in their field, the majority of BME positions do prefer or even require them.[22] Since most BME-related professions involve scientific research, such as in pharmaceutical and medical device development, graduate education is almost a requirement (as undergraduate degrees typically do not involve sufficient research training and experience). This can be either a Masters or Doctoral level degree; while in certain specialties a Ph.D. is notably more common than in others, it is hardly ever the majority (except in academia). In fact, the perceived need for some kind of graduate credential is so strong that some undergraduate BME programs will actively discourage students from majoring in BME without an expressed intention to also obtain a master's degree or apply to medical school afterwards.

Graduate programs in BME, like in other scientific fields, are highly varied, and particular programs may emphasize certain aspects within the field. They may also feature extensive collaborative efforts with programs in other fields (such as the University's Medical School or other engineering divisions), owing again to the interdisciplinary nature of BME. M.S. and Ph.D. programs will typically require applicants to have an undergraduate degree in BME, or another engineering discipline (plus certain life science coursework), or life science (plus certain engineering coursework).

Education in BME also varies greatly around the world. By virtue of its extensive biotechnology sector, its numerous major universities, and relatively few internal barriers, the U.S. has progressed a great deal in its development of BME education and training opportunities. Europe, which also has a large biotechnology sector and an impressive education system, has encountered trouble in creating uniform standards as the European community attempts to supplant some of the national jurisdictional barriers that still exist. Recently, initiatives such as BIOMEDEA have sprung up to develop BME-related education and professional standards.[23] Other countries, such as Australia, are recognizing and moving to correct deficiencies in their BME education.[24] Also, as high technology endeavors are usually marks of developed nations, some areas of the world are prone to slower development in education, including in BME.

Licensure/certification

As with other learned professions, each state has certain (fairly similar) requirements for becoming licensed as a registered Professional Engineer (PE), but, in US, in industry such a license is not required to be an employee as an engineer in the majority of situations (due to an exception known as the industrial exemption, which effectively applies to the vast majority of American engineers). The US model has generally been only to require the practicing engineers offering engineering services that impact the public welfare, safety, safeguarding of life, health, or property to be licensed, while engineers working in private industry without a direct offering of engineering services to the public or other businesses, education, and government need not be licensed. This is notably not the case in many other countries, where a license is as legally necessary to practice engineering as it is for law or medicine.

Biomedical engineering is regulated in some countries, such as Australia, but registration is typically only recommended and not required.[25]

In the UK, mechanical engineers working in the areas of Medical Engineering, Bioengineering or Biomedical engineering can gain Chartered Engineer status through the Institution of Mechanical Engineers. The Institution also runs the Engineering in Medicine and Health Division.[26] The Institute of Physics and Engineering in Medicine (IPEM) has a panel for the accreditation of MSc courses in Biomedical Engineering and Chartered Engineering status can also be sought through IPEM.

The Fundamentals of Engineering exam – the first (and more general) of two licensure examinations for most U.S. jurisdictions—does now cover biology (although technically not BME). For the second exam, called the Principles and Practices, Part 2, or the Professional Engineering exam, candidates may select a particular engineering discipline's content to be tested on; there is currently not an option for BME with this, meaning that any biomedical engineers seeking a license must prepare to take this examination in another category (which does not affect the actual license, since most jurisdictions do not recognize discipline specialties anyway). However, the Biomedical Engineering Society (BMES) is, as of 2009, exploring the possibility of seeking to implement a BME-specific version of this exam to facilitate biomedical engineers pursuing licensure.

Beyond governmental registration, certain private-sector professional/industrial organizations also offer certifications with varying degrees of prominence. One such example is the Certified Clinical Engineer (CCE) certification for Clinical engineers.

Career prospects

In 2012 there were about 19,400 biomedical engineers employed in the US, and the field was predicted to grow by 27% (much faster than average) from 2012 to 2022.[27] Biomedical engineering has the highest percentage of women engineers compared to other common engineering professions.

Notable figures

Quantum biology

From Wikipedia, the free encyclopedia

Quantum biology refers to applications of quantum mechanics and theoretical chemistry to biological objects and problems. Many biological processes involve the conversion of energy into forms that are usable for chemical transformations and are quantum mechanical in nature. Such processes involve chemical reactions, light absorption, formation of excited electronic states, transfer of excitation energy, and the transfer of electrons and protons (hydrogen ions) in chemical processes such as photosynthesis and cellular respiration.[1] Quantum biology may use computations to model biological interactions in light of quantum mechanical effects.[2] Quantum biology is concerned with the influence of non-trivial quantum phenomena,[3] as opposed to the so-called trivial quantum phenomena present in all biology by reduction to fundamental physics.

History

Early pioneers of quantum physics saw applications of quantum mechanics in biological problems. Erwin Schrƶdinger's 1944 book What is Life? discussed applications of quantum mechanics in biology.[4] Schrƶdinger introduced the idea of an "aperiodic crystal" that contained genetic information in its configuration of covalent chemical bonds. He further suggested that mutations are introduced by "quantum leaps". Other pioneers Niels Bohr, Pascual Jordan, and Max Delbruck argued that the quantum idea of complementarity was fundamental to the life sciences.[5] In 1963, Per-Olov Lƶwdin published proton tunneling as another mechanism for DNA mutation. In his paper, he stated that there is a new field of study called "quantum biology".[6]

Applications

Photosynthesis


Diagram of FMO complex. Light excites electrons in an antenna. The excitation then transfers through various proteins in the FMO complex to the reaction center to further photosynthesis.

Organisms that undergo photosynthesis initially absorb light energy through the process of electron excitation in an antenna. This antenna varies between organisms. Bacteria can use ring like structures as antennas, whereas plants and other organisms use chlorophyll pigments to absorb photons. This electron excitation creates a separation of charge in a reaction site that is later converted into chemical energy for the cell to use. However, this electron excitation must be transferred in an efficient and timely manner, before that energy is lost in fluorescence.

Various structures are responsible for transferring energy from the antennas to a reaction site. One of the most well studied is the FMO complex in green sulfur bacteria. FT electron spectroscopy studies show an efficiency of above 99% between the absorption of electrons and transfer to the reaction site with short lived intermediates.[7] This high efficiency cannot be explained by classical mechanics such as a diffusion model.

A study published in 2007 claimed the identification of electronic quantum coherence [8] at -196 °C (77 K). A later study further claimed exceptionally long-lived quantum coherence at 4 °C that was further postulated to be responsible for the high efficiency of the excitation transfer between different pigments in the light-harvesting stage of photosynthesis.[9] It was, thus, suggested that nature through evolution had developed a way of protecting quantum coherence to enhance the efficiency of photosynthesis. However, critical follow-up studies question the interpretation of these results and assign the reported signatures of electronic quantum coherence to nuclear dynamics in the chromophores.[10][11][12][13][14] The claims of unexpected long coherence times sparked a lot of research in the quantum physics community to explain the origin. A number of proposals were brought forward trying to explain the claimed long-lived coherence. According to one proposal, if each site within the complex feels its own environmental noise, then because of both quantum coherence and thermal environment, the electron will not remain in any local minimum but proceed to the reaction site.[13][15][16] Another proposal is that the rate of quantum coherence combined with electron tunneling creates an energy sink that moves the electron to the reaction site quickly.[17] Other work suggested that symmetries present in the geometric arrangement of the complex may favor efficient energy transfer to the reaction center, in a way that resembles perfect state transfer in quantum networks.[18] However, careful control experiments cast doubts on the interpretation that quantum effects last any longer than one hundred femtoseconds.[19]

Vision

Vision relies on quantized energy in order to convert light signals to an action potential in a process called phototransduction. In phototransduction, a photon interacts with a chromophore in a light receptor. The chromophore absorbs the photon and undergoes photoisomerization. This change in structure induces a change in the structure of the photo receptor and resulting signal transduction pathways lead to a visual signal. However, the photoisomerization reaction occurs at a rapid rate, in under 200 femtoseconds,[20] with high yield. Models suggest the use of quantum effects in shaping the ground state and excited state potentials in order to achieve this efficiency.[21]

Enzymatic activity (quantum biochemistry)

Enzymes may use quantum tunneling to transfer electrons long distances. Tunneling refers to the ability of a small mass particle to travel through energy barriers. Studies show that long distance electron transfers between redox centers through quantum tunneling plays important roles in enzymatic activity of photosynthesis and cellular respiration.[22][23] For example, studies show that long range electron tunneling on the order of 15–30 Ć… plays a role in redox reactions in enzymes of cellular respiration.[24] Even though there are such large separations between redox sites within enzymes, electrons successfully transfer in a temperature independent and distance dependent manner. This suggests the ability of electrons to tunnel in physiological conditions. Further research is needed to determine whether this specific tunneling is also coherent.

Magnetoreception

Magnetoreception refers to the ability of animals to navigate using the inclination of the magnetic field of the earth.[25] A possible explanation for magnetoreception is the radical pair mechanism.[26][27] The radical-pair mechanism is well-established in spin chemistry,[28][29][30] and was speculated to apply to magnetoreception in 1978 by Schulten et al.. In 2000, cryptochrome was proposed as the "magnetic molecule", so to speak, that could harbor magnetically sensitive radical-pairs. Cryptochrome, a flavoprotein found in the eyes of European robins and other animal species, is the only protein known to form photoinduced radical-pairs in animals.[25] The function of cryptochrome is diverse across species, however, the photoinduction of radical-pairs occurs by exposure to blue light, which excites an electron in a chromophore.[31]

Nevertheless, in the lab, the direction of weak magnetic fields can affect radical-pair's reactivity, and therefore can "catalyze" the formation of chemical products. Whether this mechanism applies to magnetoreception and/or quantum biology, that is, whether earth's magnetic field "catalyzes" the formation of biochemical products by the aid of entangled or non-entangled radical-pairs, is doubly undetermined. As to the former, researchers found evidence for the radical-pair mechanism of magnetoreception when European robins, cockroaches, and garden warblers, could no longer navigate when exposed to a radio frequency oscillating magnetic fields,[25] which specially disturbs radical-pair chemistry. To empirically suggest the involvement of entanglement, an experiment would need to be devised that could disturb entangled radical-pairs without disturbing other radical-pairs, or vice versa, which would first need to be demonstrated in a laboratory setting before being applied to magnetoreception.

Other biological applications

Other examples of quantum phenomena in biological systems include olfaction,[32] the conversion of chemical energy into motion,[33] DNA mutation[6] and brownian motors in many cellular processes.[34]

Entropy (information theory)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Entropy_(information_theory) In info...