Search This Blog

Thursday, May 22, 2025

Instrumentation

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Instrumentation

Instrumentation is a collective term for measuring instruments, used for indicating, measuring, and recording physical quantities. It is also a field of study about the art and science about making measurement instruments, involving the related areas of metrology, automation, and control theory. The term has its origins in the art and science of scientific instrument-making.

Instrumentation can refer to devices as simple as direct-reading thermometers, or as complex as multi-sensor components of industrial control systems. Instruments can be found in laboratories, refineries, factories and vehicles, as well as in everyday household use (e.g., smoke detectors and thermostats).

Measurement parameters

Control valve

Instrumentation is used to measure many parameters (physical values), including:

History

A local instrumentation panel on a steam turbine

The history of instrumentation can be divided into several phases.

Pre-industrial

Elements of industrial instrumentation have long histories. Scales for comparing weights and simple pointers to indicate position are ancient technologies. Some of the earliest measurements were of time. One of the oldest water clocks was found in the tomb of the ancient Egyptian pharaoh Amenhotep I, buried around 1500 BCE. Improvements were incorporated in the clocks. By 270 BCE they had the rudiments of an automatic control system device.

In 1663 Christopher Wren presented the Royal Society with a design for a "weather clock". A drawing shows meteorological sensors moving pens over paper driven by clockwork. Such devices did not become standard in meteorology for two centuries. The concept has remained virtually unchanged as evidenced by pneumatic chart recorders, where a pressurized bellows displaces a pen. Integrating sensors, displays, recorders, and controls was uncommon until the industrial revolution, limited by both need and practicality.

Early industrial

The evolution of analogue control loop signalling from the pneumatic era to the electronic era

Early systems used direct process connections to local control panels for control and indication, which from the early 1930s saw the introduction of pneumatic transmitters and automatic 3-term (PID) controllers.

The ranges of pneumatic transmitters were defined by the need to control valves and actuators in the field. Typically, a signal ranged from 3 to 15 psi (20 to 100kPa or 0.2 to 1.0 kg/cm2) as a standard, was standardized with 6 to 30 psi occasionally being used for larger valves. Transistor electronics enabled wiring to replace pipes, initially with a range of 20 to 100mA at up to 90V for loop powered devices, reducing to 4 to 20mA at 12 to 24V in more modern systems. A transmitter is a device that produces an output signal, often in the form of a 4–20 mA electrical current signal, although many other options using voltage, frequency, pressure, or ethernet are possible. The transistor was commercialized by the mid-1950s.

Instruments attached to a control system provided signals used to operate solenoids, valves, regulators, circuit breakers, relays and other devices. Such devices could control a desired output variable, and provide either remote monitoring or automated control capabilities.

Each instrument company introduced their own standard instrumentation signal, causing confusion until the 4–20 mA range was used as the standard electronic instrument signal for transmitters and valves. This signal was eventually standardized as ANSI/ISA S50, "Compatibility of Analog Signals for Electronic Industrial Process Instruments", in the 1970s. The transformation of instrumentation from mechanical pneumatic transmitters, controllers, and valves to electronic instruments reduced maintenance costs as electronic instruments were more dependable than mechanical instruments. This also increased efficiency and production due to their increase in accuracy. Pneumatics enjoyed some advantages, being favored in corrosive and explosive atmospheres.

Automatic process control

Example of a single industrial control loop, showing continuously modulated control of process flow

In the early years of process control, process indicators and control elements such as valves were monitored by an operator, that walked around the unit adjusting the valves to obtain the desired temperatures, pressures, and flows. As technology evolved pneumatic controllers were invented and mounted in the field that monitored the process and controlled the valves. This reduced the amount of time process operators needed to monitor the process. Latter years, the actual controllers were moved to a central room and signals were sent into the control room to monitor the process and outputs signals were sent to the final control element such as a valve to adjust the process as needed. These controllers and indicators were mounted on a wall called a control board. The operators stood in front of this board walking back and forth monitoring the process indicators. This again reduced the number and amount of time process operators were needed to walk around the units. The most standard pneumatic signal level used during these years was 3–15 psig.

Large integrated computer-based systems

Pneumatic "three term" pneumatic PID controller, widely used before electronics became reliable and cheaper and safe to use in hazardous areas (Siemens Telepneu Example)
A pre-DCS/SCADA era central control room. Whilst the controls are centralised in one place, they are still discrete and not integrated into one system.
A DCS control room where plant information and controls are displayed on computer graphics screens. The operators are seated and can view and control any part of the process from their screens, whilst retaining a plant overview.

Process control of large industrial plants has evolved through many stages. Initially, control would be from panels local to the process plant. However, this required a large manpower resource to attend to these dispersed panels, and there was no overall view of the process. The next logical development was the transmission of all plant measurements to a permanently staffed central control room. Effectively this was the centralization of all the localized panels, with the advantages of lower manning levels and easy overview of the process. Often the controllers were behind the control room panels, and all automatic and manual control outputs were transmitted back to plant.

However, whilst providing a central control focus, this arrangement was inflexible as each control loop had its own controller hardware, and continual operator movement within the control room was required to view different parts of the process. With coming of electronic processors and graphic displays it became possible to replace these discrete controllers with computer-based algorithms, hosted on a network of input/output racks with their own control processors. These could be distributed around plant, and communicate with the graphic display in the control room or rooms. The distributed control concept was born.

The introduction of DCSs and SCADA allowed easy interconnection and re-configuration of plant controls such as cascaded loops and interlocks, and easy interfacing with other production computer systems. It enabled sophisticated alarm handling, introduced automatic event logging, removed the need for physical records such as chart recorders, allowed the control racks to be networked and thereby located locally to plant to reduce cabling runs, and provided high level overviews of plant status and production levels.

Application

In some cases, the sensor is a very minor element of the mechanism. Digital cameras and wristwatches might technically meet the loose definition of instrumentation because they record and/or display sensed information. Under most circumstances neither would be called instrumentation, but when used to measure the elapsed time of a race and to document the winner at the finish line, both would be called instrumentation.

Household

A very simple example of an instrumentation system is a mechanical thermostat, used to control a household furnace and thus to control room temperature. A typical unit senses temperature with a bi-metallic strip. It displays temperature by a needle on the free end of the strip. It activates the furnace by a mercury switch. As the switch is rotated by the strip, the mercury makes physical (and thus electrical) contact between electrodes.

Another example of an instrumentation system is a home security system. Such a system consists of sensors (motion detection, switches to detect door openings), simple algorithms to detect intrusion, local control (arm/disarm) and remote monitoring of the system so that the police can be summoned. Communication is an inherent part of the design.

Kitchen appliances use sensors for control.

  • A refrigerator maintains a constant temperature by actuating the cooling system when the temperature becomes too high.
  • An automatic ice machine makes ice until a limit switch is thrown.
  • Pop-up bread toasters allow the time to be set.
  • Non-electronic gas ovens will regulate the temperature with a thermostat controlling the flow of gas to the gas burner. These may feature a sensor bulb sited within the main chamber of the oven. In addition, there may be a safety cut-off flame supervision device: after ignition, the burner's control knob must be held for a short time in order for a sensor to become hot, and permit the flow of gas to the burner. If the safety sensor becomes cold, this may indicate the flame on the burner has become extinguished, and to prevent a continuous leak of gas the flow is stopped.
  • Electric ovens use a temperature sensor and will turn on heating elements when the temperature is too low. More advanced ovens will actuate fans in response to temperature sensors, to distribute heat or to cool.
  • A common toilet refills the water tank until a float closes the valve. The float is acting as a water level sensor.

Automotive

Modern automobiles have complex instrumentation. In addition to displays of engine rotational speed and vehicle linear speed, there are also displays of battery voltage and current, fluid levels, fluid temperatures, distance traveled, and feedback of various controls (turn signals, parking brake, headlights, transmission position). Cautions may be displayed for special problems (fuel low, check engine, tire pressure low, door ajar, seat belt unfastened). Problems are recorded so they can be reported to diagnostic equipment. Navigation systems can provide voice commands to reach a destination. Automotive instrumentation must be cheap and reliable over long periods in harsh environments. There may be independent airbag systems that contain sensors, logic and actuators. Anti-skid braking systems use sensors to control the brakes, while cruise control affects throttle position. A wide variety of services can be provided via communication links on the OnStar system. Autonomous cars (with exotic instrumentation) have been shown.

Aircraft

Early aircraft had a few sensors. "Steam gauges" converted air pressures into needle deflections that could be interpreted as altitude and airspeed. A magnetic compass provided a sense of direction. The displays to the pilot were as critical as the measurements.

A modern aircraft has a far more sophisticated suite of sensors and displays, which are embedded into avionics systems. The aircraft may contain inertial navigation systems, global positioning systems, weather radar, autopilots, and aircraft stabilization systems. Redundant sensors are used for reliability. A subset of the information may be transferred to a crash recorder to aid mishap investigations. Modern pilot displays now include computer displays including head-up displays.

Air traffic control radar is a distributed instrumentation system. The ground part sends an electromagnetic pulse and receives an echo (at least). Aircraft carry transponders that transmit codes on reception of the pulse. The system displays an aircraft map location, an identifier and optionally altitude. The map location is based on sensed antenna direction and sensed time delay. The other information is embedded in the transponder transmission.

Laboratory instrumentation

Among the possible uses of the term is a collection of laboratory test equipment controlled by a computer through an IEEE-488 bus (also known as GPIB for General Purpose Instrument Bus or HPIB for Hewlitt Packard Instrument Bus). Laboratory equipment is available to measure many electrical and chemical quantities. Such a collection of equipment might be used to automate the testing of drinking water for pollutants.

Instrumentation engineering

The instrumentation part of a piping and instrumentation diagram will be developed by an instrumentation engineer.

Instrumentation engineering is the engineering specialization focused on the principle and operation of measuring instruments that are used in design and configuration of automated systems in areas such as electrical and pneumatic domains, and the control of quantities being measured. They typically work for industries with automated processes, such as chemical or manufacturing plants, with the goal of improving system productivity, reliability, safety, optimization and stability. To control the parameters in a process or in a particular system, devices such as microprocessors, microcontrollers or PLCs are used, but their ultimate aim is to control the parameters of a system.

Instrumentation engineering is loosely defined because the required tasks are very domain dependent. An expert in the biomedical instrumentation of laboratory rats has very different concerns than the expert in rocket instrumentation. Common concerns of both are the selection of appropriate sensors based on size, weight, cost, reliability, accuracy, longevity, environmental robustness, and frequency response. Some sensors are literally fired in artillery shells. Others sense thermonuclear explosions until destroyed. Invariably sensor data must be recorded, transmitted or displayed. Recording rates and capacities vary enormously. Transmission can be trivial or can be clandestine, encrypted and low power in the presence of jamming. Displays can be trivially simple or can require consultation with human factors experts. Control system design varies from trivial to a separate specialty.

Instrumentation engineers are responsible for integrating the sensors with the recorders, transmitters, displays or control systems, and producing the Piping and instrumentation diagram for the process. They may design or specify installation, wiring and signal conditioning. They may be responsible for commissioning, calibration, testing and maintenance of the system.

In a research environment it is common for subject matter experts to have substantial instrumentation system expertise. An astronomer knows the structure of the universe and a great deal about telescopes – optics, pointing and cameras (or other sensing elements). That often includes the hard-won knowledge of the operational procedures that provide the best results. For example, an astronomer is often knowledgeable of techniques to minimize temperature gradients that cause air turbulence within the telescope.

Instrumentation technologists, technicians and mechanics specialize in troubleshooting, repairing and maintaining instruments and instrumentation systems.

Typical industrial transmitter signal types

Impact of modern development

Ralph Müller (1940) stated, "That the history of physical science is largely the history of instruments and their intelligent use is well known. The broad generalizations and theories which have arisen from time to time have stood or fallen on the basis of accurate measurement, and in several instances new instruments have had to be devised for the purpose. There is little evidence to show that the mind of modern man is superior to that of the ancients. His tools are incomparably better."

Davis Baird has argued that the major change associated with Floris Cohen's identification of a "fourth big scientific revolution" after World War II is the development of scientific instrumentation, not only in chemistry but across the sciences. In chemistry, the introduction of new instrumentation in the 1940s was "nothing less than a scientific and technological revolution" in which classical wet-and-dry methods of structural organic chemistry were discarded, and new areas of research opened up.

As early as 1954, W. A. Wildhack discussed both the productive and destructive potential inherent in process control. The ability to make precise, verifiable and reproducible measurements of the natural world, at levels that were not previously observable, using scientific instrumentation, has "provided a different texture of the world". This instrumentation revolution fundamentally changes human abilities to monitor and respond, as is illustrated in the examples of DDT monitoring and the use of UV spectrophotometry and gas chromatography to monitor water pollutants.

Analytical chemistry

From Wikipedia, the free encyclopedia

Analytical chemistry studies and uses instruments and methods to separate, identify, and quantify matter. In practice, separation, identification or quantification may constitute the entire analysis or be combined with another method. Separation isolates analytes. Qualitative analysis identifies analytes, while quantitative analysis determines the numerical amount or concentration.

Analytical chemistry consists of classical, wet chemical methods and modern analytical techniques. Classical qualitative methods use separations such as precipitation, extraction, and distillation. Identification may be based on differences in color, odor, melting point, boiling point, solubility, radioactivity or reactivity. Classical quantitative analysis uses mass or volume changes to quantify amount. Instrumental methods may be used to separate samples using chromatography, electrophoresis or field flow fractionation. Then qualitative and quantitative analysis can be performed, often with the same instrument and may use light interaction, heat interaction, electric fields or magnetic fields. Often the same instrument can separate, identify and quantify an analyte.

Analytical chemistry is also focused on improvements in experimental design, chemometrics, and the creation of new measurement tools. Analytical chemistry has broad applications to medicine, science, and engineering.

History

Gustav Kirchhoff (left) and Robert Bunsen (right)

Analytical chemistry has been important since the early days of chemistry, providing methods for determining which elements and chemicals are present in the object in question. During this period, significant contributions to analytical chemistry included the development of systematic elemental analysis by Justus von Liebig and systematized organic analysis based on the specific reactions of functional groups.

The first instrumental analysis was flame emissive spectrometry developed by Robert Bunsen and Gustav Kirchhoff who discovered rubidium (Rb) and caesium (Cs) in 1860.

Most of the major developments in analytical chemistry took place after 1900. During this period, instrumental analysis became progressively dominant in the field. In particular, many of the basic spectroscopic and spectrometric techniques were discovered in the early 20th century and refined in the late 20th century.

The separation sciences follow a similar time line of development and also became increasingly transformed into high performance instruments. In the 1970s many of these techniques began to be used together as hybrid techniques to achieve a complete characterization of samples.

Starting in the 1970s, analytical chemistry became progressively more inclusive of biological questions (bioanalytical chemistry), whereas it had previously been largely focused on inorganic or small organic molecules. Lasers have been increasingly used as probes and even to initiate and influence a wide variety of reactions. The late 20th century also saw an expansion of the application of analytical chemistry from somewhat academic chemical questions to forensic, environmental, industrial and medical questions, such as in histology.

Modern analytical chemistry is dominated by instrumental analysis. Many analytical chemists focus on a single type of instrument. Academics tend to either focus on new applications and discoveries or on new methods of analysis. The discovery of a chemical present in blood that increases the risk of cancer would be a discovery that an analytical chemist might be involved in. An effort to develop a new method might involve the use of a tunable laser to increase the specificity and sensitivity of a spectrometric method. Many methods, once developed, are kept purposely static so that data can be compared over long periods of time. This is particularly true in industrial quality assurance (QA), forensic and environmental applications. Analytical chemistry plays an increasingly important role in the pharmaceutical industry where, aside from QA, it is used in the discovery of new drug candidates and in clinical applications where understanding the interactions between the drug and the patient are critical.

Classical methods

The presence of copper in this qualitative analysis is indicated by the bluish-green color of the flame

Although modern analytical chemistry is dominated by sophisticated instrumentation, the roots of analytical chemistry and some of the principles used in modern instruments are from traditional techniques, many of which are still used today. These techniques also tend to form the backbone of most undergraduate analytical chemistry educational labs.

Qualitative analysis

Qualitative analysis determines the presence or absence of a particular compound, but not the mass or concentration. By definition, qualitative analyses do not measure quantity.

Chemical tests

There are numerous qualitative chemical tests, for example, the acid test for gold and the Kastle-Meyer test for the presence of blood.

Flame test

Inorganic qualitative analysis generally refers to a systematic scheme to confirm the presence of certain aqueous ions or elements by performing a series of reactions that eliminate a range of possibilities and then confirm suspected ions with a confirming test. Sometimes small carbon-containing ions are included in such schemes. With modern instrumentation, these tests are rarely used but can be useful for educational purposes and in fieldwork or other situations where access to state-of-the-art instruments is not available or expedient.

Quantitative analysis

Quantitative analysis is the measurement of the quantities of particular chemical constituents present in a substance. Quantities can be measured by mass (gravimetric analysis) or volume (volumetric analysis).

Gravimetric analysis

The gravimetric analysis involves determining the amount of material present by weighing the sample before and/or after some transformation. A common example used in undergraduate education is the determination of the amount of water in a hydrate by heating the sample to remove the water such that the difference in weight is due to the loss of water.

Volumetric analysis

Titration involves the gradual addition of a measurable reactant to an exact volume of a solution being analyzed until some equivalence point is reached. Titration is a family of techniques used to determine the concentration of an analyte. Titrating accurately to either the half-equivalence point or the endpoint of a titration allows the chemist to determine the amount of moles used, which can then be used to determine a concentration or composition of the titrant. Most familiar to those who have taken chemistry during secondary education is the acid-base titration involving a color-changing indicator, such as phenolphthalein. There are many other types of titrations, for example, potentiometric titrations or precipitation titrations. Chemists might also create titration curves in order by systematically testing the pH every drop in order to understand different properties of the titrant.

Instrumental methods

Block diagram of an analytical instrument showing the stimulus and measurement of response

Spectroscopy

Spectroscopy measures the interaction of the molecules with electromagnetic radiation. Spectroscopy consists of many different applications such as atomic absorption spectroscopy, atomic emission spectroscopy, ultraviolet-visible spectroscopy, X-ray spectroscopy, fluorescence spectroscopy, infrared spectroscopy, Raman spectroscopy, dual polarization interferometry, nuclear magnetic resonance spectroscopy, photoemission spectroscopy, Mössbauer spectroscopy and so on.

Mass spectrometry

An accelerator mass spectrometer used for radiocarbon dating and other analysis

Mass spectrometry measures mass-to-charge ratio of molecules using electric and magnetic fields. In a mass spectrometer, a small amount of sample is ionized and converted to gaseous ions, where they are separated and analyzed according to their mass-to-charge ratios. There are several ionization methods: electron ionization, chemical ionization, electrospray ionization, fast atom bombardment, matrix-assisted laser desorption/ionization, and others. Also, mass spectrometry is categorized by approaches of mass analyzers: magnetic-sector, quadrupole mass analyzer, quadrupole ion trap, time-of-flight, Fourier transform ion cyclotron resonance, and so on.

Electrochemical analysis

Electroanalytical methods measure the potential (volts) and/or current (amps) in an electrochemical cell containing the analyte. These methods can be categorized according to which aspects of the cell are controlled and which are measured. The four main categories are potentiometry (the difference in electrode potentials is measured), coulometry (the transferred charge is measured over time), amperometry (the cell's current is measured over time), and voltammetry (the cell's current is measured while actively altering the cell's potential).

Potentiometry measures the cell's potential, coulometry measures the cell's current, and voltammetry measures the change in current when cell potential changes.

Thermal analysis

Calorimetry and thermogravimetric analysis measure the interaction of a material and heat.

Separation

Separation of black ink on a thin-layer chromatography plate

Separation processes are used to decrease the complexity of material mixtures. Chromatography, electrophoresis and field flow fractionation are representative of this field.

Chromatographic assays

Chromatography can be used to determine the presence of substances in a sample as different components in a mixture have different tendencies to adsorb onto the stationary phase or dissolve in the mobile phase. Thus, different components of the mixture move at different speed. Different components of a mixture can therefore be identified by their respective Rƒ values, which is the ratio between the migration distance of the substance and the migration distance of the solvent front during chromatography. In combination with the instrumental methods, chromatography can be used in quantitative determination of the substances. Chromatography separates the analyte from the rest of the sample so that it may be measured without interference from other compounds. There are different types of chromatography that differ from the media they use to separate the analyte and the sample. In Thin-layer chromatography, the analyte mixture moves up and separates along the coated sheet under the volatile mobile phase. In Gas chromatography, gas separates the volatile analytes. A common method for chromatography using liquid as a mobile phase is High-performance liquid chromatography.

Hybrid techniques

Combinations of the above techniques produce a "hybrid" or "hyphenated" technique. Several examples are in popular use today and new hybrid techniques are under development. For example, gas chromatography-mass spectrometry, gas chromatography-infrared spectroscopy, liquid chromatography-mass spectrometry, liquid chromatography-NMR spectroscopy, liquid chromatography-infrared spectroscopy, and capillary electrophoresis-mass spectrometry.

Hyphenated separation techniques refer to a combination of two (or more) techniques to detect and separate chemicals from solutions. Most often the other technique is some form of chromatography. Hyphenated techniques are widely used in chemistry and biochemistry. A slash is sometimes used instead of hyphen, especially if the name of one of the methods contains a hyphen itself.

Microscopy

Fluorescence microscope image of two mouse cell nuclei in prophase (scale bar is 5 μm)

The visualization of single molecules, single cells, biological tissues, and nanomaterials is an important and attractive approach in analytical science. Also, hybridization with other traditional analytical tools is revolutionizing analytical science. Microscopy can be categorized into three different fields: optical microscopy, electron microscopy, and scanning probe microscopy. Recently, this field is rapidly progressing because of the rapid development of the computer and camera industries.

Lab-on-a-chip

Devices that integrate (multiple) laboratory functions on a single chip of only millimeters to a few square centimeters in size and that are capable of handling extremely small fluid volumes down to less than picoliters.

Errors

Error can be defined as numerical difference between observed value and true value. The experimental error can be divided into two types, systematic error and random error. Systematic error results from a flaw in equipment or the design of an experiment while random error results from uncontrolled or uncontrollable variables in the experiment.

In error the true value and observed value in chemical analysis can be related with each other by the equation

where

  • is the absolute error.
  • is the true value.
  • is the observed value.

An error of a measurement is an inverse measure of accurate measurement, i.e. smaller the error greater the accuracy of the measurement.

Errors can be expressed relatively. Given the relative error():

The percent error can also be calculated:

If we want to use these values in a function, we may also want to calculate the error of the function. Let be a function with variables. Therefore, the propagation of uncertainty must be calculated in order to know the error in :

Standards

Standard curve

A calibration curve plot showing limit of detection (LOD), limit of quantification (LOQ), dynamic range, and limit of linearity (LOL)

A general method for analysis of concentration involves the creation of a calibration curve. This allows for the determination of the amount of a chemical in a material by comparing the results of an unknown sample to those of a series of known standards. If the concentration of element or compound in a sample is too high for the detection range of the technique, it can simply be diluted in a pure solvent. If the amount in the sample is below an instrument's range of measurement, the method of addition can be used. In this method, a known quantity of the element or compound under study is added, and the difference between the concentration added and the concentration observed is the amount actually in the sample.

Internal standards

Sometimes an internal standard is added at a known concentration directly to an analytical sample to aid in quantitation. The amount of analyte present is then determined relative to the internal standard as a calibrant. An ideal internal standard is an isotopically enriched analyte which gives rise to the method of isotope dilution.

Standard addition

The method of standard addition is used in instrumental analysis to determine the concentration of a substance (analyte) in an unknown sample by comparison to a set of samples of known concentration, similar to using a calibration curve. Standard addition can be applied to most analytical techniques and is used instead of a calibration curve to solve the matrix effect problem.

Signals and noise

One of the most important components of analytical chemistry is maximizing the desired signal while minimizing the associated noise. The analytical figure of merit is known as the signal-to-noise ratio (S/N or SNR).

Noise can arise from environmental factors as well as from fundamental physical processes.

Thermal noise

Thermal noise results from the motion of charge carriers (usually electrons) in an electrical circuit generated by their thermal motion. Thermal noise is white noise meaning that the power spectral density is constant throughout the frequency spectrum.

The root mean square value of the thermal noise in a resistor is given by

where kB is the Boltzmann constant, T is the temperature, R is the resistance, and is the bandwidth of the frequency .

Shot noise

Shot noise is a type of electronic noise that occurs when the finite number of particles (such as electrons in an electronic circuit or photons in an optical device) is small enough to give rise to statistical fluctuations in a signal.

Shot noise is a Poisson process, and the charge carriers that make up the current follow a Poisson distribution. The root mean square current fluctuation is given by

where e is the elementary charge and I is the average current. Shot noise is white noise.

Flicker noise

Flicker noise is electronic noise with a 1/ƒ frequency spectrum; as f increases, the noise decreases. Flicker noise arises from a variety of sources, such as impurities in a conductive channel, generation, and recombination noise in a transistor due to base current, and so on. This noise can be avoided by modulation of the signal at a higher frequency, for example, through the use of a lock-in amplifier.

Environmental noise

Noise in a thermogravimetric analysis; lower noise in the middle of the plot results from less human activity (and environmental noise) at night

Environmental noise arises from the surroundings of the analytical instrument. Sources of electromagnetic noise are power lines, radio and television stations, wireless devices, compact fluorescent lamps and electric motors. Many of these noise sources are narrow bandwidth and, therefore, can be avoided. Temperature and vibration isolation may be required for some instruments.

Noise reduction

Noise reduction can be accomplished either in computer hardware or software. Examples of hardware noise reduction are the use of shielded cable, analog filtering, and signal modulation. Examples of software noise reduction are digital filtering, ensemble average, boxcar average, and correlation methods.

Applications

A US Food and Drug Administration scientist uses a portable near-infrared spectroscopy device to inspect lactose for adulteration with melamine

Analytical chemistry has applications including in forensic science, bioanalysis, clinical analysis, environmental analysis, and materials analysis. Analytical chemistry research is largely driven by performance (sensitivity, detection limit, selectivity, robustness, dynamic range, linear range, accuracy, precision, and speed), and cost (purchase, operation, training, time, and space). Among the main branches of contemporary analytical atomic spectrometry, the most widespread and universal are optical and mass spectrometry. In the direct elemental analysis of solid samples, the new leaders are laser-induced breakdown and laser ablation mass spectrometry, and the related techniques with transfer of the laser ablation products into inductively coupled plasma. Advances in design of diode lasers and optical parametric oscillators promote developments in fluorescence and ionization spectrometry and also in absorption techniques where uses of optical cavities for increased effective absorption pathlength are expected to expand. The use of plasma- and laser-based methods is increasing. An interest towards absolute (standardless) analysis has revived, particularly in emission spectrometry.

Great effort is being put into shrinking the analysis techniques to chip size. Although there are few examples of such systems competitive with traditional analysis techniques, potential advantages include size/portability, speed, and cost. (micro total analysis system (μTAS) or lab-on-a-chip). Microscale chemistry reduces the amounts of chemicals used.

Many developments improve the analysis of biological systems. Examples of rapidly expanding fields in this area are genomics, DNA sequencing and related research in genetic fingerprinting and DNA microarray; proteomics, the analysis of protein concentrations and modifications, especially in response to various stressors, at various developmental stages, or in various parts of the body, metabolomics, which deals with metabolites; transcriptomics, including mRNA and associated fields; lipidomics - lipids and its associated fields; peptidomics - peptides and its associated fields; and metallomics, dealing with metal concentrations and especially with their binding to proteins and other molecules.[citation needed]

Analytical chemistry has played a critical role in the understanding of basic science to a variety of practical applications, such as biomedical applications, environmental monitoring, quality control of industrial manufacturing, forensic science, and so on.

The recent developments in computer automation and information technologies have extended analytical chemistry into a number of new biological fields. For example, automated DNA sequencing machines were the basis for completing human genome projects leading to the birth of genomics. Protein identification and peptide sequencing by mass spectrometry opened a new field of proteomics. In addition to automating specific processes, there is effort to automate larger sections of lab testing, such as in companies like Emerald Cloud Lab and Transcriptic.

Analytical chemistry has been an indispensable area in the development of nanotechnology. Surface characterization instruments, electron microscopes and scanning probe microscopes enable scientists to visualize atomic structures with chemical characterizations.

Clinical trial

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Clinical_...