Search This Blog

Saturday, June 25, 2022

Spectroscopy

From Wikipedia, the free encyclopedia
 
An example of spectroscopy: a prism analyses white light by dispersing it into its component colors.

Spectroscopy is the general field of study that measures and interprets the electromagnetic spectra that result from the interaction between electromagnetic radiation and matter as a function of the wavelength or frequency of the radiation. Matter waves and acoustic waves can also be considered forms of radiative energy, and recently gravitational waves have been associated with a spectral signature in the context of the Laser Interferometer Gravitational-Wave Observatory (LIGO)

In simpler terms, spectroscopy is the precise study of color as generalized from visible light to all bands of the electromagnetic spectrum. Historically, spectroscopy originated as the study of the wavelength dependence of the absorption by gas phase matter of visible light dispersed by a prism.

Spectroscopy, primarily in the electromagnetic spectrum, is a fundamental exploratory tool in the fields of astronomy, chemistry, materials science, and physics, allowing the composition, physical structure and electronic structure of matter to be investigated at the atomic, molecular and macro scale, and over astronomical distances. Important applications include biomedical spectroscopy in the areas of tissue analysis and medical imaging.

Introduction

Spectroscopy is a branch of science concerned with the spectra of electromagnetic radiation as a function of its wavelength or frequency measured by spectrographic equipment, and other techniques, in order to obtain information concerning the structure and properties of matter. Spectral measurement devices are referred to as spectrometers, spectrophotometers, spectrographs or spectral analyzers. Most spectroscopic analysis in the laboratory starts with a sample to be analyzed, then a light source is chosen from any desired range of the light spectrum, then the light goes through the sample to a dispersion array (diffraction grating instrument) and is captured by a photodiode. For astronomical purposes, the telescope must be equipped with the light dispersion device. There are various versions of this basic setup that may be employed.

Spectroscopy as a science began with Isaac Newton splitting light with a prism and was called Optics. Therefore, it was originally the study of visible light which we call color that later under the studies of James Clerk Maxwell came to include the entire electromagnetic spectrum. Although color is involved in spectroscopy, it is not equated with the color of elements or objects which involve the absorption and reflection of certain electromagnetic waves to give objects a sense of color to our eyes. Rather spectroscopy involves the splitting of light by a prism, diffraction grating, or similar instrument, to give off a particular discrete line pattern called a “spectrum” unique to each different type of element. Most elements are first put into a gaseous phase to allow the spectra to be examined although today other methods can be used on different phases. Each element that is diffracted by a prism-like instrument displays either an absorption spectrum or an emission spectrum depending upon whether the element is being cooled or heated.

Until recently all spectroscopy involved the study of line spectra and most spectroscopy still does. Vibrational spectroscopy is the branch of spectroscopy that studies the spectra. However, the latest developments in spectroscopy can sometimes dispense with the dispersion technique. In biochemical spectroscopy, information can be gathered about biological tissue by absorption and light scattering techniques. Light scattering spectroscopy is a type of reflectance spectroscopy that determines tissue structures by examining elastic scattering. In such a case, it is the tissue that acts as a diffraction or dispersion mechanism.

Spectroscopic studies were central to the development of quantum mechanics, because the first useful atomic models described the spectra of Hydrogen which models include the Bohr model, the Schrödinger equation, and Matrix mechanics which all can produce the spectral lines of Hydrogen, therefore, providing the basis for discrete quantum jumps to match the discrete hydrogen spectrum. Also, Max Planck's explanation of blackbody radiation involved spectroscopy because he was comparing the wavelength of light using a photometer to the temperature of a Black Body. Spectroscopy is used in physical and analytical chemistry because atoms and molecules have unique spectra. As a result, these spectra can be used to detect, identify and quantify information about the atoms and molecules. Spectroscopy is also used in astronomy and remote sensing on Earth. Most research telescopes have spectrographs. The measured spectra are used to determine the chemical composition and physical properties of astronomical objects (such as their temperature, density of elements in a star, velocity, black holes and more). An important use for spectroscopy is in biochemistry. Molecular samples may be analyzed for species identification and energy content.

Theory

The central theory of spectroscopy is that light is made of different wavelengths and that each wavelength corresponds to a different frequency. The importance of spectroscopy is centered around the fact that every different element in the periodic table has a unique light spectrum described by the frequencies of light it emits or absorbs consistently appearing in the same part of the electromagnetic spectrum when that light is diffracted. This opened up an entire field of study with anything that contains atoms which is all matter. Spectroscopy is the key to understanding the atomic properties of all matter. As such spectroscopy opened up many new sub-fields of science yet undiscovered. The idea that each atomic element has its unique spectral signature enabled spectroscopy to be used in a broad number of fields each with a specific goal achieved by different spectroscopic procedures. These unique spectral lines for each element are so important in so many branches of science that the government carries a public Atomic Spectra Database that is continually updated with more precise measurements on its NIST website.

The broadening of the field of spectroscopy is due to the fact that any part of the electromagnetic spectrum may be used to analyze a sample from the infrared to the ultraviolet telling scientists different properties about the very same sample. For instance in chemical analysis, the most common types of spectroscopy include atomic spectroscopy, infrared spectroscopy, ultraviolet and visible spectroscopy, Raman spectroscopy and nuclear magnetic resonance. In nuclear magnetic resonance, the theory behind it is that frequency is analogous to resonance and its corresponding resonant frequency. Resonances by the frequency were first characterized in mechanical systems such as pendulums which have a frequency of motion noted famously by Galileo.

Classification of methods

A huge diffraction grating at the heart of the ultra-precise ESPRESSO spectrograph.

Spectroscopy is a sufficiently broad field that many sub-disciplines exist, each with numerous implementations of specific spectroscopic techniques. The various implementations and techniques can be classified in several ways.

Type of radiative energy

The types of spectroscopy are distinguished by the type of radiative energy involved in the interaction. In many applications, the spectrum is determined by measuring changes in the intensity or frequency of this energy. The types of radiative energy studied include:

Nature of the interaction

The types of spectroscopy also can be distinguished by the nature of the interaction between the energy and the material. These interactions include:

  • Absorption spectroscopy: Absorption occurs when energy from the radiative source is absorbed by the material. Absorption is often determined by measuring the fraction of energy transmitted through the material, with absorption decreasing the transmitted portion.
  • Emission spectroscopy: Emission indicates that radiative energy is released by the material. A material's blackbody spectrum is a spontaneous emission spectrum determined by its temperature. This feature can be measured in the infrared by instruments such as the atmospheric emitted radiance interferometer. Emission can also be induced by other sources of energy such as flames, sparks, electric arcs or electromagnetic radiation in the case of fluorescence.
  • Elastic scattering and reflection spectroscopy determine how incident radiation is reflected or scattered by a material. Crystallography employs the scattering of high energy radiation, such as x-rays and electrons, to examine the arrangement of atoms in proteins and solid crystals.
  • Impedance spectroscopy: Impedance is the ability of a medium to impede or slow the transmittance of energy. For optical applications, this is characterized by the index of refraction.
  • Inelastic scattering phenomena involve an exchange of energy between the radiation and the matter that shifts the wavelength of the scattered radiation. These include Raman and Compton scattering.
  • Coherent or resonance spectroscopy are techniques where the radiative energy couples two quantum states of the material in a coherent interaction that is sustained by the radiating field. The coherence can be disrupted by other interactions, such as particle collisions and energy transfer, and so often require high intensity radiation to be sustained. Nuclear magnetic resonance (NMR) spectroscopy is a widely used resonance method, and ultrafast laser spectroscopy is also possible in the infrared and visible spectral regions.
  • Nuclear spectroscopy are methods that use the properties of specific nuclei to probe the local structure in matter, mainly condensed matter, molecules in liquids or frozen liquids and bio-molecules.

Type of material

Spectroscopic studies are designed so that the radiant energy interacts with specific types of matter.

Atoms

Atomic spectroscopy was the first application of spectroscopy developed. Atomic absorption spectroscopy and atomic emission spectroscopy involve visible and ultraviolet light. These absorptions and emissions, often referred to as atomic spectral lines, are due to electronic transitions of outer shell electrons as they rise and fall from one electron orbit to another. Atoms also have distinct x-ray spectra that are attributable to the excitation of inner shell electrons to excited states.

Atoms of different elements have distinct spectra and therefore atomic spectroscopy allows for the identification and quantitation of a sample's elemental composition. After inventing the spectroscope, Robert Bunsen and Gustav Kirchhoff discovered new elements by observing their emission spectra. Atomic absorption lines are observed in the solar spectrum and referred to as Fraunhofer lines after their discoverer. A comprehensive explanation of the hydrogen spectrum was an early success of quantum mechanics and explained the Lamb shift observed in the hydrogen spectrum, which further led to the development of quantum electrodynamics.

Modern implementations of atomic spectroscopy for studying visible and ultraviolet transitions include flame emission spectroscopy, inductively coupled plasma atomic emission spectroscopy, glow discharge spectroscopy, microwave induced plasma spectroscopy, and spark or arc emission spectroscopy. Techniques for studying x-ray spectra include X-ray spectroscopy and X-ray fluorescence.

Molecules

The combination of atoms into molecules leads to the creation of unique types of energetic states and therefore unique spectra of the transitions between these states. Molecular spectra can be obtained due to electron spin states (electron paramagnetic resonance), molecular rotations, molecular vibration, and electronic states. Rotations are collective motions of the atomic nuclei and typically lead to spectra in the microwave and millimeter-wave spectral regions. Rotational spectroscopy and microwave spectroscopy are synonymous. Vibrations are relative motions of the atomic nuclei and are studied by both infrared and Raman spectroscopy. Electronic excitations are studied using visible and ultraviolet spectroscopy as well as fluorescence spectroscopy.

Studies in molecular spectroscopy led to the development of the first maser and contributed to the subsequent development of the laser.

Crystals and extended materials

The combination of atoms or molecules into crystals or other extended forms leads to the creation of additional energetic states. These states are numerous and therefore have a high density of states. This high density often makes the spectra weaker and less distinct, i.e., broader. For instance, blackbody radiation is due to the thermal motions of atoms and molecules within a material. Acoustic and mechanical responses are due to collective motions as well. Pure crystals, though, can have distinct spectral transitions, and the crystal arrangement also has an effect on the observed molecular spectra. The regular lattice structure of crystals also scatters x-rays, electrons or neutrons allowing for crystallographic studies.

Nuclei

Nuclei also have distinct energy states that are widely separated and lead to gamma ray spectra. Distinct nuclear spin states can have their energy separated by a magnetic field, and this allows for nuclear magnetic resonance spectroscopy.

Other types

Other types of spectroscopy are distinguished by specific applications or implementations:

Applications

UVES is a high-resolution spectrograph on the Very Large Telescope.

There are several applications of spectroscopy in the fields of medicine, physics, chemistry, and astronomy. Taking advantage of the properties of absorbance and with astronomy emission, spectroscopy can be used to identify certain states of nature. The uses of spectroscopy in so many different fields and for so many different applications has caused specialty scientific subfields. Such examples include:

  • one of the first uses was for: Determining the atomic structure of a sample
  • Next huge application was in astronomy: Studying spectral emission lines of the sun and distant galaxies
  • Space exploration
  • Cure monitoring of composites using optical fibers.
  • Estimate weathered wood exposure times using near infrared spectroscopy.
  • Measurement of different compounds in food samples by absorption spectroscopy both in visible and infrared spectrum.
  • Measurement of toxic compounds in blood samples
  • Non-destructive elemental analysis by X-ray fluorescence.
  • Electronic structure research with various spectroscopes.
  • Redshift to determine the speed and velocity of a distant object
  • Determining the metabolic structure of a muscle
  • Monitoring dissolved oxygen content in freshwater and marine ecosystems
  • Altering the structure of drugs to improve effectiveness
  • Characterization of proteins
  • Respiratory gas analysis in hospitals
  • Finding the physical properties of a distant star or nearby exoplanet using the Relativistic Doppler effect.
  • In-ovo sexing: spectroscopy allows to determine the sex of the egg while it is hatching. Developed by French and German companies, both countries decided to ban chick culling, mostly done through a macerator, in 2022.

History

The history of spectroscopy began with Isaac Newton's optics experiments (1666–1672). According to Andrew Fraknoi and David Morrison, "In 1672, in the first paper that he submitted to the Royal Society, Isaac Newton described an experiment in which he permitted sunlight to pass through a small hole and then through a prism. Newton found that sunlight, which looks white to us, is actually made up of a mixture of all the colors of the rainbow." Newton applied the word "spectrum" to describe the rainbow of colors that combine to form white light and that are revealed when the white light is passed through a prism.

Fraknoi and Morrison state that "In 1802, William Hyde Wollaston built an improved spectrometer that included a lens to focus the Sun's spectrum on a screen. Upon use, Wollaston realized that the colors were not spread uniformly, but instead had missing patches of colors, which appeared as dark bands in the spectrum." During the early 1800s, Joseph von Fraunhofer made experimental advances with dispersive spectrometers that enabled spectroscopy to become a more precise and quantitative scientific technique. Since then, spectroscopy has played and continues to play a significant role in chemistry, physics, and astronomy. Per Fraknoi and Morrison, "Later, in 1815, German physicist Joseph Fraunhofer also examined the solar spectrum, and found about 600 such dark lines (missing colors), are now known as Fraunhofer lines, or Absorption lines."

In quantum mechanical systems, the analogous resonance is a coupling of two quantum mechanical stationary states of one system, such as an atom, via an oscillatory source of energy such as a photon. The coupling of the two states is strongest when the energy of the source matches the energy difference between the two states. The energy E of a photon is related to its frequency ν by E = where h is Planck's constant, and so a spectrum of the system response vs. photon frequency will peak at the resonant frequency or energy. Particles such as electrons and neutrons have a comparable relationship, the de Broglie relations, between their kinetic energy and their wavelength and frequency and therefore can also excite resonant interactions.

Spectra of atoms and molecules often consist of a series of spectral lines, each one representing a resonance between two different quantum states. The explanation of these series, and the spectral patterns associated with them, were one of the experimental enigmas that drove the development and acceptance of quantum mechanics. The hydrogen spectral series in particular was first successfully explained by the Rutherford–Bohr quantum model of the hydrogen atom. In some cases spectral lines are well separated and distinguishable, but spectral lines can also overlap and appear to be a single transition if the density of energy states is high enough. Named series of lines include the principal, sharp, diffuse and fundamental series.

Curve fitting

From Wikipedia, the free encyclopedia

Fitting of a noisy curve by an asymmetrical peak model, with an iterative process (Gauss–Newton algorithm with variable damping factor α).
Top: raw data and model.
Bottom: evolution of the normalised sum of the squares of the errors.
 

Curve fitting is the process of constructing a curve, or mathematical function, that has the best fit to a series of data points, possibly subject to constraints. Curve fitting can involve either interpolation, where an exact fit to the data is required, or smoothing, in which a "smooth" function is constructed that approximately fits the data. A related topic is regression analysis, which focuses more on questions of statistical inference such as how much uncertainty is present in a curve that is fit to data observed with random errors. Fitted curves can be used as an aid for data visualization, to infer values of a function where no data are available, and to summarize the relationships among two or more variables. Extrapolation refers to the use of a fitted curve beyond the range of the observed data, and is subject to a degree of uncertainty since it may reflect the method used to construct the curve as much as it reflects the observed data.

Fitting functions to data points

Most commonly, one fits a function of the form y=f(x).

Fitting lines and polynomial functions to data points

Polynomial curves fitting a sine function
Polynomial curves fitting points generated with a sine function. The black dotted line is the "true" data, the red line is a first degree polynomial, the green line is second degree, the orange line is third degree and the blue line is fourth degree.

The first degree polynomial equation

is a line with slope a. A line will connect any two points, so a first degree polynomial equation is an exact fit through any two points with distinct x coordinates.

If the order of the equation is increased to a second degree polynomial, the following results:

This will exactly fit a simple curve to three points.

If the order of the equation is increased to a third degree polynomial, the following is obtained:

This will exactly fit four points.

A more general statement would be to say it will exactly fit four constraints. Each constraint can be a point, angle, or curvature (which is the reciprocal of the radius of an osculating circle). Angle and curvature constraints are most often added to the ends of a curve, and in such cases are called end conditions. Identical end conditions are frequently used to ensure a smooth transition between polynomial curves contained within a single spline. Higher-order constraints, such as "the change in the rate of curvature", could also be added. This, for example, would be useful in highway cloverleaf design to understand the rate of change of the forces applied to a car (see jerk), as it follows the cloverleaf, and to set reasonable speed limits, accordingly.

The first degree polynomial equation could also be an exact fit for a single point and an angle while the third degree polynomial equation could also be an exact fit for two points, an angle constraint, and a curvature constraint. Many other combinations of constraints are possible for these and for higher order polynomial equations.

If there are more than n + 1 constraints (n being the degree of the polynomial), the polynomial curve can still be run through those constraints. An exact fit to all constraints is not certain (but might happen, for example, in the case of a first degree polynomial exactly fitting three collinear points). In general, however, some method is then needed to evaluate each approximation. The least squares method is one way to compare the deviations.

There are several reasons given to get an approximate fit when it is possible to simply increase the degree of the polynomial equation and get an exact match.:

  • Even if an exact match exists, it does not necessarily follow that it can be readily discovered. Depending on the algorithm used there may be a divergent case, where the exact fit cannot be calculated, or it might take too much computer time to find the solution. This situation might require an approximate solution.
  • The effect of averaging out questionable data points in a sample, rather than distorting the curve to fit them exactly, may be desirable.
  • Runge's phenomenon: high order polynomials can be highly oscillatory. If a curve runs through two points A and B, it would be expected that the curve would run somewhat near the midpoint of A and B, as well. This may not happen with high-order polynomial curves; they may even have values that are very large in positive or negative magnitude. With low-order polynomials, the curve is more likely to fall near the midpoint (it's even guaranteed to exactly run through the midpoint on a first degree polynomial).
  • Low-order polynomials tend to be smooth and high order polynomial curves tend to be "lumpy". To define this more precisely, the maximum number of inflection points possible in a polynomial curve is n-2, where n is the order of the polynomial equation. An inflection point is a location on the curve where it switches from a positive radius to negative. We can also say this is where it transitions from "holding water" to "shedding water". Note that it is only "possible" that high order polynomials will be lumpy; they could also be smooth, but there is no guarantee of this, unlike with low order polynomial curves. A fifteenth degree polynomial could have, at most, thirteen inflection points, but could also have eleven, or nine or any odd number down to one. (Polynomials with even numbered degree could have any even number of inflection points from n - 2 down to zero.)

The degree of the polynomial curve being higher than needed for an exact fit is undesirable for all the reasons listed previously for high order polynomials, but also leads to a case where there are an infinite number of solutions. For example, a first degree polynomial (a line) constrained by only a single point, instead of the usual two, would give an infinite number of solutions. This brings up the problem of how to compare and choose just one solution, which can be a problem for software and for humans, as well. For this reason, it is usually best to choose as low a degree as possible for an exact match on all constraints, and perhaps an even lower degree, if an approximate fit is acceptable.

Relation between wheat yield and soil salinity

Fitting other functions to data points

Other types of curves, such as trigonometric functions (such as sine and cosine), may also be used, in certain cases.

In spectroscopy, data may be fitted with Gaussian, Lorentzian, Voigt and related functions.

In biology, ecology, demography, epidemiology, and many other disciplines, the growth of a population, the spread of infectious disease, etc. can be fitted using the logistic function.

In agriculture the inverted logistic sigmoid function (S-curve) is used to describe the relation between crop yield and growth factors. The blue figure was made by a sigmoid regression of data measured in farm lands. It can be seen that initially, i.e. at low soil salinity, the crop yield reduces slowly at increasing soil salinity, while thereafter the decrease progresses faster.

Algebraic fit versus geometric fit for curves

For algebraic analysis of data, "fitting" usually means trying to find the curve that minimizes the vertical (y-axis) displacement of a point from the curve (e.g., ordinary least squares). However, for graphical and image applications geometric fitting seeks to provide the best visual fit; which usually means trying to minimize the orthogonal distance to the curve (e.g., total least squares), or to otherwise include both axes of displacement of a point from the curve. Geometric fits are not popular because they usually require non-linear and/or iterative calculations, although they have the advantage of a more aesthetic and geometrically accurate result.

Fitting plane curves to data points

If a function of the form cannot be postulated, one can still try to fit a plane curve.

Other types of curves, such as conic sections (circular, elliptical, parabolic, and hyperbolic arcs) or trigonometric functions (such as sine and cosine), may also be used, in certain cases. For example, trajectories of objects under the influence of gravity follow a parabolic path, when air resistance is ignored. Hence, matching trajectory data points to a parabolic curve would make sense. Tides follow sinusoidal patterns, hence tidal data points should be matched to a sine wave, or the sum of two sine waves of different periods, if the effects of the Moon and Sun are both considered.

For a parametric curve, it is effective to fit each of its coordinates as a separate function of arc length; assuming that data points can be ordered, the chord distance may be used.

Fitting a circle by geometric fit

Circle fitting with the Coope method, the points describing a circle arc, centre (1 ; 1), radius 4.
 
different models of ellipse fitting
 
Ellipse fitting minimising the algebraic distance (Fitzgibbon method).

Coope approaches the problem of trying to find the best visual fit of circle to a set of 2D data points. The method elegantly transforms the ordinarily non-linear problem into a linear problem that can be solved without using iterative numerical methods, and is hence much faster than previous techniques.

Fitting an ellipse by geometric fit

The above technique is extended to general ellipses by adding a non-linear step, resulting in a method that is fast, yet finds visually pleasing ellipses of arbitrary orientation and displacement.

Fitting surfaces

Note that while this discussion was in terms of 2D curves, much of this logic also extends to 3D surfaces, each patch of which is defined by a net of curves in two parametric directions, typically called u and v. A surface may be composed of one or more surface patches in each direction.

Software

Many statistical packages such as R and numerical software such as the gnuplot, GNU Scientific Library, MLAB, Maple, MATLAB, TK Solver 6.0, Scilab, Mathematica, GNU Octave, and SciPy include commands for doing curve fitting in a variety of scenarios. There are also programs specifically written to do curve fitting; they can be found in the lists of statistical and numerical-analysis programs as well as in Category:Regression and curve fitting software.

Forensic chemistry

From Wikipedia, the free encyclopedia
 

Forensic chemistry is the application of chemistry and its subfield, forensic toxicology, in a legal setting. A forensic chemist can assist in the identification of unknown materials found at a crime scene. Specialists in this field have a wide array of methods and instruments to help identify unknown substances. These include high-performance liquid chromatography, gas chromatography-mass spectrometry, atomic absorption spectroscopy, Fourier transform infrared spectroscopy, and thin layer chromatography. The range of different methods is important due to the destructive nature of some instruments and the number of possible unknown substances that can be found at a scene. Forensic chemists prefer using nondestructive methods first, to preserve evidence and to determine which destructive methods will produce the best results.

Along with other forensic specialists, forensic chemists commonly testify in court as expert witnesses regarding their findings. Forensic chemists follow a set of standards that have been proposed by various agencies and governing bodies, including the Scientific Working Group on the Analysis of Seized Drugs. In addition to the standard operating procedures proposed by the group, specific agencies have their own standards regarding the quality assurance and quality control of their results and their instruments. To ensure the accuracy of what they are reporting, forensic chemists routinely check and verify that their instruments are working correctly and are still able to detect and measure various quantities of different substances.

Role in investigations

Aftermath of the Oklahoma City bombing.
Chemists were able to identify the explosive ANFO at the scene of the Oklahoma City bombing.

Forensic chemists' analysis can provide leads for investigators, and they can confirm or refute their suspicions. The identification of the various substances found at the scene can tell investigators what to look for during their search. During fire investigations, forensic chemists can determine if an accelerant such as gasoline or kerosene was used; if so, this suggests that the fire was intentionally set. Forensic chemists can also narrow down the suspect list to people who would have access to the substance used in a crime. For example, in explosive investigations, the identification of RDX or C-4 would indicate a military connection as those substances are military grade explosives. On the other hand, the identification of TNT would create a wider suspect list, since it is used by demolition companies as well as in the military. During poisoning investigations, the detection of specific poisons can give detectives an idea of what to look for when they are interviewing potential suspects. For example, an investigation that involves ricin would tell investigators to look for ricin's precursors, the seeds of the castor oil plant.

Forensic chemists also help to confirm or refute investigators' suspicions in drug or alcohol cases. The instruments used by forensic chemists can detect minute quantities, and accurate measurement can be important in crimes such as driving under the influence as there are specific blood alcohol content cutoffs where penalties begin or increase. In suspected overdose cases, the quantity of the drug found in the person's system can confirm or rule out overdose as the cause of death.

History

Early history

Refer to caption.
A bottle of strychnine extract was once easily obtainable in apothecaries.

Throughout history, a variety of poisons have been used to commit murder, including arsenic, nightshade, hemlock, strychnine, and curare. Until the early 19th century, there were no methods to accurately determine if a particular chemical was present, and poisoners were rarely punished for their crimes. In 1836, one of the first major contributions to forensic chemistry was introduced by British chemist James Marsh. He created the Marsh test for arsenic detection, which was subsequently used successfully in a murder trial. It was also during this time that forensic toxicology began to be recognized as a distinct field. Mathieu Orfila, the "father of toxicology", made great advancements to the field during the early 19th century. A pioneer in the development of forensic microscopy, Orfila contributed to the advancement of this method for the detection of blood and semen. Orfila was also the first chemist to successfully classify different chemicals into categories such as corrosives, narcotics, and astringents.

The next advancement in the detection of poisons came in 1850 when a valid method for detecting vegetable alkaloids in human tissue was created by chemist Jean Stas. Stas's method was quickly adopted and used successfully in court to convict Count Hippolyte Visart de Bocarmé of murdering his brother-in-law by nicotine poisoning. Stas was able to successfully isolate the alkaloid from the organs of the victim. Stas's protocol was subsequently altered to incorporate tests for caffeine, quinine, morphine, strychnine, atropine, and opium.

The wide range of instrumentation for forensic chemical analysis also began to be developed during this time period. The early 19th century saw the invention of the spectroscope by Joseph von Fraunhofer. In 1859, chemist Robert Bunsen and physicist Gustav Kirchhoff expanded on Fraunhofer's invention. Their experiments with spectroscopy showed that specific substances created a unique spectrum when exposed to specific wavelengths of light. Using spectroscopy, the two scientists were able to identify substances based on their spectrum, providing a method of identification for unknown materials. In 1906 botanist Mikhail Tsvet invented paper chromatography, an early predecessor to thin layer chromatography, and used it to separate and examine the plant proteins that make up chlorophyll. The ability to separate mixtures into their individual components allows forensic chemists to examine the parts of an unknown material against a database of known products. By matching the retention factors for the separated components with known values, materials can be identified.

Modernization

A gas chromatography mass spectrometry instrument that can be used to determine the identify of unknown chemicals.
A GC-MS unit with doors open. The gas chromatograph is on the right and the mass spectrometer is on the left.

Modern forensic chemists rely on numerous instruments to identify unknown materials found at a crime scene. The 20th century saw many advancements in technology that allowed chemists to detect smaller amounts of material more accurately. The first major advancement in this century came during the 1930s with the invention of a spectrometer that could measure the signal produced with infrared (IR) light. Early IR spectrometers used a monochromator and could only measure light absorption in a very narrow wavelength band. It was not until the coupling of an interferometer with an IR spectrometer in 1949 by Peter Fellgett that the complete infrared spectrum could be measured at once. Fellgett also used the Fourier transform, a mathematical method that can break down a signal into its individual frequencies, to make sense of the enormous amount of data received from the complete infrared analysis of a material. Since then, Fourier transform infrared spectroscopy (FTIR) instruments have become critical in the forensic analysis of unknown material because they are nondestructive and extremely quick to use. Spectroscopy was further advanced in 1955 with the invention of the modern atomic absorption (AA) spectrophotometer by Alan Walsh. AA analysis can detect specific elements that make up a sample along with their concentrations, allowing for the easy detection of heavy metals such as arsenic and cadmium.

Advancements in the field of chromatography arrived in 1953 with the invention of the gas chromatograph by Anthony T. James and Archer John Porter Martin, allowing for the separation of volatile liquid mixtures with components which have similar boiling points. Nonvolatile liquid mixtures could be separated with liquid chromatography, but substances with similar retention times could not be resolved until the invention of high-performance liquid chromatography (HPLC) by Csaba Horváth in 1970. Modern HPLC instruments are capable of detecting and resolving substances whose concentrations are as low as parts per trillion.

One of the most important advancements in forensic chemistry came in 1955 with the invention of gas chromatography-mass spectrometry (GC-MS) by Fred McLafferty and Roland Gohlke. The coupling of a gas chromatograph with a mass spectrometer allowed for the identification of a wide range of substances. GC-MS analysis is widely considered the "gold standard" for forensic analysis due to its sensitivity and versatility along with its ability to quantify the amount of substance present. The increase in the sensitivity of instrumentation has advanced to the point that minute impurities within compounds can be detected potentially allowing investigators to trace chemicals to a specific batch and lot from a manufacturer.

Methods

Forensic chemists rely on a multitude of instruments to identify unknown substances found at a scene. Different methods can be used to determine the identity of the same substance, and it is up to the examiner to determine which method will produce the best results. Factors that forensic chemists might consider when performing an examination are the length of time a specific instrument will take to examine a substance and the destructive nature of that instrument. They prefer using nondestructive methods first, to preserve the evidence for further examination. Nondestructive techniques can also be used to narrow down the possibilities, making it more likely that the correct method will be used the first time when a destructive method is used.

Spectroscopy

Refer to caption.
ATR FTIR spectrum for hexane showing percent transmittance (%T) versus wavenumber (cm−1).

The two main standalone spectroscopy techniques for forensic chemistry are FTIR and AA spectroscopy. FTIR is a nondestructive process that uses infrared light to identify a substance. The attenuated total reflectance sampling technique eliminates the need for substances to be prepared before analysis. The combination of nondestructiveness and zero preparation makes ATR FTIR analysis a quick and easy first step in the analysis of unknown substances. To facilitate the positive identification of the substance, FTIR instruments are loaded with databases that can be searched for known spectra that match the unknown's spectra. FTIR analysis of mixtures, while not impossible, presents specific difficulties due to the cumulative nature of the response. When analyzing an unknown that contains more than one substance, the resulting spectra will be a combination of the individual spectra of each component. While common mixtures have known spectra on file, novel mixtures can be difficult to resolve, making FTIR an unacceptable means of identification. However, the instrument can be used to determine the general chemical structures present, allowing forensic chemists to determine the best method for analysis with other instruments. For example, a methoxy group will result in a peak between 3,030 and 2,950 wavenumbers (cm−1).

Atomic absorption spectroscopy (AAS) is a destructive technique that is able to determine the elements that make up the analyzed sample. AAS performs this analysis by subjecting the sample to an extremely high heat source, breaking the atomic bonds of the substance, leaving free atoms. Radiation in the form of light is then passed through the sample forcing the atoms to jump to a higher energy state. Forensic chemists can test for each element by using a corresponding wavelength of light that forces that element's atoms to a higher energy state during the analysis. For this reason, and due to the destructive nature of this method, AAS is generally used as a confirmatory technique after preliminary tests have indicated the presence of a specific element in the sample. The concentration of the element in the sample is proportional to the amount of light absorbed when compared to a blank sample. AAS is useful in cases of suspected heavy metal poisoning such as with arsenic, lead, mercury, and cadmium. The concentration of the substance in the sample can indicate whether heavy metals were the cause of death.

Chromatography

Refer to caption.
HPLC readout of an Excedrin tablet. Peaks from left to right are acetaminophen, aspirin, and caffeine.

Spectroscopy techniques are useful when the sample being tested is pure, or a very common mixture. When an unknown mixture is being analyzed it must be broken down into its individual parts. Chromatography techniques can be used to break apart mixtures into their components allowing for each part to be analyzed separately.

Thin layer chromatography (TLC) is a quick alternative to more complex chromatography methods. TLC can be used to analyze inks and dyes by extracting the individual components. This can be used to investigate notes or fibers left at the scene since each company's product is slightly different and those differences can be seen with TLC. The only limiting factor with TLC analysis is the necessity for the components to be soluble in whatever solution is used to carry the components up the analysis plate. This solution is called the mobile phase. The forensic chemist can compare unknowns with known standards by looking at the distance each component travelled. This distance, when compared to the starting point, is known as the retention factor (Rf) for each extracted component. If each Rf value matches a known sample, that is an indication of the unknown's identity.

High-performance liquid chromatography can be used to extract individual components from a mixture dissolved in a solution. HPLC is used for nonvolatile mixtures that would not be suitable for gas chromatography. This is useful in drug analysis where the pharmaceutical is a combination drug since the components would separate, or elute, at different times allowing for the verification of each component. The eluates from the HPLC column are then fed into various detectors that produce a peak on a graph relative to its concentration as it elutes off the column. The most common type of detector is an ultraviolet-visible spectrometer as the most common item of interest tested with HPLC, pharmaceuticals, have UV absorbance.

Gas chromatography (GC) performs the same function as liquid chromatography, but it is used for volatile mixtures. In forensic chemistry, the most common GC instruments use mass spectrometry as their detector. GC-MS can be used in investigations of arson, poisoning, and explosions to determine exactly what was used. In theory, GC-MS instruments can detect substances whose concentrations are in the femtogram (10−15) range. However, in practice, due to signal-to-noise ratios and other limiting factors, such as the age of the individual parts of the instrument, the practical detection limit for GC-MS is in the picogram (10−12) range. GC-MS is also capable of quantifying the substances it detects; chemists can use this information to determine the effect the substance would have on an individual. GC-MS instruments need around 1,000 times more of the substance to quantify the amount than they need simply to detect it; the limit of quantification is typically in the nanogram (10−9) range.

Forensic toxicology

Forensic toxicology is the study of the pharmacodynamics, or what a substance does to the body, and pharmacokinetics, or what the body does to the substance. To accurately determine the effect a particular drug has on the human body, forensic toxicologists must be aware of various levels of drug tolerance that an individual can build up as well as the therapeutic index for various pharmaceuticals. Toxicologists are tasked with determining whether any toxin found in a body was the cause of or contributed to an incident, or whether it was at too low a level to have had an effect. While the determination of the specific toxin can be time-consuming due to the number of different substances that can cause injury or death, certain clues can narrow down the possibilities. For example, carbon monoxide poisoning would result in bright red blood while death from hydrogen sulfide poisoning would cause the brain to have a green hue.

Toxicologists are also aware of the different metabolites that a specific drug could break down into inside the body. For example, a toxicologist can confirm that a person took heroin by the presence in a sample of 6-monoacetylmorphine, which only comes from the breakdown of heroin. The constant creation of new drugs, both legal and illicit, forces toxicologists to keep themselves apprised of new research and methods to test for these novel substances. The stream of new formulations means that a negative test result does not necessarily rule out drugs. To avoid detection, illicit drug manufacturers frequently change the chemicals' structure slightly. These compounds are often not detected by routine toxicology tests and can be masked by the presence of a known compound in the same sample. As new compounds are discovered, known spectra are determined and entered into the databases that can be downloaded and used as reference standards. Laboratories also tend to keep in-house databases for the substances they find locally.

Standards

SWGDRUG analysis categories
Category A Category B Category C

Guidelines have been set up by various governing bodies regarding the standards that are followed by practicing forensic scientists. For forensic chemists, the international Scientific Working Group for the Analysis of Seized Drugs (SWGDRUG) presents recommendations for the quality assurance and quality control of tested materials. In the identification of unknown samples, protocols have been grouped into three categories based on the probability for false positives. Instruments and protocols in category A are considered the best for uniquely identifying an unknown material, followed by categories B and then C. To ensure the accuracy of identifications SWGDRUG recommends that multiple tests using different instruments be performed on each sample, and that one category A technique and at least one other technique be used. If a category A technique is not available, or the forensic chemist decides not to use one, SWGDRUG recommends that at least three techniques be used, two of which must be from category B. Combination instruments, such as GC-MS, are considered two separate tests as long as the results are compared to known values individually For example, the GC elution times would be compared to known values along with the MS spectra. If both of those match a known substance, no further tests are needed.

Standards and controls are necessary in the quality control of the various instruments used to test samples. Due to the nature of their work in the legal system, chemists must ensure that their instruments are working accurately. To do this, known controls are tested consecutively with unknown samples. By comparing the readouts of the controls with their known profiles the instrument can be confirmed to have been working properly at the time the unknowns were tested. Standards are also used to determine the instrument's limit of detection and limit of quantification for various common substances. Calculated quantities must be above the limit of detection to be confirmed as present and above the limit of quantification to be quantified. If the value is below the limit the value is not considered reliable.

Testimony

The standardized procedures for testimony by forensic chemists are provided by the various agencies that employ the scientists as well as SWGDRUG. Forensic chemists are ethically bound to present testimony in a neutral manner and to be open to reconsidering their statements if new information is found. Chemists should also limit their testimony to areas they have been qualified in regardless of questions during direct or cross-examination.

Individuals called to testify must be able to relay scientific information and processes in a manner that lay individuals can understand. By being qualified as an expert, chemists are allowed to give their opinions on the evidence as opposed to just stating the facts. This can lead to competing opinions from experts hired by the opposing side. Ethical guidelines for forensic chemists require that testimony be given in an objective manner, regardless of what side the expert is testifying for. Forensic experts that are called to testify are expected to work with the lawyer who issued the summons and to assist in their understanding of the material they will be asking questions about.

Education

Forensic chemistry positions require a bachelor's degree or similar in a natural or physical science as well as laboratory experience in general, organic, and analytical chemistry. Once in the position, individuals are trained in the protocols that are performed at that specific lab until they can prove they are competent to perform all experiments without supervision.. Practicing chemists already in the field are expected to have continuing education to maintain their proficiency.

Sea level rise

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Futures_studies The global average sea ...