Treatment is by growth hormone replacement using synthetic human growth hormone. The frequency of the condition is unclear. Most cases are initially noticed in children. The genetic forms of this disease are estimated to affect about 1 in 7,000 people. Most types occur equally in males and females though males are more often diagnosed.
Signs and symptoms
Child
Severe prenatal deficiency of GH, as occurs in congenital hypopituitarism, has little effect on fetal growth. However, prenatal and congenital deficiency can reduce the size of a male's penis, especially when gonadotropins are also deficient. Besides micropenis in males, additional consequences of severe deficiency in the first days of life can include hypoglycemia and exaggerated jaundice (both direct and indirect hyperbilirubinemia).
Even congenital GH deficiency does not usually impair length
growth until after the first few months of life. From late in the first
year until mid-teens, poor growth and/or shortness is the hallmark of
childhood GH deficiency. Growth is not as severely affected in GH
deficiency as in untreated hypothyroidism,
but growth at about half the usual velocity for age is typical. It
tends to be accompanied by delayed physical maturation so that bone maturation and puberty
may be several years delayed. When severe GH deficiency is present from
birth and never treated, adult heights can be as short as 48-65 inches
(122–165 cm).
Severe GH deficiency in early childhood also results in slower muscular development, so that gross motor milestones such as standing, walking, and jumping may be delayed. Body composition (i.e., the relative amounts of bone, muscle, and fat)
is affected in many children with severe deficiency, so that mild to
moderate chubbiness is common (though GH deficiency alone rarely causes
severe obesity). Some severely GH-deficient children have recognizable,
cherubic facial features characterized by maxillaryhypoplasia and forehead prominence.
Other side effects in children include sparse hair growth and frontal recession, and pili torti and trichorrhexis nodosa are also sometimes present.
Growth
hormone deficiency in childhood commonly has no identifiable cause
(idiopathic), and adult-onset GHD is commonly due to pituitary tumours
and their treatment or to cranial irradiation. A more complete list of causes includes:
There are a variety of rare diseases that resemble GH deficiency,
including the childhood growth failure, facial appearance, delayed bone
age, and low insulin-like growth factor-1
(IGF-1) levels. However, GH testing elicits normal or high levels of GH
in the blood, demonstrating that the problem is not due to a deficiency
of GH but rather to a reduced sensitivity to its action. Insensitivity
to GH is traditionally termed Laron dwarfism,
but over the last 15 years many different types of GH resistance have
been identified, primarily involving mutations of the GH binding protein
or receptors.
As an adult ages, it is normal for the pituitary to produce diminishing amounts of GH and many other hormones, particularly the sex steroids.
Physicians, therefore, distinguish between the natural reduction in GH
levels which comes with age, and the much lower levels of "true"
deficiency. Such deficiency almost always has an identifiable cause,
with adult-onset GHD without a definable cause ("idiopathic GH deficiency") extremely rare. GH does function in adulthood to maintain muscle and bone mass and strength, and has poorly understood effects on cognition and mood.
Diagnosis
Although
GH can be readily measured in a blood sample, testing for GH deficiency
is constrained by the fact that levels are nearly undetectable for most
of the day. This makes simple measurement of GH in a single blood
sample useless for detecting deficiency. Physicians, therefore use a
combination of indirect and direct criteria in assessing GHD, including:
Auxologic criteria (defined by body measurements)
Indirect hormonal criteria (IGF levels from a single blood sample)
Direct hormonal criteria (measurement of GH in multiple blood
samples to determine secretory patterns or responses to provocative
testing), in particular:
Subnormal frequency and amplitude of GH secretory peaks when sampled over several hours
Subnormal GH secretion in response to at least two provocative stimuli
Increased IGF1 levels after a few days of GH treatment
Response to GH treatment
Corroborative evidence of pituitary dysfunction
"Provocative tests" involve giving a dose of an agent that will
normally provoke a pituitary to release a burst of growth hormone. An
intravenous line is established, the agent is given, and small amounts
of blood are drawn at 15-minute intervals over the next hour to
determine if a rise of GH was provoked. Agents which have been used
clinically to stimulate and assess GH secretion are arginine, levodopa, clonidine, epinephrine and propranolol, glucagon, and insulin.
An insulin tolerance test has been shown to be reproducible,
age-independent, and able to distinguish between GHD and normal adults, and so is the test of choice.
Severe GH deficiency in childhood additionally has the following measurable characteristics:
Proportional stature well below that expected for family
heights, although this characteristic may not be present in the case of
familial-linked GH deficiency
Increased growth velocity after a few months of GH treatment
In childhood and adulthood, the diagnosing doctor will look for these
features accompanied by corroboratory evidence of hypopituitarism such
as deficiency of other pituitary hormones, a structurally abnormal
pituitary, or a history of damage to the pituitary. This would confirm
the diagnosis; in the absence of pituitary pathology, further testing
would be required.
Classification
Growth
hormone deficiency can be congenital or acquired in childhood or adult
life. It can be partial or complete. It is usually permanent, but
sometimes transient. It may be an isolated deficiency or occur in
association with deficiencies of other pituitary hormones.
The term hypopituitarism
is often used interchangeably with GH deficiency but more often denotes
GH deficiency plus deficiency of at least one other anterior pituitary
hormone. When GH deficiency (usually with other anterior pituitary
deficiencies) is associated with posterior pituitary hormone deficiency
(usually diabetes insipidus), the condition is termed panhypopituitarism.
GH deficiency is treated by replacing GH with daily injections under
the skin or into muscle. Until 1985, growth hormone for treatment was
obtained by extraction from human pituitary glands collected at autopsy. Since 1985, recombinant human growth hormone (rHGH) is a recombinant form of human GH produced by genetically engineered bacteria, manufactured by recombinant DNA technology. In both children and adults, costs of treatment in terms of money, effort, and the impact on day-to-day life, are substantial.
Child
GH
treatment is not recommended for children who are not growing despite
having normal levels of growth hormone, and in the UK it is not licensed
for this use.
Children requiring treatment usually receive daily injections of growth
hormone. Most pediatric endocrinologists monitor growth and adjust dose
every 3–6 months and many of these visits involve blood tests and
x-rays. Treatment is usually extended as long as the child is growing,
and lifelong continuation may be recommended for those most severely
deficient. Nearly painless insulin syringes, pen injectors,
or a needle-free delivery system reduce the discomfort. Injection sites
include the biceps, thigh, buttocks, and stomach. Injection sites
should be rotated daily to avoid lipoatrophy. Treatment is expensive,
costing as much as US$10,000 to $40,000 a year in the US.
Adults
GH supplementation is not recommended medically for the physiologic age-related decline in GH/IGF secretion.
It may be appropriate in diagnosed adult-onset deficiency, where a
weekly dose of approximately 25% of that given to children is given.
Lower doses again are called for in the elderly to reduce the incidence
of side effects and maintain age-dependent normal levels of IGF-1.
In many countries, including the UK, the majority view among
endocrinologists is that the failure of treatment to provide any
demonstrable, measurable benefits in terms of outcomes means treatment
is not recommended for all adults with severe GHD, and national guidelines in the UK as set out by NICE suggest three criteria which all need to be met for treatment to be indicated:
Severe GH deficiency, defined as a peak GH response of <9mU/litre during an insulin tolerance test
Perceived impairment of quality of life, as assessed by questionnaire
They are already treated for other pituitary hormone disorders
Where treatment is indicated, duration is dependent upon indication.
Cost of adult treatment in the UK is 3000-4000 GBP annually.
When
treated with GH, a severely deficient child will begin to grow faster
within months. In the first year of treatment, the rate of growth may
increase from half as fast as other children are growing to twice as
fast (e.g., from 1 inch a year to 4 inches, or 2.5 cm to 10). Growth
typically slows in subsequent years, but usually remains above normal so
that over several years a child who had fallen far behind in their
height may grow into the normal height range. Excess adipose tissue may
be reduced.
Adults
GH
treatment can confer a number of measurable benefits to severely
GH-deficient adults, such as enhanced energy and strength, and improved
bone density. Muscle mass may increase at the expense of adipose tissue.
Although adults with hypopituitarism have been shown to have a reduced life expectancy, and a cardiovascular mortality rate of more than double controls,
treatment has not been shown to improve mortality, although blood lipid
levels do improve. Similarly, although measurements of bone density
improve with treatment, rates of fractures have not been shown to improve.
Effects on quality of life are unproven, with a number of studies
finding that adults with GHD had near-normal indicators of QoL at
baseline (giving little scope for improvement), and many using outdated
dosing strategies. However, it may be that those adults with poor QoL at
the start of treatment do benefit.
Epidemiology
The incidence of idiopathic GHD in infants is about 1 in every 3800 live births,
and rates in older children are rising as more children survive
childhood cancers which are treated with radiotherapy, although exact
rates are hard to obtain. The incidence of genuine adult-onset GHD, normally due to pituitary tumours, is estimated at 10 per million.
History
Like many other 19th century medical terms which lost precise meaning as they gained wider currency, "midget"
as a term for someone with severe proportional shortness acquired
pejorative connotations and is no longer used in a medical context.
Notable modern pop cultural figures with growth hormone deficiency include actor and comedian Andy Milonakis, who has the appearance and voice of an adolescent boy despite being in his 40s. Argentine footballer Lionel Messi was diagnosed at age 10 with growth hormone deficiency and was subsequently treated. TLC reality star Shauna Rae was effected by a medically-caused growth hormone deficiency resulting from childhood glioblastoma cancer treatment. Oscar winning actress Linda Hunt was diagnosed as having this condition when a teenager.
A fingerprint is an impression left by the friction ridges of a human finger. The recovery of partial fingerprints from a crime scene is an important method of forensic science.
Moisture and grease on a finger result in fingerprints on surfaces such
as glass or metal. Deliberate impressions of entire fingerprints can be
obtained by ink or other substances transferred from the peaks of
friction ridges on the skin to a smooth surface such as paper.
Fingerprint records normally contain impressions from the pad on the
last joint of fingers and thumbs, though fingerprint cards also
typically record portions of lower joint areas of the fingers.
Human fingerprints are detailed, nearly unique, difficult to
alter, and durable over the life of an individual, making them suitable
as long-term markers of human identity. They may be employed by police
or other authorities to identify individuals who wish to conceal their
identity, or to identify people who are incapacitated or deceased and
thus unable to identify themselves, as in the aftermath of a natural
disaster.
Their use as evidence has been challenged by academics, judges
and the media. There are no uniform standards for point-counting
methods, and academics have argued that the error rate in matching fingerprints has not been adequately studied and that fingerprint evidence has no secure statistical foundation.
Research has been conducted into whether experts can objectively focus
on feature information in fingerprints without being misled by
extraneous information, such as context.
Biology
Fingerprints are impressions left on surfaces by the friction ridges on the finger of a human. The matching of two fingerprints is among the most widely used and most reliable biometric techniques. Fingerprint matching considers only the obvious features of a fingerprint.
The composition of fingerprints consists of water (95%-99%), as well as organic and inorganic constituents. The organic component is made up of amino acids, proteins, glucose, lactase, urea, pyruvate, fatty acids and sterols. Inorganic ions such as chloride, sodium, potassium and iron are also present.
Other contaminants such as oils found in cosmetics, drugs and their
metabolites and food residues may be found in fingerprint residues.
A friction ridge is a raised portion of the epidermis on the digits (fingers and toes), the palm of the hand or the sole of the foot, consisting of one or more connected ridge units of friction ridge skin. These are sometimes known as "epidermal ridges" which are caused by the underlying interface between the dermal papillae
of the dermis and the interpapillary (rete) pegs of the epidermis.
These unique features are formed at around the 15th week of fetal
development and remain until after death, when decomposition begins.
During the development of the fetus, around the 13th week of a
pregnancy, ledge-like formation is formed at the bottom of the epidermis
beside the dermis. The cells along these ledges begin to rapidly proliferate. This rapid proliferation forms primary and secondary ridges.
Both the primary and secondary ridges act as a template for the outer
layer of the skin to form the friction ridges seen on the surface of the
skin.
These epidermal ridges serve to amplify vibrations triggered, for example, when fingertips brush across an uneven surface, better transmitting the signals to sensory nerves involved in fine texture perception. These ridges may also assist in gripping rough surfaces and may improve surface contact in wet conditions.
Genetics
Consensus within the scientific community suggests that the dermatoglyphic patterns on fingertips are hereditary. The fingerprint patterns between monozygotic twins have been shown to be very similar, whereas dizygotic twins have considerably less similarity. Significant heritability has been identified for 12 dermatoglyphic characteristics. Current models of dermatoglyphic trait inheritance suggest Mendelian transmission with additional effects from either additive or dominant major genes.
Whereas genes determine the general characteristics of patterns
and their type, the presence of environmental factors result in the
slight differentiation of each fingerprint. However, the relative
influences of genetic and environmental effects on fingerprint patterns
are generally unclear. One study has suggested that roughly 5% of the
total variability is due to small environmental effects, although this
was only performed using total ridge count as a metric.
Several models of finger ridge formation mechanisms that lead to the
vast diversity of fingerprints have been proposed. One model suggests
that a buckling instability in the basal cell layer of the fetal epidermis is responsible for developing epidermal ridges. Additionally, blood vessels and nerves may also serve a role in the formation of ridge configurations.
Another model indicates that changes in amniotic fluid surrounding each
developing finger within the uterus cause corresponding cells on each
fingerprint to grow in different microenvironments.
For a given individual, these various factors affect each finger
differently preventing two fingerprints from being identical while still
retaining similar patterns.
It is important to note that the determination of fingerprint inheritance is made difficult by the vast diversity of phenotypes.
Classification of a specific pattern is often subjective (lack of
consensus on the most appropriate characteristic to measure
quantitatively) which complicates analysis of dermatoglyphic patterns.
Several modes of inheritance have been suggested and observed for
various fingerprint patterns. Total fingerprint ridge count, a commonly
used metric of fingerprint pattern size, has been suggested to have a polygenic mode of inheritance and is influenced by multiple additive genes.
This hypothesis has been challenged by other research, however, which
indicates that ridge counts on individual fingers are genetically
independent and lack evidence to support the existence of additive genes
influencing pattern formation.
Another mode of fingerprint pattern inheritance suggests that the arch
pattern on the thumb and on other fingers are inherited as an autosomal
dominant trait. Further research on the arch pattern has suggested that a major gene or multifactorial inheritance is responsible for arch pattern heritability. A separate model for the development of the whorl pattern indicates that a single gene or group of linked genes contributes to its inheritance.
Furthermore, inheritance of the whorl pattern does not appear to be
symmetric in that the pattern is seemingly randomly distributed among
the ten fingers of a given individual.
In general, comparison of fingerprint patterns between left and right
hands suggests an asymmetry in the effects of genes on fingerprint
patterns, although this observation requires further analysis.
In addition to proposed models of inheritance, specific genes
have been implicated as factors in fingertip pattern formation (their
exact mechanism of influencing patterns is still under research).
Multivariate linkage analysis of finger ridge counts on individual
fingers revealed linkage to chromosome 5q14.1 specifically for the ring, index, and middle fingers. In mice, variants in the gene EVI1 were correlated with dermatoglyphic patterns.
EVI1 expression in humans does not directly influence fingerprint
patterns but does affect limb and digit formation which in turn may play
a role in influencing fingerprint patterns. Genome-wide association studies found single nucleotide polymorphisms within the gene ADAMTS9-AS2 on 3p14.1, which appeared to have an influence on the whorl pattern on all digits. This gene encodes antisense RNA
which may inhibit ADAMTS9, which is expressed in the skin. A model of
how genetic variants of ADAMTS9-AS2 directly influence whorl development
has not yet been proposed.
In February 2023 a study identified WNT, BMP and EDAR
as signaling pathways regulating the formation of primary ridges on
fingerprints, with the first two having an opposite relationship
established by a Turing reaction-diffusion system.
Classification systems
Before computerization, manual filing systems were used in large fingerprint repositories.
A fingerprint classification system groups fingerprints according to
their characteristics and therefore helps in the matching of a
fingerprint against a large database of fingerprints. A query
fingerprint that needs to be matched can therefore be compared with a
subset of fingerprints in an existing database.
Early classification systems were based on the general ridge patterns,
including the presence or absence of circular patterns, of several or
all fingers. This allowed the filing and retrieval of paper records in
large collections based on friction ridge patterns alone. The most
popular systems used the pattern class of each finger to form a numeric
key to assist lookup in a filing system. Fingerprint classification
systems included the Roscher System, the Juan Vucetich System and the Henry Classification System. The Roscher System was developed in Germany and implemented in both Germany and Japan. The Vucetich System was developed in Argentina and implemented throughout South America. The Henry Classification System was developed in India and implemented in most English-speaking countries.
In the Henry Classification System there are three basic fingerprint patterns: loop, whorl, and arch, which constitute 60–65 percent, 30–35 percent, and 5 percent of all fingerprints respectively. There are also more complex classification systems that break down patterns even further, into plain arches or tented arches,
and into loops that may be radial or ulnar, depending on the side of
the hand toward which the tail points. Ulnar loops start on the
pinky-side of the finger, the side closer to the ulna, the lower arm bone. Radial loops start on the thumb-side of the finger, the side closer to the radius.
Whorls may also have sub-group classifications including plain whorls,
accidental whorls, double loop whorls, peacock's eye, composite, and
central pocket loop whorls.
The system used by most experts, although complex, is similar to the Henry Classification System. It consists of five fractions, in which R stands for right, L for left, i for index finger, m for middle finger, t for thumb, r for ring finger and p(pinky) for little finger. The fractions are as follows:
Ri/Rt + Rr/Rm + Lt/Rp + Lm/Li + Lp/Lr
The numbers assigned to each print are based on whether or not they
are whorls. A whorl in the first fraction is given a 16, the second an
8, the third a 4, the fourth a 2, and 0 to the last fraction. Arches and
loops are assigned values of 0. Lastly, the numbers in the numerator
and denominator are added up, using the scheme:
(Ri + Rr + Lt + Lm + Lp)/(Rt + Rm + Rp + Li + Lr)
A 1 is added to both top and bottom, to exclude any possibility of
division by zero. For example, if the right ring finger and the left
index finger have whorls, the fraction used is:
Fingerprint identification, known as dactyloscopy, Ridgeology, or hand print identification, is the process of comparing two instances of friction ridge skin impressions (see Minutiae),
from human fingers or toes, or even the palm of the hand or sole of the
foot, to determine whether these impressions could have come from the
same individual. The flexibility and the randomized formation of the
friction ridges on skin means that no two finger or palm prints are ever
exactly alike in every detail; even two impressions recorded
immediately after each other from the same hand may be slightly
different. Fingerprint identification, also referred to as individualization, involves an expert, or an expert computer system operating under threshold scoring
rules, determining whether two friction ridge impressions are likely to
have originated from the same finger or palm (or toe or sole).
An intentional recording of friction ridges is usually made with black printer's ink
rolled across a contrasting white background, typically a white card.
Friction ridges can also be recorded digitally, usually on a glass
plate, using a technique called Live Scan.
A "latent print" is the chance recording of friction ridges deposited
on the surface of an object or a wall. Latent prints are invisible to
the naked eye, whereas "patent prints" or "plastic prints" are viewable
with the unaided eye. Latent prints are often fragmentary and require
the use of chemical methods, powder,
or alternative light sources in order to be made clear. Sometimes an
ordinary bright flashlight will make a latent print visible.
When friction ridges come into contact with a surface that will take a print, material that is on the friction ridges such as perspiration,
oil, grease, ink, or blood, will be transferred to the surface. Factors
which affect the quality of friction ridge impressions are numerous.
Pliability of the skin, deposition pressure, slippage, the material from
which the surface is made, the roughness of the surface, and the
substance deposited are just some of the various factors which can cause
a latent print to appear differently from any known recording of the
same friction ridges. Indeed, the conditions surrounding every instance
of friction ridge deposition are unique and never duplicated. For these
reasons, fingerprint examiners are required to undergo extensive
training. The scientific study of fingerprints is called dermatoglyphics.
Fingerprinting techniques
Exemplar
Exemplar prints, or known prints, is the name given to fingerprints
deliberately collected from a subject, whether for purposes of
enrollment in a system or when under arrest for a suspected criminal
offense. During criminal arrests, a set of exemplar prints will normally
include one print taken from each finger that has been rolled from one
edge of the nail to the other, plain (or slap) impressions of each of
the four fingers of each hand, and plain impressions of each thumb.
Exemplar prints can be collected using live scan or by using ink on paper cards.
Latent
In forensic science a partial fingerprint lifted from a surface, is called a latent fringerprint.
Moisture and grease on fingers result in latent fingerprints on
surfaces such as glass. But because they are not clearly visible, their
detection may require chemical development through powder dusting, the
spraying of ninhydrin, iodine fuming, or soaking in silver nitrate.
Depending on the surface or the material on which a latent fingerprint
has been found, different methods of chemical development must be used.
Forensic scientists use different techniques for porous surfaces, such as paper, and nonporous surfaces, such as glass, metal or plastic.
Nonporous surfaces require the dusting process, where fine powder and a
brush are used, followed by the application of transparent tape to lift
the latent fingerprint off the surface.
While the police often describe all partial fingerprints found at
a crime scene as latent prints, forensic scientists call partial
fingerprints that are readily visible patent prints. Chocolate,
toner, paint or ink on fingers will result in patent fingerprints.
Latent fingerprints impressions that are found on soft material, such as
soap, cement or plaster, are called plastic prints by forensic scientists.
Capture and detection
Live scan devices
Fingerprint image acquisition is considered to be the most critical
step in an automated fingerprint authentication system, as it determines
the final fingerprint image quality, which has a drastic effect on the
overall system performance. There are different types of fingerprint
readers on the market, but the basic idea behind each is to measure the
physical difference between ridges and valleys.
All the proposed methods can be grouped into two major families:
solid-state fingerprint readers and optical fingerprint readers. The
procedure for capturing a fingerprint using a sensor consists of rolling
or touching with the finger onto a sensing area, which according to the
physical principle in use (optical, ultrasonic, capacitive, or
thermal – see § Fingerprint sensors)
captures the difference between valleys and ridges. When a finger
touches or rolls onto a surface, the elastic skin deforms. The quantity
and direction of the pressure applied by the user, the skin conditions
and the projection of an irregular 3D object (the finger) onto a 2D flat
plane introduce distortions, noise, and inconsistencies in the captured
fingerprint image. These problems result in inconsistent and
non-uniform irregularities in the image.
During each acquisition, therefore, the results of the imaging are
different and uncontrollable. The representation of the same fingerprint
changes every time the finger is placed on the sensor plate, increasing
the complexity of any attempt to match fingerprints, impairing the
system performance and consequently, limiting the widespread use of this
biometric technology.
In order to overcome these problems, as of 2010, non-contact or
touchless 3D fingerprint scanners have been developed. Acquiring
detailed 3D information, 3D fingerprint scanners take a digital approach
to the analog process of pressing or rolling the finger. By modelling
the distance between neighboring points, the fingerprint can be imaged
at a resolution high enough to record all the necessary detail.
Fingerprinting on cadavers
The human skin itself, which is a regenerating organ until death, and
environmental factors such as lotions and cosmetics, pose challenges
when fingerprinting a human. Following the death of a human the skin
dries and cools. Fingerprints of dead humans may be obtained during an autopsy.
The collection of fingerprints off of a cadaver can be done in
varying ways and depends on the condition of the skin. In the case of
cadaver in the later stages of decomposition with dried skin, analysts
will boil the skin to recondition/rehydrate it, allowing for moisture to
flow back into the skin and resulting in detail friction ridges. Another, method that has been used in brushing a powder, such as baby powder over the tips of the fingers. The powder will ebbed itself into the farrows of the friction ridges allowing for the lifted ridges to be seen.
Latent fingerprint detection
In the 1930s criminal investigators in the United States
first discovered the existence of latent fingerprints on the surfaces
of fabrics, most notably on the insides of gloves discarded by
perpetrators.
Since the late nineteenth century, fingerprint identification
methods have been used by police agencies around the world to identify
suspected criminals as well as the victims of crime. The basis of the
traditional fingerprinting technique is simple. The skin on the palmar
surface of the hands and feet forms ridges, so-called papillary ridges,
in patterns that are unique to each individual and which do not change
over time. Even identical twins (who share their DNA)
do not have identical fingerprints. The best way to render latent
fingerprints visible, so that they can be photographed, can be complex
and may depend, for example, on the type of surfaces on which they have
been left. It is generally necessary to use a "developer", usually a
powder or chemical reagent, to produce a high degree of visual contrast
between the ridge patterns and the surface on which a fingerprint has
been deposited.
Developing agents depend on the presence of organic materials or
inorganic salts for their effectiveness, although the water deposited
may also take a key role. Fingerprints are typically formed from the
aqueous-based secretions of the eccrine glands of the fingers and palms
with additional material from sebaceous glands primarily from the
forehead. This latter contamination results from the common human
behaviors of touching the face and hair. The resulting latent
fingerprints consist usually of a substantial proportion of water with
small traces of amino acids and chlorides mixed with a fatty, sebaceous
component which contains a number of fatty acids and triglycerides.
Detection of a small proportion of reactive organic substances such as
urea and amino acids is far from easy.
Fingerprints at a crime scene may be detected by simple powders, or by chemicals applied in situ.
More complex techniques, usually involving chemicals, can be applied in
specialist laboratories to appropriate articles removed from a crime
scene. With advances in these more sophisticated techniques, some of the
more advanced crime scene investigation services from around the world
were, as of 2010, reporting that 50% or more of the fingerprints
recovered from a crime scene had been identified as a result of
laboratory-based techniques.
Forensic laboratories
Although there are hundreds of reported techniques for fingerprint
detection, many of these are only of academic interest and there are
only around 20 really effective methods which are currently in use in
the more advanced fingerprint laboratories around the world.
Some of these techniques, such as ninhydrin, diazafluorenone and vacuum metal deposition,
show great sensitivity and are used operationally. Some fingerprint
reagents are specific, for example ninhydrin or diazafluorenone reacting
with amino acids. Others such as ethyl cyanoacrylate
polymerisation, work apparently by water-based catalysis and polymer
growth. Vacuum metal deposition using gold and zinc has been shown to be
non-specific, but can detect fat layers as thin as one molecule.
More mundane methods, such as the application of fine powders,
work by adhesion to sebaceous deposits and possibly aqueous deposits in
the case of fresh fingerprints. The aqueous component of a fingerprint,
whilst initially sometimes making up over 90% of the weight of the
fingerprint, can evaporate quite quickly and may have mostly gone after
24 hours. Following work on the use of argon ion lasers for fingerprint
detection,
a wide range of fluorescence techniques have been introduced, primarily
for the enhancement of chemically developed fingerprints; the inherent
fluorescence of some latent fingerprints may also be detected.
Fingerprints can for example be visualized in 3D and without chemicals
by the use of infrared lasers.
A comprehensive manual of the operational methods of fingerprint
enhancement was last published by the UK Home Office Scientific
Development Branch in 2013 and is used widely around the world.
A technique proposed in 2007 aims to identify an individual's ethnicity, sex, and dietary patterns.
Limitations and implications in a forensic context
One of the main limitations of friction ridge impression evidence
regarding the actual collection would be the surface environment,
specifically talking about how porous the surface the impression is on.
With non-porous surfaces the residues of the impression will not be
absorbed into the material of the surface, but could be smudged by
another surface. With porous surfaces, the residues of the impression will be absorbed into the surface. With both resulting in either an impression of no value to examiners or the destruction of the friction ridge impressions.
In order for analysts to correctly positively identify friction
ridge patterns and their features depends heavily on the clarity of the
impression. Therefore the analysis of friction ridges is limited by clarity.
In a court context, many have argued that friction ridge
identification and Ridgeology should be classified as opinion evidence
and not as fact, therefore should be assessed as such.
Many have said that friction ridge identification is only legally
admissible today because during the time when it was added to the legal
system, the admissibility standards were quite low.
There are only a limited number of studies that have been conducted to
help confirm the science behind this identification process.
Crime scene investigations
The application of the new scanning Kelvin probe
(SKP) fingerprinting technique, which makes no physical contact with
the fingerprint and does not require the use of developers, has the
potential to allow fingerprints to be recorded whilst still leaving
intact material that could subsequently be subjected to DNA analysis. A
forensically usable prototype was under development at Swansea
University during 2010, in research that was generating significant
interest from the British Home Office
and a number of different police forces across the UK, as well as
internationally. The hope is that this instrument could eventually be
manufactured in sufficiently large numbers to be widely used by forensic
teams worldwide.
Detection of drug use
The secretions, skin oils and dead cells in a human fingerprint contain residues of various chemicals and their metabolites present in the body. These can be detected and used for forensic purposes. For example, the fingerprints of tobacco smokers contain traces of cotinine, a nicotine
metabolite; they also contain traces of nicotine itself. Caution should
be used, as its presence may be caused by mere contact of the finger
with a tobacco product. By treating the fingerprint with gold nanoparticles with attached cotinineantibodies,
and then subsequently with a fluorescent agent attached to cotinine
antibodies, the fingerprint of a smoker becomes fluorescent;
non-smokers' fingerprints stay dark. The same approach, as of 2010, is being tested for use in identifying heavy coffee drinkers, cannabis smokers, and users of various other drugs.
Fingerprints collected at a crime scene, or on items of evidence from a crime, have been used in forensic science
to identify suspects, victims and other persons who touched a surface.
Fingerprint identification emerged as an important system within police
agencies in the late 19th century, when it replaced anthropometric
measurements as a more reliable method for identifying persons having a
prior record, often under a false name, in a criminal record repository.
Fingerprinting has served all governments worldwide during the past 100
years or so to provide identification of criminals. Fingerprints are
the fundamental tool in every police agency for the identification of
people with a criminal history.
The validity of forensic fingerprint evidence has been challenged by academics, judges and the media. In the United States
fingerprint examiners have not developed uniform standards for the
identification of an individual based on matching fingerprints. In some
countries where fingerprints are also used in criminal investigations,
fingerprint examiners are required to match a number of identification points
before a match is accepted. In England 16 identification points are
required and in France 12, to match two fingerprints and identify an
individual. Point-counting methods have been challenged by some
fingerprint examiners because they focus solely on the location of
particular characteristics in fingerprints that are to be matched.
Fingerprint examiners may also uphold the one dissimilarity doctrine,
which holds that if there is one dissimilarity between two
fingerprints, the fingerprints are not from the same finger.
Furthermore, academics have argued that the error rate in matching fingerprints has not been adequately studied and it has even been argued that fingerprint evidence has no secure statistical foundation.
Research has been conducted into whether experts can objectively focus
on feature information in fingerprints without being misled by
extraneous information, such as context.
Fingerprints can theoretically be forged and planted at crime scenes.
Professional certification
Fingerprinting was the basis upon which the first forensic professional organization was formed, the International Association for Identification (IAI), in 1915.
The first professional certification program for forensic scientists
was established in 1977, the IAI's Certified Latent Print Examiner
program, which issued certificates to those meeting stringent criteria
and had the power to revoke certification where an individual's
performance warranted it. Other forensic disciplines have followed suit and established their own certification programs.
History
Antiquity and the medieval period
Fingerprints have been found on ancient clay tablets, seals, and pottery. They have also been found on the walls of Egyptian tombs and on Minoan, Greek, and Chinese pottery. In ancient China officials authenticated government documents with their fingerprints. In about 200 BC fingerprints were used to sign written contracts in Babylon. Fingerprints from 3D-scans of cuneiform tablets are extracted using the GigaMesh Software Framework.
With the advent of silk and paper in China, parties to a legal
contract impressed their handprints on the document. Sometime before 851
CE, an Arab merchant in China, Abu Zayd Hasan, witnessed Chinese
merchants using fingerprints to authenticate loans.
Although ancient peoples probably did not realize that fingerprints could uniquely identify individuals, references from the age of the Babylonian king Hammurabi (reigned 1792–1750 BCE) indicate that law officials would take the fingerprints of people who had been arrested. During China's Qin Dynasty, records have shown that officials took hand prints and foot prints as well as fingerprints as evidence from a crime scene. In 650 the Chinese historian Kia Kung-Yen remarked that fingerprints could be used as a means of authentication. In his Jami al-Tawarikh (Universal History), the Iranian physician Rashid-al-Din Hamadani
(1247–1318) refers to the Chinese practice of identifying people via
their fingerprints, commenting: "Experience shows that no two
individuals have fingers exactly alike."
Europe in the 17th and 18th centuries
From the late 16th century onwards, European academics attempted to
include fingerprints in scientific studies. But plausible conclusions
could be established only from the mid-17th century onwards. In 1686 the
professor of anatomy at the University of BolognaMarcello Malpighi identified ridges, spirals and loops in fingerprints left on surfaces. In 1788 a German anatomist Johann Christoph Andreas Mayer was the first European to conclude that fingerprints were unique to each individual. In 1880 Henry Faulds suggested, based on his studies, that fingerprints are unique to a human.
19th century
In 1823 Jan Evangelista Purkyně
identified nine fingerprint patterns. The nine patterns include the
tented arch, the loop, and the whorl, which in modern-day forensics are
considered ridge details. In 1840, following the murder of Lord William Russell, a provincial doctor, Robert Blake Overton, wrote to Scotland Yard suggesting checking for fingerprints. In 1853 the German anatomist Georg von Meissner (1829–1905) studied friction ridges, and in 1858 Sir William James Herschel initiated fingerprinting in India. In 1877 he first instituted the use of fingerprints on contracts and deeds to prevent the repudiation of signatures in Hooghly near Kolkata
and he registered government pensioners' fingerprints to prevent the
collection of money by relatives after a pensioner's death.
In 1880 Henry Faulds,
a Scottish surgeon in a Tokyo hospital, published his first paper on
the usefulness of fingerprints for identification and proposed a method
to record them with printing ink. Returning to Great Britain in 1886, he offered the concept to the Metropolitan Police in London but it was dismissed at that time. Up until the early 1890s police forces in the United States and on the European continent could not reliably identify criminals to track their criminal record. Francis Galton published a detailed statistical model of fingerprint analysis and identification in his 1892 book Finger Prints.
He had calculated that the chance of a "false positive" (two different
individuals having the same fingerprints) was about 1 in 64 billion. In 1892 Juan Vucetich,
an Argentine chief police officer, created the first method of
recording the fingerprints of individuals on file. In that same year, Francisca Rojas
was found in a house with neck injuries, whilst her two sons were found
dead with their throats cut. Rojas accused a neighbour, but despite
brutal interrogation, this neighbour would not confess to the crimes.
Inspector Álvarez, a colleague of Vucetich, went to the scene and found a
bloody thumb mark on a door. When it was compared with Rojas' prints,
it was found to be identical with her right thumb. She then confessed to
the murder of her sons. This was the first known murder case to be solved using fingerprint analysis.
In Kolkata
a fingerprint Bureau was established in 1897, after the Council of the
Governor General approved a committee report that fingerprints should be
used for the classification of criminal records. The bureau employees Azizul Haque and Hem Chandra Bose
have been credited with the primary development of a fingerprint
classification system eventually named after their supervisor, Sir Edward Richard Henry.
20th century
The French scientist Paul-Jean Coulier developed a method to transfer latent fingerprints on surfaces to paper using iodine fuming. It allowed the London Scotland Yard
to start fingerprinting individuals and identify criminals using
fingerprints in 1901. Soon after, American police departments adopted
the same method and fingerprint identification became a standard
practice in the United States.
The Scheffer case of 1902 is the first case of the identification,
arrest, and conviction of a murderer based upon fingerprint evidence. Alphonse Bertillon
identified the thief and murderer Scheffer, who had previously been
arrested and his fingerprints filed some months before, from the
fingerprints found on a fractured glass showcase, after a theft in a
dentist's apartment where the dentist's employee was found dead. It was
able to be proved in court that the fingerprints had been made after the
showcase was broken.
The identification of individuals through fingerprints for law enforcement has been considered essential in the United States since the beginning of the 20th century. Body identification using fingerprints has also been valuable in the aftermath of natural disasters and anthropogenic hazards. In the United States, the FBI manages a fingerprint identification system and database called the Integrated Automated Fingerprint Identification System
(IAFIS), which currently holds the fingerprints and criminal records of
over 51 million criminal record subjects and over 1.5 million civil
(non-criminal) fingerprint records. OBIM,
formerly U.S. VISIT, holds the largest repository of biometric
identifiers in the U.S. government at over 260 million individual
identities.
When it was deployed in 2004, this repository, known as the Automated
Biometric Identification System (IDENT), stored biometric data in the
form of two-finger records. Between 2005 and 2009, the DHS transitioned to a ten-print record standard in order to establish interoperability with IAFIS.
In 1910, Edmond Locard established the first forensic lab in France. Criminals may wear gloves
to avoid leaving fingerprints. However, the gloves themselves can leave
prints that are as unique as human fingerprints. After collecting glove prints, law enforcement can match them to gloves that they have collected as evidence or to prints collected at other crime scenes. In many jurisdictions the act of wearing gloves itself while committing a crime can be prosecuted as an inchoate offense.
The non-governmental organization (NGO) Privacy International
in 2002 made the cautionary announcement that tens of thousands of UK
school children were being fingerprinted by schools, often without the
knowledge or consent of their parents. That same year, the supplier Micro Librarian Systems,
which uses a technology similar to that used in US prisons and the
German military, estimated that 350 schools throughout Britain were
using such systems to replace library cards. By 2007, it was estimated that 3,500 schools were using such systems. Under the United Kingdom Data Protection Act,
schools in the UK do not have to ask parental consent to allow such
practices to take place. Parents opposed to fingerprinting may bring
only individual complaints against schools. In response to a complaint which they are continuing to pursue, in 2010 the European Commission
expressed 'significant concerns' over the proportionality and necessity
of the practice and the lack of judicial redress, indicating that the
practice may break the European Union data protection directive.
In March 2007, the UK government was considering fingerprinting all children aged 11 to 15 and adding the prints to a government database as part of a new passport and ID card
scheme and disallowing opposition for privacy concerns. All
fingerprints taken would be cross-checked against prints from 900,000
unsolved crimes. Shadow Home secretary David Davis called the plan "sinister". The Liberal Democrat home affairs spokesman Nick Clegg criticised "the determination to build a surveillance state behind the backs of the British people". The UK's junior education minister Lord Adonis defended the use of fingerprints by schools, to track school attendance as well as access to school meals and libraries, and reassured the House of Lords
that the children's fingerprints had been taken with the consent of the
parents and would be destroyed once children left the school. An Early Day Motion
which called on the UK Government to conduct a full and open
consultation with stakeholders about the use of biometrics in schools,
secured the support of 85 Members of Parliament (Early Day Motion 686). Following the establishment in the United Kingdom of a Conservative and Liberal Democratic coalition government in May 2010, the UK ID card scheme was scrapped.
Serious concerns about the security implications of using
conventional biometric templates in schools have been raised by a number
of leading IT security experts, one of whom has voiced the opinion that "it is absolutely premature to begin using 'conventional biometrics' in schools".
The vendors of biometric systems claim that their products bring
benefits to schools such as improved reading skills, decreased wait
times in lunch lines and increased revenues.
They do not cite independent research to support this view. One
education specialist wrote in 2007: "I have not been able to find a
single piece of published research which suggests that the use of
biometrics in schools promotes healthy eating or improves reading skills
amongst children... There is absolutely no evidence for such claims".
The Ottawa Police in Canada have advised parents who fear their children may be kidnapped to fingerprint their children.
Absence or mutilation of fingerprints
A very rare medical condition, adermatoglyphia,
is characterized by the absence of fingerprints. Affected persons have
completely smooth fingertips, palms, toes and soles, but no other
medical signs or symptoms. A 2011 study indicated that adermatoglyphia is caused by the improper expression of the proteinSMARCAD1. The condition has been called immigration delay disease
by the researchers describing it, because the congenital lack of
fingerprints causes delays when affected persons attempt to prove their
identity while traveling. Only five families with this condition had been described as of 2011.
The anti-cancer medication capecitabine may cause the loss of fingerprints. Swelling of the fingers, such as that caused by bee stings, will in some cases cause the temporary disappearance of fingerprints, though they will return when the swelling recedes.
Since the elasticity of skin decreases with age, many senior citizens
have fingerprints that are difficult to capture. The ridges get
thicker; the height between the top of the ridge and the bottom of the
furrow gets narrow, so there is less prominence.
Fingerprints can be erased permanently and this can potentially
be used by criminals to reduce their chance of conviction. Erasure can
be achieved in a variety of ways including simply burning the
fingertips, using acids and advanced techniques such as plastic surgery. John Dillinger
burned his fingers with acid, but prints taken during a previous arrest
and upon death still exhibited almost complete relation to one another.
Fingerprint verification
Fingerprints can be captured as graphical ridge and valley patterns.
Because of their uniqueness and permanence, fingerprints emerged as the
most widely used biometric identifier in the 2000s. Automated fingerprint verification systems were developed to meet the needs of law enforcement
and their use became more widespread in civilian applications. Despite
being deployed more widely, reliable automated fingerprint verification
remained a challenge and was extensively researched in the context of pattern recognition and image processing.
The uniqueness of a fingerprint can be established by the overall
pattern of ridges and valleys, or the logical ridge discontinuities
known as minutiae. In the 2000s minutiae features were considered
the most discriminating and reliable feature of a fingerprint.
Therefore, the recognition of minutiae features became the most common
basis for automated fingerprint verification. The most widely used
minutiae features used for automated fingerprint verification were the
ridge ending and the ridge bifurcation.
Patterns
The three basic patterns of fingerprint ridges are the arch, loop, and whorl:
Arch: The ridges enter from one side of the finger, rise in the
center forming an arc, and then exit the other side of the finger.
Loop: The ridges enter from one side of a finger, form a curve, and then exit on that same side.
Whorl: Ridges form circularly around a central point on the finger.
Scientists have found that family members often share the same
general fingerprint patterns, leading to the belief that these patterns
are inherited.
Fingerprint features
Features of fingerprint ridges, called minutiae, include:
Ridge ending: The abrupt end of a ridge
Bifurcation: A single ridge dividing in two
Short or independent ridge: A ridge that commences, travels a short distance and then ends
Island or dot: A single small ridge inside a short ridge or ridge ending that is not connected to all other ridges
Lake or ridge enclosure: A single ridge that bifurcates and reunites shortly afterward to continue as a single ridge
Spur: A bifurcation with a short ridge branching off a longer ridge
Bridge or crossover: A short ridge that runs between two parallel ridges
Optical scanners take a visual image of the fingerprint using a digital camera.
Capacitive or CMOS scanners use capacitors and thus electric current to form an image of the fingerprint.
Ultrasound fingerprint scanners use high frequency sound waves to penetrate the epidermal (outer) layer of the skin.
Thermal scanners sense the temperature differences on the contact surface, in between fingerprint ridges and valleys.
Consumer electronics login authentication
Since 2000 electronic fingerprint readers have been introduced as consumer electronics security applications. Fingerprint sensors could be used for loginauthentication
and the identification of computer users. However, some less
sophisticated sensors have been discovered to be vulnerable to quite
simple methods of deception, such as fake fingerprints cast in gels. In 2006, fingerprint sensors gained popularity in the laptop market. Built-in sensors in laptops, such as ThinkPads, VAIO, HP Pavilion and EliteBook laptops, and others also double as motion detectors for document scrolling, like the scroll wheel.
Two of the first smartphone manufacturers to integrate fingerprint recognition into their phones were Motorola with the Atrix 4G in 2011 and Apple with the iPhone 5S on September 10, 2013. One month after, HTC launched the One Max, which also included fingerprint recognition. In April 2014, Samsung released the Galaxy S5, which integrated a fingerprint sensor on the home button.
Following the release of the iPhone 5S
model, a group of German hackers announced on September 21, 2013, that
they had bypassed Apple's new Touch ID fingerprint sensor by
photographing a fingerprint from a glass surface and using that captured
image as verification. The spokesman for the group stated: "We hope
that this finally puts to rest the illusions people have about
fingerprint biometrics. It is plain stupid to use something that you
can't change and that you leave everywhere every day as a security
token." In September 2015, Apple included a new version of the fingerprint scanner in the iPhone home button with the iPhone 6S. The use of the Touch ID fingerprint scanner was optional and could be configured to unlock the screen or pay for mobile apps purchases. Since December 2015, cheaper smartphones with fingerprint recognition have been released, such as the $100 UMI Fair. Samsung introduced fingerprint sensors to its mid-range A series smartphones in 2014.
By 2017 Hewlett Packard, Asus, Huawei, Lenovo and Apple were using fingerprint readers in their laptops. Synaptics says the SecurePad sensor is now available for OEMs to start building into their laptops. In 2018, Synaptics revealed that their in-display fingerprint sensors would be featured on the new Vivo
X21 UD smartphone. This was the first mass-produced fingerprint sensor
to be integrated into the entire touchscreen display, rather than as a
separate sensor.
Algorithms
Matching algorithms are used to compare previously stored templates of fingerprints against candidate fingerprints for authentication
purposes. In order to do this either the original image must be
directly compared with the candidate image or certain features must be
compared.
Pre-processing
Pre-processing enhances the quality of an image by filtering and
removing extraneous noise. The minutiae-based algorithm is only
effective with 8-bit gray scale fingerprint images. One reason for this
is that an 8-bit gray fingerprint image is a fundamental base when
converting the image to a 1-bit image with value 1 for ridges and value 0
for furrows. This process allows for enhanced edge detection so the
fingerprint is revealed in high contrast, with the ridges highlighted in
black and the furrows in white. To further optimize the input image's
quality, two more steps are required: minutiae extraction and false
minutiae removal. The minutiae extraction is carried out by applying a
ridge-thinning algorithm that removes redundant pixels of ridges. As a
result, the thinned ridges of the fingerprint image are marked with a
unique ID to facilitate the conduction of further operations. After the
minutiae extraction, the false minutiae removal is carried out. The lack
of the amount of ink and the cross link among the ridges could cause
false minutiae that led to inaccuracy in fingerprint recognition
process.
Pattern-based (or image-based) algorithms
Pattern based algorithms compare the basic fingerprint patterns
(arch, whorl, and loop) between a previously stored template and a
candidate fingerprint. This requires that the images can be aligned in
the same orientation. To do this, the algorithm finds a central point in
the fingerprint image and centers on that. In a pattern-based
algorithm, the template contains the type, size, and orientation of
patterns within the aligned fingerprint image. The candidate fingerprint
image is graphically compared with the template to determine the degree
to which they match.
In other species
Some other animals have evolved their own unique prints, especially
those whose lifestyle involves climbing or grasping wet objects; these
include many primates, such as gorillas and chimpanzees, Australian koalas, and aquatic mammal species such as the North American fisher.
According to one study, even with an electron microscope, it can be
quite difficult to distinguish between the fingerprints of a koala and a
human.
In fiction
Mark Twain
Mark Twain's memoir Life on the Mississippi (1883), notable mainly for its account of the author's time on the river, also recounts parts of his later life and includes tall tales
and stories allegedly told to him. Among them is an involved,
melodramatic account of a murder in which the killer is identified by a
thumbprint. Twain's novel Pudd'nhead Wilson, published in 1893, includes a courtroom drama that turns on fingerprint identification.
Crime fiction
The use of fingerprints in crime fiction has, of course, kept pace with its use in real-life detection. Sir Arthur Conan Doyle wrote a short story about his celebrated sleuth Sherlock Holmes which features a fingerprint: "The Norwood Builder"
is a 1903 short story set in 1894 and involves the discovery of a
bloody fingerprint which helps Holmes to expose the real criminal and
free his client.
The British detective writer R. Austin Freeman's first Thorndyke novel The Red Thumb-Mark
was published in 1907 and features a bloody fingerprint left on a piece
of paper together with a parcel of diamonds inside a safe-box. These
become the center of a medico-legal investigation led by Dr. Thorndyke, who defends the accused whose fingerprint matches that on the paper, after the diamonds are stolen.
Film and television
In the television series Bonanza (1959–1973) the Chinese character Hop Sing uses his knowledge of fingerprints to free Little Joe from a murder charge.
The 1997 movie Men in Black
required Agent J to remove his ten fingerprints by putting his hands on
a metal ball, an action deemed necessary by the MIB agency to remove
the identity of its agents.
In the 2009 science fiction movie Cold Souls, a mule who smuggles souls
wears latex fingerprints to frustrate airport security terminals. She
can change her identity by simply changing her wig and latex
fingerprints.