A spaceport or cosmodrome is a site for launching or receiving spacecraft, by analogy to a seaport for ships or an airport for aircraft. The word spaceport, and even more so cosmodrome, has traditionally been used for sites capable of launching spacecraft into orbit around Earth or on interplanetary trajectories. However, rocket launch sites for purely sub-orbital flights
are sometimes called spaceports, as in recent years new and proposed
sites for suborbital human flights have been frequently referred to or
named "spaceports". Space stations
and proposed future bases on the Moon are sometimes called spaceports,
in particular if intended as a base for further journeys.
The term rocket launch site is used for any facility from which rockets are launched. It may contain one or more launch pads or suitable sites to mount a transportable launch pad. It is typically surrounded by a large safety area, often called a rocket range or missile range.
The range includes the area over which launched rockets are expected to
fly, and within which some components of the rockets may land. Tracking
stations are sometimes located in the range to assess the progress of
the launches.
Major spaceports often include more than one launch complex, which can be rocket launch sites adapted for different types of launch vehicles.
(These sites can be well-separated for safety reasons.) For launch
vehicles with liquid propellant, suitable storage facilities and, in
some cases, production facilities are necessary. On-site processing
facilities for solid propellants are also common.
A spaceport may also include runways for takeoff and landing of aircraft to support spaceport operations, or to enable support of HTHL or HTVL winged launch vehicles.
History
The
age of crewed rocket flight was initiated by Fritz von Opel who piloted
the world's first rocket-propelled flight on 30 September 1929; von
Opel was the co-designer and financier of the visionary project which
led to actual space flights.
Peenemünde, Germany – where the "V-2", the first rocket reaching space in June 1944 was launched
The first rockets to reach space were V-2 rockets launched from Peenemünde, Germany in 1944 during World War II. After the war, 70 complete V-2 rockets were brought to White Sands for test launches, with 47 of them reaching altitudes between 100 km and 213 km.
The world's first spaceport for orbital and human launches, the Baikonur Cosmodrome in southern Kazakhstan, started as a Soviet military rocket range in 1955. It achieved the first orbital flight (Sputnik 1)
in October 1957. The exact location of the cosmodrome was initially
held secret. Guesses to its location were misdirected by a name in
common with a mining town 320 km away. The position became known in 1957
outside the Soviet Union only after U-2 planes had identified the site by following railway lines in the Kazakh SSR, although Soviet authorities did not confirm the location for decades.
The Baikonur Cosmodrome achieved the first launch of a human into space (Yuri Gagarin) in 1961. The launch complex used, Site 1, has reached a special symbolic significance and is commonly called Gagarin's Start. Baikonur was the primary Soviet cosmodrome, and is still frequently used by Russia under a lease arrangement with Kazakhstan.
In response to the early Soviet successes, the United States
built up a major spaceport complex at Cape Canaveral in Florida. A large
number of uncrewed flights, as well as the early human flights, were
carried out at Cape Canaveral Space Force Station. For the Apollo programme, an adjacent spaceport, Kennedy Space Center, was constructed, and achieved the first crewed mission to the lunar surface (Apollo 11) in July 1969. It was the base for all Space Shuttle launches and most of their runway landings. For details on the launch complexes of the two spaceports, see List of Cape Canaveral and Merritt Island launch sites.
The Guiana Space Centre
in Kourou, French Guiana, is the major European spaceport, with
satellite launches that benefit from the location 5 degrees north of the
equator.
Breaking with tradition, in June 2004 on a runway at Mojave Air and Space Port, California, a human was for the first time launched to space in a privately funded, suborbital spaceflight, that was intended to pave the way for future commercial spaceflights. The spacecraft, SpaceShipOne, was launched by a carrier airplane taking off horizontally.
At Cape Canaveral, SpaceX in 2015 made the first successful landing and recovery of a first stage used in a vertical satellite launch.
Location
Rockets can most easily reach satellite orbits if launched near the equator in an easterly direction, as this maximizes use of the Earth's rotational speed (465 m/s at the equator). Such launches also provide a desirable orientation for arriving at a geostationary orbit. For polar orbits and Molniya orbits this does not apply.
In principle, advantages of high altitude launch are reduced
vertical distance to travel and a thinner atmosphere for the rocket to
penetrate. However, altitude of the launch site is not a driving factor
in spaceport placement because most of the delta-v for a launch is spent on achieving the required horizontal orbital speed.
The small gain from a few kilometers of extra altitude does not usually
off-set the logistical costs of ground transport in mountainous
terrain.
Many spaceports have been placed at existing military installations, such as intercontinental ballistic missile ranges, which are not always physically ideal sites for launch.
A rocket launch site is built as far as possible away from major
population centers in order to mitigate risk to bystanders should a
rocket experience a catastrophic failure. In many cases a launch site is
built close to major bodies of water to ensure that no components are
shed over populated areas. Typically a spaceport site is large enough
that, should a vehicle explode, it will not endanger human lives or
adjacent launch pads.
Planned sites of spaceports for sub-orbital
tourist spaceflight often make use of existing ground infrastructure,
including runways. The nature of the local view from 100 km (62 mi)
altitude is also a factor to consider.
The establishment of spaceports for tourist trips raises legal issues, which are only beginning to be addressed.
With achieved vertical launches of humans
The
following is a table of spaceports and launch complexes for vertical
launchers with documented achieved launches of humans to space (more
than 100 km (62 mi) altitude). The sorting order is spaceport by
spaceport according to the time of the first human launch.
† Three of the Soyuz missions were uncrewed and are not counted (Soyuz 2, Soyuz 20, Soyuz 34).
‡ STS-51-L (Challenger) failed to reach orbit and is not counted. STS-107 (Columbia) reached orbit and is therefore included in the count (disaster struck on re-entry).
With achieved satellite launches
The
following is a table of spaceports with a documented achieved launch to
orbit. The table is sorted according to the time of the first launch
that achieved satellite orbit insertion. The first column gives the
geographical location. Operations from a different country are indicated
in the fourth column. A launch is counted as one also in cases where
the payload consists of multiple satellites.
With achieved horizontal launches of humans to 100 km
The
following table shows spaceports with documented achieved launches of
humans to at least 100 km altitude, starting from a horizontal runway.
All the flights were sub-orbital.
Spaceports have been proposed for locations on the Moon, Mars, orbiting the Earth, at Sun-Earth and Earth-Moon Lagrange points, and at other locations in the Solar System. Human-tended outposts on the Moon or Mars, for example, will be spaceports by definition. The 2012 Space Studies Program of the International Space University
studied the economic benefit of a network of spaceports throughout the
solar system beginning from Earth and expanding outwardly in phases,
within its team project Operations And Service Infrastructure for Space
(OASIS). Its analysis claimed that the first phase, placing the "Node 1" spaceport with space tug services in low Earth orbit (LEO), would be commercially profitable and reduce transportation costs to geosynchronous orbit
by as much as 44% (depending on the launch vehicle). The second phase
would add a Node 2 spaceport on the lunar surface to provide services
including lunar ice mining and delivery of rocket propellants back to Node 1. This would enable lunar surface activities and further reduce transportation costs within and out from cislunar space. The third phase would add a Node 3 spaceport on the Martian moon Phobos
to enable refueling and resupply prior to Mars surface landings,
missions beyond Mars, and return trips to Earth. In addition to
propellant mining and refueling, the network of spaceports could provide
services such as power storage and distribution, in-space assembly and
repair of spacecraft, communications relay, shelter, construction and
leasing of infrastructure, maintaining spacecraft positioned for future
use, and logistics.
In evolutionary computation, an initial set of candidate
solutions is generated and iteratively updated. Each new generation is
produced by stochastically removing less desired solutions, and
introducing small random changes. In biological terminology, a population of solutions is subjected to natural selection (or artificial selection) and mutation. As a result, the population will gradually evolve to increase in fitness, in this case the chosen fitness function of the algorithm.
Evolutionary computation techniques can produce highly optimized
solutions in a wide range of problem settings, making them popular in computer science.
Many variants and extensions exist, suited to more specific families of
problems and data structures. Evolutionary computation is also
sometimes used in evolutionary biology as an in silico experimental procedure to study common aspects of general evolutionary processes.
History
The concept of mimicking evolutionary processes to solve problems originates before the advent of computers, such as when Alan Turing proposed a method of genetic search in 1948 . Turing's B-type u-machines resemble primitive neural networks, and connections between neurons were learnt via a sort of genetic algorithm. His P-type u-machines resemble a method for reinforcement learning,
where pleasure and pain signals direct the machine to learn certain
behaviors. However, Turing's paper went unpublished until 1968, and he
died in 1954, so this early work had little to no effect on the field of
evolutionary computation that was to develop.
Evolutionary computing as a field began in earnest in the 1950s and 1960s.
There were several independent attempts to use the process of evolution
in computing at this time, which developed separately for roughly 15
years. Three branches emerged in different places to attain this goal: evolution strategies, evolutionary programming, and genetic algorithms. A fourth branch, genetic programming,
eventually emerged in the early 1990s. These approaches differ in the
method of selection, the permitted mutations, and the representation of
genetic data. By the 1990s, the distinctions between the historic
branches had begun to blur, and the term 'evolutionary computing' was
coined in 1991 to denote a field that exists over all four paradigms.
In 1962, Lawrence J. Fogel initiated the research of Evolutionary Programming in the United States, which was considered an artificial intelligence endeavor. In this system, finite state machines
are used to solve a prediction problem: these machines would be mutated
(adding or deleting states, or changing the state transition rules),
and the best of these mutated machines would be evolved further in
future generations. The final finite state machine may be used to
generate predictions when needed. The evolutionary programming method
was successfully applied to prediction problems, system identification,
and automatic control. It was eventually extended to handle time series
data and to model the evolution of gaming strategies.
In 1964, Ingo Rechenberg and Hans-Paul Schwefel introduce the paradigm of evolution strategies in Germany. Since traditional gradient descent
techniques produce results that may get stuck in local minima,
Rechenberg and Schwefel proposed that random mutations (applied to all
parameters of some solution vector) may be used to escape these minima.
Child solutions were generated from parent solutions, and the more
successful of the two was kept for future generations. This technique
was first used by the two to successfully solve optimization problems in
fluid dynamics.
Initially, this optimization technique was performed without computers,
instead relying on dice to determine random mutations. By 1965, the
calculations were performed wholly by machine.
John Henry Holland introduced genetic algorithms in the 1960s, and it was further developed at the University of Michigan in the 1970s.
While the other approaches were focused on solving problems, Holland
primarily aimed to use genetic algorithms to study adaptation and
determine how it may be simulated. Populations of chromosomes,
represented as bit strings, were transformed by an artificial selection
process, selecting for specific 'allele' bits in the bit string. Among
other mutation methods, interactions between chromosomes were used to
simulate the recombination
of DNA between different organisms. While previous methods only tracked
a single optimal organism at a time (having children compete with
parents), Holland's genetic algorithms tracked large populations (having
many organisms compete each generation).
By the 1990s, a new approach to evolutionary computation that came to be called genetic programming emerged, advocated for by John Koza among others. In this class of algorithms, the subject of evolution was itself a program written in a high-level programming language
(there had been some previous attempts as early as 1958 to use machine
code, but they met with little success). For Koza, the programs were LispS-expressions,
which can be thought of as trees of sub-expressions. This
representation permits programs to swap subtrees, representing a sort of
genetic mixing. Programs are scored based on how well they complete a
certain task, and the score is used for artificial selection. Sequence
induction, pattern recognition, and planning were all successful
applications of the genetic programming paradigm.
Many other figures played a role in the history of evolutionary
computing, although their work did not always fit into one of the major
historical branches of the field. The earliest computational simulations
of evolution using evolutionary algorithms and artificial life techniques were performed by Nils Aall Barricelli in 1953, with first results published in 1954. Another pioneer in the 1950s was Alex Fraser, who published a series of papers on simulation of artificial selection.
As academic interest grew, dramatic increases in the power of computers
allowed practical applications, including the automatic evolution of
computer programs.
Evolutionary algorithms are now used to solve multi-dimensional
problems more efficiently than software produced by human designers, and
also to optimize the design of systems.
A through catalogue with many other recently proposed algorithms has been published in the Evolutionary Computation Bestiary. It is important to note that many recent algorithms, however, have poor experimental validation.
In this process, there are two main forces that form the basis of evolutionary systems: Recombinationmutation and crossover create the necessary diversity and thereby facilitate novelty, while selection acts as a force increasing quality.
Many aspects of such an evolutionary process are stochastic.
Changed pieces of information due to recombination and mutation are
randomly chosen. On the other hand, selection operators can be either
deterministic, or stochastic. In the latter case, individuals with a
higher fitness have a higher chance to be selected than individuals with a lower fitness, but typically even the weak individuals have a chance to become a parent or to survive.
Genetic algorithms deliver methods to model biological systems and systems biology that are linked to the theory of dynamical systems,
since they are used to predict the future states of the system. This is
just a vivid (but perhaps misleading) way of drawing attention to the
orderly, well-controlled and highly structured character of development
in biology.
However, the use of algorithms and informatics, in particular of computational theory, beyond the analogy to dynamical systems, is also relevant to understand evolution itself.
This view has the merit of recognizing that there is no central
control of development; organisms develop as a result of local
interactions within and between cells. The most promising ideas about
program-development parallels seem to us to be ones that point to an
apparently close analogy between processes within cells, and the
low-level operation of modern computers.
Thus, biological systems are like computational machines that process
input information to compute next states, such that biological systems
are closer to a computation than classical dynamical system.
Furthermore, following concepts from computational theory, micro processes in biological organisms are fundamentally incomplete and undecidable (completeness (logic)), implying that “there is more than a crude metaphor behind the analogy between cells and computers.
The analogy to computation extends also to the relationship between inheritance systems and biological structure, which is often thought to reveal one of the most pressing problems in explaining the origins of life.
Evolutionary automata, a generalization of Evolutionary Turing machines,
have been introduced in order to investigate more precisely properties
of biological and evolutionary computation. In particular, they allow to
obtain new results on expressiveness of evolutionary computation. This confirms the initial result about undecidability of natural evolution and evolutionary algorithms and processes. Evolutionary finite automata, the simplest subclass of Evolutionary automata working in terminal mode
can accept arbitrary languages over a given alphabet, including
non-recursively enumerable (e.g., diagonalization language) and
recursively enumerable but not recursive languages (e.g., language of
the universal Turing machine).
Notable practitioners
The
list of active researchers is naturally dynamic and non-exhaustive. A
network analysis of the community was published in 2007.
https://en.wikipedia.org/wiki/Telemetry An expendable dropsonde
used to capture weather data. The telemetry consists of sensors for
pressure, temperature, and humidity and a wireless transmitter to return
the captured data to an aircraft.A saltwater crocodile with a GPS-based satellite transmitter attached to its head for tracking
Telemetry is the in situcollection of measurements or other data at remote points and their automatic transmission to receiving equipment (telecommunication) for monitoring. The word is derived from the Greek roots tele, 'remote', and metron, 'measure'. Systems that need external instructions and data to operate require the counterpart of telemetry: telecommand.
Although the term commonly refers to wireless data transfer mechanisms (e.g., using radio, ultrasonic, or infrared systems), it also encompasses data transferred over other media such as a telephone or computer network,
optical link or other wired communications like power line carriers.
Many modern telemetry systems take advantage of the low cost and
ubiquity of GSM networks by using SMS to receive and transmit telemetry data.
A telemeter is a physical device used in telemetry. It consists of a sensor,
a transmission path, and a display, recording, or control device.
Electronic devices are widely used in telemetry and can be wireless or
hard-wired, analog or digital. Other technologies are also possible, such as mechanical, hydraulic and optical.
Telemetry may be commutated to allow the transmission of multiple data streams in a fixed frame.
History
The beginning of industrial telemetry lies in the steam age, although the sensor was not called telemeter at that time. Examples are James Watt's (1736-1819) additions to his steam engines for monitoring from a (near) distance such as the mercury pressure gauge and the fly-ball governor.
Although the original telemeter referred to a ranging device (the rangefinding telemeter),
by the late 19th century the same term had been in wide use by
electrical engineers applying it refer to electrically operated devices
measuring many other quantities besides distance (for instance, in the
patent of an "Electric Telemeter Transmitter"). General telemeters included such sensors as the thermocouple (from the work of Thomas Johann Seebeck), the resistance thermometer (by William Siemens based on the work of Humphry Davy), and the electrical strain gauge (based on Lord Kelvin's discovery that conductors under mechanical strain change their resistance) and output devices such as Samuel Morse's telegraph sounder and the relay. In 1889 this led an author in the Institution of Civil Engineers proceedings to suggest that the term for the rangefinder telemeter might be replaced with tacheometer.
In the 1930s use of electrical telemeters grew rapidly. The
electrical strain gauge was widely used in rocket and aviation research
and the radiosonde was invented for meteorological measurements. The advent of World War II gave an impetus to industrial development and henceforth many of these telemeters became commercially viable. Carrying on from rocket research, radio telemetry was used
routinely as space exploration got underway. Spacecraft are in a place
where a physical connection is not possible, leaving radio or other
electromagnetic waves (such as infrared lasers) as the only viable
option for telemetry. During crewed space missions it is used to monitor
not only parameters of the vehicle, but also the health and life
support of the astronauts. During the Cold War telemetry found uses in espionage. US intelligence found that they could monitor the telemetry from Soviet
missile tests by building a telemeter of their own to intercept the
radio signals and hence learn a great deal about Soviet capabilities.
Types of telemeter
Telemeters are the physical devices used in telemetry. It consists of a sensor, a transmission path, and a display, recording, or control device. Electronic devices are widely used in telemetry and can be wireless or hard-wired, analog or digital. Other technologies are also possible, such as mechanical, hydraulic and optical.
Telemetering information over wire had its origins in the 19th
century. One of the first data-transmission circuits was developed in
1845 between the Russian Tsar's Winter Palace and army headquarters. In 1874, French engineers built a system of weather and snow-depth sensors on Mont Blanc that transmitted real-time information to Paris. In 1901 the American inventor C. Michalke patented the selsyn,
a circuit for sending synchronized rotation information over a
distance. In 1906 a set of seismic stations were built with telemetering
to the Pulkovo Observatory in Russia. In 1912, Commonwealth Edison developed a system of telemetry to monitor electrical loads on its power grid. The Panama Canal (completed 1913–1914) used extensive telemetry systems to monitor locks and water levels.
Wireless telemetry made early appearances in the radiosonde, developed concurrently in 1930 by Robert Bureau in France and Pavel Molchanov in Russia. Molchanov's system modulated temperature and pressure measurements by converting them to wireless Morse code. The German V-2
rocket used a system of primitive multiplexed radio signals called
"Messina" to report four rocket parameters, but it was so unreliable
that Wernher von Braun once claimed it was more useful to watch the rocket through binoculars.
In the US and the USSR, the Messina system was quickly replaced with better systems; in both cases, based on pulse-position modulation (PPM).
Early Soviet missile and space telemetry systems which were developed in
the late 1940s used either PPM (e.g., the Tral telemetry system
developed by OKB-MEI) or pulse-duration modulation
(e.g., the RTS-5 system developed by NII-885). In the United States,
early work employed similar systems, but were later replaced by pulse-code modulation (PCM) (for example, in the Mars probe Mariner 4).
Later Soviet interplanetary probes used redundant radio systems,
transmitting telemetry by PCM on a decimeter band and PPM on a
centimeter band.
Applications
Meteorology
Telemetry has been used by weather balloons for transmitting meteorological data since 1920.
Oil and gas industry
Telemetry
is used to transmit drilling mechanics and formation evaluation
information uphole, in real time, as a well is drilled. These services
are known as Measurement while drilling and Logging while drilling.
Information acquired thousands of feet below ground, while drilling, is
sent through the drilling hole to the surface sensors and the
demodulation software. The pressure wave (sana) is translated into
useful information after DSP and noise filters. This information is used
for Formation evaluation, Drilling Optimization, and Geosteering.
Motor racing
Telemetry
is a key factor in modern motor racing, allowing race engineers to
interpret data collected during a test or race and use it to properly
tune the car for optimum performance. Systems used in series such as Formula One
have become advanced to the point where the potential lap time of the
car can be calculated, and this time is what the driver is expected to
meet. Examples of measurements on a race car include accelerations (G forces)
in three axes, temperature readings, wheel speed, and suspension
displacement. In Formula One, driver input is also recorded so the team
can assess driver performance and (in case of an accident) the FIA can determine or rule out driver error as a possible cause.
Later developments include two-way telemetry which allows
engineers to update calibrations on the car in real time (even while it
is out on the track). In Formula One, two-way telemetry surfaced in the
early 1990s and consisted of a message display on the dashboard which
the team could update. Its development continued until May 2001, when it
was first allowed on the cars. By 2002, teams were able to change
engine mapping and deactivate engine sensors from the pit while the car
was on the track. For the 2003 season, the FIA banned two-way telemetry from Formula One; however, the technology may be used in other types of racing or on road cars.
One way telemetry system has also been applied in R/C racing car to get information by car's sensors like: engine RPM, voltage, temperatures, throttle.
Transportation
In
the transportation industry, telemetry provides meaningful information
about a vehicle or driver's performance by collecting data from sensors
within the vehicle. This is undertaken for various reasons ranging from
staff compliance monitoring, insurance rating to predictive maintenance.
Telemetry is used to link traffic counter devices to data recorders to measure traffic flows and vehicle lengths and weights.
Telemetry is used by the railway industry for measuring the health of trackage.
This permits optimized and focused predictive and preventative
maintenance. Typically this is done with specialized trains, such as the
New Measurement Train used in the United Kingdom by Network Rail, which can check for track defects, such as problems with gauge, and deformations in the rail. Japan uses similar, but quicker trains, nicknamed Doctor Yellow. Such trains, besides checking the tracks, can also verify whether or not there are any problems with the overhead power supply (catenary), where it's installed. Dedicated rail inspection companies, such as Sperry Rail,
have their own customized rail cars and rail-wheel equipped trucks that
use a variety of methods, including lasers, ultrasound, and induction
(measuring resulting magnetic fields from running electricity into
rails) to find any defects.
Agriculture
Most
activities related to healthy crops and good yields depend on timely
availability of weather and soil data. Therefore, wireless weather
stations play a major role in disease prevention and precision
irrigation. These stations transmit parameters necessary for
decision-making to a base station: air temperature and relative humidity, precipitation and leaf wetness (for disease prediction models), solar radiation and wind speed (to calculate evapotranspiration), water deficit stress (WDS) leaf sensors and soil moisture (crucial to irrigation decisions).
Because local micro-climates can vary significantly, such data
needs to come from within the crop. Monitoring stations usually transmit
data back by terrestrial radio, although occasionally satellite systems are used. Solar power is often employed to make the station independent of the power grid.
Water management
Telemetry is important in water management, including water quality and stream gauging functions. Major applications include AMR (automatic meter reading), groundwater
monitoring, leak detection in distribution pipelines and equipment
surveillance. Having data available in almost real time allows quick
reactions to events in the field. Telemetry control allows engineers to
intervene with assets such as pumps and by remotely switching pumps on
or off depending on the circumstances. Watershed telemetry is an
excellent strategy of how to implement a water management system.
Defense, space and resource exploration
Telemetry is used in complex systems such as missiles, RPVs, spacecraft, oil rigs, and chemical plants
since it allows the automatic monitoring, alerting, and record-keeping
necessary for efficient and safe operation. Space agencies such as NASA, ISRO, the European Space Agency (ESA), and other agencies use telemetry and/or telecommand systems to collect data from spacecraft and satellites.
Telemetry is vital in the development of missiles, satellites and
aircraft because the system might be destroyed during or after the
test. Engineers need critical system parameters to analyze (and improve)
the performance of the system. In the absence of telemetry, this data
would often be unavailable.
Space science
Telemetry
is used by crewed or uncrewed spacecraft for data transmission.
Distances of more than 10 billion kilometres have been covered, e.g., by
Voyager 1.
Rocketry
In rocketry, telemetry equipment forms an integral part of the rocket range
assets used to monitor the position and health of a launch vehicle to
determine range safety flight termination criteria (Range purpose is for
public safety). Problems include the extreme environment (temperature,
acceleration and vibration), the energy supply, antenna alignment and (at long distances, e.g., in spaceflight) signal travel time.
Flight testing
Today nearly every type of aircraft, missiles, or spacecraft carries a wireless telemetry system as it is tested.
Aeronautical mobile telemetry is used for the safety of the pilots and
persons on the ground during flight tests. Telemetry from an on-board flight test instrumentation
system is the primary source of real-time measurement and status
information transmitted during the testing of crewed and uncrewed
aircraft.
Military intelligence
Intercepted telemetry was an important source of intelligence for the United States and UK when Soviet missiles were tested; for this purpose, the United States operated a listening post in Iran.
Eventually, the Russians discovered the United States
intelligence-gathering network and encrypted their missile-test
telemetry signals. Telemetry was also a source for the Soviets, who
operated listening ships in Cardigan Bay to eavesdrop on UK missile tests performed in the area.
Energy monitoring
In factories, buildings and houses, energy consumption of systems such as HVAC
are monitored at multiple locations; related parameters (e.g.,
temperature) are sent via wireless telemetry to a central location. The
information is collected and processed, enabling the most efficient use
of energy. Such systems also facilitate predictive maintenance.
Resource distribution
Many
resources need to be distributed over wide areas. Telemetry is useful
in these cases, since it allows the logistics system to channel
resources where they are needed, as well as provide security for those
assets; principal examples of this are dry goods, fluids, and granular
bulk solids.
Dry goods
Dry goods, such as packaged merchandise, may be tracked and remotely monitored, tracked and inventoried by RFID sensing systems, barcode reader, optical character recognition (OCR) reader, or other sensing devices—coupled to telemetry devices, to detect RFID tags, barcode
labels or other identifying markers affixed to the item, its package,
or (for large items and bulk shipments) affixed to its shipping
container or vehicle. This facilitates knowledge of their location, and
can record their status and disposition, as when merchandise with
barcode labels is scanned through a checkout reader at point-of-sale systems in a retail store. Stationary or hand-held barcode RFID scanners or Optical reader
with remote communications, can be used to expedite inventory tracking
and counting in stores, warehouses, shipping terminals, transportation
carriers and factories.
Fluids
Fluids
stored in tanks are a principal object of constant commercial telemetry.
This typically includes monitoring of tank farms in gasoline refineries
and chemical plants—and distributed or remote tanks, which must be
replenished when empty (as with gas station storage tanks, home heating
oil tanks, or ag-chemical tanks at farms), or emptied when full (as with
production from oil wells, accumulated waste products, and newly
produced fluids).
Telemetry is used to communicate the variable measurements of flow and
tank level sensors detecting fluid movements and/or volumes by pneumatic, hydrostatic, or differential pressure; tank-confined ultrasonic, radar or Doppler effect echoes; or mechanical or magnetic sensors.
Telemetry of bulk solids is common for tracking and reporting the volume status and condition of grain and livestock feed
bins, powdered or granular food, powders and pellets for manufacturing,
sand and gravel, and other granular bulk solids. While technology
associated with fluid tank monitoring also applies, in part, to granular
bulk solids, reporting of overall container weight, or other gross
characteristics and conditions, are sometimes required, owing to bulk
solids' more complex and variable physical characteristics.
Medicine/healthcare
Telemetry is used for patients (biotelemetry) who are at risk of abnormal heart activity, generally in a coronary care unit. Telemetry specialists are sometimes used to monitor many patients within a hospital. Such patients are outfitted with measuring, recording and transmitting devices. A data log can be useful in diagnosis of the patient's condition by doctors. An alerting function can alert nurses if the patient is suffering from an acute (or dangerous) condition.
A new and emerging application for telemetry is in the field of neurophysiology, or neurotelemetry. Neurophysiology
is the study of the central and peripheral nervous systems through the
recording of bioelectrical activity, whether spontaneous or stimulated.
In neurotelemetry (NT) the electroencephalogram
(EEG) of a patient is monitored remotely by a registered EEG
technologist using advanced communication software. The goal of
neurotelemetry is to recognize a decline in a patient's condition before
physical signs and symptoms are present.
Neurotelemetry is synonymous with real-time continuous video EEG monitoring
and has application in the epilepsy monitoring unit, neuro ICU,
pediatric ICU and newborn ICU. Due to the labor-intensive nature of
continuous EEG monitoring NT is typically done in the larger academic
teaching hospitals using in-house programs that include R.EEG
Technologists, IT support staff, neurologist and neurophysiologist and
monitoring support personnel.
Modern microprocessor speeds, software algorithms and video data
compression allow hospitals to centrally record and monitor continuous
digital EEGs of multiple critically ill patients simultaneously.
Neurotelemetry and continuous EEG monitoring provides dynamic
information about brain function that permits early detection of changes
in neurologic status, which is especially useful when the clinical
examination is limited.
A bumblebee worker with a transponder attached to its back, visiting an oilseed rape flower
Telemetry is used to study wildlife,
and has been useful for monitoring threatened species at the individual
level. Animals under study can be outfitted with instrumentation tags,
which include sensors that measure temperature, diving depth and
duration (for marine animals), speed and location (using GPS or Argos
packages). Telemetry tags can give researchers information about
animal behavior, functions, and their environment. This information is
then either stored (with archival tags) or the tags can send (or
transmit) their information to a satellite or handheld receiving device. Capturing and marking wild animals can put them at some risk, so it is important to minimize these impacts.
Retail
At a 2005 workshop in Las Vegas, a seminar noted the introduction of telemetry equipment which would allow vending machines to communicate sales and inventory data to a route truck or to a headquarters.
This data could be used for a variety of purposes, such as eliminating
the need for drivers to make a first trip to see which items needed to
be restocked before delivering the inventory.
Retailers also use RFID
tags to track inventory and prevent shoplifting. Most of these tags
passively respond to RFID readers (e.g., at the cashier), but active
RFID tags are available which periodically transmit location information
to a base station.
Law enforcement
Telemetry hardware is useful for tracking persons and property in law enforcement. An ankle collar worn by convicts on probation can warn authorities if a person violates the terms of his or her parole, such as by straying from authorized boundaries or visiting an unauthorized location. Telemetry has also enabled bait cars,
where law enforcement can rig a car with cameras and tracking equipment
and leave it somewhere they expect it to be stolen. When stolen the
telemetry equipment reports the location of the vehicle, enabling law
enforcement to deactivate the engine and lock the doors when it is
stopped by responding officers.
Energy providers
In
some countries, telemetry is used to measure the amount of electrical
energy consumed. The electricity meter communicates with a concentrator, and the latter sends the information through GPRS or GSM
to the energy provider's server. Telemetry is also used for the remote
monitoring of substations and their equipment. For data transmission,
phase line carrier systems operating on frequencies between 30 and
400 kHz are sometimes used.
Falconry
In falconry, "telemetry" means a small radio transmitter carried by a bird of prey that will allow the bird's owner to track it when it is out of sight.
Testing
Telemetry
is used in testing hostile environments which are dangerous to humans.
Examples include munitions storage facilities, radioactive sites,
volcanoes, deep sea, and outer space.
Communications
Telemetry
is used in many battery operated wireless systems to inform monitoring
personnel when the battery power is reaching a low point and the end
item needs fresh batteries.
Mining
In the
mining industry, telemetry serves two main purposes: the measurement of
key parameters from mining equipment and the monitoring of safety
practices.
The information provided by the collection and analysis of key
parameters allows for root-cause identification of inefficient
operations, unsafe practices and incorrect equipment usage for
maximizing productivity and safety. Further applications of the technology allow for sharing knowledge and best practices across the organization.
In software, telemetry is used to gather data on the use and
performance of applications and application components, e.g. how often
certain features are used, measurements of start-up time and processing
time, hardware, application crashes, and general usage statistics and/or
user behavior. In some cases, very detailed data is reported like
individual window metrics, counts of used features, and individual
function timings.
This kind of telemetry can be essential to software developers to
receive data from a wide variety of endpoints that can't possibly all
be tested in-house, as well as getting data on the popularity of certain
features and whether they should be given priority or be considered for
removal. Due to concerns about privacy since software telemetry can easily be used to profile
users, telemetry in user software is often user choice, commonly
presented as an opt-in feature (requiring explicit user action to enable
it) or user choice during the software installation process.
International standards
As
in other telecommunications fields, international standards exist for
telemetry equipment and software. International standards producing
bodies include Consultative Committee for Space Data Systems (CCSDS) for space agencies, Inter-Range Instrumentation Group
(IRIG) for missile ranges, and Telemetering Standards Coordination
Committee (TSCC), an organisation of the International Foundation for
Telemetering.