Search This Blog

Wednesday, November 6, 2024

Rayleigh scattering

From Wikipedia, the free encyclopedia
Rayleigh scattering causes the blue color of the daytime sky and the reddening of the Sun at sunset.

Rayleigh scattering (/ˈrli/ RAY-lee) is the scattering or deflection of light, or other electromagnetic radiation, by particles with a size much smaller than the wavelength of the radiation. For light frequencies well below the resonance frequency of the scattering medium (normal dispersion regime), the amount of scattering is inversely proportional to the fourth power of the wavelength (e.g., a blue color is scattered much more than a red color as light propagates through air). The phenomenon is named after the 19th-century British physicist Lord Rayleigh (John William Strutt).

Due to Rayleigh scattering, red and orange colors are more visible during sunset because the blue and violet light has been scattered out of the direct path. Due to removal of such colors, these colors are scattered by dramatically colored skies and monochromatic rainbows.

Rayleigh scattering results from the electric polarizability of the particles. The oscillating electric field of a light wave acts on the charges within a particle, causing them to move at the same frequency. The particle, therefore, becomes a small radiating dipole whose radiation we see as scattered light. The particles may be individual atoms or molecules; it can occur when light travels through transparent solids and liquids, but is most prominently seen in gases.

Rayleigh scattering of sunlight in Earth's atmosphere causes diffuse sky radiation, which is the reason for the blue color of the daytime and twilight sky, as well as the yellowish to reddish hue of the low Sun. Sunlight is also subject to Raman scattering, which changes the rotational state of the molecules and gives rise to polarization effects.

Scattering by particles with a size comparable to, or larger than, the wavelength of the light is typically treated by the Mie theory, the discrete dipole approximation and other computational techniques. Rayleigh scattering applies to particles that are small with respect to wavelengths of light, and that are optically "soft" (i.e., with a refractive index close to 1). Anomalous diffraction theory applies to optically soft but larger particles.

History

In 1869, while attempting to determine whether any contaminants remained in the purified air he used for infrared experiments, John Tyndall discovered that bright light scattering off nanoscopic particulates was faintly blue-tinted. He conjectured that a similar scattering of sunlight gave the sky its blue hue, but he could not explain the preference for blue light, nor could atmospheric dust explain the intensity of the sky's color.

In 1871, Lord Rayleigh published two papers on the color and polarization of skylight to quantify Tyndall's effect in water droplets in terms of the tiny particulates' volumes and refractive indices. In 1881, with the benefit of James Clerk Maxwell's 1865 proof of the electromagnetic nature of light, he showed that his equations followed from electromagnetism. In 1899, he showed that they applied to individual molecules, with terms containing particulate volumes and refractive indices replaced with terms for molecular polarizability.

Small size parameter approximation

The size of a scattering particle is often parameterized by the ratio

where r is the particle's radius, λ is the wavelength of the light and x is a dimensionless parameter that characterizes the particle's interaction with the incident radiation such that: Objects with x ≫ 1 act as geometric shapes, scattering light according to their projected area. At the intermediate x ≃ 1 of Mie scattering, interference effects develop through phase variations over the object's surface. Rayleigh scattering applies to the case when the scattering particle is very small (x ≪ 1, with a particle size < 1/10 of wavelength) and the whole surface re-radiates with the same phase. Because the particles are randomly positioned, the scattered light arrives at a particular point with a random collection of phases; it is incoherent and the resulting intensity is just the sum of the squares of the amplitudes from each particle and therefore proportional to the inverse fourth power of the wavelength and the sixth power of its size. The wavelength dependence is characteristic of dipole scattering and the volume dependence will apply to any scattering mechanism. In detail, the intensity of light scattered by any one of the small spheres of radius r and refractive index n from a beam of unpolarized light of wavelength λ and intensity I0 is given by where R is the distance to the particle and θ is the scattering angle. Averaging this over all angles gives the Rayleigh scattering cross-section of the particles in air: Here n is the refractive index of the spheres that approximate the molecules of the gas; the index of the gas surrounding the spheres is neglected, an approximation that introduces an error of less than 0.05%.

The fraction of light scattered by scattering particles over the unit travel length (e.g., meter) is the number of particles per unit volume N times the cross-section. For example, air has a refractive index of 1.0002793 at atmospheric pressure, where there are about 2×1025 molecules per cubic meter, and therefore the major constituent of the atmosphere, nitrogen, has a Rayleigh cross section of 5.1×10−31 m2 at a wavelength of 532 nm (green light). This means that about a fraction 10−5 of the light will be scattered for every meter of travel.

The strong wavelength dependence of the scattering (~λ−4) means that shorter (blue) wavelengths are scattered more strongly than longer (red) wavelengths.

From molecules

Figure showing the greater proportion of blue light scattered by the atmosphere relative to red light

The expression above can also be written in terms of individual molecules by expressing the dependence on refractive index in terms of the molecular polarizability α, proportional to the dipole moment induced by the electric field of the light. In this case, the Rayleigh scattering intensity for a single particle is given in CGS-units by and in SI-units by

Effect of fluctuations

When the dielectric constant of a certain region of volume is different from the average dielectric constant of the medium , then any incident light will be scattered according to the following equation

where represents the variance of the fluctuation in the dielectric constant .

Cause of the blue color of the sky

Scattered blue light is polarized. The picture on the right is shot through a polarizing filter: the polarizer transmits light that is linearly polarized in a specific direction.

The blue color of the sky is a consequence of three factors:

  • the blackbody spectrum of sunlight coming into the Earth's atmosphere,
  • Rayleigh scattering of that light off oxygen and nitrogen molecules, and
  • the response of the human visual system.

The strong wavelength dependence of the Rayleigh scattering (~λ−4) means that shorter (blue) wavelengths are scattered more strongly than longer (red) wavelengths. This results in the indirect blue and violet light coming from all regions of the sky. The human eye responds to this wavelength combination as if it were a combination of blue and white light.

Some of the scattering can also be from sulfate particles. For years after large Plinian eruptions, the blue cast of the sky is notably brightened by the persistent sulfate load of the stratospheric gases. Some works of the artist J. M. W. Turner may owe their vivid red colours to the eruption of Mount Tambora in his lifetime.

In locations with little light pollution, the moonlit night sky is also blue, because moonlight is reflected sunlight, with a slightly lower color temperature due to the brownish color of the Moon. The moonlit sky is not perceived as blue, however, because at low light levels human vision comes mainly from rod cells that do not produce any color perception (Purkinje effect).

Of sound in amorphous solids

Rayleigh scattering is also an important mechanism of wave scattering in amorphous solids such as glass, and is responsible for acoustic wave damping and phonon damping in glasses and granular matter at low or not too high temperatures. This is because in glasses at higher temperatures the Rayleigh-type scattering regime is obscured by the anharmonic damping (typically with a ~λ−2 dependence on wavelength), which becomes increasingly more important as the temperature rises.

In amorphous solids – glasses – optical fibers

Rayleigh scattering is an important component of the scattering of optical signals in optical fibers. Silica fibers are glasses, disordered materials with microscopic variations of density and refractive index. These give rise to energy losses due to the scattered light, with the following coefficient:

where n is the refraction index, p is the photoelastic coefficient of the glass, k is the Boltzmann constant, and β is the isothermal compressibility. Tf is a fictive temperature, representing the temperature at which the density fluctuations are "frozen" in the material.

In porous materials

Rayleigh scattering in opalescent glass: it appears blue from the side, but orange light shines through.

Rayleigh-type λ−4 scattering can also be exhibited by porous materials. An example is the strong optical scattering by nanoporous materials. The strong contrast in refractive index between pores and solid parts of sintered alumina results in very strong scattering, with light completely changing direction each five micrometers on average. The λ−4-type scattering is caused by the nanoporous structure (a narrow pore size distribution around ~70 nm) obtained by sintering monodispersive alumina powder.

Tuesday, November 5, 2024

Synthetic data

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Synthetic_data

Synthetic data are artificially generated data rather than produced by real-world events. Typically created using algorithms, synthetic data can be deployed to validate mathematical models and to train machine learning models.

Data generated by a computer simulation can be seen as synthetic data. This encompasses most applications of physical modeling, such as music synthesizers or flight simulators. The output of such systems approximates the real thing, but is fully algorithmically generated.

Synthetic data is used in a variety of fields as a filter for information that would otherwise compromise the confidentiality of particular aspects of the data. In many sensitive applications, datasets theoretically exist but cannot be released to the general public; synthetic data sidesteps the privacy issues that arise from using real consumer information without permission or compensation.

Usefulness

Synthetic data is generated to meet specific needs or certain conditions that may not be found in the original, real data. One of the hurdles in applying up-to-date machine learning approaches for complex scientific tasks is the scarcity of labeled data, a gap effectively bridged by the use of synthetic data, which closely replicates real experimental data. This can be useful when designing many systems, from simulations based on theoretical value, to database processors, etc. This helps detect and solve unexpected issues such as information processing limitations. Synthetic data are often generated to represent the authentic data and allows a baseline to be set. Another benefit of synthetic data is to protect the privacy and confidentiality of authentic data, while still allowing for use in testing systems.

A science article's abstract, quoted below, describes software that generates synthetic data for testing fraud detection systems. "This enables us to create realistic behavior profiles for users and attackers. The data is used to train the fraud detection system itself, thus creating the necessary adaptation of the system to a specific environment." In defense and military contexts, synthetic data is seen as a potentially valuable tool to develop and improve complex AI systems, particularly in contexts where high-quality real-world data is scarce. At the same time, synthetic data together with the testing approach can give the ability to model

History

Scientific modelling of physical systems, which allows to run simulations in which one can estimate/compute/generate datapoints that haven't been observed in actual reality, has a long history that runs concurrent with the history of physics itself. For example, research into synthesis of audio and voice can be traced back to the 1930s and before, driven forward by the developments of e.g. the telephone and audio recording. Digitization gave rise to software synthesizers from the 1970s onwards.

In the context of privacy-preserving statistical analysis, in 1993, the idea of original fully synthetic data was created by Rubin. Rubin originally designed this to synthesize the Decennial Census long form responses for the short form households. He then released samples that did not include any actual long form records - in this he preserved anonymity of the household. Later that year, the idea of original partially synthetic data was created by Little. Little used this idea to synthesize the sensitive values on the public use file.

A 1993 work fitted a statistical model to 60,000 MNIST digits, then it was used to generate over 1 million examples. Those were used to train a LeNet-4 to reach state of the art performance.

In 1994, Fienberg came up with the idea of critical refinement, in which he used a parametric posterior predictive distribution (instead of a Bayes bootstrap) to do the sampling. Later, other important contributors to the development of synthetic data generation were Trivellore Raghunathan, Jerry Reiter, Donald Rubin, John M. Abowd, and Jim Woodcock. Collectively they came up with a solution for how to treat partially synthetic data with missing data. Similarly they came up with the technique of Sequential Regression Multivariate Imputation.

Calculations

Researchers test the framework on synthetic data, which is "the only source of ground truth on which they can objectively assess the performance of their algorithms".

Synthetic data can be generated through the use of random lines, having different orientations and starting positions. Datasets can get fairly complicated. A more complicated dataset can be generated by using a synthesizer build. To create a synthesizer build, first use the original data to create a model or equation that fits the data the best. This model or equation will be called a synthesizer build. This build can be used to generate more data.

Constructing a synthesizer build involves constructing a statistical model. In a linear regression line example, the original data can be plotted, and a best fit linear line can be created from the data. This line is a synthesizer created from the original data. The next step will be generating more synthetic data from the synthesizer build or from this linear line equation. In this way, the new data can be used for studies and research, and it protects the confidentiality of the original data.

David Jensen from the Knowledge Discovery Laboratory explains how to generate synthetic data: "Researchers frequently need to explore the effects of certain data characteristics on their data model." To help construct datasets exhibiting specific properties, such as auto-correlation or degree disparity, proximity can generate synthetic data having one of several types of graph structure: random graphs that are generated by some random process; lattice graphs having a ring structure; lattice graphs having a grid structure, etc. In all cases, the data generation process follows the same process:

  1. Generate the empty graph structure.
  2. Generate attribute values based on user-supplied prior probabilities.

Since the attribute values of one object may depend on the attribute values of related objects, the attribute generation process assigns values collectively.

Applications

Fraud detection and confidentiality systems

Testing and training fraud detection and confidentiality systems are devised using synthetic data. Specific algorithms and generators are designed to create realistic data, which then assists in teaching a system how to react to certain situations or criteria. For example, intrusion detection software is tested using synthetic data. This data is a representation of the authentic data and may include intrusion instances that are not found in the authentic data. The synthetic data allows the software to recognize these situations and react accordingly. If synthetic data was not used, the software would only be trained to react to the situations provided by the authentic data and it may not recognize another type of intrusion.

Scientific research

Researchers doing clinical trials or any other research may generate synthetic data to aid in creating a baseline for future studies and testing.

Real data can contain information that researchers may not want released, so synthetic data is sometimes used to protect the privacy and confidentiality of a dataset. Using synthetic data reduces confidentiality and privacy issues since it holds no personal information and cannot be traced back to any individual.

Machine learning

Synthetic data is increasingly being used for machine learning applications: a model is trained on a synthetically generated dataset with the intention of transfer learning to real data. Efforts have been made to enable more data science experiments via the construction of general-purpose synthetic data generators, such as the Synthetic Data Vault. In general, synthetic data has several natural advantages:

  • once the synthetic environment is ready, it is fast and cheap to produce as much data as needed;
  • synthetic data can have perfectly accurate labels, including labeling that may be very expensive or impossible to obtain by hand;
  • the synthetic environment can be modified to improve the model and training;
  • synthetic data can be used as a substitute for certain real data segments that contain, e.g., sensitive information.

This usage of synthetic data has been proposed for computer vision applications, in particular object detection, where the synthetic environment is a 3D model of the object, and learning to navigate environments by visual information.

At the same time, transfer learning remains a nontrivial problem, and synthetic data has not become ubiquitous yet. Research results indicate that adding a small amount of real data significantly improves transfer learning with synthetic data. Advances in generative adversarial networks (GAN), lead to the natural idea that one can produce data and then use it for training. Since at least 2016, such adversarial training has been successfully used to produce synthetic data of sufficient quality to produce state-of-the-art results in some domains, without even needing to re-mix real data in with the generated synthetic data.

Examples

In 1987, a Navlab autonomous vehicle used 1200 synthetic road images as one approach to training.

In 2021, Microsoft released a database of 100,000 synthetic faces based on (500 real faces) that claims to "match real data in accuracy".

Optical depth

From Wikipedia, the free encyclopedia
Aerosol Optical Depth (AOD) at 830 nm measured with the same LED sun photometer from 1990 to 2016 at Geronimo Creek Observatory, Texas. Measurements made at or near solar noon when the Sun is not obstructed by clouds. Peaks indicate smoke, dust and smog. Saharan dust events are measured each summer.

In physics, optical depth or optical thickness is the natural logarithm of the ratio of incident to transmitted radiant power through a material. Thus, the larger the optical depth, the smaller the amount of transmitted radiant power through the material. Spectral optical depth or spectral optical thickness is the natural logarithm of the ratio of incident to transmitted spectral radiant power through a material. Optical depth is dimensionless, and in particular is not a length, though it is a monotonically increasing function of optical path length, and approaches zero as the path length approaches zero. The use of the term "optical density" for optical depth is discouraged.

In chemistry, a closely related quantity called "absorbance" or "decadic absorbance" is used instead of optical depth: the common logarithm of the ratio of incident to transmitted radiant power through a material. It is the optical depth divided by loge(10), because of the different logarithm bases used.

Mathematical definitions

Optical depth

Optical depth of a material, denoted , is given by:where

The absorbance is related to optical depth by:

Spectral optical depth

Spectral optical depth in frequency and spectral optical depth in wavelength of a material, denoted and respectively, are given by: where

Spectral absorbance is related to spectral optical depth by: where

  • is the spectral absorbance in frequency;
  • is the spectral absorbance in wavelength.

Relationship with attenuation

Attenuation

Optical depth measures the attenuation of the transmitted radiant power in a material. Attenuation can be caused by absorption, but also reflection, scattering, and other physical processes. Optical depth of a material is approximately equal to its attenuation when both the absorbance is much less than 1 and the emittance of that material (not to be confused with radiant exitance or emissivity) is much less than the optical depth: where

  • Φet is the radiant power transmitted by that material;
  • Φeatt is the radiant power attenuated by that material;
  • Φei is the radiant power received by that material;
  • Φee is the radiant power emitted by that material;
  • T = Φetei is the transmittance of that material;
  • ATT = Φeattei is the attenuation of that material;
  • E = Φeeei is the emittance of that material,

and according to the Beer–Lambert law, so:

Attenuation coefficient

Optical depth of a material is also related to its attenuation coefficient by:where

  • l is the thickness of that material through which the light travels;
  • α(z) is the attenuation coefficient or Napierian attenuation coefficient of that material at z,

and if α(z) is uniform along the path, the attenuation is said to be a linear attenuation and the relation becomes:

Sometimes the relation is given using the attenuation cross section of the material, that is its attenuation coefficient divided by its number density: where

  • σ is the attenuation cross section of that material;
  • n(z) is the number density of that material at z,

and if is uniform along the path, i.e., , the relation becomes:

Applications

Atomic physics

In atomic physics, the spectral optical depth of a cloud of atoms can be calculated from the quantum-mechanical properties of the atoms. It is given bywhere

Atmospheric sciences

In atmospheric sciences, one often refers to the optical depth of the atmosphere as corresponding to the vertical path from Earth's surface to outer space; at other times the optical path is from the observer's altitude to outer space. The optical depth for a slant path is τ = , where τ′ refers to a vertical path, m is called the relative airmass, and for a plane-parallel atmosphere it is determined as m = sec θ where θ is the zenith angle corresponding to the given path. Therefore,The optical depth of the atmosphere can be divided into several components, ascribed to Rayleigh scattering, aerosols, and gaseous absorption. The optical depth of the atmosphere can be measured with a Sun photometer.

The optical depth with respect to the height within the atmosphere is given by and it follows that the total atmospheric optical depth is given by

In both equations:

  • ka is the absorption coefficient
  • w1 is the mixing ratio
  • ρ0 is the density of air at sea level
  • H is the scale height of the atmosphere
  • z is the height in question

The optical depth of a plane parallel cloud layer is given by where:

  • Qe is the extinction efficiency
  • L is the liquid water path
  • H is the geometrical thickness
  • N is the concentration of droplets
  • ρl is the density of liquid water

So, with a fixed depth and total liquid water path, .

Astronomy

In astronomy, the photosphere of a star is defined as the surface where its optical depth is 2/3. This means that each photon emitted at the photosphere suffers an average of less than one scattering before it reaches the observer. At the temperature at optical depth 2/3, the energy emitted by the star (the original derivation is for the Sun) matches the observed total energy emitted.

Note that the optical depth of a given medium will be different for different colors (wavelengths) of light.

For planetary rings, the optical depth is the (negative logarithm of the) proportion of light blocked by the ring when it lies between the source and the observer. This is usually obtained by observation of stellar occultations.

Representation of a Lie group

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Representation_of_a_Lie_group...