Search This Blog

Saturday, December 27, 2025

Nanotechnology

From Wikipedia, the free encyclopedia
Fullerene nanogears

Nanotechnology is the manipulation of matter with at least one dimension sized from 1 to 100 nanometers (nm). At this scale, commonly known as the nanoscale, surface area and quantum mechanical effects become important in describing properties of matter. This definition of nanotechnology includes all types of research and technologies that deal with these special properties. It is common to see the plural form "nanotechnologies" as well as "nanoscale technologies" to refer to research and applications whose common trait is scale. An earlier understanding of nanotechnology referred to the particular technological goal of precisely manipulating atoms and molecules for fabricating macroscale products, now referred to as molecular nanotechnology.

Nanotechnology defined by scale includes fields of science such as surface science, organic chemistry, molecular biology, semiconductor physics, energy storageengineeringmicrofabrication, and molecular engineering. The associated research and applications range from extensions of conventional device physics to molecular self-assembly, from developing new materials with dimensions on the nanoscale to direct control of matter on the atomic scale.

Nanotechnology may be able to create new materials and devices with diverse applications, such as in nanomedicine, nanoelectronics, agricultural sectorsbiomaterials energy production, and consumer products. However, nanotechnology raises issues, including concerns about the toxicity and environmental impact of nanomaterials, and their potential effects on global economics, as well as various doomsday scenarios. These concerns have led to a debate among advocacy groups and governments on whether special regulation of nanotechnology is warranted.

Origins

The concepts that seeded nanotechnology were first discussed in 1959 by physicist Richard Feynman in his talk There's Plenty of Room at the Bottom, in which he described the possibility of synthesis via direct manipulation of atoms.

Comparison of nanomaterials sizes

The term "nano-technology" was first used by Norio Taniguchi in 1974, though it was not widely known. Inspired by Feynman's concepts, K. Eric Drexler used the term "nanotechnology" in his 1986 book Engines of Creation: The Coming Era of Nanotechnology, which achieved popular success and helped thrust nanotechnology into the public sphere. In it he proposed the idea of a nanoscale "assembler" that would be able to build a copy of itself and of other items of arbitrary complexity with atom-level control. Also in 1986, Drexler co-founded The Foresight Institute to increase public awareness and understanding of nanotechnology concepts and implications.

The emergence of nanotechnology as a field in the 1980s occurred through the convergence of Drexler's theoretical and public work, which developed and popularized a conceptual framework, and experimental advances that drew additional attention to the prospects. In the 1980s, two breakthroughs helped to spark the growth of nanotechnology. First, the invention of the scanning tunneling microscope in 1981 enabled visualization of individual atoms and bonds, and was successfully used to manipulate individual atoms in 1989. The microscope's developers Gerd Binnig and Heinrich Rohrer at IBM Zurich Research Laboratory received a Nobel Prize in Physics in 1986. Binnig, Quate and Gerber also invented the analogous atomic force microscope that year.

Buckminsterfullerene C60, also known as the buckyball, is a representative member of the carbon structures known as fullerenes. Members of the fullerene family are a major subject of research falling under the nanotechnology umbrella.
Harry Kroto (top) won the 1996 Nobel Prize in Chemistry along with Richard Smalley and Robert Curl for their 1985 discovery of buckminsterfullerene, while Sumio Iijima (middle) won the inaugural 2008 Kavli Prize in Nanoscience for his 1991 discovery of carbon nanotubes.

Second, fullerenes (buckyballs) were discovered in 1985 by Harry Kroto, Richard Smalley, and Robert Curl, who together won the 1996 Nobel Prize in Chemistry. C60 was not initially described as nanotechnology; the term was used regarding subsequent work with related carbon nanotubes (sometimes called graphene tubes or Bucky tubes) which suggested potential applications for nanoscale electronics and devices. The discovery of carbon nanotubes is attributed to Sumio Iijima of NEC in 1991, for which Iijima won the inaugural 2008 Kavli Prize in Nanoscience.

In the early 2000s, the field garnered increased scientific, political, and commercial attention that led to both controversy and progress. Controversies emerged regarding the definitions and potential implications of nanotechnologies, exemplified by the Royal Society's report on nanotechnology. Challenges were raised regarding the feasibility of applications envisioned by advocates of molecular nanotechnology, which culminated in a public debate between Drexler and Smalley in 2001 and 2003.

Meanwhile, commercial products based on advancements in nanoscale technologies began emerging. These products were limited to bulk applications of nanomaterials and did not involve atomic control of matter. Some examples include the Silver Nano platform for using silver nanoparticles as an antibacterial agent, nanoparticle-based sunscreens, carbon fiber strengthening using silica nanoparticles, and carbon nanotubes for stain-resistant textiles.

Governments moved to promote and fund research into nanotechnology, such as American the National Nanotechnology Initiative, which formalized a size-based definition of nanotechnology and established research funding, and in Europe via the European Framework Programmes for Research and Technological Development.

By the mid-2000s scientific attention began to flourish. Nanotechnology roadmaps centered on atomically precise manipulation of matter and discussed existing and projected capabilities, goals, and applications.

Fundamental concepts

Nanotechnology is the science and engineering of functional systems at the molecular scale. In its original sense, nanotechnology refers to the projected ability to construct items from the bottom up making complete, high-performance products.

One nanometer (nm) is one billionth, or 10−9, of a meter. By comparison, typical carbon–carbon bond lengths, or the spacing between these atoms in a molecule, are in the range 0.12–0.15 nm, and DNA's diameter is around 2 nm. On the other hand, the smallest cellular life forms, the bacteria of the genus Mycoplasma, are around 200 nm in length. By convention, nanotechnology is taken as the scale range 1 to 100 nm, following the definition used by the American National Nanotechnology Initiative. The lower limit is set by the size of atoms (hydrogen has the smallest atoms, which have an approximately ,25 nm kinetic diameter). The upper limit is more or less arbitrary, but is around the size below which phenomena not observed in larger structures start to become apparent and can be made use of. These phenomena make nanotechnology distinct from devices that are merely miniaturized versions of an equivalent macroscopic device; such devices are on a larger scale and come under the description of microtechnology.

To put that scale in another context, the comparative size of a nanometer to a meter is the same as that of a marble to the size of the earth.

Two main approaches are used in nanotechnology. In the "bottom-up" approach, materials and devices are built from molecular components which assemble themselves chemically by principles of molecular recognition. In the "top-down" approach, nano-objects are constructed from larger entities without atomic-level control.

Areas of physics such as nanoelectronics, nanomechanics, nanophotonics and nanoionics have evolved to provide nanotechnology's scientific foundation.

Larger to smaller: a materials perspective

Image of reconstruction on a clean Gold(100) surface, as visualized using scanning tunneling microscopy. The positions of the individual atoms composing the surface are visible.

Several phenomena become pronounced as system size. These include statistical mechanical effects, as well as quantum mechanical effects, for example, the "quantum size effect" in which the electronic properties of solids alter along with reductions in particle size. Such effects do not apply at macro or micro dimensions. However, quantum effects can become significant when nanometer scales. Additionally, physical (mechanical, electrical, optical, etc.) properties change versus macroscopic systems. One example is the increase in surface area to volume ratio altering mechanical, thermal, and catalytic properties of materials. Diffusion and reactions can be different as well. Systems with fast ion transport are referred to as nanoionics. The mechanical properties of nanosystems are of interest in research.

Simple to complex: a molecular perspective

Modern synthetic chemistry can prepare small molecules of almost any structure. These methods are used to manufacture a wide variety of useful chemicals such as pharmaceuticals or commercial polymers. This ability raises the question of extending this kind of control to the next-larger level, seeking methods to assemble single molecules into supramolecular assemblies consisting of many molecules arranged in a well-defined manner.

These approaches utilize the concepts of molecular self-assembly and/or supramolecular chemistry to automatically arrange themselves into a useful conformation through a bottom-up approach. The concept of molecular recognition is important: molecules can be designed so that a specific configuration or arrangement is favored due to non-covalent intermolecular forces. The Watson–Crick basepairing rules are a direct result of this, as is the specificity of an enzyme targeting a single substrate, or the specific folding of a protein. Thus, components can be designed to be complementary and mutually attractive so that they make a more complex and useful whole.

Such bottom-up approaches should be capable of producing devices in parallel and be much cheaper than top-down methods, but could potentially be overwhelmed as the size and complexity of the desired assembly increases. Most useful structures require complex and thermodynamically unlikely arrangements of atoms. Nevertheless, many examples of self-assembly based on molecular recognition in exist in biology, most notably Watson–Crick basepairing and enzyme-substrate interactions.

Molecular nanotechnology: a long-term view

Ribosome translating DNA is a biological machine functioning as a molecular assembler. Protein domain dynamics can now be seen by neutron spin echo spectroscopy

Molecular nanotechnology, sometimes called molecular manufacturing, concerns engineered nanosystems (nanoscale machines) operating on the molecular scale. Molecular nanotechnology is especially associated with molecular assemblers, machines that can produce a desired structure or device atom-by-atom using the principles of mechanosynthesis. Manufacturing in the context of productive nanosystems is not related to conventional technologies used to manufacture nanomaterials such as carbon nanotubes and nanoparticles.

When Drexler independently coined and popularized the term "nanotechnology", he envisioned manufacturing technology based on molecular machine systems. The premise was that molecular-scale biological analogies of traditional machine components demonstrated molecular machines were possible: biology was full of examples of sophisticated, stochastically optimized biological machines.

Drexler and other researchers have proposed that advanced nanotechnology ultimately could be based on mechanical engineering principles, namely, a manufacturing technology based on the mechanical functionality of these components (such as gears, bearings, motors, and structural members) that would enable programmable, positional assembly to atomic specification. The physics and engineering performance of exemplar designs were analyzed in Drexler's book Nanosystems: Molecular Machinery, Manufacturing, and Computation.

In general, assembling devices on the atomic scale requires positioning atoms on other atoms of comparable size and stickiness. Carlo Montemagno's view is that future nanosystems will be hybrids of silicon technology and biological molecular machines. Richard Smalley argued that mechanosynthesis was impossible due to difficulties in mechanically manipulating individual molecules.

This led to an exchange of letters in the American Chemical Society publication Chemical & Engineering News in 2003. Though biology clearly demonstrates that molecular machines are possible, non-biological molecular machines remained in their infancy. Alex Zettl and colleagues at Lawrence Berkeley Laboratories and UC Berkeley constructed at least three molecular devices whose motion is controlled via changing voltage: a nanotube nanomotor, a molecular actuator, and a nanoelectromechanical relaxation oscillator.

Ho and Lee at Cornell University in 1999 used a scanning tunneling microscope to move an individual carbon monoxide molecule (CO) to an individual iron atom (Fe) sitting on a flat silver crystal and chemically bound the CO to the Fe by applying a voltage.

Research

Graphical representation of a rotaxane, useful as a molecular switch
This DNA tetrahedron is an artificially designed nanostructure of the type made in the field of DNA nanotechnology. Each edge of the tetrahedron is a 20 base pair DNA double helix, and each vertex is a three-arm junction.
Rotating view of C60, one kind of fullerene
This device transfers energy from nano-thin layers of quantum wells to nanocrystals above them, causing the nanocrystals to emit visible light.

Nanomaterials

Many areas of science develop or study materials having unique properties arising from their nanoscale dimensions.

Bottom-up approaches

The bottom-up approach seeks to arrange smaller components into more complex assemblies.

  • DNA nanotechnology utilizes Watson–Crick basepairing to construct well-defined structures out of DNA and other nucleic acids.
  • Approaches from the field of "classical" chemical synthesis (inorganic and organic synthesis) aim at designing molecules with well-defined shape (e.g. bis-peptides).
  • More generally, molecular self-assembly seeks to use concepts of supramolecular chemistry, and molecular recognition in particular, to cause single-molecule components to automatically arrange themselves into some useful conformation.
  • Atomic force microscope tips can be used as a nanoscale "write head" to deposit a chemical upon a surface in a desired pattern in a process called dip-pen nanolithography. This technique fits into the larger subfield of nanolithography.
  • Molecular-beam epitaxy allows for bottom-up assemblies of materials, most notably semiconductor materials commonly used in chip and computing applications, stacks, gating, and nanowire lasers.

Top-down approaches

These seek to create smaller devices by using larger ones to direct their assembly.

Functional approaches

Functional approaches seek to develop useful components without regard to how they might be assembled.

Biomimetic approaches

Speculative

These subfields seek to anticipate what inventions nanotechnology might yield, or attempt to propose an agenda along which inquiry could progress. These often take a big-picture view, with more emphasis on societal implications than engineering details.

  • Molecular nanotechnology is a proposed approach that involves manipulating single molecules in finely controlled, deterministic ways. This is more theoretical than the other subfields, and many of its proposed techniques are beyond current capabilities.
  • Nanorobotics considers self-sufficient machines operating at the nanoscale. There are hopes for applying nanorobots in medicine. Nevertheless, progress on innovative materials and patented methodologies have been demonstrated.
  • Productive nanosystems are "systems of nanosystems" could produce atomically precise parts for other nanosystems, not necessarily using novel nanoscale-emergent properties, but well-understood fundamentals of manufacturing. Because of the discrete (i.e. atomic) nature of matter and the possibility of exponential growth, this stage could form the basis of another industrial revolution. Mihail Roco proposed four states of nanotechnology that seem to parallel the technical progress of the Industrial Revolution, progressing from passive nanostructures to active nanodevices to complex nanomachines and ultimately to productive nanosystems.
  • Programmable matter seeks to design materials whose properties can be easily, reversibly and externally controlled though a fusion of information science and materials science.
  • Due to the popularity and media exposure of the term nanotechnology, the words picotechnology and femtotechnology have been coined in analogy to it, although these are used only informally.

Dimensionality in nanomaterials

Nanomaterials can be classified in 0D, 1D, 2D and 3D nanomaterials. Dimensionality plays a major role in determining the characteristic of nanomaterials including physical, chemical, and biological characteristics. With the decrease in dimensionality, an increase in surface-to-volume ratio is observed. This indicates that smaller dimensional nanomaterials have higher surface area compared to 3D nanomaterials. Two dimensional (2D) nanomaterials have been extensively investigated for electronic, biomedical, drug delivery and biosensor applications.

Tools and techniques

Typical AFM setup. A microfabricated cantilever with a sharp tip is deflected by features on a sample surface, much like in a phonograph but on a much smaller scale. A laser beam reflects off the backside of the cantilever into a set of photodetectors, allowing the deflection to be measured and assembled into an image of the surface.

Scanning microscopes

The atomic force microscope (AFM) and the scanning tunneling microscope (STM) are two versions of scanning probes that are used for nano-scale observation. Other types of scanning probe microscopy have much higher resolution, since they are not limited by the wavelengths of sound or light.

The tip of a scanning probe can also be used to manipulate nanostructures (positional assembly). Feature-oriented scanning may be a promising way to implement these nano-scale manipulations via an automatic algorithm. However, this is still a slow process because of low velocity of the microscope.

The top-down approach anticipates nanodevices that must be built piece by piece in stages, much as manufactured items are made. Scanning probe microscopy is an important technique both for characterization and synthesis. Atomic force microscopes and scanning tunneling microscopes can be used to look at surfaces and to move atoms around. By designing different tips for these microscopes, they can be used for carving out structures on surfaces and to help guide self-assembling structures. By using, for example, feature-oriented scanning approach, atoms or molecules can be moved around on a surface with scanning probe microscopy techniques.

Lithography

Various techniques of lithography, such as optical lithography, X-ray lithography, dip pen lithography, electron beam lithography or nanoimprint lithography offer top-down fabrication techniques where a bulk material is reduced to a nano-scale pattern.

Another group of nano-technological techniques include those used for fabrication of nanotubes and nanowires, those used in semiconductor fabrication such as deep ultraviolet lithography, electron beam lithography, focused ion beam machining, nanoimprint lithography, atomic layer deposition, and molecular vapor deposition, and further including molecular self-assembly techniques such as those employing di-block copolymers.

Bottom-up

In contrast, bottom-up techniques build or grow larger structures atom by atom or molecule by molecule. These techniques include chemical synthesis, self-assembly and positional assembly. Dual-polarization interferometry is one tool suitable for characterization of self-assembled thin films. Another variation of the bottom-up approach is molecular-beam epitaxy or MBE. Researchers at Bell Telephone Laboratories including John R. Arthur. Alfred Y. Cho, and Art C. Gossard developed and implemented MBE as a research tool in the late 1960s and 1970s. Samples made by MBE were key to the discovery of the fractional quantum Hall effect for which the 1998 Nobel Prize in Physics was awarded. MBE lays down atomically precise layers of atoms and, in the process, build up complex structures. Important for research on semiconductors, MBE is also widely used to make samples and devices for the newly emerging field of spintronics.

Therapeutic products based on responsive nanomaterials, such as the highly deformable, stress-sensitive transfersome vesicles, are approved for human use in some countries.

Applications

One of the major applications of nanotechnology is in the area of nanoelectronics with MOSFET's being made of small nanowires ≈10 nm in length.
Nanowire lasers for ultrafast transmission of information in light pulses

As of August 21, 2008, the Project on Emerging Nanotechnologies estimated that over 800 manufacturer-identified nanotech products were publicly available, with new ones hitting the market at a pace of 3–4 per week. Most applications are "first generation" passive nanomaterials that includes titanium dioxide in sunscreen, cosmetics, surface coatings, and some food products; Carbon allotropes used to produce gecko tape; silver in food packaging, clothing, disinfectants, and household appliances; zinc oxide in sunscreens and cosmetics, surface coatings, paints and outdoor furniture varnishes; and cerium oxide as a fuel catalyst.

In the electric car industry, single wall carbon nanotubes (SWCNTs) address key lithium-ion battery challenges, including energy density, charge rate, service life, and cost. SWCNTs connect electrode particles during charge/discharge process, preventing battery premature degradation. Their exceptional ability to wrap active material particles enhanced electrical conductivity and physical properties, setting them apart multi-walled carbon nanotubes and carbon black.

Further applications allow tennis balls to last longer, golf balls to fly straighter, and bowling balls to become more durable. Trousers and socks have been infused with nanotechnology to last longer and lower temperature in the summer. Bandages are infused with silver nanoparticles to heal cuts faster. Video game consoles and personal computers may become cheaper, faster, and contain more memory thanks to nanotechnology. Also, to build structures for on chip computing with light, for example on chip optical quantum information processing, and picosecond transmission of information.

Nanotechnology may have the ability to make existing medical applications cheaper and easier to use in places like the doctors' offices and at homes. Cars use nanomaterials in such ways that car parts require fewer metals during manufacturing and less fuel to operate in the future.

Nanoencapsulation involves the enclosure of active substances within carriers. Typically, these carriers offer advantages, such as enhanced bioavailability, controlled release, targeted delivery, and protection of the encapsulated substances. In the medical field, nanoencapsulation plays a significant role in drug delivery. It facilitates more efficient drug administration, reduces side effects, and increases treatment effectiveness. Nanoencapsulation is particularly useful for improving the bioavailability of poorly water-soluble drugs, enabling controlled and sustained drug release, and supporting the development of targeted therapies. These features collectively contribute to advancements in medical treatments and patient care.

Nanotechnology may play role in tissue engineering. When designing scaffolds, researchers attempt to mimic the nanoscale features of a cell's microenvironment to direct its differentiation down a suitable lineage. For example, when creating scaffolds to support bone growth, researchers may mimic osteoclast resorption pits.

Researchers used DNA origami-based nanobots capable of carrying out logic functions to target drug delivery in cockroaches.

A nano bible (a .5mm2 silicon chip) was created by the Technion in order to increase youth interest in nanotechnology.

Implications

One concern is the effect that industrial-scale manufacturing and use of nanomaterials will have on human health and the environment, as suggested by nanotoxicology research. For these reasons, some groups advocate that nanotechnology be regulated. However, regulation might stifle scientific research and the development of beneficial innovations. Public health research agencies, such as the National Institute for Occupational Safety and Health research potential health effects stemming from exposures to nanoparticles.

Nanoparticle products may have unintended consequences. Researchers have discovered that bacteriostatic silver nanoparticles used in socks to reduce foot odor are released in the wash. These particles are then flushed into the wastewater stream and may destroy bacteria that are critical components of natural ecosystems, farms, and waste treatment processes.

Public deliberations on risk perception in the US and UK carried out by the Center for Nanotechnology in Society found that participants were more positive about nanotechnologies for energy applications than for health applications, with health applications raising moral and ethical dilemmas such as cost and availability.

Experts, including director of the Woodrow Wilson Center's Project on Emerging Nanotechnologies David Rejeski, testified that commercialization depends on adequate oversight, risk research strategy, and public engagement. As of 206 Berkeley, California was the only US city to regulate nanotechnology.

Health and environmental concerns

Inhaling airborne nanoparticles and nanofibers may contribute to pulmonary diseases, e.g. fibrosis. Researchers found that when rats breathed in nanoparticles, the particles settled in the brain and lungs, which led to significant increases in biomarkers for inflammation and stress response and that nanoparticles induce skin aging through oxidative stress in hairless mice.

A two-year study at UCLA's School of Public Health found lab mice consuming nano-titanium dioxide showed DNA and chromosome damage to a degree "linked to all the big killers of man, namely cancer, heart disease, neurological disease and aging".

A Nature Nanotechnology study suggested that some forms of carbon nanotubes could be as harmful as asbestos if inhaled in sufficient quantities. Anthony Seaton of the Institute of Occupational Medicine in Edinburgh, Scotland, who contributed to the article on carbon nanotubes said "We know that some of them probably have the potential to cause mesothelioma. So those sorts of materials need to be handled very carefully." In the absence of specific regulation forthcoming from governments, Paull and Lyons (2008) have called for an exclusion of engineered nanoparticles in food. A newspaper article reports that workers in a paint factory developed serious lung disease and nanoparticles were found in their lungs.

Regulation

Calls for tighter regulation of nanotechnology have accompanied a debate related to human health and safety risks. Some regulatory agencies cover some nanotechnology products and processes – by "bolting on" nanotechnology to existing regulations – leaving clear gaps. Davies proposed a road map describing steps to deal with these shortcomings.

Andrew Maynard, chief science advisor to the Woodrow Wilson Center's Project on Emerging Nanotechnologies, reported insufficient funding for human health and safety research, and as a result inadequate understanding of human health and safety risks. Some academics called for stricter application of the precautionary principle, slowing marketing approval, enhanced labelling and additional safety data.

A Royal Society report identified a risk of nanoparticles or nanotubes being released during disposal, destruction and recycling, and recommended that "manufacturers of products that fall under extended producer responsibility regimes such as end-of-life regulations publish procedures outlining how these materials will be managed to minimize possible human and environmental exposure".

Friday, December 26, 2025

Helioseismology

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Helioseismology

Helioseismology is the study of the structure and dynamics of the Sun through its oscillations. These are principally caused by sound waves that are continuously driven and damped by convection near the Sun's surface. It is similar to geoseismology, or asteroseismology, which are respectively the studies of the Earth or stars through their oscillations. While the Sun's oscillations were first detected in the early 1960s, it was only in the mid-1970s that it was realized that the oscillations propagated throughout the Sun and could allow scientists to study the Sun's deep interior. The term was coined by Douglas Gough in the 90s. The modern field is separated into global helioseismology, which studies the Sun's resonant modes directly, and local helioseismology, which studies the propagation of the component waves near the Sun's surface.

Helioseismology has contributed to a number of scientific breakthroughs. The most notable was to show that the anomaly in the predicted neutrino flux from the Sun could not be caused by flaws in stellar models and must instead be a problem of particle physics. The so-called solar neutrino problem was ultimately resolved by neutrino oscillations. The experimental discovery of neutrino oscillations was recognized by the 2015 Nobel Prize for Physics. Helioseismology also allowed accurate measurements of the quadrupole (and higher-order) moments of the Sun's gravitational potential, which are consistent with General Relativity. The first helioseismic calculations of the Sun's internal rotation profile showed a rough separation into a rigidly-rotating core and differentially-rotating envelope. The boundary layer is now known as the tachocline and is thought to be a key component for the solar dynamo. Although it roughly coincides with the base of the solar convection zone — also inferred through helioseismology — it is conceptually distinct, being a boundary layer in which there is a meridional flow connected with the convection zone and driven by the interplay between baroclinicity and Maxwell stresses.

Helioseismology benefits most from continuous monitoring of the Sun, which began first with uninterrupted observations from near the South Pole over the austral summer. In addition, observations over multiple solar cycles have allowed helioseismologists to study changes in the Sun's structure over decades. These studies are made possible by global telescope networks like the Global Oscillations Network Group (GONG) and the Birmingham Solar Oscillations Network (BiSON), which have been operating for over several decades.

Types of solar oscillation

Illustration of a solar pressure mode (p mode) with radial order n=14, angular degree l=20 and azimuthal order m=16. The surface shows the corresponding spherical harmonic. The interior shows the radial displacement computed using a standard solar model. Note that the increase in the speed of sound as waves approach the center of the Sun causes a corresponding increase in the acoustic wavelength.

Solar oscillation modes are interpreted as resonant vibrations of a roughly spherically symmetric self-gravitating fluid in hydrostatic equilibrium. Each mode can then be represented approximately as the product of a function of radius and a spherical harmonic , and consequently can be characterized by the three quantum numbers which label:

  • the number of nodal shells in radius, known as the radial order ;
  • the total number of nodal circles on each spherical shell, known as the angular degree ; and
  • the number of those nodal circles that are longitudinal, known as the azimuthal order .

It can be shown that the oscillations are separated into two categories: interior oscillations and a special category of surface oscillations. More specifically, there are:

Pressure modes (p modes)

Pressure modes are in essence standing sound waves. The dominant restoring force is the pressure (rather than buoyancy), hence the name. All the solar oscillations that are used for inferences about the interior are p modes, with frequencies between about 1 and 5 millihertz and angular degrees ranging from zero (purely radial motion) to order . Broadly speaking, their energy densities vary with radius inversely proportional to the sound speed, so their resonant frequencies are determined predominantly by the outer regions of the Sun. Consequently it is difficult to infer from them the structure of the solar core.

A propagation diagram for a standard solar model showing where oscillations have a g-mode character (blue) or where dipole modes have a p-mode character (orange). The dashed line shows the acoustic cut-off frequency, computed from more precise modelling, and above which modes are not trapped in the star, and roughly-speaking do not resonate.

Gravity modes (g modes)

Gravity modes are confined to convectively stable regions, either the radiative interior or the atmosphere. The restoring force is predominantly buoyancy, and thus indirectly gravity, from which they take their name. They are evanescent in the convection zone, and therefore interior modes have tiny amplitudes at the surface and are extremely difficult to detect and identify. It has long been recognized that measurement of even just a few g modes could substantially increase our knowledge of the deep interior of the Sun. However, no individual g mode has yet been unambiguously measured, although indirect detections have been both claimed and challenged. Additionally, there can be similar gravity modes confined to the convectively stable atmosphere.

Surface gravity modes (f modes)

Surface gravity waves are analogous to waves in deep water, having the property that the Lagrangian pressure perturbation is essentially zero. They are of high degree , penetrating a characteristic distance , where is the solar radius. To good approximation, they obey the so-called deep-water-wave dispersion law: , irrespective of the stratification of the Sun, where is the angular frequency, is the surface gravity and is the horizontal wavenumber, and tend asymptotically to that relation as .

What seismology can reveal

The oscillations that have been successfully utilized for seismology are essentially adiabatic. Their dynamics is therefore the action of pressure forces (plus putative Maxwell stresses) against matter with inertia density , which itself depends upon the relation between them under adiabatic change, usually quantified via the (first) adiabatic exponent . The equilibrium values of the variables and (together with the dynamically small angular velocity and magnetic field ) are related by the constraint of hydrostatic support, which depends upon the total mass and radius of the Sun. Evidently, the oscillation frequencies depend only on the seismic variables , , and , or any independent set of functions of them. Consequently it is only about these variables that information can be derived directly. The square of the adiabatic sound speed, , is such commonly adopted function, because that is the quantity upon which acoustic propagation principally depends. Properties of other, non-seismic, quantities, such as helium abundance, , or main-sequence age , can be inferred only by supplementation with additional assumptions, which renders the outcome more uncertain.

Data analysis

Global helioseismology

Power spectrum of the Sun using data from instruments aboard the Solar and Heliospheric Observatory on double-logarithmic axes. The three passbands of the VIRGO/SPM instrument show nearly the same power spectrum. The line-of-sight velocity observations from GOLF are less sensitive to the red noise produced by granulation. All the datasets clearly show the oscillation modes around 3mHz.
Power spectrum of the Sun around where the modes have maximum power, using data from the GOLF and VIRGO/SPM instruments aboard the Solar and Heliospheric Observatory. The low-degree modes (l<4) show a clear comb-like pattern with a regular spacing.
Power spectrum of medium angular degree () solar oscillations, computed for 144 days of data from the MDI instrument aboard SOHO. The colour scale is logarithmic and saturated at one hundredth the maximum power in the signal, to make the modes more visible. The low-frequency region is dominated by the signal of granulation. As the angular degree increases, the individual mode frequencies converge onto clear ridges, each corresponding to a sequence of low-order modes.

The chief tool for analysing the raw seismic data is the Fourier transform. To good approximation, each mode is a damped harmonic oscillator, for which the power as a function of frequency is a Lorentz function. Spatially resolved data are usually projected onto desired spherical harmonics to obtain time series which are then Fourier transformed. Helioseismologists typically combine the resulting one-dimensional power spectra into a two-dimensional spectrum.

The lower frequency range of the oscillations is dominated by the variations caused by granulation. This must first be filtered out before (or at the same time that) the modes are analysed. Granular flows at the solar surface are mostly horizontal, from the centres of the rising granules to the narrow downdrafts between them. Relative to the oscillations, granulation produces a stronger signal in intensity than line-of-sight velocity, so the latter is preferred for helioseismic observatories.

Local helioseismology

Local helioseismology—a term coined by Charles Lindsey, Doug Braun and Stuart Jefferies in 1993—employs several different analysis methods to make inferences from the observational data.

  • The Fourier–Hankel spectral method was originally used to search for wave absorption by sunspots.
  • Ring-diagram analysis, first introduced by Frank Hill, is used to infer the speed and direction of horizontal flows below the solar surface by observing the Doppler shifts of ambient acoustic waves from power spectra of solar oscillations computed over patches of the solar surface (typically 15° × 15°). Thus, ring-diagram analysis is a generalization of global helioseismology applied to local areas on the Sun (as opposed to half of the Sun). For example, the sound speed and adiabatic index can be compared within magnetically active and inactive (quiet Sun) regions.
  • Time-distance helioseismology aims to measure and interpret the travel times of solar waves between any two locations on the solar surface. Inhomogeneities near the ray path connecting the two locations perturb the travel time between those two points. An inverse problem must then be solved to infer the local structure and dynamics of the solar interior.
  • Helioseismic holography, introduced in detail by Charles Lindsey and Doug Braun for the purpose of far-side (magnetic) imaging, is a special case of phase-sensitive holography. The idea is to use the wavefield on the visible disk to learn about active regions on the far side of the Sun. The basic idea in helioseismic holography is that the wavefield, e.g., the line-of-sight Doppler velocity observed at the solar surface, can be used to make an estimate of the wavefield at any location in the solar interior at any instant in time. In this sense, holography is much like seismic migration, a technique in geophysics that has been in use since the 1940s. As another example, this technique has been used to give a seismic image of a solar flare.
  • In direct modelling, the idea is to estimate subsurface flows from direct inversion of the frequency-wavenumber correlations seen in the wavefield in the Fourier domain. Woodard demonstrated the ability of the technique to recover near-surface flows the f modes.

Inversion

Introduction

The Sun's oscillation modes represent a discrete set of observations that are sensitive to its continuous structure. This allows scientists to formulate inverse problems for the Sun's interior structure and dynamics. Given a reference model of the Sun, the differences between its mode frequencies and those of the Sun, if small, are weighted averages of the differences between the Sun's structure and that of the reference model. The frequency differences can then be used to infer those structural differences. The weighting functions of these averages are known as kernels.

Structure

The first inversions of the Sun's structure were made using Duvall's law and later using Duvall's law linearized about a reference solar model. These results were subsequently supplemented by analyses that linearize the full set of equations describing the stellar oscillations about a theoretical reference model and are now a standard way to invert frequency data. The inversions demonstrated differences in solar models that were greatly reduced by implementing gravitational settling: the gradual separation of heavier elements towards the solar centre (and lighter elements to the surface to replace them).

Rotation

The internal rotation profile of the Sun inferred using data from the Helioseismic and Magnetic Imager aboard the Solar Dynamics Observatory. The inner radius has been truncated where the measurements are less certain than 1%, which happens around 3/4 of the way to the core. The dashed line indicates the base of the solar convection zone, which happens to coincide with the boundary at which the rotation profile changes, known as the tachocline.

If the Sun were perfectly spherical, the modes with different azimuthal orders m would have the same frequencies. Rotation, however, breaks this degeneracy, and the modes frequencies differ by rotational splittings that are weighted-averages of the angular velocity through the Sun. Different modes are sensitive to different parts of the Sun and, given enough data, these differences can be used to infer the rotation rate throughout the Sun. For example, if the Sun were rotating uniformly throughout, all the p modes would be split by approximately the same amount. Actually, the angular velocity is not uniform, as can be seen at the surface, where the equator rotates faster than the poles. The Sun rotates slowly enough that a spherical, non-rotating model is close enough to reality for deriving the rotational kernels.

Helioseismology has shown that the Sun has a rotation profile with several features:

  • a rigidly-rotating radiative (i.e. non-convective) zone, though the rotation rate of the inner core is not well known;
  • a thin shear layer, known as the tachocline, which separates the rigidly-rotating interior and the differentially-rotating convective envelope;
  • a convective envelope in which the rotation rate varies both with depth and latitude; and
  • a final shear layer just beneath the surface, in which the rotation rate slows down towards the surface.

Relationship to other fields

Geoseismology

Helioseismology was born from analogy with geoseismology but several important differences remain. First, the Sun lacks a solid surface and therefore cannot support shear waves. From the data analysis perspective, global helioseismology differs from geoseismology by studying only normal modes. Local helioseismology is thus somewhat closer in spirit to geoseismology in the sense that it studies the complete wavefield.

Asteroseismology

Because the Sun is a star, helioseismology is closely related to the study of oscillations in other stars, known as asteroseismology. Helioseismology is most closely related to the study of stars whose oscillations are also driven and damped by their outer convection zones, known as solar-like oscillators, but the underlying theory is broadly the same for other classes of variable star.

The principal difference is that oscillations in distant stars cannot be resolved. Because the brighter and darker sectors of the spherical harmonic cancel out, this restricts asteroseismology almost entirely to the study of low degree modes (angular degree ). This makes inversion much more difficult but upper limits can still be achieved by making more restrictive assumptions.

History

Solar oscillations were first observed in the early 1960s as a quasi-periodic intensity and line-of-sight velocity variation with a period of about 5 minutes. Scientists gradually realized that the oscillations might be global modes of the Sun and predicted that the modes would form clear ridges in two-dimensional power spectra. The ridges were subsequently confirmed in observations of high-degree modes in the mid 1970s, and mode multiplets of different radial orders were distinguished in whole-disc observations. At a similar time, Jørgen Christensen-Dalsgaard and Douglas Gough suggested the potential of using individual mode frequencies to infer the interior structure of the Sun. They calibrated solar models against the low-degree data finding two similarly good fits, one with low and a corresponding low neutrino production rate , the other with higher and ; earlier envelope calibrations against high-degree frequencies preferred the latter, but the results were not wholly convincing. It was not until Tom Duvall and Jack Harvey connected the two extreme data sets by measuring modes of intermediate degree to establish the quantum numbers associated with the earlier observations that the higher- model was established, thereby suggesting at that early stage that the resolution of the neutrino problem must lie in nuclear or particle physics.

New methods of inversion developed in the 1980s, allowing researchers to infer the profiles sound speed and, less accurately, density throughout most of the Sun, corroborating the conclusion that residual errors in the inference of the solar structure is not the cause of the neutrino problem. Towards the end of the decade, observations also began to show that the oscillation mode frequencies vary with the Sun's magnetic activity cycle.

To overcome the problem of not being able to observe the Sun at night, several groups had begun to assemble networks of telescopes (e.g. the Birmingham Solar Oscillations Network, or BiSON, and the Global Oscillation Network Group) from which the Sun would always be visible to at least one node. Long, uninterrupted observations brought the field to maturity, and the state of the field was summarized in a 1996 special issue of Science magazine. This coincided with the start of normal operations of the Solar and Heliospheric Observatory (SoHO), which began producing high-quality data for helioseismology.

The subsequent years saw the resolution of the solar neutrino problem, and the long seismic observations began to allow analysis of multiple solar activity cycles. The agreement between standard solar models and helioseismic inversions was disrupted by new measurements of the heavy element content of the solar photosphere based on detailed three-dimensional models. Though the results later shifted back towards the traditional values used in the 1990s, the new abundances significantly worsened the agreement between the models and helioseismic inversions. The cause of the discrepancy remains unsolved and is known as the solar abundance problem.

Space-based observations by SoHO have continued and SoHO was joined in 2010 by the Solar Dynamics Observatory (SDO), which has also been monitoring the Sun continuously since its operations began. In addition, ground-based networks (notably BiSON and GONG) continue to operate, providing nearly continuous data from the ground too.

Philosophy of science

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Philosophy_of_science Philosophy of science  is the branch of  philosoph...