Search This Blog

Wednesday, December 5, 2018

Nanometrology

From Wikipedia, the free encyclopedia

NIST Next-Generation Nanometrology research.
Nanometrology is a subfield of metrology, concerned with the science of measurement at the nanoscale level. Nanometrology has a crucial role in order to produce nanomaterials and devices with a high degree of accuracy and reliability in nanomanufacturing.

A challenge in this field is to develop or create new measurement techniques and standards to meet the needs of next-generation advanced manufacturing, which will rely on nanometer scale materials and technologies. The needs for measurement and characterization of new sample structures and characteristics far exceed the capabilities of current measurement science. Anticipated advances in emerging U.S. nanotechnology industries will require revolutionary metrology with higher resolution and accuracy than has previously been envisioned.

Introduction

Control of the critical dimensions are the most important factors in nanotechnology. Nanometrology today, is to a large extent based on the development in semiconductor technology. Nanometrology is the science of measurement at the nanoscale level. Nanometer or nm is equivalent to 10^-9 m. In Nanotechnology accurate control of dimensions of objects is important. Typical dimensions of nanosystems vary from 10 nm to a few hundred nm and while fabricating such systems measurement up to 0.1 nm is required. 

A Scanning Electron Microscope

At nanoscale due to the small dimensions various new physical phenomena can be observed. For example, when the crystal size is smaller than the electron mean free path the conductivity of the crystal changes. Another example is the discretization of stresses in the system. It becomes important to measure the physical parameters so as to apply these phenomena into engineering of nanosystems and manufacturing them. The measurement of length or size, force, mass, electrical and other properties is included in Nanometrology. The problem is how to measure these with reliability and accuracy. The measurement techniques used for macro systems cannot be directly used for measurement of parameters in nanosystems. Various techniques based on physical phenomena have been developed which can be used for measure or determine the parameters for nanostructures and nanomaterials. Some of the popular ones are X-Ray diffraction, transmission electron microscopy, High Resolution Transmission Electron Microscopy, atomic force microscopy, scanning electron microscopy, field emission scanning electron microscopy and Brunauer, Emmett, Teller method to determine specific surface.

Nanotechnology is an important field because of the large number of applications it has and it has become necessary to develop more precise techniques of measurement and globally accepted standards. Hence progress is required in the field of Nanometrology.

Development needs

Nanotechnology can be divided into two branches. The first being molecular nanotechnology which involves bottom up manufacturing and the second is engineering nanotechnology which involve the development and processing of materials and systems at nanoscale. The measurement and manufacturing tools and techniques required for the two branches are slightly different.

Furthermore, Nanometrology requirements are different for the industry and research institutions. Nanometrology of research has progressed faster than that for industry mainly because implementing nanometrology for industry is difficult. In research oriented nanometrology resolution is important whereas in industrial nanometrology accuracy is given precedence over resolution. Further due to economic reasons it is important to have low time costs in industrial nanometrology it is not important for research nanometrology. The various measurement techniques available today require a controlled environment like in vacuum, vibration and noise free environment. Also, in industrial nanometrology requires that the measurements be more quantitative with minimum number of parameters.

Standards

International standards

Metrology standards are objects or ideas that are designated as being authoritative for some accepted reason. Whatever value they possess is useful for comparison to unknowns for the purpose of establishing or confirming an assigned value based on the standard. The execution of measurement comparisons for the purpose of establishing the relationship between a standard and some other measuring device is calibration. The ideal standard is independently reproducible without uncertainty. The worldwide market for products with nanotechnology applications is projected to be at least a couple of hundred billion dollars in the near future.[citation needed] Until recently, there almost no established internationally accepted standards for nanotechnology related field. The International Organisation for Standardization TC-229 Technical Committee on Nanotechnology recently published few standards for terminology, characterization of nanomaterials and nanoparticles using measurement tools like AFM, SEM, Interferometers, optoacoustic tools, gas adsorption methods etc. Certain standards for standardization of measurements for electrical properties have been published by the International Electrotechnical Commission. Some important standards which are yet to be established are standards for measuring thickness of thin films or layers, characterization of surface features, standards for force measurement at nanoscale, standards for characterization of critical dimensions of nanoparticles and nanostructures and also Standards for measurement for physical properties like conductivity, elasticity etc.

National standards

Because of the importance of nanotechnology in the future, countries around the world have programmes to establish national standards for nanometrology and nanotechnology. These programmes are run by the national standard agencies of the respective countries. In the United States, National Institute of Standards and Technology has been working on developing new techniques for measurement at nanoscale and has also established some national standards for nanotechnology. These standards are for nanoparticle characterization, Roughness Characterization, magnification standard, calibration standards etc.

Calibration

It is difficult to provide samples using which precision instruments can be calibrated at nanoscale. Reference or calibration standards are important for repeatability to be ensured. But there are no international standards for calibration and the calibration artefacts provided by the company along with their equipment is only good for calibrating that particular equipment. Hence it is difficult to select a universal calibration artefact using which we can achieve repeatability at nanoscale. At nanoscale while calibrating care needs to be taken for influence of external factors like vibration, noise, motions caused by thermal drift and creep and internal factors like the interaction between the artefact and the equipment which can cause significant deviations.

Measurement techniques

In the last 70 years various techniques for measuring at nanoscale have been developed. Most of them based on some physical phenomena observed on particle interactions or forces at nanoscale. Some of the most commonly used techniques are Atomic Force Microscopy, X-Ray Diffraction, Scanning Electron Microscopy, Transmission Electron Microscopy, High Resolution Transmission Electron Microscopy, and Field Emission Scanning Electron Microscopy.

A Atomic Force Microscope
Block Diagram of atomic force microscope.

Atomic force microscopy (AFM) is one of the most common measurement techniques. It can be used to measure Topology, grain size, frictional characteristics and different forces. It consists of a silicon cantilever with a sharp tip with a radius of curvature of a few nanometers. The tip is used as a probe on the specimen to be measured. The forces acting at the atomic level between the tip and the surface of the specimen cause the tip to deflect and this deflection is detected using a laser spot which is reflected to an array of photodiodes. 

A Scanning Tunneling Microscope

Scanning tunneling microscopy (STM) is another instrument commonly used. It is used to measure 3-D topology of the specimen. The STM is based on the concept of quantum tunneling. When a conducting tip is brought very near to the surface to be examined, a bias (voltage difference) applied between the two can allow electrons to tunnel through the vacuum between them. Measurements are made by monitoring the current as the tip's position scans across the surface, which can then be used to display an image.

Another commonly used instrument is the scanning electron microscopy (SEM) which apart from measuring the shape and size of the particles and topography of the surface can be used to determine the composition of elements and compounds the sample is composed of. In SEM the specimen surface is scanned with a high energy electron beam. The electrons in the beam interact with atoms in the specimen and interactions are detected using detectors. The interactions produced are back scattering of electrons, transmission of electrons, secondary electrons etc. To remove high angle electrons magnetics lenses are used.

The instruments mentioned above produce realistic pictures of the surface are excellent measuring tools for research. Industrial applications of nanotechnology require the measurements to be produced need to be more quantitative. The requirement in industrial nanometrology is for higher accuracy than resolution as compared to research nanometrology.

Nano coordinate measuring machine

A coordinate measuring machine (CMM) that works at the nanoscale would have a smaller frame than the CMM used for macroscale objects. This is so because it may provide the necessary stiffness and stability to achieve nanoscale uncertainties in x,y and z directions. The probes for such a machine need to be small to enable a 3-D measurement of nanometre features from the sides and from inside like nanoholes. Also for accuracy laser interferometers need to be used. NIST has developed a surface measuring instrument, called the Molecular Measuring Machine. This instrument is basically an STM. The x- and y-axes are read out by laser interferometers. The molecules on the surface area can be identified individually and at the same time the distance between any two molecules can be determined. For measuring with molecular resolution, the measuring times become very large for even a very small surface area. Ilmenau Machine is another nanomeasuring machine developed by researchers at the Ilmenau University of Technology. 

A CMM
Dimensional metrology using CMM.

The components of a nano CMM include nanoprobes, control hardware, 3D-nanopositioning platform, and instruments with high resolution and accuracy for linear and angular measurement.

List of some of the measurement techniques

Type Description
Atomic Force Microscopy A precise mechanical probe is used to analyze surface irregularities
X- Ray Diffraction A crystalline structure causes x-rays to diverge, using the angle of these diffractions, measurements can be determined
X-ray absorption Spectroscopy Core electrons are excited using x-rays, and their transitions are measured
Small Angle X-Ray Scattering
Scanning Tunneling Microscopy
Transmission Electron Microscopy
Capacitance Spectroscopy
Polarization Spectroscopy
Auger Electron Spectroscopy
Raman Spectroscopy
Small Angle Neutron Scattering
Scanning Electron Microscopy
Cyclic Voltammetry
Linear Sweep Voltammetry
Nuclear Magnetic Resonance
Mössbauer Spectroscopy
Fourier Transform Infrared Spectroscopy
Photoluminescence Spectroscopy
Electroluminescence Spectroscopy
Differential Scanning Calorimetry
Secondary Ion Mass Spectrometry
Cathodoluminescence Spectroscopy
Electron Energy Loss Spectroscopy
Energy Dispersive X-Ray Spectroscopy
Four point probe and I-V technique
X-Ray Photoelectron Spectroscopy
Scanning Near-field Optical Microscopy
Single-molecule Spectroscopy
Neutron Diffraction
Interference Microscopy
Laser Interferometry

Traceability

In metrology at macro scale achieving traceability is quite easy and artefacts like scales, laser interferometers, step gauges, and straight edges are used. At nanoscale a crystalline highly oriented pyrolytic graphite (HOPG), mica or silicon surface is considered suitable used as calibration artefact for achieving traceability. But it is not always possible to ensure traceability. Like what is a straight edge at nanoscale and even if take the same standard as that for macroscale there is no way to calibrate it accurately at nanoscale. This so because the requisite internationally or nationally accepted reference standards are not always there. Also the measurement equipment required to ensure traceability has not been developed. The generally used for traceability are miniaturisation of traditional metrology standards hence there is a need for establishing nanoscale standards. Also there is a need to establish some kind of uncertainty estimation model. Traceability is one of the fundamental requirements for manufacturing and assembly of products when multiple producers are there.

Tolerance

A IC
Integrated circuit made using monolithic integration technique.

Tolerance is the permissible limit or limits of variation in dimensions, properties, or conditions without significantly affecting functioning of equipment or a process. Tolerances are specified to allow reasonable leeway for imperfections and inherent variability without compromising performance. In nanotechnology the systems have dimensions in the range of nanometers. Defining tolerances at nanoscale with suitable calibration standards for traceability is difficult for different nanomanufacturing methods. There are various integration techniques developed in the semiconductor industry that are used in nanomanufacturing.

Integration techniques

  • In hetero integration direct fabrication of nanosystems from compound substrates is done. Geometric tolerances are required to achieve the functionality of the assembly.
  • In hybrid integration nanocomponents are placed or assembled on a substrate fabricating functioning nanosystems. In this technique, the most important control parameter is the positional accuracy of the components on the substrate.
  • In monolithic integration all the fabrication process steps are integrated on a single substrate and hence no mating of components or assembly is required. The advantage of this technique is that the geometric measurements are no longer of primary importance for achieving functionality of nanosystem or control of the fabrication process.

Classification of nanostructures

There are a variety of nanostructures like nanocomposites, nanowires, nanopowders, nanotubes, fullerenes nanofibers, nanocages, nanocrystallites, nanoneedles, nanofoams, nanomeshes, nanoparticles, nanopillars, thin films, nanorods, nanofabrics, quantumdots etc. The most common way to classify nano structures is by their dimensions. 

A Nanowire
SEM of nanowire.

Dimensional classification

Dimensions Criteria Examples
Zero-dimensional (0-D) The nanostructure has all dimensions in the nanometer range. Nanoparticles, quantum dots, nanodots
One-dimensional (1-D) One dimension of the nanostructure is outside the nanometer range. Nanowires, nanorods, nanotubes
Two-dimensional (2-D) Two dimensions of the nanostructure are outside the nanometer range. Coatings, thin-film-multilayers
Three-dimensional (3-D) Three dimensions of the nanostructure are outside the nanometer range. Bulk

Classification of grain structure

Nanostructures can be classified on the basis of the grain structure and size there are made up of. This is applicable in the cas of 2-dimensional and 3-Dimensional Nanostructurs.

Surface area measurement

For nanopowder to determine the specific surface area the B.E.T. method is commonly used. The drop of pressure of nitrogen in a closed container due to adsorption of the nitrogen molecules to the surface of the material inserted in the container is measured. Also, the shape of the nanopowder particles is assumed to be spherical.
D = 6/(ρ*A)
Where "D" is the effective diameter, "ρ" is the density and "A" is the surface area found from the B.E.T. method.

Green chemistry (updated)

From Wikipedia, the free encyclopedia

Green chemistry, also called sustainable chemistry, is an area of chemistry and chemical engineering focused on the designing of products and processes that minimize the use and generation of hazardous substances. Whereas environmental chemistry focuses on the effects of polluting chemicals on nature, green chemistry focuses on the environmental impact of chemistry, including technological approaches to preventing pollution and reducing consumption of nonrenewable resources.

The overarching goals of green chemistry—namely, more resource-efficient and inherently safer design of molecules, materials, products, and processes—can be pursued in a wide range of contexts.

IUPAC definition

Green chemistry (sustainable chemistry): Design of chemical products and processes that reduce or eliminate the use or generation of substances hazardous to humans, animals, plants, and the environment.

History

Green chemistry emerged from a variety of existing ideas and research efforts (such as atom economy and catalysis) in the period leading up to the 1990s, in the context of increasing attention to problems of chemical pollution and resource depletion. The development of green chemistry in Europe and the United States was linked to a shift in environmental problem-solving strategies: a movement from command and control regulation and mandated reduction of industrial emissions at the "end of the pipe," toward the active prevention of pollution through the innovative design of production technologies themselves. The set of concepts now recognized as green chemistry coalesced in the mid- to late-1990s, along with broader adoption of the term (which prevailed over competing terms such as "clean" and "sustainable" chemistry).

In the United States, the Environmental Protection Agency played a significant early role in fostering green chemistry through its pollution prevention programs, funding, and professional coordination. At the same time in the United Kingdom, researchers at the University of York contributed to the establishment of the Green Chemistry Network within the Royal Society of Chemistry, and the launch of the journal Green Chemistry.

Principles

In 1998, Paul Anastas (who then directed the Green Chemistry Program at the US EPA) and John C. Warner (then of Polaroid Corporation) published a set of principles to guide the practice of green chemistry. The twelve principles address a range of ways to reduce the environmental and health impacts of chemical production, and also indicate research priorities for the development of green chemistry technologies.

The principles cover such concepts as:
The twelve principles of green chemistry are:
  1. Prevention. Preventing waste is better than treating or cleaning up waste after it is created.
  2. Atom economy. Synthetic methods should try to maximize the incorporation of all materials used in the process into the final product.
  3. Less hazardous chemical syntheses. Synthetic methods should avoid using or generating substances toxic to humans and/or the environment.
  4. Designing safer chemicals. Chemical products should be designed to achieve their desired function while being as non-toxic as possible.
  5. Safer solvents and auxiliaries. Auxiliary substances should be avoided wherever possible, and as non-hazardous as possible when they must be used.
  6. Design for energy efficiency. Energy requirements should be minimized, and processes should be conducted at ambient temperature and pressure whenever possible.
  7. Use of renewable feedstocks. Whenever it is practical to do so, renewable feedstocks or raw materials are preferable to non-renewable ones.
  8. Reduce derivatives. Unnecessary generation of derivatives—such as the use of protecting groups—should be minimized or avoided if possible; such steps require additional reagents and may generate additional waste.
  9. Catalysis. Catalytic reagents that can be used in small quantities to repeat a reaction are superior to stoichiometric reagents (ones that are consumed in a reaction).
  10. Design for degradation. Chemical products should be designed so that they do not pollute the environment; when their function is complete, they should break down into non-harmful products.
  11. Real-time analysis for pollution prevention. Analytical methodologies need to be further developed to permit real-time, in-process monitoring and control before hazardous substances form.
  12. Inherently safer chemistry for accident prevention. Whenever possible, the substances in a process, and the forms of those substances, should be chosen to minimize risks such as explosions, fires, and accidental releases.

Trends

Attempts are being made not only to quantify the greenness of a chemical process but also to factor in other variables such as chemical yield, the price of reaction components, safety in handling chemicals, hardware demands, energy profile and ease of product workup and purification. In one quantitative study, the reduction of nitrobenzene to aniline receives 64 points out of 100 marking it as an acceptable synthesis overall whereas a synthesis of an amide using HMDS is only described as adequate with a combined 32 points.

Green chemistry is increasingly seen as a powerful tool that researchers must use to evaluate the environmental impact of nanotechnology. As nanomaterials are developed, the environmental and human health impacts of both the products themselves and the processes to make them must be considered to ensure their long-term economic viability.

Examples

Green solvents

Solvents are consumed in large quantities in many chemical syntheses as well as for cleaning and degreasing. Traditional solvents are often toxic or are chlorinated. Green solvents, on the other hand, are generally derived from renewable resources and biodegrade to innocuous, often a naturally occurring product.

Synthetic techniques

Novel or enhanced synthetic techniques can often provide improved environmental performance or enable better adherence to the principles of green chemistry. For example, the 2005 Nobel Prize for Chemistry was awarded, to Yves Chauvin, Robert H. Grubbs and Richard R. Schrock, for the development of the metathesis method in organic synthesis, with explicit reference to its contribution to green chemistry and "smarter production." A 2005 review identified three key developments in green chemistry in the field of organic synthesis: use of supercritical carbon dioxide as green solvent, aqueous hydrogen peroxide for clean oxidations and the use of hydrogen in asymmetric synthesis. Some further examples of applied green chemistry are supercritical water oxidation, on water reactions, and dry media reactions.

Bioengineering is also seen as a promising technique for achieving green chemistry goals. A number of important process chemicals can be synthesized in engineered organisms, such as shikimate, a Tamiflu precursor which is fermented by Roche in bacteria. Click chemistry is often cited as a style of chemical synthesis that is consistent with the goals of green chemistry. The concept of 'green pharmacy' has recently been articulated based on similar principles.

Carbon dioxide as blowing agent

In 1996, Dow Chemical won the 1996 Greener Reaction Conditions award for their 100% carbon dioxide blowing agent for polystyrene foam production. Polystyrene foam is a common material used in packing and food transportation. Seven hundred million pounds are produced each year in the United States alone. Traditionally, CFC and other ozone-depleting chemicals were used in the production process of the foam sheets, presenting a serious environmental hazard. Flammable, explosive, and, in some cases toxic hydrocarbons have also been used as CFC replacements, but they present their own problems. Dow Chemical discovered that supercritical carbon dioxide works equally as well as a blowing agent, without the need for hazardous substances, allowing the polystyrene to be more easily recycled. The CO2 used in the process is reused from other industries, so the net carbon released from the process is zero.

Hydrazine

Addressing principle #2 is the Peroxide Process for producing hydrazine without cogenerating salt. Hydrazine is traditionally produced by the Olin Raschig process from sodium hypochlorite (the active ingredient in many bleaches) and ammonia. The net reaction produces one equivalent of sodium chloride for every equivalent of the targeted product hydrazine:
NaOCl + 2 NH3 → H2N-NH2 + NaCl + H2O
In the greener Peroxide process hydrogen peroxide is employed as the oxidant and the side product is water. The net conversion follows:
2 NH3 + H2O2 → H2N-NH2 + 2 H2O
Addressing principle #4, this process does not require auxiliary extracting solvents. Methyl ethyl ketone is used as a carrier for the hydrazine, the intermediate ketazine phase separates from the reaction mixture, facilitating workup without the need of an extracting solvent.

1,3-Propanediol

Addressing principle #7 is a green route to 1,3-propanediol, which is traditionally generated from petrochemical precursors. It can be produced from renewable precursors via the bioseparation of 1,3-propanediol using a genetically modified strain of E. coli. This diol is used to make new polyesters for the manufacture of carpets.

Lactide

Lactide

In 2002, Cargill Dow (now NatureWorks) won the Greener Reaction Conditions Award for their improved method for polymerization of polylactic acid . Unfortunately, lactide-base polymers do not perform well and the project was discontinued by Dow soon after the award. Lactic acid is produced by fermenting corn and converted to lactide, the cyclic dimer ester of lactic acid using an efficient, tin-catalyzed cyclization. The L,L-lactide enantiomer is isolated by distillation and polymerized in the melt to make a crystallizable polymer, which has some applications including textiles and apparel, cutlery, and food packaging. Wal-Mart has announced that it is using/will use PLA for its produce packaging. The NatureWorks PLA process substitutes renewable materials for petroleum feedstocks, doesn't require the use of hazardous organic solvents typical in other PLA processes, and results in a high-quality polymer that is recyclable and compostable.

Carpet tile backings

In 2003 Shaw Industries selected a combination of polyolefin resins as the base polymer of choice for EcoWorx due to the low toxicity of its feedstocks, superior adhesion properties, dimensional stability, and its ability to be recycled. The EcoWorx compound also had to be designed to be compatible with nylon carpet fiber. Although EcoWorx may be recovered from any fiber type, nylon-6 provides a significant advantage. Polyolefins are compatible with known nylon-6 depolymerization methods. PVC interferes with those processes. Nylon-6 chemistry is well-known and not addressed in first-generation production. From its inception, EcoWorx met all of the design criteria necessary to satisfy the needs of the marketplace from a performance, health, and environmental standpoint. Research indicated that separation of the fiber and backing through elutriation, grinding, and air separation proved to be the best way to recover the face and backing components, but an infrastructure for returning postconsumer EcoWorx to the elutriation process was necessary. Research also indicated that the postconsumer carpet tile had a positive economic value at the end of its useful life. EcoWorx is recognized by MBDC as a certified cradle-to-cradle design

Trans and cis fatty acids

Transesterification of fats

In 2005, Archer Daniels Midland (ADM) and Novozymes won the Greener Synthetic Pathways Award for their enzyme interesterification process. In response to the U.S. Food and Drug Administration (FDA) mandated labeling of trans-fats on nutritional information by January 1, 2006, Novozymes and ADM worked together to develop a clean, enzymatic process for the interesterification of oils and fats by interchanging saturated and unsaturated fatty acids. The result is commercially viable products without trans-fats. In addition to the human health benefits of eliminating trans-fats, the process has reduced the use of toxic chemicals and water, prevents vast amounts of byproducts, and reduces the amount of fats and oils wasted.

Bio-succinic acid

In 2011, the Outstanding Green Chemistry Accomplishments by a Small Business Award went to BioAmber Inc. for integrated production and downstream applications of bio-based succinic acid. Succinic acid is a platform chemical that is an important starting material in the formulations of everyday products. Traditionally, succinic acid is produced from petroleum-based feedstocks. BioAmber has developed process and technology that produces succinic acid from the fermentation of renewable feedstocks at a lower cost and lower energy expenditure than the petroleum equivalent while sequestering CO2 rather than emitting it.

Laboratory chemicals

Several laboratory chemicals are controversial from the perspective of Green chemistry. The Massachusetts Institute of Technology created a "Green" Alternatives Wizard to help identify alternatives. Ethidium bromide, xylene, mercury, and formaldehyde have been identified as "worst offenders" which have alternatives. Solvents in particular make a large contribution to the environmental impact of chemical manufacturing and there is a growing focus on introducing Greener solvents into the earliest stage of development of these processes: laboratory-scale reaction and purification methods. In the Pharmaceutical Industry, both GSK and Pfizer have published Solvent Selection Guides for their Drug Discovery chemists.

Legislation

The EU

In 2007, The EU put into place the Registration, Evaluation, Authorisation and Restriction of Chemicals (REACH) program, which requires companies to provide data showing that their products are safe. This regulation (1907/2006) ensures not only the assessment of the chemicals' hazards as well as risks during their uses but also includes measures for banning or restricting/authorising uses of specific substances. ECHA, the EU Chemicals Agency in Helsinki, is implementing the regulation whereas the enforcement lies with the EU member states.

United States

The U.S. law that governs the majority of industrial chemicals (excluding pesticides, foods, and pharmaceuticals) is the Toxic Substances Control Act (TSCA) of 1976. Examining the role of regulatory programs in shaping the development of green chemistry in the United States, analysts have revealed structural flaws and long-standing weaknesses in TSCA; for example, a 2006 report to the California Legislature concludes that TSCA has produced a domestic chemicals market that discounts the hazardous properties of chemicals relative to their function, price, and performance. Scholars have argued that such market conditions represent a key barrier to the scientific, technical, and commercial success of green chemistry in the U.S., and fundamental policy changes are needed to correct these weaknesses.

Passed in 1990, the Pollution Prevention Act helped foster new approaches for dealing with pollution by preventing environmental problems before they happen.

In 2008, the State of California approved two laws aiming to encourage green chemistry, launching the California Green Chemistry Initiative. One of these statutes required California's Department of Toxic Substances Control (DTSC) to develop new regulations to prioritize "chemicals of concern" and promote the substitution of hazardous chemicals with safer alternatives. The resulting regulations took effect in 2013, initiating DTSC's Safer Consumer Products Program.

Education

Many institutions offer courses and degrees on Green Chemistry. Examples from across the globe are Denmark's Technical University, and several in the US, e.g. at the Universities of Massachusetts-Boston, Michigan, and Oregon. A masters level course in Green Technology, has been introduced by the Institute of Chemical Technology, India. In the UK at the University of York University of Leicester, Department of Chemistry and MRes in Green Chemistry at Imperial College London. In Spain different universities like the Universidad de Jaume I or the Universidad de Navarra, offer Green Chemistry master courses. There are also websites focusing on green chemistry, such as the Michigan Green Chemistry Clearinghouse at www.migreenchemistry.org.
 
Apart from its Green Chemistry Master courses the Zurich University of Applied Sciences ZHAW presents an exposition and web page "Making chemistry green" for a broader public, illustrating the 12 principles.

Scientific journals specialized in green chemistry

Contested definition

There are ambiguities in the definition of green chemistry, and in how it is understood among broader science, policy, and business communities. Even within chemistry, researchers have used the term "green chemistry" to describe a range of work independently of the framework put forward by Anastas and Warner (i.e., the 12 principles). While not all uses of the term are legitimate, many are, and the authoritative status of any single definition is uncertain. More broadly, the idea of green chemistry can easily be linked (or confused) with related concepts like green engineering, environmental design, or sustainability in general. The complexity and multifaceted nature of green chemistry makes it difficult to devise clear and simple metrics. As a result, "what is green" is often open to debate.

Awards

Several scientific societies have created awards to encourage research in green chemistry.
  • Australia's Green Chemistry Challenge Awards overseen by The Royal Australian Chemical Institute (RACI).
  • The Canadian Green Chemistry Medal.
  • In Italy, Green Chemistry activities center around an inter-university consortium known as INCA.
  • In Japan, The Green & Sustainable Chemistry Network oversees the GSC awards program.
  • In the United Kingdom, the Green Chemical Technology Awards are given by Crystal Faraday.
  • In the US, the Presidential Green Chemistry Challenge Awards recognize individuals and businesses.

Metamaterial

From Wikipedia, the free encyclopedia

Negative-index metamaterial array configuration, which was constructed of copper split-ring resonators and wires mounted on interlocking sheets of fiberglass circuit board. The total array consists of 3 by 20×20 unit cells with overall dimensions of 10 mm × 100 mm × 100 mm (0.39 in × 3.94 in × 3.94 in).
 
A metamaterial (from the Greek word μετά meta, meaning "beyond") is a material engineered to have a property that is not found in naturally occurring materials. They are made from assemblies of multiple elements fashioned from composite materials such as metals or plastics. The materials are usually arranged in repeating patterns, at scales that are smaller than the wavelengths of the phenomena they influence. Metamaterials derive their properties not from the properties of the base materials, but from their newly designed structures. Their precise shape, geometry, size, orientation and arrangement gives them their smart properties capable of manipulating electromagnetic waves: by blocking, absorbing, enhancing, or bending waves, to achieve benefits that go beyond what is possible with conventional materials.

Appropriately designed metamaterials can affect waves of electromagnetic radiation or sound in a manner not observed in bulk materials. Those that exhibit a negative index of refraction for particular wavelengths have attracted significant research. These materials are known as negative-index metamaterials

Potential applications of metamaterials are diverse and include optical filters, medical devices, remote aerospace applications, sensor detection and infrastructure monitoring, smart solar power management, crowd control, radomes, high-frequency battlefield communication and lenses for high-gain antennas, improving ultrasonic sensors, and even shielding structures from earthquakes. Metamaterials offer the potential to create superlenses. Such a lens could allow imaging below the diffraction limit that is the minimum resolution that can be achieved by conventional glass lenses. A form of 'invisibility' was demonstrated using gradient-index materials. Acoustic and seismic metamaterials are also research areas.

Metamaterial research is interdisciplinary and involves such fields as electrical engineering, electromagnetics, classical optics, solid state physics, microwave and antenna engineering, optoelectronics, material sciences, nanoscience and semiconductor engineering.

History

Explorations of artificial materials for manipulating electromagnetic waves began at the end of the 19th century. Some of the earliest structures that may be considered metamaterials were studied by Jagadish Chandra Bose, who in 1898 researched substances with chiral properties. Karl Ferdinand Lindman studied wave interaction with metallic helices as artificial chiral media in the early twentieth century.

Winston E. Kock developed materials that had similar characteristics to metamaterials in the late 1940s. In the 1950s and 1960s, artificial dielectrics were studied for lightweight microwave antennas. Microwave radar absorbers were researched in the 1980s and 1990s as applications for artificial chiral media.

Negative-index materials were first described theoretically by Victor Veselago in 1967. He proved that such materials could transmit light. He showed that the phase velocity could be made anti-parallel to the direction of Poynting vector. This is contrary to wave propagation in naturally occurring materials.

John Pendry was the first to identify a practical way to make a left-handed metamaterial, a material in which the right-hand rule is not followed. Such a material allows an electromagnetic wave to convey energy (have a group velocity) against its phase velocity. Pendry's idea was that metallic wires aligned along the direction of a wave could provide negative permittivity (dielectric function ε < 0). Natural materials (such as ferroelectrics) display negative permittivity; the challenge was achieving negative permeability (µ < 0). In 1999 Pendry demonstrated that a split ring (C shape) with its axis placed along the direction of wave propagation could do so. In the same paper, he showed that a periodic array of wires and rings could give rise to a negative refractive index. Pendry also proposed a related negative-permeability design, the Swiss roll

In 2000, Smith et al. reported the experimental demonstration of functioning electromagnetic metamaterials by horizontally stacking, periodically, split-ring resonators and thin wire structures. A method was provided in 2002 to realize negative-index metamaterials using artificial lumped-element loaded transmission lines in microstrip technology. In 2003, complex (both real and imaginary parts of) negative refractive index and imaging by flat lens using left handed metamaterials were demonstrated. By 2007, experiments that involved negative refractive index had been conducted by many groups. At microwave frequencies, the first, imperfect invisibility cloak was realized in 2006.

Electromagnetic metamaterials

An electromagnetic metamaterial affects electromagnetic waves that impinge on or interact with its structural features, which are smaller than the wavelength. To behave as a homogeneous material accurately described by an effective refractive index, its features must be much smaller than the wavelength.

For microwave radiation, the features are on the order of millimeters. Microwave frequency metamaterials are usually constructed as arrays of electrically conductive elements (such as loops of wire) that have suitable inductive and capacitive characteristics. One microwave metamaterial uses the split-ring resonator.

Photonic metamaterials, nanometer scale, manipulate light at optical frequencies. To date, subwavelength structures have shown only a few, questionable, results at visible wavelengths. Photonic crystals and frequency-selective surfaces such as diffraction gratings, dielectric mirrors and optical coatings exhibit similarities to subwavelength structured metamaterials. However, these are usually considered distinct from subwavelength structures, as their features are structured for the wavelength at which they function and thus cannot be approximated as a homogeneous material. However, material structures such as photonic crystals are effective in the visible light spectrum. The middle of the visible spectrum has a wavelength of approximately 560 nm (for sunlight). Photonic crystal structures are generally half this size or smaller, that is <280 nbsp="" nm.="" p="">

Plasmonic metamaterials utilize surface plasmons, which are packets of electrical charge that collectively oscillate at the surfaces of metals at optical frequencies.

Frequency selective surfaces (FSS) can exhibit subwavelength characteristics and are known variously as artificial magnetic conductors (AMC) or High Impedance Surfaces (HIS). FSS display inductive and capacitive characteristics that are directly related to their subwavelength structure.

Negative refractive index

A comparison of refraction in a left-handed metamaterial to that in a normal material

Almost all materials encountered in optics, such as glass or water, have positive values for both permittivity ε and permeability µ. However, metals such as silver and gold have negative permittivity at shorter wavelengths. A material such as a surface plasmon that has either (but not both) ε or µ negative is often opaque to electromagnetic radiation. However, anisotropic materials with only negative permittivity can produce negative refraction due to chirality.

Although the optical properties of a transparent material are fully specified by the parameters εr and µr, refractive index n is often used in practice, which can be determined from . All known non-metamaterial transparent materials possess positive εr and µr. By convention the positive square root is used for n

However, some engineered metamaterials have εr < 0 and µr < 0. Because the product εrµr is positive, n is real. Under such circumstances, it is necessary to take the negative square root for n.

Video representing negative refraction of light at uniform planar interface.







The foregoing considerations are simplistic for actual materials, which must have complex-valued εr and µr. The real parts of both εr and µr do not have to be negative for a passive material to display negative refraction. Metamaterials with negative n have numerous interesting properties:
  • Snell's law (n1sinθ1 = n2sinθ2), but as n2 is negative, the rays are refracted on the same side of the normal on entering the material.
  • Cherenkov radiation points the other way.
  • The time-averaged Poynting vector is antiparallel to phase velocity. However, for waves (energy) to propagate, a –µ must be paired with a –ε in order to satisfy the wave number dependence on the material parameters .
Negative index of refraction derives mathematically from the vector triplet E, H and k.

For plane waves propagating in electromagnetic metamaterials, the electric field, magnetic field and wave vector follow a left-hand rule, the reverse of the behavior of conventional optical materials.

Classification

Negative index

In negative-index metamaterials (NIM), both permittivity and permeability are negative, resulting in a negative index of refraction. These are also known as double negative metamaterials or double negative materials (DNG). Other terms for NIMs include "left-handed media", "media with a negative refractive index", and "backward-wave media".

In optical materials, if both permittivity ε and permeability µ are positive, wave propagation travels in the forward direction. If both ε and µ are negative, a backward wave is produced. If ε and µ have different polarities, waves do not propagate. 

Mathematically, quadrant II and quadrant IV have coordinates (0,0) in a coordinate plane where ε is the horizontal axis, and µ is the vertical axis.

To date, only metamaterials exhibit a negative index of refraction.

Single negative

Single negative (SNG) metamaterials have either negative relative permittivity (εr) or negative relative permeability (µr), but not both. They act as metamaterials when combined with a different, complementary SNG, jointly acting as a DNG. 

Epsilon negative media (ENG) display a negative εr while µr is positive. Many plasmas exhibit this characteristic. For example, noble metals such as gold or silver are ENG in the infrared and visible spectrums

Mu-negative media (MNG) display a positive εr and negative µr. Gyrotropic or gyromagnetic materials exhibit this characteristic. A gyrotropic material is one that has been altered by the presence of a quasistatic magnetic field, enabling a magneto-optic effect. A magneto-optic effect is a phenomenon in which an electromagnetic wave propagates through such a medium. In such a material, left- and right-rotating elliptical polarizations can propagate at different speeds. When light is transmitted through a layer of magneto-optic material, the result is called the Faraday effect: the polarization plane can be rotated, forming a Faraday rotator. The results of such a reflection are known as the magneto-optic Kerr effect (not to be confused with the nonlinear Kerr effect). Two gyrotropic materials with reversed rotation directions of the two principal polarizations are called optical isomers

Joining a slab of ENG material and slab of MNG material resulted in properties such as resonances, anomalous tunneling, transparency and zero reflection. Like negative-index materials, SNGs are innately dispersive, so their εr, µr and refraction index n, are a function of frequency.

Bandgap

Electromagnetic bandgap metamaterials (EBG or EBM) control light propagation. This is accomplished either with photonic crystals (PC) or left-handed materials (LHM). PCs can prohibit light propagation altogether. Both classes can allow light to propagate in specific, designed directions and both can be designed with bandgaps at desired frequencies. The period size of EBGs is an appreciable fraction of the wavelength, creating constructive and destructive interference.
 
PC are distinguished from sub-wavelength structures, such as tunable metamaterials, because the PC derives its properties from its bandgap characteristics. PCs are sized to match the wavelength of light, versus other metamaterials that expose sub-wavelength structure. Furthermore, PCs function by diffracting light. In contrast, metamaterial does not use diffraction.

PCs have periodic inclusions that inhibit wave propagation due to the inclusions' destructive interference from scattering. The photonic bandgap property of PCs makes them the electromagnetic analog of electronic semi-conductor crystals.

EBGs have the goal of creating high quality, low loss, periodic, dielectric structures. An EBG affects photons in the same way semiconductor materials affect electrons. PCs are the perfect bandgap material, because they allow no light propagation. Each unit of the prescribed periodic structure acts like one atom, albeit of a much larger size.

EBGs are designed to prevent the propagation of an allocated bandwidth of frequencies, for certain arrival angles and polarizations. Various geometries and structures have been proposed to fabricate EBG's special properties. In practice it is impossible to build a flawless EBG device.

EBGs have been manufactured for frequencies ranging from a few gigahertz (GHz) to a few terahertz (THz), radio, microwave and mid-infrared frequency regions. EBG application developments include a transmission line, woodpiles made of square dielectric bars and several different types of low gain antennas.

Double positive medium

Double positive mediums (DPS) do occur in nature, such as naturally occurring dielectrics. Permittivity and magnetic permeability are both positive and wave propagation is in the forward direction. Artificial materials have been fabricated which combine DPS, ENG and MNG properties.

Bi-isotropic and bianisotropic

Categorizing metamaterials into double or single negative, or double positive, normally assumes that the metamaterial has independent electric and magnetic responses described by ε and µ. However, in many cases, the electric field causes magnetic polarization, while the magnetic field induces electrical polarization, known as magnetoelectric coupling. Such media are denoted as bi-isotropic. Media that exhibit magnetoelectric coupling and that are anisotropic (which is the case for many metamaterial structures), are referred to as bi-anisotropic.

Four material parameters are intrinsic to magnetoelectric coupling of bi-isotropic media. They are the electric (E) and magnetic (H) field strengths, and electric (D) and magnetic (B) flux densities. These parameters are ε, µ, κ and χ or permittivity, permeability, strength of chirality, and the Tellegen parameter respectively. In this type of media, material parameters do not vary with changes along a rotated coordinate system of measurements. In this sense they are invariant or scalar.

The intrinsic magnetoelectric parameters, κ and χ, affect the phase of the wave. The effect of the chirality parameter is to split the refractive index. In isotropic media this results in wave propagation only if ε and µ have the same sign. In bi-isotropic media with χ assumed to be zero, and κ a non-zero value, different results appear. Either a backward wave or a forward wave can occur. Alternatively, two forward waves or two backward waves can occur, depending on the strength of the chirality parameter.

In the general case, the constitutive relations for bi-anisotropic materials read where and are the permittivity and the permeability tensors, respectively, whereas and are the two magneto-electric tensors. If the medium is reciprocal, permittivity and permeability are symmetric tensors, and , where is the chiral tensor describing chiral electromagnetic and reciprocal magneto-electric response. The chiral tensor can be expressed as , where is the trace of , I is the identity matrix, N is a symmetric trace-free tensor, and J is an antisymmetric tensor. Such decomposition allows us to classify the reciprocal bianisotropic response and we can identify the following three main classes: (i) chiral media (), (ii) pseudochiral media (), (iii) omega media (). Generally the chiral and/or bianisotropic electromagnetic response is a consequence of 3D geometrical chirality: 3D chiral metamaterials are composed by embedding 3D chiral structures in a host medium and they show chirality-related polarization effects such as optical activity and circular dichroism. The concept of 2D chirality also exists and a planar object is said to be chiral if it cannot be superposed onto its mirror image unless it is lifted from the plane. On the other hand, bianisotropic response can arise from geometrical achiral structures possessing neither 2D nor 3D intrinsic chirality. Plum et al. investigated extrinsic chiral metamaterials where the magneto-electric coupling results from the geometric chirality of the whole structure and the effect is driven by the radiation wave vector contributing to the overall chiral asymmetry (extrinsic electromagnetic chiralilty). Rizza et al. suggested 1D chiral metamaterials where the effective chiral tensor is not vanishing if the system is geometrically one-dimensional chiral (the mirror image of the entire structure cannot be superposed onto it by using translations without rotations).

Chiral

Chiral metamaterials are constructed from chiral materials in which the effective parameter k is non-zero. This is a potential source of confusion as the metamaterial literature includes two conflicting uses of the terms left- and right-handed. The first refers to one of the two circularly polarized waves that are the propagating modes in chiral media. The second relates to the triplet of electric field, magnetic field and Poynting vector that arise in negative refractive index media, which in most cases are not chiral.

Wave propagation properties in chiral metamaterials demonstrate that negative refraction can be realized in metamaterials with a strong chirality and positive ε and μ. This is because the refractive index has distinct values for left and right, given by

.

It can be seen that a negative index will occur for one polarization if κ > εrµr. In this case, it is not necessary that either or both εr and µr be negative for backward wave propagation.

FSS based

Frequency selective surface-based metamaterials block signals in one waveband and pass those at another waveband. They have become an alternative to fixed frequency metamaterials. They allow for optional changes of frequencies in a single medium, rather than the restrictive limitations of a fixed frequency response.

Other types

Elastic

These metamaterials use different parameters to achieve a negative index of refraction in materials that are not electromagnetic. Furthermore, "a new design for elastic metamaterials that can behave either as liquids or solids over a limited frequency range may enable new applications based on the control of acoustic, elastic and seismic waves." They are also called mechanical metamaterials.[citation needed]

Acoustic

Acoustic metamaterials control, direct and manipulate sound in the form of sonic, infrasonic or ultrasonic waves in gases, liquids and solids. As with electromagnetic waves, sonic waves can exhibit negative refraction.

Control of sound waves is mostly accomplished through the bulk modulus β, mass density ρ and chirality. The bulk modulus and density are analogs of permittivity and permeability in electromagnetic metamaterials. Related to this is the mechanics of sound wave propagation in a lattice structure. Also materials have mass and intrinsic degrees of stiffness. Together, these form a resonant system and the mechanical (sonic) resonance may be excited by appropriate sonic frequencies (for example audible pulses).

Structural

Structural metamaterials provide properties such as crushability and light weight. Using projection micro-stereolithography, microlattices can be created using forms much like trusses and girders. Materials four orders of magnitude stiffer than conventional aerogel, but with the same density have been created. Such materials can withstand a load of at least 160,000 times their own weight by over-constraining the materials.

A ceramic nanotruss metamaterial can be flattened and revert to its original state.

Nonlinear

Metamaterials may be fabricated that include some form of nonlinear media, whose properties change with the power of the incident wave. Nonlinear media are essential for nonlinear optics. Most optical materials have a relatively weak response, meaning that their properties change by only a small amount for large changes in the intensity of the electromagnetic field. The local electromagnetic fields of the inclusions in nonlinear metamaterials can be much larger than the average value of the field. Besides, remarkable nonlinear effects have been predicted and observed if the metamaterial effective dielectric permittivity is very small (epsilon-near-zero media). In addition, exotic properties such as a negative refractive index, create opportunities to tailor the phase matching conditions that must be satisfied in any nonlinear optical structure.

Frequency bands

Terahertz

Terahertz metamaterials interact at terahertz frequencies, usually defined as 0.1 to 10 THz. Terahertz radiation lies at the far end of the infrared band, just after the end of the microwave band. This corresponds to millimeter and submillimeter wavelengths between the 3 mm (EHF band) and 0.03 mm (long-wavelength edge of far-infrared light).

Photonic

Photonic metamaterial interact with optical frequencies (mid-infrared). The sub-wavelength period distinguishes them from photonic band gap structures.

Tunable

Tunable metamaterials allow arbitrary adjustments to frequency changes in the refractive index. A tunable metamaterial expands beyond the bandwidth limitations in left-handed materials by constructing various types of metamaterials.

Plasmonic

Plasmonic metamaterials exploit surface plasmons, which are produced from the interaction of light with metal-dielectrics. Under specific conditions, the incident light couples with the surface plasmons to create self-sustaining, propagating electromagnetic waves known as surface plasmon polaritons.

Applications

Metamaterials are under consideration for many applications. Metamaterial antennas are commercially available.

In 2007, one researcher stated that for metamaterial applications to be realized, energy loss must be reduced, materials must be extended into three-dimensional isotropic materials and production techniques must be industrialized.

Antennas

Metamaterial antennas are a class of antennas that use metamaterials to improve performance. Demonstrations showed that metamaterials could enhance an antenna's radiated power. Materials that can attain negative permeability allow for properties such as small antenna size, high directivity and tunable frequency.

Absorber

A metamaterial absorber manipulates the loss components of metamaterials' permittivity and magnetic permeability, to absorb large amounts of electromagnetic radiation. This is a useful feature for photodetection and solar photovoltaic applications. Loss components are also relevant in applications of negative refractive index (photonic metamaterials, antenna systems) or transformation optics (metamaterial cloaking, celestial mechanics), but often are not utilized in these applications.

Superlens

A superlens is a two or three-dimensional device that uses metamaterials, usually with negative refraction properties, to achieve resolution beyond the diffraction limit (ideally, infinite resolution). Such a behaviour is enabled by the capability of double-negative materials to yield negative phase velocity. The diffraction limit is inherent in conventional optical devices or lenses.

Cloaking devices

Metamaterials are a potential basis for a practical cloaking device. The proof of principle was demonstrated on October 19, 2006. No practical cloaks are publicly known to exist.

RCS (Radar Cross Section) reducing metamaterials

Conventionally, the RCS has been reduced either by Radar absorbent material (RAM) or by purpose shaping of the targets such that the scattered energy can be redirected away from the source. While RAMs have narrow frequency band functionality, purpose shaping limits the aerodynamic performance of the target. More recently, metamaterials or metasurfaces are synthesized that can redirect the scattered energy away from the source using either array theory or generalized Snell's law. This has led to aerodynamically favorable shapes for the targets with the reduced RCS.

Seismic protection

Seismic metamaterials counteract the adverse effects of seismic waves on man-made structures.

Sound filtering

Metamaterials textured with nanoscale wrinkles could control sound or light signals, such as changing a material's color or improving ultrasound resolution. Uses include nondestructive material testing, medical diagnostics and sound suppression. The materials can be made through a high-precision, multi-layer deposition process. The thickness of each layer can be controlled within a fraction of a wavelength. The material is then compressed, creating precise wrinkles whose spacing can cause scattering of selected frequencies.

Theoretical models

All materials are made of atoms, which are dipoles. These dipoles modify light velocity by a factor n (the refractive index). In a split ring resonator the ring and wire units act as atomic dipoles: the wire acts as a ferroelectric atom, while the ring acts as an inductor L, while the open section acts as a capacitor C. The ring as a whole acts as an LC circuit. When the electromagnetic field passes through the ring, an induced current is created. The generated field is perpendicular to the light's magnetic field. The magnetic resonance results in a negative permeability; the refraction index is negative as well. (The lens is not truly flat, since the structure's capacitance imposes a slope for the electric induction.)

Several (mathematical) material models frequency response in DNGs. One of these is the Lorentz model, which describes electron motion in terms of a driven-damped, harmonic oscillator. The Debye relaxation model applies when the acceleration component of the Lorentz mathematical model is small compared to the other components of the equation. The Drude model applies when the restoring force component is negligible and the coupling coefficient is generally the plasma frequency. Other component distinctions call for the use of one of these models, depending on its polarity or purpose.

Three-dimensional composites of metal/non-metallic inclusions periodically/randomly embedded in a low permittivity matrix are usually modeled by analytical methods, including mixing formulas and scattering-matrix based methods. The particle is modeled by either an electric dipole parallel to the electric field or a pair of crossed electric and magnetic dipoles parallel to the electric and magnetic fields, respectively, of the applied wave. These dipoles are the leading terms in the multipole series. They are the only existing ones for a homogeneous sphere, whose polarizability can be easily obtained from the Mie scattering coefficients. In general, this procedure is known as the "point-dipole approximation", which is a good approximation for metamaterials consisting of composites of electrically small spheres. Merits of these methods include low calculation cost and mathematical simplicity.

Other first principles techniques for analyzing triply-periodic electromagnetic media may be found in Computing photonic band structure

Institutional networks

MURI

The Multidisciplinary University Research Initiative (MURI) encompasses dozens of Universities and a few government organizations. Participating universities include UC Berkeley, UC Los Angeles, UC San Diego, Massachusetts Institute of Technology, and Imperial College in London. The sponsors are Office of Naval Research and the Defense Advanced Research Project Agency.

MURI supports research that intersects more than one traditional science and engineering discipline to accelerate both research and translation to applications. As of 2009, 69 academic institutions were expected to participate in 41 research efforts.

Metamorphose

The Virtual Institute for Artificial Electromagnetic Materials and Metamaterials "Metamorphose VI AISBL" is an international association to promote artificial electromagnetic materials and metamaterials. It organizes scientific conferences, supports specialized journals, creates and manages research programs, provides training programs (including PhD and training programs for industrial partners); and technology transfer to European Industry.

Delayed-choice quantum eraser

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Delayed-choice_quantum_eraser A delayed-cho...