Search This Blog

Friday, April 27, 2018

Nanomedicine

From Wikipedia, the free encyclopedia

Nanomedicine is the medical application of nanotechnology.[1] Nanomedicine ranges from the medical applications of nanomaterials and biological devices, to nanoelectronic biosensors, and even possible future applications of molecular nanotechnology such as biological machines. Current problems for nanomedicine involve understanding the issues related to toxicity and environmental impact of nanoscale materials (materials whose structure is on the scale of nanometers, i.e. billionths of a meter).

Functionalities can be added to nanomaterials by interfacing them with biological molecules or structures. The size of nanomaterials is similar to that of most biological molecules and structures; therefore, nanomaterials can be useful for both in vivo and in vitro biomedical research and applications. Thus far, the integration of nanomaterials with biology has led to the development of diagnostic devices, contrast agents, analytical tools, physical therapy applications, and drug delivery vehicles.

Nanomedicine seeks to deliver a valuable set of research tools and clinically useful devices in the near future.[2][3] The National Nanotechnology Initiative expects new commercial applications in the pharmaceutical industry that may include advanced drug delivery systems, new therapies, and in vivo imaging.[4] Nanomedicine research is receiving funding from the US National Institutes of Health Common Fund program, supporting four nanomedicine development centers.[5]

Nanomedicine sales reached $16 billion in 2015, with a minimum of $3.8 billion in nanotechnology R&D being invested every year. Global funding for emerging nanotechnology increased by 45% per year in recent years, with product sales exceeding $1 trillion in 2013.[6] As the nanomedicine industry continues to grow, it is expected to have a significant impact on the economy.

Drug delivery

Nanoparticles (top), liposomes (middle), and dendrimers (bottom) are some nanomaterials being investigated for use in nanomedicine.

Nanotechnology has provided the possibility of delivering drugs to specific cells using nanoparticles.[7] The overall drug consumption and side-effects may be lowered significantly by depositing the active agent in the morbid region only and in no higher dose than needed. Targeted drug delivery is intended to reduce the side effects of drugs with concomitant decreases in consumption and treatment expenses. Drug delivery focuses on maximizing bioavailability both at specific places in the body and over a period of time. This can potentially be achieved by molecular targeting by nanoengineered devices.[8][9] A benefit of using nanoscale for medical technologies is that smaller devices are less invasive and can possibly be implanted inside the body, plus biochemical reaction times are much shorter. These devices are faster and more sensitive than typical drug delivery.[10] The efficacy of drug delivery through nanomedicine is largely based upon: a) efficient encapsulation of the drugs, b) successful delivery of drug to the targeted region of the body, and c) successful release of the drug.[citation needed]

Drug delivery systems, lipid-[11] or polymer-based nanoparticles,[12] can be designed to improve the pharmacokinetics and biodistribution of the drug.[13][14][15] However, the pharmacokinetics and pharmacodynamics of nanomedicine is highly variable among different patients.[16] When designed to avoid the body's defence mechanisms,[17] nanoparticles have beneficial properties that can be used to improve drug delivery. Complex drug delivery mechanisms are being developed, including the ability to get drugs through cell membranes and into cell cytoplasm. Triggered response is one way for drug molecules to be used more efficiently. Drugs are placed in the body and only activate on encountering a particular signal. For example, a drug with poor solubility will be replaced by a drug delivery system where both hydrophilic and hydrophobic environments exist, improving the solubility.[18] Drug delivery systems may also be able to prevent tissue damage through regulated drug release; reduce drug clearance rates; or lower the volume of distribution and reduce the effect on non-target tissue. However, the biodistribution of these nanoparticles is still imperfect due to the complex host's reactions to nano- and microsized materials[17] and the difficulty in targeting specific organs in the body. Nevertheless, a lot of work is still ongoing to optimize and better understand the potential and limitations of nanoparticulate systems. While advancement of research proves that targeting and distribution can be augmented by nanoparticles, the dangers of nanotoxicity become an important next step in further understanding of their medical uses.[19]

Nanoparticles are under research for their potential to decrease antibiotic resistance or for various antimicrobial uses.[20][21][22] Nanoparticles might also used to circumvent multidrug resistance (MDR) mechanisms.[7]

Systems under research

Two forms of nanomedicine that have already been tested in mice and are awaiting human testing will use gold nanoshells to help diagnose and treat cancer,[23] along with liposomes as vaccine adjuvants and drug transport vehicles.[24][25] Similarly, drug detoxification is also another application for nanomedicine which has shown promising results in rats.[26] Advances in Lipid nanotechnology was also instrumental in engineering medical nanodevices and novel drug delivery systems as well as in developing sensing applications.[27] Another example can be found in dendrimers and nanoporous materials. Another example is to use block co-polymers, which form micelles for drug encapsulation.[12]

Polymeric nanoparticles are a competing technology to lipidic (based mainly on Phospholipids) nanoparticles. There is an additional risk of toxicity associated with polymers not widely studied or understood. The major advantages of polymers is stability, lower cost and predictable characterisation. However, in the patient's body this very stability (slow degradation) is a negative factor. Phospholipids on the other hand are membrane lipids (already present in the body and surrounding each cell), have a GRAS (Generally Recognised As Safe) status from FDA and are derived from natural sources without any complex chemistry involved. They are not metabolised but rather absorbed by the body and the degradation products are themselves nutrients (fats or micronutrients).[citation needed]

Protein and peptides exert multiple biological actions in the human body and they have been identified as showing great promise for treatment of various diseases and disorders. These macromolecules are called biopharmaceuticals. Targeted and/or controlled delivery of these biopharmaceuticals using nanomaterials like nanoparticles[28] and Dendrimers is an emerging field called nanobiopharmaceutics, and these products are called nanobiopharmaceuticals.[citation needed]

Another highly efficient system for microRNA delivery for example are nanoparticles formed by the self-assembly of two different microRNAs deregulated in cancer.[29]

Another vision is based on small electromechanical systems; nanoelectromechanical systems are being investigated for the active release of drugs and sensors. Some potentially important applications include cancer treatment with iron nanoparticles or gold shells or cancer early diagnosis.[30] Nanotechnology is also opening up new opportunities in implantable delivery systems, which are often preferable to the use of injectable drugs, because the latter frequently display first-order kinetics (the blood concentration goes up rapidly, but drops exponentially over time). This rapid rise may cause difficulties with toxicity, and drug efficacy can diminish as the drug concentration falls below the targeted range.[citation needed]

Applications

Some nanotechnology-based drugs that are commercially available or in human clinical trials include:
  • Abraxane, approved by the U.S. Food and Drug Administration (FDA) to treat breast cancer,[31] non-small- cell lung cancer (NSCLC)[32] and pancreatic cancer,[33] is the nanoparticle albumin bound paclitaxel.
  • Doxil was originally approved by the FDA for the use on HIV-related Kaposi's sarcoma. It is now being used to also treat ovarian cancer and multiple myeloma. The drug is encased in liposomes, which helps to extend the life of the drug that is being distributed. Liposomes are self-assembling, spherical, closed colloidal structures that are composed of lipid bilayers that surround an aqueous space. The liposomes also help to increase the functionality and it helps to decrease the damage that the drug does to the heart muscles specifically.[34]
  • Onivyde, liposome encapsulated irinotecan to treat metastatic pancreatic cancer, was approved by FDA in October 2015.[35]
  • C-dots (Cornell dots) are the smallest silica-based nanoparticles with the size <10 a="" are="" dye="" href="https://en.wikipedia.org/wiki/Fluorescence" infused="" light="" nbsp="" nm.="" organic="" particles="" the="" title="Fluorescence" up="" which="" will="" with="">fluorescence
. Clinical trial is underway since 2011 to use the C-dots as diagnostic tool to assist surgeons to identify the location of tumor cells.[36]
  • An early phase clinical trial using the platform of ‘Minicell’ nanoparticle for drug delivery have been tested on patients with advanced and untreatable cancer. Built from the membranes of mutant bacteria, the minicells were loaded with paclitaxel and coated with cetuximab, antibodies that bind the epidermal growth factor receptor (EGFR) which is often overexpressed in a number of cancers, as a 'homing' device to the tumor cells. The tumor cells recognize the bacteria from which the minicells have been derived, regard it as invading microorganism and engulf it. Once inside, the payload of anti-cancer drug kills the tumor cells. Measured at 400 nanometers, the minicell is bigger than synthetic particles developed for drug delivery. The researchers indicated that this larger size gives the minicells a better profile in side-effects because the minicells will preferentially leak out of the porous blood vessels around the tumor cells and do not reach the liver, digestive system and skin. This Phase 1 clinical trial demonstrated that this treatment is well tolerated by the patients. As a platform technology, the minicell drug delivery system can be used to treat a number of different cancers with different anti-cancer drugs with the benefit of lower dose and less side-effects.[37][38]
  • In 2014, a Phase 3 clinical trial for treating inflammation and pain after cataract surgery, and a Phase 2 trial for treating dry eye disease were initiated using nanoparticle loteprednol etabonate.[39] In 2015, the product, KPI-121 was found to produce statistically significant positive results for the post-surgery treatment.[40]

  • Cancer

    Nanoparticles have high surface area to volume ratio. This allows for many functional groups to be attached to a nanoparticle, which can seek out and bind to certain tumor cells.[47] Additionally, the small size of nanoparticles (10 to 100 nanometers), allows them to preferentially accumulate at tumor sites (because tumors lack an effective lymphatic drainage system).[48] Limitations to conventional cancer chemotherapy include drug resistance, lack of selectivity, and lack of solubility.[46] Nanoparticles have the potential to overcome these problems.[41][49]

    In photodynamic therapy, a particle is placed within the body and is illuminated with light from the outside. The light gets absorbed by the particle and if the particle is metal, energy from the light will heat the particle and surrounding tissue. Light may also be used to produce high energy oxygen molecules which will chemically react with and destroy most organic molecules that are next to them (like tumors). This therapy is appealing for many reasons. It does not leave a "toxic trail" of reactive molecules throughout the body (chemotherapy) because it is directed where only the light is shined and the particles exist. Photodynamic therapy has potential for a noninvasive procedure for dealing with diseases, growth and tumors. Kanzius RF therapy is one example of such therapy (nanoparticle hyperthermia) .[citation needed] Also, gold nanoparticles have the potential to join numerous therapeutic functions into a single platform, by targeting specific tumor cells, tissues and organs.[50][51]

    Imaging

    In vivo imaging is another area where tools and devices are being developed.[52] Using nanoparticle contrast agents, images such as ultrasound and MRI have a favorable distribution and improved contrast. In cardiovascular imaging, nanoparticles have potential to aid visualization of blood pooling, ischemia, angiogenesis, atherosclerosis, and focal areas where inflammation is present.[52]

    The small size of nanoparticles endows them with properties that can be very useful in oncology, particularly in imaging.[7] Quantum dots (nanoparticles with quantum confinement properties, such as size-tunable light emission), when used in conjunction with MRI (magnetic resonance imaging), can produce exceptional images of tumor sites. Nanoparticles of cadmium selenide (quantum dots) glow when exposed to ultraviolet light. When injected, they seep into cancer tumors. The surgeon can see the glowing tumor, and use it as a guide for more accurate tumor removal.These nanoparticles are much brighter than organic dyes and only need one light source for excitation. This means that the use of fluorescent quantum dots could produce a higher contrast image and at a lower cost than today's organic dyes used as contrast media. The downside, however, is that quantum dots are usually made of quite toxic elements, but this concern may be addressed by use of fluorescent dopants.[53]

    Tracking movement can help determine how well drugs are being distributed or how substances are metabolized. It is difficult to track a small group of cells throughout the body, so scientists used to dye the cells. These dyes needed to be excited by light of a certain wavelength in order for them to light up. While different color dyes absorb different frequencies of light, there was a need for as many light sources as cells. A way around this problem is with luminescent tags. These tags are quantum dots attached to proteins that penetrate cell membranes.[53] The dots can be random in size, can be made of bio-inert material, and they demonstrate the nanoscale property that color is size-dependent. As a result, sizes are selected so that the frequency of light used to make a group of quantum dots fluoresce is an even multiple of the frequency required to make another group incandesce. Then both groups can be lit with a single light source. They have also found a way to insert nanoparticles[54] into the affected parts of the body so that those parts of the body will glow showing the tumor growth or shrinkage or also organ trouble.[55]

    Sensing

    Nanotechnology-on-a-chip is one more dimension of lab-on-a-chip technology. Magnetic nanoparticles, bound to a suitable antibody, are used to label specific molecules, structures or microorganisms. In particular silica nanoparticles are inert from the photophysical point of view and might accumulate a large number of dye(s) within the nanoparticle shell.[28] Gold nanoparticles tagged with short segments of DNA can be used for detection of genetic sequence in a sample. Multicolor optical coding for biological assays has been achieved by embedding different-sized quantum dots into polymeric microbeads. Nanopore technology for analysis of nucleic acids converts strings of nucleotides directly into electronic signatures.[citation needed]

    Sensor test chips containing thousands of nanowires, able to detect proteins and other biomarkers left behind by cancer cells, could enable the detection and diagnosis of cancer in the early stages from a few drops of a patient's blood.[56] Nanotechnology is helping to advance the use of arthroscopes, which are pencil-sized devices that are used in surgeries with lights and cameras so surgeons can do the surgeries with smaller incisions. The smaller the incisions the faster the healing time which is better for the patients. It is also helping to find a way to make an arthroscope smaller than a strand of hair.[57]

    Research on nanoelectronics-based cancer diagnostics could lead to tests that can be done in pharmacies. The results promise to be highly accurate and the product promises to be inexpensive. They could take a very small amount of blood and detect cancer anywhere in the body in about five minutes, with a sensitivity that is a thousand times better than in a conventional laboratory test. These devices that are built with nanowires to detect cancer proteins; each nanowire detector is primed to be sensitive to a different cancer marker.[30] The biggest advantage of the nanowire detectors is that they could test for anywhere from ten to one hundred similar medical conditions without adding cost to the testing device.[58] Nanotechnology has also helped to personalize oncology for the detection, diagnosis, and treatment of cancer. It is now able to be tailored to each individual’s tumor for better performance. They have found ways that they will be able to target a specific part of the body that is being affected by cancer.[59]

    Blood purification

    Magnetic micro particles are proven research instruments for the separation of cells and proteins from complex media. The technology is available under the name Magnetic-activated cell sorting or Dynabeads among others. More recently it was shown in animal models that magnetic nanoparticles can be used for the removal of various noxious compounds including toxins, pathogens, and proteins from whole blood in an extracorporeal circuit similar to dialysis.[60][61] In contrast to dialysis, which works on the principle of the size related diffusion of solutes and ultrafiltration of fluid across a semi-permeable membrane, the purification with nanoparticles allows specific targeting of substances. Additionally larger compounds which are commonly not dialyzable can be removed.[citation needed]

    The purification process is based on functionalized iron oxide or carbon coated metal nanoparticles with ferromagnetic or superparamagnetic properties.[62] Binding agents such as proteins,[61] antibodies,[60] antibiotics,[63] or synthetic ligands[64] are covalently linked to the particle surface. These binding agents are able to interact with target species forming an agglomerate. Applying an external magnetic field gradient allows exerting a force on the nanoparticles. Hence the particles can be separated from the bulk fluid, thereby cleaning it from the contaminants.[65][66]

    The small size (< 100 nm) and large surface area of functionalized nanomagnets leads to advantageous properties compared to hemoperfusion, which is a clinically used technique for the purification of blood and is based on surface adsorption. These advantages are high loading and accessibility of the binding agents, high selectivity towards the target compound, fast diffusion, small hydrodynamic resistance, and low dosage.[67]

    This approach offers new therapeutic possibilities for the treatment of systemic infections such as sepsis by directly removing the pathogen. It can also be used to selectively remove cytokines or endotoxins[63] or for the dialysis of compounds which are not accessible by traditional dialysis methods. However the technology is still in a preclinical phase and first clinical trials are not expected before 2017.[68]

    Tissue engineering

    Nanotechnology may be used as part of tissue engineering to help reproduce or repair or reshape damaged tissue using suitable nanomaterial-based scaffolds and growth factors. Tissue engineering if successful may replace conventional treatments like organ transplants or artificial implants. Nanoparticles such as graphene, carbon nanotubes, molybdenum disulfide and tungsten disulfide are being used as reinforcing agents to fabricate mechanically strong biodegradable polymeric nanocomposites for bone tissue engineering applications. The addition of these nanoparticles in the polymer matrix at low concentrations (~0.2 weight %) leads to significant improvements in the compressive and flexural mechanical properties of polymeric nanocomposites.[69][70] Potentially, these nanocomposites may be used as a novel, mechanically strong, light weight composite as bone implants.[citation needed]

    For example, a flesh welder was demonstrated to fuse two pieces of chicken meat into a single piece using a suspension of gold-coated nanoshells activated by an infrared laser. This could be used to weld arteries during surgery.[71] Another example is nanonephrology, the use of nanomedicine on the kidney.

    Medical devices

    Neuro-electronic interfacing is a visionary goal dealing with the construction of nanodevices that will permit computers to be joined and linked to the nervous system. This idea requires the building of a molecular structure that will permit control and detection of nerve impulses by an external computer. A refuelable strategy implies energy is refilled continuously or periodically with external sonic, chemical, tethered, magnetic, or biological electrical sources, while a nonrefuelable strategy implies that all power is drawn from internal energy storage which would stop when all energy is drained. A nanoscale enzymatic biofuel cell for self-powered nanodevices have been developed that uses glucose from biofluids including human blood and watermelons.[72] One limitation to this innovation is the fact that electrical interference or leakage or overheating from power consumption is possible. The wiring of the structure is extremely difficult because they must be positioned precisely in the nervous system. The structures that will provide the interface must also be compatible with the body's immune system.[73]

    Molecular nanotechnology is a speculative subfield of nanotechnology regarding the possibility of engineering molecular assemblers, machines which could re-order matter at a molecular or atomic scale. Nanomedicine would make use of these nanorobots, introduced into the body, to repair or detect damages and infections. Molecular nanotechnology is highly theoretical, seeking to anticipate what inventions nanotechnology might yield and to propose an agenda for future inquiry. The proposed elements of molecular nanotechnology, such as molecular assemblers and nanorobots are far beyond current capabilities.[1][73][74][75] Future advances in nanomedicine could give rise to life extension through the repair of many processes thought to be responsible for aging. K. Eric Drexler, one of the founders of nanotechnology, postulated cell repair machines, including ones operating within cells and utilizing as yet hypothetical molecular machines, in his 1986 book Engines of Creation, with the first technical discussion of medical nanorobots by Robert Freitas appearing in 1999.[1] Raymond Kurzweil, a futurist and transhumanist, stated in his book The Singularity Is Near that he believes that advanced medical nanorobotics could completely remedy the effects of aging by 2030.[76] According to Richard Feynman, it was his former graduate student and collaborator Albert Hibbs who originally suggested to him (circa 1959) the idea of a medical use for Feynman's theoretical micromachines (see nanotechnology). Hibbs suggested that certain repair machines might one day be reduced in size to the point that it would, in theory, be possible to (as Feynman put it) "swallow the doctor". The idea was incorporated into Feynman's 1959 essay There's Plenty of Room at the Bottom.[77]

    Thursday, April 26, 2018

    Entropy in thermodynamics and information theory

    From Wikipedia, the free encyclopedia

    There are close parallels between the mathematical expressions for the thermodynamic entropy, usually denoted by S, of a physical system in the statistical thermodynamics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s, and the information-theoretic entropy, usually expressed as H, of Claude Shannon and Ralph Hartley developed in the 1940s. Shannon commented on the similarity upon publicizing information theory in A Mathematical Theory of Communication.

    This article explores what links there are between the two concepts, and how far they can be regarded as connected.

    Equivalence of form of the defining expressions


    Boltzmann's grave in the Zentralfriedhof, Vienna, with bust and entropy formula.

    The defining expression for entropy in the theory of statistical mechanics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s, is of the form:
    {\displaystyle S=-k_{\text{B}}\sum _{i}p_{i}\ln p_{i},}
    where p_{i} is the probability of the microstate i taken from an equilibrium ensemble.

    The defining expression for entropy in the theory of information established by Claude E. Shannon in 1948 is of the form:
    {\displaystyle H=-\sum _{i}p_{i}\log _{b}p_{i},}
    where p_{i} is the probability of the message m_{i} taken from the message space M, and b is the base of the logarithm used. Common values of b are 2, Euler's number e, and 10, and the unit of entropy is shannon (or bit) for b = 2, nat for b = e, and hartley for b = 10.[1]

    Mathematically H may also be seen as an average information, taken over the message space, because when a certain message occurs with probability pi, the information quantity −log(pi) will be obtained.

    If all the microstates are equiprobable (a microcanonical ensemble), the statistical thermodynamic entropy reduces to the form, as given by Boltzmann,
    {\displaystyle S=k_{\text{B}}\ln W,}
    where W is the number of microstates that corresponds to the macroscopic thermodynamic state. Therefore S depends on temperature.

    If all the messages are equiprobable, the information entropy reduces to the Hartley entropy
    {\displaystyle H=\log _{b}|M|\ ,}
    where |M| is the cardinality of the message space M.

    The logarithm in the thermodynamic definition is the natural logarithm. It can be shown that the Gibbs entropy formula, with the natural logarithm, reproduces all of the properties of the macroscopic classical thermodynamics of Rudolf Clausius. (See article: Entropy (statistical views)).

    The logarithm can also be taken to the natural base in the case of information entropy. This is equivalent to choosing to measure information in nats instead of the usual bits (or more formally, shannons). In practice, information entropy is almost always calculated using base 2 logarithms, but this distinction amounts to nothing other than a change in units. One nat is about 1.44 bits.

    For a simple compressible system that can only perform volume work, the first law of thermodynamics becomes
    {\displaystyle dE=-pdV+TdS.}
    But one can equally well write this equation in terms of what physicists and chemists sometimes call the 'reduced' or dimensionless entropy, σ = S/k, so that
    {\displaystyle dE=-pdV+k_{\text{B}}Td\sigma .}
    Just as S is conjugate to T, so σ is conjugate to kBT (the energy that is characteristic of T on a molecular scale).

    Theoretical relationship

    Despite the foregoing, there is a difference between the two quantities. The information entropy H can be calculated for any probability distribution (if the "message" is taken to be that the event i which had probability pi occurred, out of the space of the events possible), while the thermodynamic entropy S refers to thermodynamic probabilities pi specifically. The difference is more theoretical than actual, however, because any probability distribution can be approximated arbitrarily closely by some thermodynamic system.[citation needed]

    Moreover, a direct connection can be made between the two. If the probabilities in question are the thermodynamic probabilities pi: the (reduced) Gibbs entropy σ can then be seen as simply the amount of Shannon information needed to define the detailed microscopic state of the system, given its macroscopic description. Or, in the words of G. N. Lewis writing about chemical entropy in 1930, "Gain in entropy always means loss of information, and nothing more". To be more concrete, in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes–no questions needed to be answered in order to fully specify the microstate, given that we know the macrostate.

    Furthermore, the prescription to find the equilibrium distributions of statistical mechanics—such as the Boltzmann distribution—by maximising the Gibbs entropy subject to appropriate constraints (the Gibbs algorithm) can be seen as something not unique to thermodynamics, but as a principle of general relevance in statistical inference, if it is desired to find a maximally uninformative probability distribution, subject to certain constraints on its averages. (These perspectives are explored further in the article Maximum entropy thermodynamics.)

    The Shannon entropy in information theory is sometimes expressed in units of bits per symbol. The physical entropy may be on a "per quantity" basis (h) which is called "intensive" entropy instead of the usual total entropy which is called "extensive" entropy. The "shannons" of a message (H) are its total "extensive" information entropy and is h times the number of bits in the message.

    A direct and physically real relationship between h and S can be found by assigning a symbol to each microstate that occurs per mole, kilogram, volume, or particle of a homogeneous substance, then calculating the 'h' of these symbols. By theory or by observation, the symbols (microstates) will occur with different probabilities and this will determine h. If there are N moles, kilograms, volumes, or particles of the unit substance, the relationship between h (in bits per unit substance) and physical extensive entropy in nats is:
    {\displaystyle S=k_{\mathrm {B} }\ln(2)Nh}
    where ln(2) is the conversion factor from base 2 of Shannon entropy to the natural base e of physical entropy. N h is the amount of information in bits needed to describe the state of a physical system with entropy S. Landauer's principle demonstrates the reality of this by stating the minimum energy E required (and therefore heat Q generated) by an ideally efficient memory change or logic operation by irreversibly erasing or merging N h bits of information will be S times the temperature which is
    {\displaystyle E=Q=Tk_{\mathrm {B} }\ln(2)Nh}
    where h is in informational bits and E and Q are in physical Joules. This has been experimentally confirmed.[2]

    Temperature is a measure of the average kinetic energy per particle in an ideal gas (Kelvins = 2/3*Joules/kb) so the J/K units of kb is fundamentally unitless (Joules/Joules). kb is the conversion factor from energy in 3/2*Kelvins to Joules for an ideal gas. If kinetic energy measurements per particle of an ideal gas were expressed as Joules instead of Kelvins, kb in the above equations would be replaced by 3/2. This shows that S is a true statistical measure of microstates that does not have a fundamental physical unit other than the units of information, in this case "nats", which is just a statement of which logarithm base was chosen by convention.

    Information is physical

    Szilard's engine


    N-atom engine schematic

    A physical thought experiment demonstrating how just the possession of information might in principle have thermodynamic consequences was established in 1929 by Leó Szilárd, in a refinement of the famous Maxwell's demon scenario.

    Consider Maxwell's set-up, but with only a single gas particle in a box. If the supernatural demon knows which half of the box the particle is in (equivalent to a single bit of information), it can close a shutter between the two halves of the box, close a piston unopposed into the empty half of the box, and then extract k_{B}T\ln 2 joules of useful work if the shutter is opened again. The particle can then be left to isothermally expand back to its original equilibrium occupied volume. In just the right circumstances therefore, the possession of a single bit of Shannon information (a single bit of negentropy in Brillouin's term) really does correspond to a reduction in the entropy of the physical system. The global entropy is not decreased, but information to energy conversion is possible.

    Using a phase-contrast microscope equipped with a high speed camera connected to a computer, as demon, the principle has been actually demonstrated.[3] In this experiment, information to energy conversion is performed on a Brownian particle by means of feedback control; that is, synchronizing the work given to the particle with the information obtained on its position. Computing energy balances for different feedback protocols, has confirmed that the Jarzynski equality requires a generalization that accounts for the amount of information involved in the feedback.

    Landauer's principle

    In fact one can generalise: any information that has a physical representation must somehow be embedded in the statistical mechanical degrees of freedom of a physical system.

    Thus, Rolf Landauer argued in 1961, if one were to imagine starting with those degrees of freedom in a thermalised state, there would be a real reduction in thermodynamic entropy if they were then re-set to a known state. This can only be achieved under information-preserving microscopically deterministic dynamics if the uncertainty is somehow dumped somewhere else – i.e. if the entropy of the environment (or the non information-bearing degrees of freedom) is increased by at least an equivalent amount, as required by the Second Law, by gaining an appropriate quantity of heat: specifically kT ln 2 of heat for every 1 bit of randomness erased.

    On the other hand, Landauer argued, there is no thermodynamic objection to a logically reversible operation potentially being achieved in a physically reversible way in the system. It is only logically irreversible operations – for example, the erasing of a bit to a known state, or the merging of two computation paths – which must be accompanied by a corresponding entropy increase. When information is physical, all processing of its representations, i.e. generation, encoding, transmission, decoding and interpretation, are natural processes where entropy increases by consumption of free energy.[4]

    Applied to the Maxwell's demon/Szilard engine scenario, this suggests that it might be possible to "read" the state of the particle into a computing apparatus with no entropy cost; but only if the apparatus has already been SET into a known state, rather than being in a thermalised state of uncertainty. To SET (or RESET) the apparatus into this state will cost all the entropy that can be saved by knowing the state of Szilard's particle.

    Negentropy

    Shannon entropy has been related by physicist Léon Brillouin to a concept sometimes called negentropy. In 1953, Brillouin derived a general equation[5] stating that the changing of an information bit value requires at least kT ln(2) energy. This is the same energy as the work Leo Szilard's engine produces in the idealistic case, which in turn equals to the same quantity found by Landauer. In his book,[6] he further explored this problem concluding that any cause of a bit value change (measurement, decision about a yes/no question, erasure, display, etc.) will require the same amount, kT ln(2), of energy. Consequently, acquiring information about a system’s microstates is associated with an entropy production, while erasure yields entropy production only when the bit value is changing. Setting up a bit of information in a sub-system originally in thermal equilibrium results in a local entropy reduction. However, there is no violation of the second law of thermodynamics, according to Brillouin, since a reduction in any local system’s thermodynamic entropy results in an increase in thermodynamic entropy elsewhere. In this way, Brillouin clarified the meaning of negentropy which was considered as controversial because its earlier understanding can yield Carnot efficiency higher than one. Additionally, the relationship between energy and information formulated by Brillouin has been proposed as a connection between the amount of bits that the brain processes and the energy it consumes.[7]

    In 2009, Mahulikar & Herwig redefined thermodynamic negentropy as the specific entropy deficit of the dynamically ordered sub-system relative to its surroundings.[8] This definition enabled the formulation of the Negentropy Principle, which is mathematically shown to follow from the 2nd Law of Thermodynamics, during order existence.

    Black holes

    Stephen Hawking often speaks of the thermodynamic entropy of black holes in terms of their information content.[9] Do black holes destroy information? It appears that there are deep relations between the entropy of a black hole and information loss[10] See Black hole thermodynamics and Black hole information paradox.

    Quantum theory

    Hirschman showed,[11] cf. Hirschman uncertainty, that Heisenberg's uncertainty principle can be expressed as a particular lower bound on the sum of the classical distribution entropies of the quantum observable probability distributions of a quantum mechanical state, the square of the wave-function, in coordinate, and also momentum space, when expressed in Planck units. The resulting inequalities provide a tighter bound on the uncertainty relations of Heisenberg.

    It is meaningful to assign a "joint entropy", because positions and momenta are quantum conjugate variables and are therefore not jointly observable. Mathematically, they have to be treated as joint distribution. Note that this joint entropy is not equivalent to the Von Neumann entropy, −Tr ρ lnρ = −⟨lnρ⟩. Hirschman's entropy is said to account for the full information content of a mixture of quantum states.[12]

    (Dissatisfaction with the Von Neumann entropy from quantum information points of view has been expressed by Stotland, Pomeransky, Bachmat and Cohen, who have introduced a yet different definition of entropy that reflects the inherent uncertainty of quantum mechanical states. This definition allows distinction between the minimum uncertainty entropy of pure states, and the excess statistical entropy of mixtures.[13])

    The fluctuation theorem

    The fluctuation theorem provides a mathematical justification of the second law of thermodynamics under these principles, and precisely defines the limitations of the applicability of that law for systems away from thermodynamic equilibrium.

    Criticism

    There exists criticisms of the link between thermodynamic entropy and information entropy.

    The most common criticism is that information entropy cannot be related to thermodynamic entropy because there is no concept of temperature, energy, or the second law, in the discipline of information entropy.[14][15][16][17][18] This can best be discussed by considering the fundamental equation of thermodynamics:
    {\displaystyle dU=\sum F_{i}\,dx_{i}}
    where the Fi are "generalized forces" and the dxi are "generalized displacements". This is analogous to the mechanical equation dE = F dx where dE is the change in the kinetic energy of an object having been displaced by distance dx under the influence of force F. For example, for a simple gas, we have:
    {\displaystyle dU=TdS-PdV+\mu dN}
    where the temperature (T ), pressure (P ), and chemical potential (µ ) are generalized forces which, when imbalanced, result in a generalized displacement in entropy (S ), volume (-V ) and quantity (N ) respectively, and the products of the forces and displacements yield the change in the internal energy (dU ) of the gas.

    In the mechanical example, to declare that dx is not a geometric displacement because it ignores the dynamic relationship between displacement, force, and energy is not correct. Displacement, as a concept in geometry, does not require the concepts of energy and force for its definition, and so one might expect that entropy may not require the concepts of energy and temperature for its definition. The situation is not that simple, however. In classical thermodynamics, which is the study of thermodynamics from a purely empirical, or measurement point of view, thermodynamic entropy can only be measured by considering energy and temperature. Clausius' statement dS= δQ/T, or, equivalently, when all other effective displacements are zero, dS=dU/T, is the only way to actually measure thermodynamic entropy. It is only with the introduction of statistical mechanics, the viewpoint that a thermodynamic system consists of a collection of particles and which explains classical thermodynamics in terms of probability distributions, that the entropy can be considered separately from temperature and energy. This is expressed in Boltzmann's famous entropy formula S=kB ln(W). Here kB is Boltzmann's constant, and W is the number of equally probable microstates which yield a particular thermodynamic state, or macrostate.

    Boltzmann's equation is presumed to provide a link between thermodynamic entropy S and information entropy H = −Σi pi ln pi = ln(W) where pi=1/W are the equal probabilities of a given microstate. This interpretation has been criticized also. While some say that the equation is merely a unit conversion equation between thermodynamic and information entropy, this is not completely correct.[19] A unit conversion equation will, e.g., change inches to centimeters, and yield two measurements in different units of the same physical quantity (length). Since thermodynamic and information entropy are dimensionally unequal (energy/unit temperature vs. units of information), Boltzmann's equation is more akin to x = c t where x is the distance travelled by a light beam in time t, c being the speed of light. While we cannot say that length x and time t represent the same physical quantity, we can say that, in the case of a light beam, since c is a universal constant, they will provide perfectly accurate measures of each other. (For example, the light-year is used as a measure of distance). Likewise, in the case of Boltzmann's equation, while we cannot say that thermodynamic entropy S and information entropy H represent the same physical quantity, we can say that, in the case of a thermodynamic system, since kB is a universal constant, they will provide perfectly accurate measures of each other.

    The question then remains whether ln(W) is an information-theoretic quantity. If it is measured in bits, one can say that, given the macrostate, it represents the number of yes/no questions one must ask to determine the microstate, clearly an information-theoretic concept. Objectors point out that such a process is purely conceptual, and has nothing to do with the measurement of entropy. Then again, the whole of statistical mechanics is purely conceptual, serving only to provide an explanation of the "pure" science of thermodynamics.

    Ultimately, the criticism of the link between thermodynamic entropy and information entropy is a matter of terminology, rather than substance. Neither side in the controversy will disagree on the solution to a particular thermodynamic or information-theoretic problem.

    Topics of recent research

    Is information quantized?

    In 1995, Tim Palmer signalled[citation needed] two unwritten assumptions about Shannon's definition of information that may make it inapplicable as such to quantum mechanics:
    • The supposition that there is such a thing as an observable state (for instance the upper face of a dice or a coin) before the observation begins
    • The fact that knowing this state does not depend on the order in which observations are made (commutativity)
    Anton Zeilinger's and Caslav Brukner's article[20] synthesized and developed these remarks. The so-called Zeilinger's principle suggests that the quantization observed in QM could be bound to information quantization (one cannot observe less than one bit, and what is not observed is by definition "random"). Nevertheless, these claims remain quite controversial. Detailed discussions of the applicability of the Shannon information in quantum mechanics and an argument that Zeilinger's principle cannot explain quantization have been published,[21][22][23] that show that Brukner and Zeilinger change, in the middle of the calculation in their article, the numerical values of the probabilities needed to compute the Shannon entropy, so that the calculation makes little sense.

    Extracting work from quantum information in a Szilárd engine

    In 2013, a description was published[24] of a two atom version of a Szilárd engine using Quantum discord to generate work from purely quantum information.[25] Refinements in the lower temperature limit were suggested.[26]

    Algorithmic cooling

    Algorithmic cooling is an algorithmic method for transferring heat (or entropy) from some qubits to others or outside the system and into the environment, thus resulting in a cooling effect. This cooling effect may have usages in initializing cold (highly pure) qubits for quantum computation and in increasing polarization of certain spins in nuclear magnetic resonance.

    Planck length

    From Wikipedia, the free encyclopedia
    Planck length
    Unit system Planck units
    Unit of length
    Symbol P
    Unit conversions
    1 P in ... ... is equal to ...
       SI units    1.616229(38)×10−35 m
       natural units    11.706 S
    3.0542×10−25 a0
       imperial/US units    6.3631×10−34 in

    In physics, the Planck length, denoted P, is a unit of length, equal to 1.616229(38)×10−35 metres. It is a base unit in the system of Planck units, developed by physicist Max Planck. The Planck length can be defined from three fundamental physical constants: the speed of light in a vacuum, the Planck constant, and the gravitational constant.

    Value

    The Planck length P is defined as:
    {\displaystyle \ell _{\mathrm {P} }={\sqrt {\frac {\hbar G}{c^{3}}}}}
    Solving the above will show the approximate equivalent value of this unit with respect to the meter:
    {\displaystyle 1\ \ell _{\mathrm {P} }\approx 1.616\;229(38)\times 10^{-35}\ \mathrm {m} }, where c is the speed of light in a vacuum, G is the gravitational constant, and ħ is the reduced Planck constant. The two digits enclosed by parentheses are the estimated standard error associated with the reported numerical value.[1][2]

    The Planck length is about 10−20 times the diameter of a proton. It can be defined using the radius of Planck particle.

    Measuring the Planck length

    In 2017 it was suggested by E. Haug[3] that the Planck length can be indirectly measured independent of any knowledge of Newton's gravitational constant with for example the use of a Cavendish apparatus. Further, it seems like the error in the Planck length measures must be exactly half of that in the measurement errors of the Newton's gravitational constant. That is the error as measured in percentage term, also known as the relative standard uncertainty. This is in line with the relative standard uncertainty reported by NIST, which for the gravitational constant is {\displaystyle 4.7\times 10^{-5}} and for the Planck length is {\displaystyle 2.3\times 10^{-5}}.

    History

    In 1899 Max Planck[4] suggested that there existed some fundamental natural units for length, mass, time and energy. These he derived using dimensional analysis only using the Newton gravitational constant, the speed of light and the Planck constant. The natural units he derived has later been known as: the Planck length, the Planck mass, the Planck time and the Planck energy.

    Theoretical significance

    The Planck length is the scale at which quantum gravitational effects are believed to begin to be apparent, where interactions require a working theory of quantum gravity to be analyzed.[5] The Planck area is the area by which the surface of a spherical black hole increases when the black hole swallows one bit of information.[6]

    The Planck length is sometimes misconceived as the minimum length of spacetime, but this is not accepted by conventional physics, as this would require violation or modification of Lorentz symmetry.[5] However, certain theories of loop quantum gravity do attempt to establish a minimum length on the scale of the Planck length, though not necessarily the Planck length itself[5], or attempt to establish the Planck length as observer-invariant, known as doubly special relativity.[citation needed]

    The strings of string theory are modelled to be on the order of the Planck length.[5][7] In theories of large extra dimensions, the Planck length has no fundamental physical significance, and quantum gravitational effects appear at other scales.[citation needed]

    Planck length and Euclidean geometry

    The gravitational field performs zero-point oscillations, and the geometry associated with it also oscillates. The ratio of the circumference to the radius varies near the Euclidean value. The smaller the scale, the greater the deviations from the Euclidean geometry. Let us estimate the order of the wavelength of zero gravitational oscillations, at which the geometry becomes completely unlike the Euclidean geometry. The degree of deviation \zeta of geometry from Euclidean geometry in the gravitational field is determined by the ratio of the gravitational potential \varphi and the square of the speed of light c: {\displaystyle \zeta =\varphi /c^{2}}. When {\displaystyle \zeta \ll 1}, the geometry is close to Euclidean geometry; for {\displaystyle \zeta \sim 1}, all similarities disappear. The energy of the oscillation of scale l is equal to {\displaystyle E=\hbar \nu \sim \hbar c/l} (where {\displaystyle c/l} is the order of the oscillation frequency). The gravitational potential created by the mass m, at this length is {\displaystyle \varphi =Gm/l}, where G is the constant of universal gravitation. Instead of m, we must substitute a mass, which, according to Einstein's formula, corresponds to the energy E (where m=E/c^{2}). We get {\displaystyle \varphi =GE/l\,c^{2}=G\hbar /l^{2}c}. Dividing this expression by c^{2}, we obtain the value of the deviation {\displaystyle \zeta =G\hbar /c^{3}l^{2}=\ell _{P}^{2}/l^{2}}. Equating \zeta =1, we find the length at which the Euclidean geometry is completely distorted. It is equal to Planck length {\displaystyle \ell _{P}={\sqrt {G\hbar /c^{3}}}\approx 10^{-35}}m. Here there is a quantum foam.

    Visualization

    The size of the Planck length can be visualized as follows: if a particle or dot about 0.005 mm in size (which is the same size as a small grain of silt) were magnified in size to be as large as the observable universe, then inside that universe-sized "dot", the Planck length would be roughly the size of an actual 0.005 mm dot. In other words, a 0.005 mm dot is halfway between the Planck length and the size of the observable universe on a logarithmic scale.[8] All said, the attempt to visualize to an arbitrary scale of a 0.005 mm dot is only for a hinge point. With no fixed frame of reference for time or space, where the spatial units shrink toward infinitesimally small spatial sections and time stretches toward infinity, scale breaks down. Inverted, where space is stretched and time is shrunk, the scale adjusts the other way according to the ratio V-squared/C-squared (Lorentz transformation).[clarification needed]

    Operator (computer programming)

    From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Operator_(computer_programmin...