Search This Blog

Tuesday, June 26, 2018

Buffer solution

From Wikipedia, the free encyclopedia
 
A buffer solution (more precisely, pH buffer or hydrogen ion buffer) is an aqueous solution consisting of a mixture of a weak acid and its conjugate base, or vice versa. Its pH changes very little when a small amount of strong acid or base is added to it. Buffer solutions are used as a means of keeping pH at a nearly constant value in a wide variety of chemical applications. In nature, there are many systems that use buffering for pH regulation. For example, the bicarbonate buffering system is used to regulate the pH of blood.

Principles of buffering

Simulated titration of an acidified solution of a weak acid (pKa = 4.7) with alkali.
Addition of hydroxide to an equilibrium mixture of a weak acid. HA, and its conjugate base, A

Buffer solutions achieve their resistance to pH change because of the presence of an equilibrium between the acid HA and its conjugate base A.
HA ⇌ H+ + A
When some strong acid is added to an equilibrium mixture of the weak acid and its conjugate base, the equilibrium is shifted to the left, in accordance with Le Châtelier's principle. Because of this, the hydrogen ion concentration increases by less than the amount expected for the quantity of strong acid added. Similarly, if strong alkali is added to the mixture the hydrogen ion concentration decreases by less than the amount expected for the quantity of alkali added. The effect is illustrated by the simulated titration of a weak acid with pKa = 4.7. The relative concentration of undissociated acid is shown in blue and of its conjugate base in red. The pH changes relatively slowly in the buffer region, pH = pKa ± 1, centered at pH = 4.7 where [HA] = [A]. The hydrogen ion concentration decreases by less than the amount expected because most of the added hydroxide ion is consumed in the reaction
OH + HA → H2O + A
and only a little is consumed in the neutralization reaction which results in an increase in pH.
OH + H+ → H2O
Once the acid is more than 95% deprotonated the pH rises rapidly because most of the added alkali is consumed in the neutralization reaction.

Buffer capacity

Buffer capacity, β, is a quantitative measure of the resistance of a buffer solution to pH change on addition of hydroxide ions. It can be defined as follows.
{\displaystyle \beta ={\frac {dn}{d(\mathrm {pH} )}}},
where dn is an infinitesimal amount of added base and d(p[H+]) is the resulting infinitesimal change in the cologarithm of the hydrogen ion concentration. With this definition the buffer capacity of a weak acid, with a dissociation constant Ka, can be expressed as:
{\displaystyle {\frac {dn}{d(\mathrm {pH} )}}=\ln {10}{\frac {C_{\mathrm {A} }K_{\mathrm {a} }[\mathrm {H^{+}} ]}{\left(K_{\mathrm {a} }+[\mathrm {H^{+}} ]\right)^{2}}}\approx 2.303{\frac {C_{\mathrm {A} }K_{\mathrm {a} }[\mathrm {H^{+}} ]}{\left(K_{\mathrm {a} }+[\mathrm {H^{+}} ]\right)^{2}}}}
Buffer capacity for a 0.1 M solution of an acid with pKa 
of 7

for pH close to the pKa. CA is the analytical concentration of the acid.[1][2] pH is defined as −log10[H+]. For simple buffers there are three regions of raised buffer capacity.
  1. At very low pH the buffer capacity rises exponentially with decreasing pH.
  2. The buffer capacity of a buffering agent is at a local maximum when pH = pKa. It falls to about 33% of the maximum value at pH = pKa ± 1 and to about 12% at pH = pKa ± 1.5. For this reason the useful range is approximately pKa ± 1. Buffer capacity is proportional to the concentration of the buffering agent, CA, so dilute solutions have little buffer capacity.
  3. At very high pH the buffer capacity rises exponentially with increasing pH.
Properties 1 and 3 are independent of the presence or absence of added buffering agents. They are concentration effects and reflect the fact that pH is related to the logarithm of the hydrogen ion concentration.

Applications

Buffer solutions are necessary to keep the correct pH for enzymes in many organisms to work. Many enzymes work only under very precise conditions; if the pH moves outside of a narrow range, the enzymes slow or stop working and can denature. In many cases denaturation can permanently disable their catalytic activity.[3] A buffer of carbonic acid (H
2
CO
3
) and bicarbonate (HCO
3
) is present in blood plasma, maintaining a pH between 7.35 and 7.45.

Industrially, buffer solutions are used in fermentation processes and in setting the correct conditions for dyes used in colouring fabrics. They are also used in chemical analysis[2] and calibration of pH meters.

The majority of biological samples that are used in research are made in buffers, especially phosphate buffered saline (PBS) at pH 7.4.

Simple buffering agents

Buffering agent pKa Useful pH range
Citric acid 3.13, 4.76, 6.40 2.1–7.4
Acetic acid 4.8 3.8–5.8
KH2PO4 7.2 6.2–8.2
CHES 9.3 8.3–10.3
Borate 9.24 8.25–10.25
For buffers in acid regions, the pH may be adjusted to a desired value by adding a strong acid such as hydrochloric acid to the buffering agent. For alkaline buffers, a strong base such as sodium hydroxide may be added. Alternatively, a buffer mixture can be made from a mixture of an acid and its conjugate base. For example, an acetate buffer can be made from a mixture of acetic acid and sodium acetate. Similarly an alkaline buffer can be made from a mixture of the base and its conjugate acid.

"Universal" buffer mixtures

By combining substances with pKa values differing by only two or less and adjusting the pH, a wide range of buffers can be obtained. Citric acid is a useful component of a buffer mixture because it has three pKa values, separated by less than two. The buffer range can be extended by adding other buffering agents. The following mixtures (McIlvaine's buffer solutions) have a buffer range of pH 3 to 8.[4]
0.2 M Na2HPO4 (mL) 0.1 M citric acid (mL) pH
20.55 79.45 3.0
38.55 61.45 4.0
51.50 48.50 5.0
63.15 36.85 6.0
82.35 17.65 7.0
97.25 2.75 8.0
A mixture containing citric acid, monopotassium phosphate, boric acid, and diethyl barbituric acid can be made to cover the pH range 2.6 to 12.[5]

Other universal buffers are the Carmody buffer[6] and the Britton–Robinson buffer, developed in 1931.

Common buffer compounds used in biology


Common name Structure pKa
at 25 °C
Temp. effect
dpH/dT (K−1)[7]
Mol.
weight
TAPS TAPS.svg 8.43 −0.018 243.3
Bicine Bicine.png 8.35 −0.018 163.2
Tris Tris.png 8.07* −0.028 121.14
Tricine Tricine.png 8.05 −0.021 179.2
TAPSO TAPSO.svg 7.635
259.3
HEPES HEPES.png 7.48 −0.014 238.3
TES TES free acid.svg 7.40 −0.020 229.20
MOPS MOPS.png 7.20 −0.015 209.3
PIPES PIPES.svg 6.76 −0.008 302.4
Cacodylate Cacodylic acid.svg 6.27
138.0
MES MES.svg 6.15 −0.011 195.2
(*) Tris is a base, the pKa of 8.07 refers to its conjugate acid.

Calculating buffer pH

Monoprotic acids

First write down the equilibrium expression.
HA ⇌ A + H+
This shows that when the acid dissociates equal amounts of hydrogen ion and anion are produced. The equilibrium concentrations of these three components can be calculated in an ICE table.
ICE table for a monoprotic acid

[HA] [A] [H+]
I C0 0 y
C x x x
E C0x x x + y
The first row, labelled I, lists the initial conditions: the concentration of acid is C0, initially undissociated, so the concentrations of A and H+ would be zero; y is the initial concentration of added strong acid, such as hydrochloric acid. If strong alkali, such as sodium hydroxide, is added y will have a negative sign because alkali removes hydrogen ions from the solution. The second row, labelled C for change, specifies the changes that occur when the acid dissociates. The acid concentration decreases by an amount −x and the concentrations of A and H+ both increase by an amount +x. This follows from the equilibrium expression. The third row, labelled E for equilibrium concentrations, adds together the first two rows and shows the concentrations at equilibrium.

To find x, use the formula for the equilibrium constant in terms of concentrations:
{\displaystyle K_{\mathrm {a} }={\frac {[\mathrm {H^{+}} ][\mathrm {A^{-}} ]}{[\mathrm {HA} ]}}}
Substitute the concentrations with the values found in the last row of the ICE table:
{\displaystyle K_{\mathrm {a} }={\frac {x(x+y)}{C_{0}-x}}}
Simplify to:
{\displaystyle x^{2}+(K_{\mathrm {a} }+y)x-K_{\mathrm {a} }C_{0}=0}
With specific values for C0, Ka and y this equation can be solved for x. Assuming that pH = −log10[H+] the pH can be calculated as pH = −log10(x + y).

Polyprotic acids

This image plots the relative percentages of the protonation species of citric acid as a function of p H. Citric acid has three ionizable hydrogen atoms and thus three p K A values. Below the lowest p K A, the triply protonated species prevails; between the lowest and middle p K A, the doubly protonated form prevails; between the middle and highest p K A, the singly protonated form prevails; and above the highest p K A, the unprotonated form of citric acid is predominant.
% species formation calculated for a 10 millimolar solution of citric acid.

Polyprotic acids are acids that can lose more than one proton. The constant for dissociation of the first proton may be denoted as Ka1 and the constants for dissociation of successive protons as Ka2, etc. Citric acid, H3A, is an example of a polyprotic acid as it can lose three protons.
Equilibrium pKa value
H3A ⇌ H2A + H+ pKa1 = 3.13
H2A ⇌ HA2− + H+ pKa2 = 4.76
HA2− ⇌ A3− + H+ pKa3 = 6.40
When the difference between successive pKa values is less than about three there is overlap between the pH range of existence of the species in equilibrium. The smaller the difference, the more the overlap. In the case of citric acid, the overlap is extensive and solutions of citric acid are buffered over the whole range of pH 2.5 to 7.5.

Calculation of the pH with a polyprotic acid requires a speciation calculation to be performed. In the case of citric acid, this entails the solution of the two equations of mass balance
{\displaystyle {\begin{aligned}C_{{\ce {A}}}&=[{\ce {A^3-}}]+\beta _{1}[{\ce {A^3-}}][{\ce {H+}}]+\beta _{2}[{\ce {A^3-}}][{\ce {H+}}]^{2}+\beta _{3}[{\ce {A^3-}}][{\ce {H+}}]^{3}\\C_{{\ce {H}}}&=[{\ce {H+}}]+\beta _{1}[{\ce {A^3-}}][{\ce {H+}}]+2\beta _{2}[{\ce {A^3-}}][{\ce {H+}}]^{2}+3\beta _{3}[{\ce {A^3-}}][{\ce {H+}}]^{3}-K_{{\ce {w}}}[{\ce {H+}}]^{-1}\end{aligned}}}
CA is the analytical concentration of the acid, CH is the analytical concentration of added hydrogen ions, βq are the cumulative association constants
{\displaystyle \log \beta _{1}={\ce {p}}K_{{\ce {a3}}},\quad \log \beta _{2}={\ce {p}}K_{{\ce {a2}}}+{\ce {p}}K_{{\ce {a3}}},\quad \log \beta _{3}={\ce {p}}K_{{\ce {a1}}}+{\ce {p}}K_{{\ce {a2}}}+{\ce {p}}K_{{\ce {a3}}}}
Kw is the constant for self-ionization of water. There are two non-linear simultaneous equations in two unknown quantities [A3−] and [H+]. Many computer programs are available to do this calculation. The speciation diagram for citric acid was produced with the program HySS.[8]

The Singularity Is Near

From Wikipedia, the free encyclopedia
 
The Singularity Is Near: When Humans Transcend Biology
Cover of The Singularity is Near
Author Raymond Kurzweil
Country United States
Language English
Publisher Viking
Publication date
2005
Pages 652
ISBN 978-0-670-03384-3
OCLC 57201348
153.9
LC Class QP376 .K85
Preceded by The Age of Spiritual Machines
Followed by How to Create a Mind: The Secret of Human Thought Revealed

The Singularity Is Near: When Humans Transcend Biology is a 2005 non-fiction book about artificial intelligence and the future of humanity by inventor and futurist Ray Kurzweil.

The book builds on the ideas introduced in Kurzweil's previous books, The Age of Intelligent Machines (1990) and The Age of Spiritual Machines (1999). This time, however, Kurzweil embraces the term the Singularity, which was popularized by Vernor Vinge in his 1993 essay "The Coming Technological Singularity" more than a decade earlier.[1]

Kurzweil describes his law of accelerating returns which predicts an exponential increase in technologies like computers, genetics, nanotechnology, robotics and artificial intelligence. Once the Singularity has been reached, Kurzweil says that machine intelligence will be infinitely more powerful than all human intelligence combined. Afterwards he predicts intelligence will radiate outward from the planet until it saturates the universe. The Singularity is also the point at which machines intelligence and humans would merge.

Content

Exponential growth

Kurzweil characterizes evolution throughout all time as progressing through six epochs, each one building on the one before. He says the four epochs which have occurred so far are Physics and Chemistry, Biology and DNA, Brains, and Technology. Kurzweil predicts the Singularity will coincide with the next epoch, The Merger of Human Technology with Human Intelligence. After the Singularity he says the final epoch will occur, The Universe Wakes Up.[2]

Kurzweil explains that evolutionary progress is exponential because of positive feedback; the results of one stage are used to create the next stage. Exponential growth is deceptive, nearly flat at first until it hits what Kurzweil calls "the knee in the curve" then rises almost vertically.[3] In fact Kurzweil believes evolutionary progress is super-exponential because more resources are deployed to the winning process. As an example of super-exponential growth Kurzweil cites the computer chip business. The overall budget for the whole industry increases over time, since the fruits of exponential growth make it an attractive investment; meanwhile the additional budget fuels more innovation which makes the industry grow even faster, effectively an example of "double" exponential growth.[4]

Kurzweil says evolutionary progress looks smooth, but that really it is divided into paradigms, specific methods of solving problems. Each paradigm starts with slow growth, builds to rapid growth, and then levels off. As one paradigm levels off, pressure builds to find or develop a new paradigm. So what looks like a single smooth curve is really series of smaller S curves.[5] For example, Kurzweil notes that when vacuum tubes stopped getting faster, cheaper transistors became popular and continued the overall exponential growth.[6]

Kurzweil calls this exponential growth the law of accelerating returns, and he believes it applies to many human-created technologies such as computer memory, transistors, microprocessors, DNA sequencing, magnetic storage, the number of Internet hosts, Internet traffic, decrease in device size, and nanotech citations and patents.[7] Kurzweil cites two historical examples of exponential growth: the Human Genome Project and the growth of the Internet.[8] Kurzweil claims the whole world economy is in fact growing exponentially, although short term booms and busts tend to hide this trend.[9]

Computational capacity

Plot showing Moore's law
An updated version of Moore's Law over 120 Years (based on Kurzweil’s graph). The 7 most recent data points are all NVIDIA GPUs.

A fundamental pillar of Kurzweil's argument is that to get to the Singularity, computational capacity is as much of a bottleneck as other things like quality of algorithms and understanding of the human brain. Moore's Law predicts the capacity of integrated circuits grows exponentially, but not indefinitely. Kurzweil feels the increase in the capacity of integrated circuits will probably slow by the year 2020.[10] He feels confident that a new paradigm will debut at that point to carry on the exponential growth predicted by his law of accelerating returns. Kurzweil describes four paradigms of computing that came before integrated circuits: electromechanical, relay, vacuum tube, and transistors.[10] What technology will follow integrated circuits, to serve as the sixth paradigm, is unknown, but Kurzweil believes nanotubes are the most likely alternative among a number of possibilities:
nanotubes and nanotube circuitry, molecular computing, self-assembly in nanotube circuits, biological systems emulating circuit assembly, computing with DNA, spintronics (computing with the spin of electrons), computing with light, and quantum computing.[11]
Since Kurzweil believes computational capacity will continue to grow exponentially long after Moore's Law ends it will eventually rival the raw computing power of the human brain. Kurzweil looks at several different estimates of how much computational capacity is in the brain and settles on 1016 calculations per second and 1013 bits of memory. He writes that $1,000 will buy computer power equal to a single brain "by around 2020"[12] while by 2045, the onset of the Singularity, he says same amount of money will buy one billion times more power than all human brains combined today.[13] Kurzweil admits the exponential trend in increased computing power will hit a limit eventually, but he calculates that limit to be trillions of times beyond what is necessary for the Singularity.[14]

The brain

Plot showing the exponential growth of computing
Exponential Growth of Computing

Kurzweil notes that computational capacity alone will not create artificial intelligence. He asserts that the best way to build machine intelligence is to first understand human intelligence. The first step is to image the brain, to peer inside it. Kurzweil claims imaging technologies such as PET and fMRI are increasing exponentially in resolution[15] while he predicts even greater detail will be obtained during the 2020s when it becomes possible to scan the brain from the inside using nanobots.[16] Once the physical structure and connectivity information are known, Kurzweil says researchers will have to produce functional models of sub-cellular components and synapses all the way up to whole brain regions.[17] The human brain is "a complex hierarchy of complex systems, but it does not represent a level of complexity beyond what we are already capable of handling".[18]

Beyond reverse engineering the brain in order to understand and emulate it, Kurzweil introduces the idea of "uploading" a specific brain with every mental process intact, to be instantiated on a "suitably powerful computational substrate". He writes that general modeling requires 1016 calculations per second and 1013 bits of memory, but then explains uploading requires additional detail, perhaps as many as 1019 cps and 1018 bits. Kurzweil says the technology to do this will be available by 2040.[19] Rather than an instantaneous scan and conversion to digital form, Kurzweil feels humans will most likely experience gradual conversion as portions of their brain are augmented with neural implants, increasing their proportion of non-biological intelligence slowly over time.[20]

Kurzweil believes there is "no objective test that can conclusively determine" the presence of consciousness.[21] Therefore, he says nonbiological intelligences will claim to have consciousness and "the full range of emotional and spiritual experiences that humans claim to have";[22] he feels such claims will generally be accepted.

Genetics, nanotechnology and robotics (AI)

Kurzweil says revolutions in genetics, nanotechnology and robotics will usher in the beginning of the Singularity.[23] Kurzweil feels with sufficient genetic technology it should be possible to maintain the body indefinitely, reversing aging while curing cancer, heart disease and other illnesses.[24] Much of this will be possible thanks to nanotechnology, the second revolution, which entails the molecule by molecule construction of tools which themselves can "rebuild the physical world".[25] Finally, the revolution in robotics will really be the development of strong AI, defined as machines which have human-level intelligence or greater.[26] This development will be the most important of the century, "comparable in importance to the development of biology itself".[27]

Kurzweil concedes that every technology carries with it the risk of misuse or abuse, from viruses and nanobots to out-of-control AI machines. He believes the only countermeasure is to invest in defensive technologies, for example by allowing new genetics and medical treatments, monitoring for dangerous pathogens, and creating limited moratoriums on certain technologies. As for artificial intelligence Kurzweil feels the best defense is to increase the "values of liberty, tolerance, and respect for knowledge and diversity" in society, because "the nonbiological intelligence will be embedded in our society and will reflect our values".[28]

The Singularity

Plot showing the countdown the singularity
Countdown to the Singularity

Kurzweil touches on the history of the Singularity concept, tracing it back to John von Neumann in the 1950s and I. J. Good in the 1960s. He compares his Singularity to that of a mathematical or astrophysical singularity. While his ideas of a Singularity is not actually infinite, he says it looks that way from any limited perspective.[29]

During the Singularity, Kurzweil predicts that "human life will be irreversibly transformed"[30] and that humans will transcend the "limitations of our biological bodies and brain".[31] He looks beyond the Singularity to say that "the intelligence that will emerge will continue to represent the human civilization." Further, he feels that "future machines will be human, even if they are not biological".[32]

Kurzweil claims once nonbiological intelligence predominates the nature of human life will be radically altered:[33] there will be radical changes in how humans learn, work, play, and wage war.[34] Kurzweil envisions nanobots which allow people to eat whatever they want while remaining thin and fit, provide copious energy, fight off infections or cancer, replace organs and augment their brains. Eventually people's bodies will contain so much augmentation they'll be able to alter their "physical manifestation at will".[35]

Kurzweil says the law of accelerating returns suggests that once a civilization develops primitive mechanical technologies, it is only a few centuries before they achieve everything outlined in the book, at which point it will start expanding outward, saturating the universe with intelligence. Since people have found no evidence of other civilizations, Kurzweil believes humans are likely alone in the universe. Thus Kurzweil concludes it is humanity's destiny to do the saturating, enlisting all matter and energy in the process.[36][37]

As for individual identities during these radical changes, Kurzweil suggests people think of themselves as an evolving pattern rather than a specific collection of molecules. Kurzweil says evolution moves towards "greater complexity, greater elegance, greater knowledge, greater intelligence, greater beauty, greater creativity, and greater levels of subtle attributes such as love".[38] He says that these attributes, in the limit, are generally used to describe God. That means, he continues, that evolution is moving towards a conception of God and that the transition away from biological roots is in fact a spiritual undertaking.[38]

Predictions

Kurzweil does not include an actual written timeline of the past and future, as he did in The Age of Intelligent Machines and The Age of Spiritual Machines, however he still makes many specific predictions. Kurzweil writes that by 2010 a supercomputer will have the computational capacity to emulate human intelligence[39] and "by around 2020" this same capacity will be available "for one thousand dollars".[12] After that milestone he expects human brain scanning to contribute to an effective model of human intelligence "by the mid-2020s".[40] These two elements will culminate in computers that can pass the Turing test by 2029.[41] By the early 2030s the amount of non-biological computation will exceed the "capacity of all living biological human intelligence".[42] Finally the exponential growth in computing capacity will lead to the Singularity. Kurzweil spells out the date very clearly: "I set the date for the Singularity—representing a profound and disruptive transformation in human capability—as 2045".[13]

Reception

Analysis

A common criticism of the book relates to the "exponential growth fallacy". As an example, in 1969, man landed on the moon. Extrapolating exponential growth from there one would expect huge lunar bases and manned missions to distant planets. Instead, exploration stalled or even regressed after that. Paul Davies writes "the key point about exponential growth is that it never lasts"[43] often due to resource constraints.

Theodore Modis says "nothing in nature follows a pure exponential" and suggests the logistic function is a better fit for "a real growth process". The logistic function looks like an exponential at first but then tapers off and flattens completely. For example, world population and the United States's oil production both appeared to be rising exponentially, but both have leveled off because they were logistic. Kurzweil says "the knee in the curve" is the time when the exponential trend is going to explode, while Modis claims if the process is logistic when you hit the "knee" the quantity you are measuring is only going to increase by a factor of 100 more.[44]

While some critics complain that the law of accelerating returns is not a law of nature[43] others question the religious motivations or implications of Kurzweil's Singularity. The buildup towards the Singularity is compared with Judeo-Christian end-of-time scenarios. Beam calls it "a Buck Rogers vision of the hypothetical Christian Rapture".[45] John Gray says "the Singularity echoes apocalyptic myths in which history is about to be interrupted by a world-transforming event".[46]

The radical nature of Kurzweil's predictions is often discussed. Anthony Doerr says that before you "dismiss it as techno-zeal" consider that "every day the line between what is human and what is not quite human blurs a bit more". He lists technology of the day, in 2006, like computers that land supersonic airplanes or in vitro fertility treatments and asks whether brain implants that access the internet or robots in our blood really are that unbelievable.[47]

In regard to reverse engineering the brain, neuroscientist David J. Linden writes that "Kurzweil is conflating biological data collection with biological insight". He feels that data collection might be growing exponentially, but insight is increasing only linearly. For example, the speed and cost of sequencing genomes is also improving exponentially, but our understanding of genetics is growing very slowly. As for nanobots Linden believes the spaces available in the brain for navigation are simply too small. He acknowledges that someday we will fully understand the brain, just not on Kurzweil's timetable.[48]

Reviews

Paul Davies wrote in Nature that The Singularity is Near is a "breathless romp across the outer reaches of technological possibility" while warning that the "exhilarating speculation is great fun to read, but needs to be taken with a huge dose of salt."[43]

Anthony Doerr in The Boston Globe wrote "Kurzweil's book is surprisingly elaborate, smart, and persuasive. He writes clean methodical sentences, includes humorous dialogues with characters in the future and past, and uses graphs that are almost always accessible."[47] while his colleague Alex Beam points out that "Singularitarians have been greeted with hooting skepticism"[45] Janet Maslin in The New York Times wrote "The Singularity is Near is startling in scope and bravado", but says "much of his thinking tends to be pie in the sky". She observes that he's more focused on optimistic outcomes rather than the risks.[49]

Film adaptations

In 2006, Barry Ptolemy and his production company Ptolemaic Productions licensed the rights to The Singularity Is Near from Kurzweil. Inspired by the book, Ptolemy directed and produced the film Transcendent Man, which went on to bring more attention to the book.

Kurzweil has also directed his own adaptation, called The Singularity is Near, which mixes documentary with a science-fiction story involving his robotic avatar Ramona's transformation into an artificial general intelligence. It was screened at the World Film Festival, the Woodstock Film Festival, the Warsaw International FilmFest, the San Antonio Film Festival in 2010 and the San Francisco Indie Film Festival in 2011. The movie was released generally on July 20, 2012.[50] It is available on DVD or digital download[51] and a trailer is available.[52]

The 2014 film Lucy is roughly based upon the predictions made by Kurzweil about what the year 2045 will look like, including the immortality of man.[53]

There's Plenty of Room at the Bottom

From Wikipedia, the free encyclopedia
"There's Plenty of Room at the Bottom" was a lecture given by physicist Richard Feynman at an American Physical Society meeting at Caltech on December 29, 1959.[1] Feynman considered the possibility of direct manipulation of individual atoms as a more powerful form of synthetic chemistry than those used at the time. The talk went unnoticed and did not inspire the conceptual beginnings of the field. In the 1990s it was rediscovered and publicised as a seminal event in the field, probably to boost the history of nanotechnology with Feynman's reputation.[citation needed]

Conception

Feynman considered a number of interesting ramifications of a general ability to manipulate matter on an atomic scale. He was particularly interested in the possibilities of denser computer circuitry, and microscopes that could see things much smaller than is possible with scanning electron microscopes. These ideas were later realized by the use of the scanning tunneling microscope, the atomic force microscope and other examples of scanning probe microscopy and storage systems such as Millipede, created by researchers at IBM.

Feynman also suggested that it should be possible, in principle, to make nanoscale machines that "arrange the atoms the way we want", and do chemical synthesis by mechanical manipulation.
He also presented the possibility of "swallowing the doctor", an idea that he credited in the essay to his friend and graduate student Albert Hibbs. This concept involved building a tiny, swallowable surgical robot.

As a thought experiment he proposed developing a set of one-quarter-scale manipulator hands slaved to the operator's hands to build one-quarter scale machine tools analogous to those found in any machine shop. This set of small tools would then be used by the small hands to build and operate ten sets of one-sixteenth-scale hands and tools, and so forth, culminating in perhaps a billion tiny factories to achieve massively parallel operations. He uses the analogy of a pantograph as a way of scaling down items. This idea was anticipated in part, down to the microscale, by science fiction author Robert A. Heinlein in his 1942 story Waldo.[2][3] As the sizes got smaller, one would have to redesign some tools, because the relative strength of various forces would change. Although gravity would become unimportant, surface tension would become more important, Van der Waals attraction would become important, etc. Feynman mentioned these scaling issues during his talk. Nobody has yet attempted to implement this thought experiment, although it has been noted that some types of biological enzymes and enzyme complexes (especially ribosomes) function chemically in a way close to Feynman's vision.[4][5] Feynman also mentioned in his lecture that it might be better eventually to use glass or plastic because their greater uniformity would avoid problems in the very small scale (metals and crystals are separated into domains where the lattice structure prevails).[6] This could be a good reason to make machines and also electronics out of glass and plastic. At the present time, there are electronic components made of both materials. In glass, there are optical fiber cables that amplify the light pulses at regular intervals, using glass doped with the rare-earth element erbium. The doped glass is spliced into the fiber and pumped by a laser operating at a different frequency.[7] In plastic, field effect transistors are being made with polythiophene, a plastic invented by Alan J. Heeger et al. that becomes an electrical conductor when oxidized. At this time, a factor of just 20 in electron mobility separates plastic from silicon.[8][9]

Challenges

At the meeting Feynman concluded his talk with two challenges, and he offered a prize of $1000 for the first individuals to solve each one. The first challenge involved the construction of a tiny motor, which, to Feynman's surprise, was achieved by November 1960 by William McLellan, a meticulous craftsman, using conventional tools. The motor met the conditions, but did not advance the art. The second challenge involved the possibility of scaling down letters small enough so as to be able to fit the entire Encyclopædia Britannica on the head of a pin, by writing the information from a book page on a surface 1/25,000 smaller in linear scale. In 1985, Tom Newman, a Stanford graduate student, successfully reduced the first paragraph of A Tale of Two Cities by 1/25,000, and collected the second Feynman prize.[10][11]

Impact

K. Eric Drexler later took the Feynman concept of a billion tiny factories and added the idea that they could make more copies of themselves, via computer control instead of control by a human operator, in his 1986 book Engines of Creation: The Coming Era of Nanotechnology.

After Feynman's death, scholars studying the historical development of nanotechnology have concluded that his actual role in catalyzing nanotechnology research was limited based on recollections from many of the people active in the nascent field in the 1980s and 1990s. Chris Toumey, a cultural anthropologist at the University of South Carolina, has reconstructed the history of the publication and republication of Feynman’s talk, along with the record of citations to “Plenty of Room” in the scientific literature.[12][13] In Toumey's 2008 article, "Reading Feynman into Nanotechnology", he found 11 versions of the publication of “Plenty of Room", plus two instances of a closely related talk by Feynman, “Infinitesimal Machinery,” which Feynman called “Plenty of Room, Revisited.” Also in Toumey’s references are videotapes of that second talk.

Toumey found that the published versions of Feynman’s talk had a negligible influence in the twenty years after it was first published, as measured by citations in the scientific literature, and not much more influence in the decade after the Scanning Tunneling Microscope was invented in 1981. Subsequently, interest in “Plenty of Room” in the scientific literature greatly increased in the early 1990s. This is probably because the term “nanotechnology” gained serious attention just before that time, following its use by Drexler in his 1986 book, Engines of Creation: The Coming Era of Nanotechnology, which cited Feynman, and in a cover article headlined "Nanotechnology", published later that year in a mass-circulation science-oriented magazine, OMNI.[14][15] The journal Nanotechnology was launched in 1989; the famous Eigler-Schweizer experiment, precisely manipulating 35 xenon atoms, was published in Nature in April 1990; and Science had a special issue on nanotechnology in November 1991. These and other developments hint that the retroactive rediscovery of Feynman’s “Plenty of Room” gave nanotechnology a packaged history that provided an early date of December 1959, plus a connection to the charisma and genius of Richard Feynman.

Toumey’s analysis also includes comments from distinguished scientists in nanotechnology who say that “Plenty of Room” did not influence their early work, and in fact most of them had not read it until a later date.

Feynman's stature as a Nobel laureate and as an iconic figure in 20th century science surely helped advocates of nanotechnology and provided a valuable intellectual link to the past.[2] More concretely, his stature and concept of atomically precise fabrication played a role in securing funding for nanotechnology research, illustrated by President Clinton's January 2000 speech calling for a Federal program:
My budget supports a major new National Nanotechnology Initiative, worth $500 million. Caltech is no stranger to the idea of nanotechnology the ability to manipulate matter at the atomic and molecular level. Over 40 years ago, Caltech's own Richard Feynman asked, "What would happen if we could arrange the atoms one by one the way we want them?"[16]
While the version of the Nanotechnology Research and Development Act that was passed by the House in May 2003 called for a study of the technical feasibility of molecular manufacturing, this study was removed to safeguard funding of less controversial research before the Act was passed by the Senate and finally signed into law by President Bush on December 3, 2003.[17]

Fiction byproducts

  • In "The Tree of Time", a short story published in 1964, Damon Knight uses the idea of a barrier that has to be constructed atom by atom (a time barrier, in the story).

Editions

  • Feynman, R.P. (1 March 1992). "There's plenty of room at the bottom (data storage)". Journal of Microelectromechanical Systems. 1 (1): 60–66. doi:10.1109/84.128057. A reprint of the talk.
  • Feynman, R. (1993). "Infinitesimal machinery". Journal of Microelectromechanical Systems. 2 (1): 4–14. doi:10.1109/84.232589. A sequel to his first talk.

Lie group

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Lie_group In mathematics , a Lie gro...