Read more at: http://phys.org/news/2013-12-home-electricity-falling.html#jCp
A Medley of Potpourri is just what it says; various thoughts, opinions, ruminations, and contemplations on a variety of subjects.
Search This Blog
Thursday, January 2, 2014
Home electricity use in US falling to 2001 levels
Dec 30, 2013 by Jonathan Fahey on phys.org
Read more at: http://phys.org/news/2013-12-home-electricity-falling.html#jCp
The average amount of electricity consumed in U.S. homes has fallen to levels last seen more than a decade ago, back when the smartest device in people's pockets was a Palm pilot and anyone talking about a tablet was probably an archaeologist or a preacher. Because of more energy-efficient housing, appliances and gadgets, power usage is on track to decline in 2013 for the third year in a row, to its lowest point since 2001, even though our lives are more electrified. Here's a look at what has changed since the last time consumption was so low. BETTER HOMES In the early 2000s, as energy prices rose, more states adopted or toughened building codes to force builders to better seal homes so heat or air-conditioned air doesn't seep out so fast. That means newer homes waste less energy. Also, insulated windows and other building technologies have dropped in price, making retrofits of existing homes more affordable. In the wake of the financial crisis, billions of dollars in Recovery Act funding was directed toward home-efficiency programs. BETTER GADGETS Big appliances such as refrigerators and air conditioners have gotten more efficient thanks to federal energy standards that get stricter ever few years as technology evolves. A typical room air conditioner—one of the biggest power hogs in the home—uses 20 percent less electricity per hour of full operation than it did in 2001, according to the Association of Home Appliance Manufacturers.
Read more at: http://phys.org/news/2013-12-home-electricity-falling.html#jCp
In a world first, Japan extracted natural gas from frozen undersea deposits this year.
By Lisa Raffensperger @ http://discovermagazine.com/2014/jan-feb/19-fuel-from-fire-ice
Global fuel supplies may soon be dramatically enlarged thanks to new techniques to tap into huge reserves of natural gas trapped under the seafloor. In March, Japan became the first country to successfully extract methane from frozen undersea deposits called gas hydrates.
These lacy structures of ice, found around the globe buried under permafrost and the oceanfloor, have pores filled with highly flammable gas. By some estimates, hydrates could store more than 10 quadrillion cubic feet of harvestable methane — enough to fulfill the present gas needs of the entire United States for the next 400 years.
The following is from Wikipedia, under methane clathrates.
Methane clathrates are common constituents of the shallow marine geosphere, and they occur both in deep sedimentary structures, and form outcrops on the ocean floor. Methane hydrates are believed to form by migration of gas from depth along geological faults, followed by precipitation, or crystallization, on contact of the rising gas stream with cold sea water. Methane clathrates are also present in deep Antarctic ice cores, and record a history of atmospheric methane concentrations, dating to 800,000 years ago.[4] The ice-core methane clathrate record is a primary source of data for global warming research, along with oxygen and carbon dioxide.
The sedimentary methane hydrate reservoir probably contains 2–10 times the currently known reserves of conventional natural gas, as of 2013[update].[25] This represents a potentially important future source of hydrocarbon fuel. However, in the majority of sites deposits are thought to be too dispersed for economic extraction.[18] Other problems facing commercial exploitation are detection of viable reserves and development of the technology for extracting methane gas from the hydrate deposits.
A research and development project in Japan is aiming for commercial-scale extraction near Aichi Prefecture by 2016.[26][27] In August 2006, China announced plans to spend 800 million yuan (US$100 million) over the next 10 years to study natural gas hydrates.[28] A potentially economic reserve in the Gulf of Mexico may contain approximately 100 billion cubic metres (3.5×10
On March 12, 2013, JOGMEC researchers announced that they had successfully extracted natural gas from frozen methane hydrate.[31] In order to extract the gas, specialized equipment was used to drill into and depressurize the hydrate deposits, causing the methane to separate from the ice. The gas was then collected and piped to surface where it was ignited to prove its presence.[32] According to an industry spokesperson, "It [was] the world's first offshore experiment producing gas from methane hydrate".[31] Previously, gas had been extracted from onshore deposits, but never from offshore deposits which are much more common.[32] The hydrate field from which the gas was extracted is located 50 kilometres (31 mi) from central Japan in the Nankai Trough, 300 metres (980 ft) under the sea.[31][32] A spokesperson for JOGMEC remarked "Japan could finally have an energy source to call its own".[32] The experiment will continue for two weeks before it is determined how efficient the gas extraction process has been.[32] Marine geologist Mikio Satoh remarked "Now we know that extraction is possible. The next step is to see how far Japan can get costs down to make the technology economically viable."[32] Japan estimates that there are at least 1.1 trillion cubic meters of methane trapped in the Nankai Trough, enough to meet the country's needs for more than ten years.[32]
NASA's Cold Fusion Folly Posted by Buzz Skyline at http://physicsbuzz.physicscentral.com/2013/04/nasas-cold-fusion-folly.html
I am sad - horrified really - to learn that some NASA scientists have caught cold fusion madness. As is so often the case with companies and research groups that get involved in this fruitless enterprise, they tend to make their case by first pointing out how nice it would be to have a clean, cheap, safe, effectively limitless source of power. Who could say no to that?
NASA Langley scientists are hoping to build spacecraft powered with cold fusion. Image courtesy of NASA. |
NASA, for example, is promoting a cold fusion scheme that they say will power your house and car, and even a space plane that is apparently under development, despite the fact that cold fusion power supplies don't exist yet and almost certainly never will. And if that's not enough, NASA's brand of cold fusion can solve our climate change problems by converting carbon directly into nitrogen.
The one hitch in the plan, unfortunately, is that they're going to have to violate some very well established physics to make it happen. To say the least, I wouldn't count on it.
To be clear, cold fusion does indeed work - provided you use a heavier cousin of the electron, known as a muon, to make it happen. There is no question that muon-catalyzed fusion is a perfectly sound, well-understood process that would be an abundant source of energy, if only we could find or create a cheap source of muons. Unfortunately, it takes way more energy to create the muons that go into muon-catalyzed fusion than comes out of the reaction.
Cold fusion that doesn't involve muons, on the other hand, doesn't work. In fact, the very same physics principles that make muon-catalyzed fusion possible are the ones that guarantee that the muon-less version isn't possible.
To get around the problem presented by nature and her physical laws, NASA's scientists have joined other cold fusion advocates in rebranding their work under the deceptively scientific moniker LENR (Low Energy Nuclear Reactions), and backing it up with various sketchy theories.
The main theory currently in fashion among cold fusion people is the Widom-Larsen LENR theory, which claims that neutrons can result from interactions with "heavy electrons" and protons in a lump of material in a cold fusion experiment. These neutrons, so the argument goes, can then be absorbed in a material (copper is a popular choice) which becomes unstable and decays to form a lighter material (nickel, assuming you start with copper), giving off energy in the process.
At least one paper argues that Widom and Larsen made some serious errors in their calculations that thoroughly undermine their theory. But even if you assume the Widom-Larsen paper is correct, then there should be detectable neutrons produced in cold fusion experiments. (Coincidentally, it's primarily because no neutrons were detected in the original cold fusion experiments of Pons and Fleischmann that physicists were first clued into the fact no fusion was happening at all.)
Some proponents claim that the neutrons produced in the Widom-Larsen theory are trapped in the sample material and rapidly absorbed by atoms. But because the neutrons are formed at room temperature, they should have energies typical of thermal neutrons, which move on average at about 2000 meters a second. That means that a large fraction of them should escape the sample, and be easily detectable. Those that don't escape, but instead are absorbed by atoms would also lead to detectable radiation as the neutron-activated portions of the material decays. Either way, it would be pretty dangerous to be near an experiment like that, if it worked. The fact that cold fusion researchers are alive is fairly good evidence that their experiments aren't doing what they think they're doing.
But if you're willing to believe Widom-Larsen, and you suspend your disbelief long enough to accept that the neutrons exclusively stay in the sample for some reason, and that the energy released as a result dosn't include any radiation, it should still be pretty easy to determine if the experiments work. All you'd have to do is look for nickel in a sample that initially consisted of pure copper. If published proof exists, I haven't found it yet (please send links to peer-reviewed publications, if you've seen something).
Instead, people like NASA's Dennis Bushnell are happy with decidedly unscientific evidence for cold fusion. Among other things, Bushnell notes that " . . . several labs have blown up studying LENR and windows have melted, indicating when the conditions are "right" prodigious amounts of energy can be produced and released."
Of course, chemical reactions can blow things up and melt glass too. There's no reason to conclude nuclear reactions were responsible. And it certainly isn't publishable proof of cold fusion. Considering that most of these experiments involve hydrogen gas and electricity, it's not at all surprising that labs go up in flames on occasion.
On a related note, a recent article in Forbes magazine reported that Lewis Larsen, of the above-mentioned Widom-Larsen theory, claims that measurements of the isotopes of mercury in compact fluorescent bulbs indicate that LENR reactions are taking place in light fixtures everywhere. If only it were true, it would offer serious support for the Widom-Larsen theory.
It's too bad the paper Larsen cites says nothing of the sort. According to an article in Chemical and Engineering News, the scientists who performed the study of gas in fluorescent bulbs were motivated by the knowledge that some mercury isotopes are absorbed in the glass of the bulbs more readily than others. The isotope ratio inside isn't changing because of nuclear reactions, but instead by soaking into the glass at different rates. Sorry Lewis Larsen, nice try.
Chimpanzee–human last common ancestor
From Wikipedia, the free encyclopedia
The chimpanzee–human last common ancestor (CHLCA, CLCA, or C/H LCA) is the last species that humans, bonobos and chimpanzees share as a common ancestor.
In human genetic studies, the CHLCA is useful as an anchor point for calculating single-nucleotide polymorphism (SNP) rates in human populations where chimpanzees are used as an outgroup.[citation needed] The CHLCA is frequently cited as an anchor for molecular time to most recent common ancestor (TMRCA) determination because the two species of the genus Pan, the bonobos and the chimpanzee, are the species most genetically similar to Homo sapiens.
Time estimates
The age of the CHLCA is an estimate. The fossil find of Ardipithecus kadabba, Sahelanthropus tchadensis, and Orrorin tugenensis are closest in age and expected morphology to the CHLCA and suggest the LCA (last common ancestor) is older than 7 million years. The earliest studies of apes suggested the CHLCA may have been as old as 25 million years; however, protein studies in the 1970s suggested the CHLCA was less than 8 million years in age. Genetic methods based on Orangutan/Human and Gibbon/Human LCA times were then used to estimate a Chimpanzee/Human LCA of 6 million years, and LCA times between 5 and 7 million years ago are currently used in the literature.[note 1]“ | One no longer has the option of considering a fossil older than about eight million years as a hominid no matter what it looks like. | ” |
—V. Sarich, Background for man[1]
|
Because chimps and humans share a matrilineal ancestor, establishing the geological age of that last ancestor allows the estimation of the mutation rate. However, fossils of the exact last common ancestor would be an extremely rare find. The CHLCA is frequently cited as an anchor for mt-TMRCA determination because chimpanzees are the species most genetically similar to humans. However, there are no known fossils that represent that CHLCA. It is believed that there are no proto-chimpanzee fossils or proto-gorilla fossils that have been clearly identified. However, Richard Dawkins, in his book The Ancestor's Tale, proposes that robust australopithecines such as Paranthropus are the ancestors of gorillas, whereas some of the gracile australopithecines are the ancestors of chimpanzees (see Homininae).
“ | In effect, there is now no a priori reason to presume that human-chimpanzee split times are especially recent, and the fossil evidence is now fully compatible with older chimpanzee-human divergence dates [7 to 10 Ma... | ” |
—White et al. (2009), [2]
|
Some researchers tried to estimate the age of the CHLCA (TCHLCA) using biopolymer structures which differ slightly between closely related animals. Among these researchers, Allan C. Wilson and Vincent Sarich were pioneers in the development of the molecular clock for humans. Working on protein sequences they eventually determined that apes were closer to humans than some paleontologists perceived based on the fossil record.[note 2] Later Vincent Sarich concluded that the TCHLCA was no greater than 8 million years in age, with a favored range between 4 and 6 million years before present.
This paradigmatic age has stuck with molecular anthropology until the late 1990s, when others began questioning the certainty of the assumption. Currently, the estimation of the TCHLCA is less certain, and there is genetic as well as paleontological support for increasing TCHLCA. A 13 million year TCHLCA is one proposed age.[2][3]
A source of confusion in determining the exact age of the Pan–Homo split is evidence of a more complex speciation process rather than a clean split between the two lineages. Different chromosomes appear to have split at different times, possibly over as much as a 4 million year period, indicating a long and drawn out speciation process with large scale hybridization events between the two emerging lineages.[4] Particularly the X chromosome shows very little difference between Humans and chimpanzees, though this effect may also partly be the result of rapid evolution of the X chromosome in the last common ancestors.[5] Complex speciation and incomplete lineage sorting of genetic sequences seem to also have happened in the split between our lineage and that of the gorilla, indicating "messy" speciation is the rule rather than exception in large-bodied primates.[6][7] Such a scenario would explain why divergence age between the Homo and Pan has varied with the chosen method and why a single point has been so far hard to track down.
Richard Wrangham argued that the CHLCA was so similar to chimpanzee (Pan troglodytes), that it should be classified as a member of the Pan genus, and called Pan prior.[8]
Notes
- Jump up ^ Studies have pointed to the slowing molecular clock as monkeys evolved into apes and apes evolved into humans. In particular, Macaque monkey mtDNA has evolved 30% more rapidly than African ape mtDNA.
- Jump up ^ "If man and old world monkeys last shared a common ancestor 30 million years ago, then man and African apes shared a common ancestor 5 million years ago..." Sarich & Wilson (1971)
References
- Jump up ^ Background for man: readings in physical anthropology, 1971
- ^ Jump up to: a b White TD, Asfaw B, Beyene Y, et al. (October 2009). "Ardipithecus ramidus and the paleobiology of early hominids". Science 326 (5949): 75–86. doi:10.1126/science.1175802. PMID 19810190. Cite uses deprecated parameters (help)
- Jump up ^ Arnason U, Gullberg A, Janke A (December 1998). "Molecular timing of primate divergences as estimated by two nonprimate calibration points". J. Mol. Evol. 47 (6): 718–27. doi:10.1007/PL00006431. PMID 9847414. Cite uses deprecated parameters (help)
- Jump up ^ Patterson N, Richter DJ, Gnerre S, Lander ES, Reich D (June 2006). "Genetic evidence for complex speciation of humans and chimpanzees". Nature 441 (7097): 1103–8. doi:10.1038/nature04789. PMID 16710306. Cite uses deprecated parameters (help)
- Jump up ^ Wakeley J (March 2008). "Complex speciation of humans and chimpanzees". Nature 452 (7184): E3–4; discussion E4. doi:10.1038/nature06805. PMID 18337768. Cite uses deprecated parameters (help)
- Jump up ^ Scally A, Dutheil JY, Hillier LW, et al. (March 2012). "Insights into hominid evolution from the gorilla genome sequence". Nature 483 (7388): 169–75. doi:10.1038/nature10842. PMC 3303130. PMID 22398555. Cite uses deprecated parameters (help)
- Jump up ^ Van Arsdale, A.P. "Go, go, Gorilla genome". The Pleistocene Scene – A.P. Van Arsdale Blog. Retrieved 16 November 2012.
- Jump up ^ De Waal, Frans B. M (2002-10-15). Tree of Origin: What Primate Behavior Can Tell Us About Human Social Evolution. pp. 124–126. ISBN 9780674010048.
Viewpoint: Human evolution, from tree to braid
31 December 2013 Last updated at 06:54 ET
By Professor Clive Finlayson Director, Gibraltar Museum
If one human evolution paper published in 2013 sticks in my mind above all others, it has to be the wonderful report in the 18 October issue of the journal Science.
The article in question described the beautiful fifth skull from Dmanisi in Georgia. Most commentators and colleagues were full of praise, but controversy soon reared its ugly head.
What was, in my view, a logical conclusion reached by the authors was too much for some researchers to take.
The conclusion of the Dmanisi study was that the variation in skull shape and morphology observed in this small sample, derived from a single population of Homo erectus, matched the entire variation observed among African fossils ascribed to three species - H. erectus, H. habilis and H. rudolfensis.
The five highly variable Dmanisi fossils belonged to a single population of H. erectus, so how could we argue any longer that similar variation among spatially and temporally widely distributed fossils in Africa reflected differences between species? They all had to be the same species.
I have been advocating that the morphological differences observed within fossils typically ascribed to Homo sapiens (the so-called modern humans) and the Neanderthals fall within the variation observable in a single species.
It was not surprising to find that Neanderthals and modern humans interbred, a clear expectation of the biological species concept.
But most people were surprised with that particular discovery, as indeed they were with the fifth skull and many other recent discoveries, for example the "Hobbit" from the Indonesian island of Flores.
It seems that almost every other discovery in palaeoanthropology is reported as a surprise. I wonder when the penny will drop: when we have five pieces of a 5,000-piece jigsaw puzzle, every new bit that we add is likely to change the picture.
Did we really think that having just a minuscule residue of our long and diverse past was enough for us to tell humanity's story?
If the fossils of 1.8 or so million years ago and those of the more recent Neanderthal-modern human era were all part of a single, morphologically diverse, species with a wide geographical range, what is there to suggest that it would have been any different in the intervening periods?
Probably not so different if we take the latest finds from the Altai Mountains in Siberia into account. Denisova Cave has produced yet another surprise, revealing that, not only was there gene flow between Neanderthals, Denisovans and modern humans, but that a fourth player was also involved in the gene-exchange game.
The identity of the fourth player remains unknown but it was an ancient lineage that had been separate for probably over a million years. H. erectus seems a likely candidate. Whatever the name we choose to give this mystery lineage, what these results show is that gene flow was possible not just among contemporaries but also between ancient and more modern lineages.
Just to show how little we really know of the human story, another genetic surprise has confounded palaeoanthropologists. Scientists succeeded in extracting the most ancient mitochondrial DNA so far, from the Sima de los Huesos site in Atapuerca, Spain.
The morphology of these well-known Middle Pleistocene (approximately 400,000 years old) fossils have long been thought to represent a lineage leading to the Neanderthals.
When the results came in they were actually closer to the 40,000 year-old Denisovans from Siberia. We can speculate on the result but others have offered enough alternatives for me to not to have to add to them.
The conclusion that I derive takes me back to Dmanisi: We have built a picture of our evolution based on the morphology of fossils and it was wrong.
We just cannot place so much taxonomic weight on a handful of skulls when we know how plastic - or easily changeable - skull shape is in humans. And our paradigms must also change.
Some time ago we replaced a linear view of our evolution by one represented by a branching tree. It is now time to replace it with that of an interwoven plexus of genetic lineages that branch out and fuse once again with the passage of time.
This means, of course, that we must abandon, once and for all, views of modern human superiority over archaic (ancient) humans. The terms "archaic" and "modern" lose all meaning as do concepts of modern human replacement of all other lineages.
It also releases us from the deep-rooted shackles that have sought to link human evolution with stone tool-making technological stages - the Stone Ages - even when we have known that these have overlapped with each other for half-a-million years in some instances.
The world of our biological and cultural evolution was far too fluid for us to constrain it into a few stages linked by transitions.
The challenge must now be to try and learn as much as we can of the detail. We have to flesh out the genetic information and this is where archaeology comes into the picture. We may never know how the Denisovans earned a living, after all we have mere fragments of their anatomy at our disposal, let alone other populations that we may not even be aware of.
What we can do is try to understand the spectrum of potential responses of human populations to different environmental conditions and how culture has intervened in these relationships. The Neanderthals will be central to our understanding of the possibilities because they have been so well studied.
A recent paper, for example, supports the view that Neanderthals at La Chapelle-aux-Saints in France intentionally buried their dead which contrasts with reports of cannibalistic behaviour not far away at El Sidron in northern Spain.
Here we have two very different behavioural patterns within Neanderthals. Similarly, modern humans in south-western Europe painted in cave walls for a limited period but many contemporaries did not. Some Neanderthals did it in a completely different way it seems, by selecting raptor feathers of particular colours. Rather than focus on differences between modern humans and Neanderthals, what the examples show is the range of possibilities open to humans (Neanderthals included) in different circumstances.
The future of human origins research will need to focus along three axes:
Sites in the latter category are few and far between. In Europe at least, many were excavated during the last century but there are some outstanding examples remaining. Gorham's and Vanguard Caves in Gibraltar, where I work, are among those because they span over 100,000 years of occupation and are veritable repositories of data.
There is another dimension to this story. It seems that the global community is coming round to recognising the value of key sites that document human evolution.
In 2012, the caves on Mount Carmel were inscribed on the Unesco World Heritage List and the UK Government will be putting Gorham's and associated caves on the Rock of Gibraltar forward for similar status in January 2015. It is recognition of the value of these caves as archives of the way of life and the environments of people long gone but who are very much a part of our story.
Prof Clive Finlayson is director of the Gibraltar Museum and author of the book The Improbable Primate.
Share this page
If one human evolution paper published in 2013 sticks in my mind above all others, it has to be the wonderful report in the 18 October issue of the journal Science.
The article in question described the beautiful fifth skull from Dmanisi in Georgia. Most commentators and colleagues were full of praise, but controversy soon reared its ugly head.
What was, in my view, a logical conclusion reached by the authors was too much for some researchers to take.
The conclusion of the Dmanisi study was that the variation in skull shape and morphology observed in this small sample, derived from a single population of Homo erectus, matched the entire variation observed among African fossils ascribed to three species - H. erectus, H. habilis and H. rudolfensis.
The five highly variable Dmanisi fossils belonged to a single population of H. erectus, so how could we argue any longer that similar variation among spatially and temporally widely distributed fossils in Africa reflected differences between species? They all had to be the same species.
I have been advocating that the morphological differences observed within fossils typically ascribed to Homo sapiens (the so-called modern humans) and the Neanderthals fall within the variation observable in a single species.
It was not surprising to find that Neanderthals and modern humans interbred, a clear expectation of the biological species concept.
But most people were surprised with that particular discovery, as indeed they were with the fifth skull and many other recent discoveries, for example the "Hobbit" from the Indonesian island of Flores.
It seems that almost every other discovery in palaeoanthropology is reported as a surprise. I wonder when the penny will drop: when we have five pieces of a 5,000-piece jigsaw puzzle, every new bit that we add is likely to change the picture.
Did we really think that having just a minuscule residue of our long and diverse past was enough for us to tell humanity's story?
If the fossils of 1.8 or so million years ago and those of the more recent Neanderthal-modern human era were all part of a single, morphologically diverse, species with a wide geographical range, what is there to suggest that it would have been any different in the intervening periods?
Probably not so different if we take the latest finds from the Altai Mountains in Siberia into account. Denisova Cave has produced yet another surprise, revealing that, not only was there gene flow between Neanderthals, Denisovans and modern humans, but that a fourth player was also involved in the gene-exchange game.
The identity of the fourth player remains unknown but it was an ancient lineage that had been separate for probably over a million years. H. erectus seems a likely candidate. Whatever the name we choose to give this mystery lineage, what these results show is that gene flow was possible not just among contemporaries but also between ancient and more modern lineages.
Just to show how little we really know of the human story, another genetic surprise has confounded palaeoanthropologists. Scientists succeeded in extracting the most ancient mitochondrial DNA so far, from the Sima de los Huesos site in Atapuerca, Spain.
The morphology of these well-known Middle Pleistocene (approximately 400,000 years old) fossils have long been thought to represent a lineage leading to the Neanderthals.
When the results came in they were actually closer to the 40,000 year-old Denisovans from Siberia. We can speculate on the result but others have offered enough alternatives for me to not to have to add to them.
The conclusion that I derive takes me back to Dmanisi: We have built a picture of our evolution based on the morphology of fossils and it was wrong.
We just cannot place so much taxonomic weight on a handful of skulls when we know how plastic - or easily changeable - skull shape is in humans. And our paradigms must also change.
Some time ago we replaced a linear view of our evolution by one represented by a branching tree. It is now time to replace it with that of an interwoven plexus of genetic lineages that branch out and fuse once again with the passage of time.
This means, of course, that we must abandon, once and for all, views of modern human superiority over archaic (ancient) humans. The terms "archaic" and "modern" lose all meaning as do concepts of modern human replacement of all other lineages.
It also releases us from the deep-rooted shackles that have sought to link human evolution with stone tool-making technological stages - the Stone Ages - even when we have known that these have overlapped with each other for half-a-million years in some instances.
The world of our biological and cultural evolution was far too fluid for us to constrain it into a few stages linked by transitions.
The challenge must now be to try and learn as much as we can of the detail. We have to flesh out the genetic information and this is where archaeology comes into the picture. We may never know how the Denisovans earned a living, after all we have mere fragments of their anatomy at our disposal, let alone other populations that we may not even be aware of.
What we can do is try to understand the spectrum of potential responses of human populations to different environmental conditions and how culture has intervened in these relationships. The Neanderthals will be central to our understanding of the possibilities because they have been so well studied.
A recent paper, for example, supports the view that Neanderthals at La Chapelle-aux-Saints in France intentionally buried their dead which contrasts with reports of cannibalistic behaviour not far away at El Sidron in northern Spain.
Here we have two very different behavioural patterns within Neanderthals. Similarly, modern humans in south-western Europe painted in cave walls for a limited period but many contemporaries did not. Some Neanderthals did it in a completely different way it seems, by selecting raptor feathers of particular colours. Rather than focus on differences between modern humans and Neanderthals, what the examples show is the range of possibilities open to humans (Neanderthals included) in different circumstances.
The future of human origins research will need to focus along three axes:
- further genetic research to clarify the relationship of lineages and the history of humans;
- research using new technology on old archaeological sites, as at La Chapelle; and
- research at sites that currently retain huge potential for new discoveries.
Sites in the latter category are few and far between. In Europe at least, many were excavated during the last century but there are some outstanding examples remaining. Gorham's and Vanguard Caves in Gibraltar, where I work, are among those because they span over 100,000 years of occupation and are veritable repositories of data.
There is another dimension to this story. It seems that the global community is coming round to recognising the value of key sites that document human evolution.
In 2012, the caves on Mount Carmel were inscribed on the Unesco World Heritage List and the UK Government will be putting Gorham's and associated caves on the Rock of Gibraltar forward for similar status in January 2015. It is recognition of the value of these caves as archives of the way of life and the environments of people long gone but who are very much a part of our story.
Prof Clive Finlayson is director of the Gibraltar Museum and author of the book The Improbable Primate.
Earth's temperature could rise by more than 4°C by 2100, claim some scientists.
Research by the University of New South Wales found that the global climate is more affected by carbon dioxide than previously thought.
The scientists believe temperatures could rise by more than 8°C by 2200 if C02 emissions are not reduced.
By Sarah Griffiths
PUBLISHED: 10:45 EST, 31 December 2013 | UPDATED: 11:25 EST, 31 December 2013
Ref.: http://www.dailymail.co.uk/sciencetech/article-2531706/Earths-temperature-rise-4-C-2100-claim-scientists.html
Global temperatures could soar by at least 4°C by 2100 if carbon dioxide emissions aren’t slashed, new research warns.
Climate scientists claim that temperatures could rise by at least 4°C by 2100 and potentially more than 8°C by 2200, which could have disastrous results for the planet.
The research, published in the journal Nature, found that the global climate is more affected by carbon dioxide than previously thought.
HOW CLOUDS AFFECT THE CLIMATE
Fewer clouds form as the planet warms so that less sunlight is reflected back into space, driving temperature on Earth higher.
It could also solve one of the mysteries of climate sensitivity - the role of cloud formation and whether it has positive or negative effect on global warming.
Researchers now believe that existing climate models significantly overestimate the number of clouds protecting our atmosphere from overheating.
The study suggests that fewer clouds form as the planet warms, so that less sunlight is reflected back into space, driving temperatures up on Earth.
Professor Steven Sherwood, from the University of New South Wales, said: 'Our research has shown climate models indicating a low temperature response to a doubling of carbon dioxide from pre-industrial times are not reproducing the correct processes that lead to cloud formation.'
'When the processes are correct in the climate models, the level of climate sensitivity is far higher.
'Previously, estimates of the sensitivity of global temperature to a doubling of carbon dioxide ranged from 1.5°C to 5°C.
'This new research takes away the lower end of climate sensitivity estimates, meaning that global average temperatures will increase by 3°C to 5°C with a doubling of carbon dioxide.'
Professor Sherwood told The Guardian that a rise of 4°C would likely be 'catastrophic' rather than just dangerous.
'For example, it would make life difficult, if not impossible, in much of the tropics, and would guarantee the eventual melting of the Greenland ice sheet and some of the Antarctic ice sheet' he said.
COST OF EXTREME WEATHER EVENTS SOARS BY 60 PER CENT IN 30 YEARS
The key to this narrower but higher estimate can be found by looking at the role of water vapour in cloud formation.
When water vapour is taken up by the atmosphere through evaporation, the updraughts can rise up to nine miles (15km) and form clouds that produce heavy rains.
The can however also rise just a few kilometres before returning to the surface without forming rain clouds, which reflect light away from the earth's surface.
When they rise only a few kilometres, they reduce total cloud cover because they pull more vapour away from the higher clouds forming.
Researchers found that climate models predicting a lesser rise in the Earth's temperature, do not include enough of the lower level water vapour process.
Most models show nearly all updraughts rising to 9 miles and forming clouds, reflecting more sunlight and as a result, the global temperature in these models becomes less sensitive in its response to atmospheric carbon dioxide.
When the models are made more realistic, the water vapour is taken to a wider range of heights in the atmosphere, causing fewer clouds to form as the climate warms.
This increases the amount of sunlight and heat entering the atmosphere and as a result increases the sensitivity of our climate to carbon dioxide or any other perturbation.
The result is that when the models are correct, the doubling of carbon dioxide expected in the next 50 years will see a temperature increase of at least 4°C by 2100.
Professor Sherwood said: 'Climate sceptics like to criticise climate models for getting things wrong and we are the first to admit they are not perfect, but what we are finding is that the mistakes are being made by those models that predict less warming, not those that predict more.
'Rises in global average temperatures of this magnitude will have profound impacts on the world and the economies of many countries if we don’t urgently start to curb our emissions.'
The scientists believe temperatures could rise by more than 8°C by 2200 if C02 emissions are not reduced.
By Sarah Griffiths
PUBLISHED: 10:45 EST, 31 December 2013 | UPDATED: 11:25 EST, 31 December 2013
Ref.: http://www.dailymail.co.uk/sciencetech/article-2531706/Earths-temperature-rise-4-C-2100-claim-scientists.html
Global temperatures could soar by at least 4°C by 2100 if carbon dioxide emissions aren’t slashed, new research warns.
Climate scientists claim that temperatures could rise by at least 4°C by 2100 and potentially more than 8°C by 2200, which could have disastrous results for the planet.
The research, published in the journal Nature, found that the global climate is more affected by carbon dioxide than previously thought.
Scientists added that temperatures could rise by more than 8°C by 2200 if CO2 emissions are not reduced. The research found that the global climate is more affected by carbon dioxide than previously thought
HOW CLOUDS AFFECT THE CLIMATE
Fewer clouds form as the planet warms so that less sunlight is reflected back into space, driving temperature on Earth higher.
When water evaporates from oceans, vapour can rise nine miles into the atmosphere to create rain clouds that reflect light, or can rise just a few miles and drift back down without forming clouds.
While both processes occur in the real world, current climate models place too much emphasis on the amount of clouds that form on a daily basis.
By looking at how clouds form in on the planet , scientists are able to create more realistic climate models, which are used to predict future temperatures.
Scientists have long debated how clouds affect global warming.
While both processes occur in the real world, current climate models place too much emphasis on the amount of clouds that form on a daily basis.
By looking at how clouds form in on the planet , scientists are able to create more realistic climate models, which are used to predict future temperatures.
Scientists have long debated how clouds affect global warming.
Researchers now believe that existing climate models significantly overestimate the number of clouds protecting our atmosphere from overheating.
Professor Steven Sherwood, from the University of New South Wales, said: 'Our research has shown climate models indicating a low temperature response to a doubling of carbon dioxide from pre-industrial times are not reproducing the correct processes that lead to cloud formation.'
'When the processes are correct in the climate models, the level of climate sensitivity is far higher.
Protective: Researchers now believe that existing climate models significantly overestimate the number of clouds protecting the atmosphere from overheating
'Previously, estimates of the sensitivity of global temperature to a doubling of carbon dioxide ranged from 1.5°C to 5°C.
'This new research takes away the lower end of climate sensitivity estimates, meaning that global average temperatures will increase by 3°C to 5°C with a doubling of carbon dioxide.'
Professor Sherwood told The Guardian that a rise of 4°C would likely be 'catastrophic' rather than just dangerous.
'For example, it would make life difficult, if not impossible, in much of the tropics, and would guarantee the eventual melting of the Greenland ice sheet and some of the Antarctic ice sheet' he said.
COST OF EXTREME WEATHER EVENTS SOARS BY 60 PER CENT IN 30 YEARS
The costs of extreme weather events have risen dramatically, climate scientists warned last week.
The national science academies of EU Member States believe Europe needs to plan for future probabilities of extreme weather, such as heat waves, floods and storms.
Highlighting a 60 per cent rise over the last 30 years in the cost of damage from extreme weather events across Europe, the European Academies' Science Advisory Council (EASAC) warned of the grave economic and social consequences if European policy makers do not use the latest estimates of future droughts, floods and storms in their planning while adapting to global warming and the resulting climate disruption.
The report urges EU nations to prepare for heat waves and think about how to reduce the number of deaths. Flood defence is also an area that requires improvement, as rising sea levels will leave coastal areas at serious risk from storm surges.
Researchers also believe climate research and adaptation plans should be given more priority.
The national science academies of EU Member States believe Europe needs to plan for future probabilities of extreme weather, such as heat waves, floods and storms.
Highlighting a 60 per cent rise over the last 30 years in the cost of damage from extreme weather events across Europe, the European Academies' Science Advisory Council (EASAC) warned of the grave economic and social consequences if European policy makers do not use the latest estimates of future droughts, floods and storms in their planning while adapting to global warming and the resulting climate disruption.
The report urges EU nations to prepare for heat waves and think about how to reduce the number of deaths. Flood defence is also an area that requires improvement, as rising sea levels will leave coastal areas at serious risk from storm surges.
Researchers also believe climate research and adaptation plans should be given more priority.
The key to this narrower but higher estimate can be found by looking at the role of water vapour in cloud formation.
When water vapour is taken up by the atmosphere through evaporation, the updraughts can rise up to nine miles (15km) and form clouds that produce heavy rains.
The can however also rise just a few kilometres before returning to the surface without forming rain clouds, which reflect light away from the earth's surface.
Researchers found that climate models predicting a lesser rise in the Earth's temperature, do not include enough of the lower level water vapour process.
Most models show nearly all updraughts rising to 9 miles and forming clouds, reflecting more sunlight and as a result, the global temperature in these models becomes less sensitive in its response to atmospheric carbon dioxide.
The scientists warned that such a rise in temperatures on Earth would lead to droughts (pictured) and make life difficult for people living in the tropics. A hotter planet would also likely lead to the melting of the Greenland ice sheet and some of the Antarctic ice sheet
This increases the amount of sunlight and heat entering the atmosphere and as a result increases the sensitivity of our climate to carbon dioxide or any other perturbation.
The result is that when the models are correct, the doubling of carbon dioxide expected in the next 50 years will see a temperature increase of at least 4°C by 2100.
Professor Sherwood said: 'Climate sceptics like to criticise climate models for getting things wrong and we are the first to admit they are not perfect, but what we are finding is that the mistakes are being made by those models that predict less warming, not those that predict more.
'Rises in global average temperatures of this magnitude will have profound impacts on the world and the economies of many countries if we don’t urgently start to curb our emissions.'
Wednesday, January 1, 2014
Jaw-Dropping Views of Saturn Cap 2013 for NASA's Cassini Spacecraft (Photos)
by Stephanie Pappas, SPACE.com Staff Writer | December 30, 2013 10:08am ET
NASA's Cassini spacecraft has capped 2013 with a spectacular new collection of Saturn photos showcasing the planet's beauty, as well with its trademark rings and strange moons.
The newly released Saturn photos by Cassini include two views of Enceladus, Saturn's sixth-largest moon. Enceladus is a winter-appropriate ice world. Geysers at its poles shoot ice particles into space, some of which make it into orbit around Saturn. Some of this space "snow" becomes part of Saturn's E ring, Saturn's second outermost ring that is made of microscopic particles.
Other images highlight Saturn's largest moon, Titan. There are no jolly elves at Titan's north pole; liquid methane and ethane seas appear as splotchy features near the moon's poles. At the south pole, a high-altitude vortex swirls. The hazy orange atmosphere of Titan is thought to resemble the atmosphere of early Earth.
Saturn itself is the celestial tree-topper of this trio, with a wide-angle look at its north pole revealing the planet's hexagonal jet stream and its spinning polar vortex.
"Until Cassini arrived at Saturn, we didn't know about the hydrocarbon lakes of Titan, the active drama of Enceladus' jets, and the intricate patterns at Saturn's poles," Linda Spiller, the Cassini project scientist at NASA Jet's Propulsion Laboratory in Pasadena, Calif., said in a statement on Dec. 23. "Spectacular images like these highlight that Cassini has given us the gift of knowledge, which we have been so excited to share with everyone."
The Cassini-Huygens spacecraft arrived at Saturn launched in 1997 and arrived at Saturn in 2004. Cassini orbits Saturn, while Huygens, a lander, touched down on Titan in 2005. In July, Cassini beamed back an amazing image of Saturn's rings with Earth as a tiny pinpoint of light in the background.
Cassini's mission is expected to continue through at least 2017, after which it will be decommissioned by a controlled fall through Saturn's atmosphere.
Follow Stephanie Pappas on Twitterand Google+. Follow us @Spacedotcom, Facebook and Google+. Original article on SPACE.com.
The spectacular rings of Saturn cast dark shadows on the ringed planet as the winter season approaches in Saturn's southern hemisphere in this view from the Cassini spacecraft. With the cold season comes a blue hue on Saturn that is likely caused by a drop in ultraviolet sunlight and haze it produces. This image was taken on July 29, 2013 and released on Dec. 23. Credit: NASA/JPL-Caltech/Space Science Institute |
NASA's Cassini spacecraft has capped 2013 with a spectacular new collection of Saturn photos showcasing the planet's beauty, as well with its trademark rings and strange moons.
The newly released Saturn photos by Cassini include two views of Enceladus, Saturn's sixth-largest moon. Enceladus is a winter-appropriate ice world. Geysers at its poles shoot ice particles into space, some of which make it into orbit around Saturn. Some of this space "snow" becomes part of Saturn's E ring, Saturn's second outermost ring that is made of microscopic particles.
Other images highlight Saturn's largest moon, Titan. There are no jolly elves at Titan's north pole; liquid methane and ethane seas appear as splotchy features near the moon's poles. At the south pole, a high-altitude vortex swirls. The hazy orange atmosphere of Titan is thought to resemble the atmosphere of early Earth.
The globe of Saturn, seen here in natural color, is reminiscent of a holiday ornament in this wide-angle view from NASA's Cassini spacecraft. The characteristic hexagonal shape of Saturn's northern jet stream, somewhat yellow here, is visible. At the pole lies a Saturnian version of a high-speed hurricane, eye and all. This image was taken on July 22, 2013 and released on Dec. 23.
Credit: NASA/JPL-Caltech/Space Science Institute
Credit: NASA/JPL-Caltech/Space Science Institute
Saturn itself is the celestial tree-topper of this trio, with a wide-angle look at its north pole revealing the planet's hexagonal jet stream and its spinning polar vortex.
Saturn's largest and second largest moons, Titan and Rhea, appear to be stacked on top of each other in this true-color scene from NASA's Cassini spacecraft released on Dec. 23, 2013. The north polar hood can be seen on Titan appearing as a detached layer at the top of the moon on the top right. This view looks toward the Saturn-facing side of the smaller Rhea.
Credit: NASA/JPL-Caltech/Space Science Institute
Credit: NASA/JPL-Caltech/Space Science Institute
Saturn's moon Enceladus, covered in snow and ice, resembles a perfectly packed snowball in this image from NASA's Cassini mission released on Dec. 23, 2013. This view was taken by Cassini on March 10, 2012. It shows the leading side of Enceladus. North on Enceladus is up and rotated 6 degrees to the left.
Credit: NASA/JPL-Caltech/Space Science Institute
Credit: NASA/JPL-Caltech/Space Science Institute
The Cassini-Huygens spacecraft arrived at Saturn launched in 1997 and arrived at Saturn in 2004. Cassini orbits Saturn, while Huygens, a lander, touched down on Titan in 2005. In July, Cassini beamed back an amazing image of Saturn's rings with Earth as a tiny pinpoint of light in the background.
Cassini's mission is expected to continue through at least 2017, after which it will be decommissioned by a controlled fall through Saturn's atmosphere.
Follow Stephanie Pappas on Twitterand Google+. Follow us @Spacedotcom, Facebook and Google+. Original article on SPACE.com.
Subscribe to:
Posts (Atom)
Operator (computer programming)
From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Operator_(computer_programmin...
-
From Wikipedia, the free encyclopedia Islamic State of Iraq and the Levant الدولة الإسلامية في العراق والشام ( ...
-
From Wikipedia, the free encyclopedia A reproduction of the palm -leaf manuscript in Siddham script ...