Search This Blog

Thursday, January 2, 2014

NASA's Cold Fusion Folly Posted by Buzz Skyline at http://physicsbuzz.physicscentral.com/2013/04/nasas-cold-fusion-folly.html

I am sad - horrified really - to learn that some NASA scientists have caught cold fusion madness. As is so often the case with companies and research groups that get involved in this fruitless enterprise, they tend to make their case by first pointing out how nice it would be to have a clean, cheap, safe, effectively limitless source of power. Who could say no to that?
NASA Langley scientists are hoping to build spacecraft powered with cold fusion. Image courtesy of NASA.
Here's a word of caution: anytime anyone, especially a scientist, starts by telling you about glorious, nigh-unbelievable futuristic applications of their idea, be very, very skeptical.

NASA, for example, is promoting a cold fusion scheme that they say will power your house and car, and even a space plane that is apparently under development, despite the fact that  cold fusion power supplies don't exist yet and almost certainly never will. And if that's not enough, NASA's brand of cold fusion can solve our climate change problems by converting carbon directly into nitrogen.

The one hitch in the plan, unfortunately, is that they're going to have to violate some very well established physics to make it happen. To say the least, I wouldn't count on it.

To be clear, cold fusion does indeed work - provided you use a heavier cousin of the electron, known as a muon, to make it happen. There is no question that muon-catalyzed fusion is a perfectly sound, well-understood process that would be an abundant source of energy, if only we could find or create a cheap source of muons. Unfortunately, it takes way more energy to create the muons that go into muon-catalyzed fusion than comes out of the reaction.

Cold fusion that doesn't involve muons, on the other hand, doesn't work. In fact, the very same physics principles that make muon-catalyzed fusion possible are the ones that guarantee that the muon-less version isn't possible.

To get around the problem presented by nature and her physical laws, NASA's scientists have joined other cold fusion advocates in rebranding their work under the deceptively scientific moniker LENR (Low Energy Nuclear Reactions), and backing it up with various sketchy theories.

The main theory currently in fashion among cold fusion people is the Widom-Larsen LENR theory, which claims that neutrons can result from interactions with "heavy electrons" and protons in a lump of material in a cold fusion experiment. These neutrons, so the argument goes, can then be absorbed in a material (copper is a popular choice) which becomes unstable and decays to form a lighter material (nickel, assuming you start with copper), giving off energy in the process.

At least one paper argues that Widom and Larsen made some serious errors in their calculations that thoroughly undermine their theory. But even if you assume the Widom-Larsen paper is correct, then there should be detectable neutrons produced in cold fusion experiments. (Coincidentally, it's primarily because no neutrons were detected in the original cold fusion experiments of Pons and Fleischmann that physicists were first clued into the fact no fusion was happening at all.)

Some proponents claim that the neutrons produced in the Widom-Larsen theory are trapped in the sample material and rapidly absorbed by atoms. But because the neutrons are formed at room temperature, they should have energies typical of thermal neutrons, which move on average at about 2000 meters a second. That means that a large fraction of them should escape the sample, and be easily detectable. Those that don't escape, but instead are absorbed by atoms would also lead to detectable radiation as the neutron-activated portions of the material decays. Either way, it would be pretty dangerous to be near an experiment like that, if it worked.  The fact that cold fusion researchers are alive is fairly good evidence that their experiments aren't doing what they think they're doing.

But if you're willing to believe Widom-Larsen, and you suspend your disbelief long enough to accept that the neutrons exclusively stay in the sample for some reason, and that the energy released as a result dosn't include any radiation, it should still be pretty easy to determine if the experiments work. All you'd have to do is look for nickel in a sample that initially consisted of pure copper. If published proof exists, I haven't found it yet (please send links to peer-reviewed publications, if you've seen something).

Instead, people like NASA's Dennis Bushnell are happy with decidedly unscientific evidence for cold fusion. Among other things, Bushnell notes that " . . . several labs have blown up studying LENR and windows have melted, indicating when the conditions are "right" prodigious amounts of energy can be produced and released."

Of course, chemical reactions can blow things up and melt glass too. There's no reason to conclude nuclear reactions were responsible. And it certainly isn't publishable proof of cold fusion. Considering that most of these experiments involve hydrogen gas and electricity, it's not at all surprising that labs go up in flames on occasion.

On a related note, a recent article in Forbes magazine reported that Lewis Larsen, of the above-mentioned Widom-Larsen theory, claims that measurements of the isotopes of mercury in compact fluorescent bulbs indicate that LENR reactions are taking place in light fixtures everywhere. If only it were true, it would offer serious support for the Widom-Larsen theory.

It's too bad the paper Larsen cites says nothing of the sort. According to an article in Chemical and Engineering News, the scientists who performed the study of gas in fluorescent bulbs were motivated by the knowledge that some mercury isotopes are absorbed in the glass of the bulbs more readily than others. The isotope ratio inside isn't changing because of nuclear reactions, but instead by soaking into the glass at different rates. Sorry Lewis Larsen, nice try.

Chimpanzee–human last common ancestor

From Wikipedia, the free encyclopedia

The chimpanzee–human last common ancestor (CHLCA, CLCA, or C/H LCA) is the last species that humans, bonobos and chimpanzees share as a common ancestor.

In human genetic studies, the CHLCA is useful as an anchor point for calculating single-nucleotide polymorphism (SNP) rates in human populations where chimpanzees are used as an outgroup.[citation needed] The CHLCA is frequently cited as an anchor for molecular time to most recent common ancestor (TMRCA) determination because the two species of the genus Pan, the bonobos and the chimpanzee, are the species most genetically similar to Homo sapiens.

Time estimates

The age of the CHLCA is an estimate. The fossil find of Ardipithecus kadabba, Sahelanthropus tchadensis, and Orrorin tugenensis are closest in age and expected morphology to the CHLCA and suggest the LCA (last common ancestor) is older than 7 million years. The earliest studies of apes suggested the CHLCA may have been as old as 25 million years; however, protein studies in the 1970s suggested the CHLCA was less than 8 million years in age. Genetic methods based on Orangutan/Human and Gibbon/Human LCA times were then used to estimate a Chimpanzee/Human LCA of 6 million years, and LCA times between 5 and 7 million years ago are currently used in the literature.[note 1]
One no longer has the option of considering a fossil older than about eight million years as a hominid no matter what it looks like.
—V. Sarich, Background for man[1]

Because chimps and humans share a matrilineal ancestor, establishing the geological age of that last ancestor allows the estimation of the mutation rate. However, fossils of the exact last common ancestor would be an extremely rare find. The CHLCA is frequently cited as an anchor for mt-TMRCA determination because chimpanzees are the species most genetically similar to humans. However, there are no known fossils that represent that CHLCA. It is believed that there are no proto-chimpanzee fossils or proto-gorilla fossils that have been clearly identified. However, Richard Dawkins, in his book The Ancestor's Tale, proposes that robust australopithecines such as Paranthropus are the ancestors of gorillas, whereas some of the gracile australopithecines are the ancestors of chimpanzees (see Homininae).
In effect, there is now no a priori reason to presume that human-chimpanzee split times are especially recent, and the fossil evidence is now fully compatible with older chimpanzee-human divergence dates [7 to 10 Ma...
—White et al. (2009), [2]

Some researchers tried to estimate the age of the CHLCA (TCHLCA) using biopolymer structures which differ slightly between closely related animals. Among these researchers, Allan C. Wilson and Vincent Sarich were pioneers in the development of the molecular clock for humans. Working on protein sequences they eventually determined that apes were closer to humans than some paleontologists perceived based on the fossil record.[note 2] Later Vincent Sarich concluded that the TCHLCA was no greater than 8 million years in age, with a favored range between 4 and 6 million years before present.

This paradigmatic age has stuck with molecular anthropology until the late 1990s, when others began questioning the certainty of the assumption. Currently, the estimation of the TCHLCA is less certain, and there is genetic as well as paleontological support for increasing TCHLCA. A 13 million year TCHLCA is one proposed age.[2][3]

A source of confusion in determining the exact age of the PanHomo split is evidence of a more complex speciation process rather than a clean split between the two lineages. Different chromosomes appear to have split at different times, possibly over as much as a 4 million year period, indicating a long and drawn out speciation process with large scale hybridization events between the two emerging lineages.[4] Particularly the X chromosome shows very little difference between Humans and chimpanzees, though this effect may also partly be the result of rapid evolution of the X chromosome in the last common ancestors.[5] Complex speciation and incomplete lineage sorting of genetic sequences seem to also have happened in the split between our lineage and that of the gorilla, indicating "messy" speciation is the rule rather than exception in large-bodied primates.[6][7] Such a scenario would explain why divergence age between the Homo and Pan has varied with the chosen method and why a single point has been so far hard to track down.

Richard Wrangham argued that the CHLCA was so similar to chimpanzee (Pan troglodytes), that it should be classified as a member of the Pan genus, and called Pan prior.[8]

Notes

  1. Jump up ^ Studies have pointed to the slowing molecular clock as monkeys evolved into apes and apes evolved into humans. In particular, Macaque monkey mtDNA has evolved 30% more rapidly than African ape mtDNA.
  2. Jump up ^ "If man and old world monkeys last shared a common ancestor 30 million years ago, then man and African apes shared a common ancestor 5 million years ago..." Sarich & Wilson (1971)

References

  1. Jump up ^ Background for man: readings in physical anthropology, 1971
  2. ^ Jump up to: a b White TD, Asfaw B, Beyene Y, et al. (October 2009). "Ardipithecus ramidus and the paleobiology of early hominids". Science 326 (5949): 75–86. doi:10.1126/science.1175802. PMID 19810190. 
  3. Jump up ^ Arnason U, Gullberg A, Janke A (December 1998). "Molecular timing of primate divergences as estimated by two nonprimate calibration points". J. Mol. Evol. 47 (6): 718–27. doi:10.1007/PL00006431. PMID 9847414. 
  4. Jump up ^ Patterson N, Richter DJ, Gnerre S, Lander ES, Reich D (June 2006). "Genetic evidence for complex speciation of humans and chimpanzees". Nature 441 (7097): 1103–8. doi:10.1038/nature04789. PMID 16710306. 
  5. Jump up ^ Wakeley J (March 2008). "Complex speciation of humans and chimpanzees". Nature 452 (7184): E3–4; discussion E4. doi:10.1038/nature06805. PMID 18337768. 
  6. Jump up ^ Scally A, Dutheil JY, Hillier LW, et al. (March 2012). "Insights into hominid evolution from the gorilla genome sequence". Nature 483 (7388): 169–75. doi:10.1038/nature10842. PMC 3303130. PMID 22398555. 
  7. Jump up ^ Van Arsdale, A.P. "Go, go, Gorilla genome". The Pleistocene Scene – A.P. Van Arsdale Blog. Retrieved 16 November 2012. 
  8. Jump up ^ De Waal, Frans B. M (2002-10-15). Tree of Origin: What Primate Behavior Can Tell Us About Human Social Evolution. pp. 124–126. ISBN 9780674010048.
 

Viewpoint: Human evolution, from tree to braid



D4500
One and the same: What many thought of as three separate species may in fact be just one

If one human evolution paper published in 2013 sticks in my mind above all others, it has to be the wonderful report in the 18 October issue of the journal Science.

The article in question described the beautiful fifth skull from Dmanisi in Georgia. Most commentators and colleagues were full of praise, but controversy soon reared its ugly head.

What was, in my view, a logical conclusion reached by the authors was too much for some researchers to take.

The conclusion of the Dmanisi study was that the variation in skull shape and morphology observed in this small sample, derived from a single population of Homo erectus, matched the entire variation observed among African fossils ascribed to three species - H. erectus, H. habilis and H. rudolfensis.

The five highly variable Dmanisi fossils belonged to a single population of H. erectus, so how could we argue any longer that similar variation among spatially and temporally widely distributed fossils in Africa reflected differences between species? They all had to be the same species.

I have been advocating that the morphological differences observed within fossils typically ascribed to Homo sapiens (the so-called modern humans) and the Neanderthals fall within the variation observable in a single species.

It was not surprising to find that Neanderthals and modern humans interbred, a clear expectation of the biological species concept.

But most people were surprised with that particular discovery, as indeed they were with the fifth skull and many other recent discoveries, for example the "Hobbit" from the Indonesian island of Flores.

It seems that almost every other discovery in palaeoanthropology is reported as a surprise. I wonder when the penny will drop: when we have five pieces of a 5,000-piece jigsaw puzzle, every new bit that we add is likely to change the picture.

Did we really think that having just a minuscule residue of our long and diverse past was enough for us to tell humanity's story?

If the fossils of 1.8 or so million years ago and those of the more recent Neanderthal-modern human era were all part of a single, morphologically diverse, species with a wide geographical range, what is there to suggest that it would have been any different in the intervening periods?

Probably not so different if we take the latest finds from the Altai Mountains in Siberia into account. Denisova Cave has produced yet another surprise, revealing that, not only was there gene flow between Neanderthals, Denisovans and modern humans, but that a fourth player was also involved in the gene-exchange game.

The identity of the fourth player remains unknown but it was an ancient lineage that had been separate for probably over a million years. H. erectus seems a likely candidate. Whatever the name we choose to give this mystery lineage, what these results show is that gene flow was possible not just among contemporaries but also between ancient and more modern lineages.

Pit of Bones A femur recovered from the famed "Pit of Bones" site in Spain yielded 400,000-year-old DNA

Just to show how little we really know of the human story, another genetic surprise has confounded palaeoanthropologists. Scientists succeeded in extracting the most ancient mitochondrial DNA so far, from the Sima de los Huesos site in Atapuerca, Spain.

The morphology of these well-known Middle Pleistocene (approximately 400,000 years old) fossils have long been thought to represent a lineage leading to the Neanderthals.

When the results came in they were actually closer to the 40,000 year-old Denisovans from Siberia. We can speculate on the result but others have offered enough alternatives for me to not to have to add to them.

The conclusion that I derive takes me back to Dmanisi: We have built a picture of our evolution based on the morphology of fossils and it was wrong.

We just cannot place so much taxonomic weight on a handful of skulls when we know how plastic - or easily changeable - skull shape is in humans. And our paradigms must also change.

The Panel of Hands at El Castillo Cave, Spain Old assumptions are being challenged as new thinking emerges

Some time ago we replaced a linear view of our evolution by one represented by a branching tree. It is now time to replace it with that of an interwoven plexus of genetic lineages that branch out and fuse once again with the passage of time.

This means, of course, that we must abandon, once and for all, views of modern human superiority over archaic (ancient) humans. The terms "archaic" and "modern" lose all meaning as do concepts of modern human replacement of all other lineages.

It also releases us from the deep-rooted shackles that have sought to link human evolution with stone tool-making technological stages - the Stone Ages - even when we have known that these have overlapped with each other for half-a-million years in some instances.

The world of our biological and cultural evolution was far too fluid for us to constrain it into a few stages linked by transitions.

The challenge must now be to try and learn as much as we can of the detail. We have to flesh out the genetic information and this is where archaeology comes into the picture. We may never know how the Denisovans earned a living, after all we have mere fragments of their anatomy at our disposal, let alone other populations that we may not even be aware of.

What we can do is try to understand the spectrum of potential responses of human populations to different environmental conditions and how culture has intervened in these relationships. The Neanderthals will be central to our understanding of the possibilities because they have been so well studied.

A recent paper, for example, supports the view that Neanderthals at La Chapelle-aux-Saints in France intentionally buried their dead which contrasts with reports of cannibalistic behaviour not far away at El Sidron in northern Spain.

Here we have two very different behavioural patterns within Neanderthals. Similarly, modern humans in south-western Europe painted in cave walls for a limited period but many contemporaries did not. Some Neanderthals did it in a completely different way it seems, by selecting raptor feathers of particular colours. Rather than focus on differences between modern humans and Neanderthals, what the examples show is the range of possibilities open to humans (Neanderthals included) in different circumstances.

The future of human origins research will need to focus along three axes:

  • further genetic research to clarify the relationship of lineages and the history of humans;
  • research using new technology on old archaeological sites, as at La Chapelle; and
  • research at sites that currently retain huge potential for new discoveries.

Sites in the latter category are few and far between. In Europe at least, many were excavated during the last century but there are some outstanding examples remaining. Gorham's and Vanguard Caves in Gibraltar, where I work, are among those because they span over 100,000 years of occupation and are veritable repositories of data.

There is another dimension to this story. It seems that the global community is coming round to recognising the value of key sites that document human evolution.

In 2012, the caves on Mount Carmel were inscribed on the Unesco World Heritage List and the UK Government will be putting Gorham's and associated caves on the Rock of Gibraltar forward for similar status in January 2015. It is recognition of the value of these caves as archives of the way of life and the environments of people long gone but who are very much a part of our story.

Prof Clive Finlayson is director of the Gibraltar Museum and author of the book The Improbable Primate.

Earth's temperature could rise by more than 4°C by 2100, claim some scientists.

Research by the University of New South Wales found that the global climate is more affected by carbon dioxide than previously thought.
The scientists believe temperatures could rise by more than 8°C by 2200 if C02 emissions are not reduced.

By Sarah Griffiths
|
      
      Global temperatures could soar by at least 4°C by 2100 if carbon dioxide emissions aren’t slashed, new research warns.   
Climate scientists claim that temperatures could rise by at least 4°C by 2100 and potentially more than 8°C by 2200, which could have disastrous results for the planet.
The research, published in the journal Nature, found that the global climate is more affected by carbon dioxide than previously thought.
Scientists added that temperatures could rise by more than 8°C by 2200 if CO2 emissions are not reduced. The research found that the global climate is more affected by carbon dioxide than previously thought
Scientists added that temperatures could rise by more than 8°C by 2200 if CO2 emissions are not reduced. The research found that the global climate is more affected by carbon dioxide than previously thought

HOW CLOUDS AFFECT THE CLIMATE


Fewer clouds form as the planet warms so that less sunlight is reflected back into space, driving temperature on Earth higher.
When water evaporates from oceans, vapour can rise nine miles into the atmosphere to create rain clouds that reflect light, or can rise just a few miles and drift back down without forming clouds.
While both processes occur in the real world, current climate models place too much emphasis on the amount of clouds that form on a daily basis.
By looking at how clouds form in on the planet , scientists are able to create more realistic climate models, which are used to predict future temperatures.
Scientists have long debated how clouds affect global warming.
It could also solve one of the mysteries of climate sensitivity - the role of cloud formation and whether it has positive or negative effect on global warming.
Researchers now believe that existing climate models significantly overestimate the number of clouds protecting our atmosphere from overheating.
 
The study suggests that fewer clouds form as the planet warms, so that less sunlight is reflected back into space, driving temperatures up on Earth.
Professor Steven Sherwood, from the University of New South Wales, said: 'Our research has shown climate models indicating a low temperature response to a doubling of carbon dioxide from pre-industrial times are not reproducing the correct processes that lead to cloud formation.'
'When the processes are correct in the climate models, the level of climate sensitivity is far higher.
Protective: Researchers now believe that existing climate models significantly overestimate the number of clouds protecting the atmosphere from overheating
Protective: Researchers now believe that existing climate models significantly overestimate the number of clouds protecting the atmosphere from overheating

'Previously, estimates of the sensitivity of global temperature to a doubling of carbon dioxide ranged from 1.5°C to 5°C.

'This new research takes away the lower end of climate sensitivity estimates, meaning that global average temperatures will increase by 3°C to 5°C with a doubling of carbon dioxide.'

Professor Sherwood told The Guardian that a rise of 4°C would likely be 'catastrophic' rather than just dangerous.

'For example, it would make life difficult, if not impossible, in much of the tropics, and would guarantee the eventual melting of the Greenland ice sheet and some of the Antarctic ice sheet' he said.

COST OF EXTREME WEATHER EVENTS SOARS BY 60 PER CENT IN 30 YEARS
The costs of extreme weather events have risen dramatically, climate scientists warned last week.

The national science academies of EU Member States believe Europe needs to plan for future probabilities of extreme weather, such as heat waves, floods and storms.
Highlighting a 60 per cent rise over the last 30 years in the cost of damage from extreme weather events across Europe, the European Academies' Science Advisory Council (EASAC) warned of the grave economic and social consequences if European policy makers do not use the latest estimates of future droughts, floods and storms in their planning while adapting to global warming and the resulting climate disruption.

The report urges EU nations to prepare for heat waves and think about how to reduce the number of deaths. Flood defence is also an area that requires improvement, as rising sea levels will leave coastal areas at serious risk from storm surges.

Researchers also believe climate research and adaptation plans should be given more priority.

The key to this narrower but higher estimate can be found by looking at the role of water vapour in cloud formation.

When water vapour is taken up by the atmosphere through evaporation, the updraughts can rise up to nine miles (15km) and form clouds that produce heavy rains.

The can however also rise just a few kilometres before returning to the surface without forming rain clouds, which reflect light away from the earth's surface.
When they rise only a few kilometres, they reduce total cloud cover because they pull more vapour away from the higher clouds forming.

Researchers found that climate models predicting a lesser rise in the Earth's temperature, do not include enough of the lower level water vapour process.

Most models show nearly all updraughts rising to 9 miles and forming clouds, reflecting more sunlight and as a result, the global temperature in these models becomes less sensitive in its response to atmospheric carbon dioxide.
The scientists warned that such a rise in temperatures on Earth would lead to droughts (pictured) and make life difficult for people living in the tropics. A hotter planet would also likely lead to the melting of the Greenland ice sheet and some of the Antarctic ice sheet
The scientists warned that such a rise in temperatures on Earth would lead to droughts (pictured) and make life difficult for people living in the tropics. A hotter planet would also likely lead to the melting of the Greenland ice sheet and some of the Antarctic ice sheet
When the models are made more realistic, the water vapour is taken to a wider range of heights in the atmosphere, causing fewer clouds to form as the climate warms.
This increases the amount of sunlight and heat entering the atmosphere and as a result increases the sensitivity of our climate to carbon dioxide or any other perturbation.

The result is that when the models are correct, the doubling of carbon dioxide expected in the next 50 years will see a temperature increase of at least 4°C by 2100.
Professor Sherwood said: 'Climate sceptics like to criticise climate models for getting things wrong and we are the first to admit they are not perfect, but what we are finding is that the mistakes are being made by those models that predict less warming, not those that predict more.
'Rises in global average temperatures of this magnitude will have profound impacts on the world and the economies of many countries if we don’t urgently start to curb our emissions.'

Wednesday, January 1, 2014

Jaw-Dropping Views of Saturn Cap 2013 for NASA's Cassini Spacecraft (Photos)

by Stephanie Pappas, SPACE.com Staff Writer   |   December 30, 2013 10:08am ET

The bad science checklist of GMO opponents


The Red FlagWhat the GMO opponents say
The ‘scientifically proven’ subterfuge.The GMO refusers love this tactic. They love to state that GMO’s harm humans, in some unknown way, by stating that it is “scientifically proven.” Setting aside the semantic point that science doesn’t “prove” anything, it provides evidence in support or refutation of a hypothesis, and the body of evidence is used to support a scientific principle. Moreover, there just isn’t a “scientific consensus” of any type that shows that GMO products may harm human or environmental health. However, there is a boatload of data that supports the safety of GMO crops
Persecuted prophets and maligned mavericks: The Galileo Gambit.Users of this tactic will try to persuade you that they belong to a tradition of maverick scientists who have been responsible for great advances despite being persecuted by mainstream science. Natural News, the absolute worst scientific source you could find, thinks that Gilles-Eric Séralini, who published what has to be one of the worst articles about GMO effects on a rat, is the martyr for the anti-GMO cause. 
Empty edicts – absence of empirical evidenceThe GMO opponents frequently use this tactic to make claims in the form of bald statements, without supplying us with supporting evidence. You will see it in numerous declarative statements, “this is the way it is” or “this is true” or “I know/believe this” or “everybody knows this.” When you push them on the evidence, they rely on other Red Flag attempts. 
Anecdotes, testimonials and urban legendsAnecdotes are de facto evidence of the pseudoscience pushing crowd. The problem is that anecdotes don’t equal data, and more anecdotes doesn’t equal better data. Our friends at Natural News go over the deep end providing us anecdotes about the dangers of GMO’s.
Charges of conspiracy, collusion and connivanceConspiracy theories are the standard operating procedures of the anti-GMO crowd. And Monsanto conspiracy theories are the best
Stressing status and appealing to authorityAlthough GMO opponents use all logical fallacies, one of their favorites is the Argument from False or Misleading Authority, which is when someone provides an argument from an authority, but on a topic outside of the particular authority’s expertise or on a topic on which the authority is not disinterested. Furthermore, arguments from authority are judged not on the fact that individual is an authority, but on the quality and quantity of evidence supporting the authority’s conclusions. For example, David Suzuki, an eminent zoologist and geneticist is vehemently opposed to GMO’s, yet his quality, let alone quantity, of evidence in support of his belief is underwhelming. 
Devious deception in displaying data: Cherry pickingGMO opponents love Cherry Picking. They will focus on one or two legitimate studies (or worse yet, only a part of the a study), while ignoring the body of evidence. Science does not function by inventing a conclusion and finding only data (or research) that supports the conclusion; in fact, good science examines the peer-reviewed data and find where it leads. Moreover, any cherry picked study that supports the anti-GMO conclusion is never critically analyzed, truly an official mark of good science. For example, the Séralini study I mentioned previously was just horrendous science with amateur errors that would embarrass your local high school science fair. But it’s accepted as the Truth by GMO opponents
Repetition of discredited argumentsIn this tactic, people persist in repeating claims that have been shown over and over to have no foundation. It’s like the Nazi’s Big Lie, basically repeating a lie so often and with such authority that the listener just assumes that it’s true, or that no one would have impudence to actually state a lie. The GMO opponents state so many lies about Monsanto, crops, and how it harms human health that the average listener assumes it must be the truth. Once again, only evidence matters, and it becomes difficult to get the liar (or the person pushing the lie) to provide evidence. 
Duplicity and distractionThis is the False Dichotomy logical fallacy, which states that there are only two possible, and usually opposite, positions from which to choose. You will hear many times from GMO refusers that “either you’re against GMO’s or you support Monsanto’s plan to do XYZ.” In fact, there’s a perfectly valid position that Monsanto is a bad company, but GMO crops are still safe. It’s possible to say that Monsanto is a polluter, but GMO crops are safe. But the worst part of the False Dichotomy fallacy is that the GMO refusers wants you to believe that if one argument is shown false (or true), the other argument is true (or false). In fact, one form of this argument has been renamed argumentum ad Monsantium, that is, if you support genetically modified foods, you must love Monsanto. 
Wishful thinking – favoring fantasy over factWe all fall victim to this tactic because we use it on ourselves. We like to believe things that conform with our wishes or desires, even to the extent of ignoring evidence to the contrary. People just want to believe that natural foods (whatever that may be, since many crops were genetically modified 10,000 years ago when we first domesticated many of the most common crops) are somehow better than all other foods, and evidence be damned.
Appeals to ancient wisdom – trusting traditional trickeryIn the world of foods, somehow there’s a belief that our ancestors ate better and healthier. And some go back to 10-20,000 years ago to try to convince everyone that the “Paleolithic diet” is the right one. Or that somehow our ancestors ate better, organic foods. Or that farmers knew better how to farm in the 13th century. In fact, food is better today because we have better transportation systems which means there’s less spoilage and generally healthier. Humans today not only live longer today, we live more productive active lives. Although there are lots of reasons for this (vaccinations, sanitation, medicines), one of the reasons is more and better food. Our ancestors had pests, wars, plagues (which killed laborers), and many other issues that made food worse. 
Technobabble and tenuous terminology: the use of pseudo scientific languageIn this tactic, people use invented terms that sound “sciencey” or co-opt real science terms and apply them incorrectly. The aforementioned Natural News is the most guilty of this, but it’s one of the fundamental tenets of pseudoscience. There’s a belief among the GMO haters that somehow GMO food will somehow incorporate itself into the human genome. They use all kinds of science terminology to sell their point of view, but on further examination, it’s all laughable. Because experts on gene therapy state that “the reason is that I have experience with working with DNA, human, mouse, and otherwise, including injecting it into tissues and trying to get it to express the protein for which it encodes. This is not a trivial matter. Think of it this way. If it were, gene therapy would be an almost trivial matter. But it’s not. In general, it’s difficult to induce human cells to take up foreign DNA in tissue. Even with viral vectors, it’s hard to get more than a small percentage of cells not only to take up the DNA but to express detectable levels of protein.” Real science.
Conflating correlation with causation: rooster syndromeThe infamous Post hoc ergo propter hoc logical fallacy, which is essentially a belief that because a second event follows the first, the first event must be the cause of the second. So, just so you know, GMO’s cause autism. Oh wait, everything causes autism.
Straw man: crushing concocted canardsAnother favorite logical fallacy of pseudoscience pushers, the Strawman Argument. Remember, all logical fallacies exist because one side of the argument completely lacks any evidence. The strawman argument is a method by which one side invents a position or quality about the other side, then proceeds to destroy that invented position. Monsanto, again, is the King Strawman for the GMO crowd. Like I mentioned above, there are probably some valid reasons to dislike Monsanto, but the invented belief that Monsanto is ruthless about harming human beings is unsupported by any evidence whatsoever. 
Indelible initial impressions: the anchoring effectAnchoring is the human tendency to rely almost entirely on one piece of evidence or study, usually one that we encountered early, when making a decision. The aforementioned Séralini study has been used over and over and over again by anti-GMO forces as “proof” that GMO’s cause cancer, even if the evidence was so bad that the scientific community, including individuals who don’t discuss GMO’s that often, mocked it without remorse
Perceiving phoney patterns: apopheniaThis happens when you convince yourself, or someone tries to convince you, that some data reveal a significant pattern when really the data are random or meaningless.
Banishing boundaries and pushing panaceas – applying models where they don’t belongThose who use this tactic take a model that works under certain conditions and try to apply it more widely to circumstances beyond its scope, where it does not work. Recently, I discussed research that seemed to indicate that GMO rice passed some fitness (the biological meaning) to weedy rice (which are rice-like grasses which are not agriculturally useful). Except the article didn’t actually show that result (it was poorly done). And some news sources wildly claimed that these results meant that GMO crops actually benefit weeds. Setting aside the low quality of the research (and some egregious experimental errors), it is scientifically illogical to apply these results to other genetically modified foods.
Single study syndrome – clutching at convenient confirmationThis tactic shows up when a person who has a vested interest in a particular point of view pounces on some new finding which seems to either support or threaten that point of view. It’s usually used in a context where the weight of evidence is against the perpetrator’s view. In other words, it’s a type of bias where the person ignores all other points of evidence while attacking this one study. 
Appeal to nature – the authenticity axiomGMO supporters push the Appeal to Nature, which is the belief or suggestion that “natural” is always better than “unnatural”. It assumes that “nature” is good, and “unnatural” is not.  Yoni Freedhof, an MD and Professor of Family Medicine, recently wrote that, believing that nature is good, and chemicals are bad, “is arrogant because it suggests that the entirety of the natural world has been created purely as a service to humankind – that somehow the earth and everything on it grows simply for our pleasure or our consumption.” There is nothing in nature that is necessarily and inherently better than something invented by mankind, but don’t tell that to the GMO refusers.
The reversed responsibility response – switching the burden of proofA form of the Argument from Ignorance, this is an logical fallacy where the arguer deflects a demand for evidence of a claim, by demanding that the other side provide evidence to refute the claim. Then, if you cannot refute it, the arguer declares victory because if you can’t prove it’s untrue, it must be true. Or vice versa. 
The scary science scenario – science portrayed as evil.Sometimes invoking the precautionary principle, the anti-GMO crowd will often scream out that “science,” as if it is an anthropomorphic organism, has ulterior motives. I presume people watch too many movies, which often make scientists out to be evil Dr. Frankensteins, rather than life-saving heroes like Jonas Salk or Paul Offit. As I’ve stated before, science has no inherent motive, but to understand the natural universe. It is a method to gain information. And the evil recently attributed to “science” is just patently false.
False balance – cultivating counterfeit controversy to create confusionFalse balance, an annoying tactic used by the anti-science crowd, that makes it appear that there’s a debate, and both sides of the debate is essentially equivalent. Many journalists routinely look for a representative of each “side” to include in their stories, even though it might be inappropriate. Anti-GMO groups like to exploit this tendency so that their point of view gains undeserved publicity. There is no scientific debate about GMO’s.
Confirmation bias – ferreting favourable findings while overlooking opposing observationsConfirmation Bias is a cognitive bias that causes us to search out evidence that supports our point of view, while ignoring anything that doesn’t. It is a basic human behavior. The anti-GMO world, no different than any other pseudoscience pushing group, subjects itself to this type of bias regularly. There are substantially more peer-reviewed articles that state that there are no issues with GMO foods, yet if you read any blog post against GMO’s, they only mention the rare study (cue Séralini again) that supports their anti-GMO point of view. Again, good science takes all the evidence, weighs higher quality evidence against lower quality ones, then decide if there’s enough evidence to support or reject a hypothesis. Real science is not coming to a conclusion, then finding evidence that supports it.

If you think that GMO crops are safe and are necessary tool to feed the world, if you think that genetically modified organisms are necessary for medicine, or if you think that a new genetically modified flu vaccine, safer than the old one using eggs, will save more lives, then all of the above will make sense. You will see how the anti-GMO activists use bad science.

If you didn’t have much an opinion about GMO’s, but maybe thought that there was something wrong with it, then understand that nearly everything negative you’ve heard about GMO’s is based on logical fallacies, and bad science.

If you’re against GMO’s because you think science supports you, then you’re no different than the anti-science people who populate the global warming denier community. In fact, if you think that you have “science” supporting your nonsense beliefs about GMO’s, just understand that you use the same tactics, the same unscientific rubbish that the global warming deniers use. In other words, you use the same tactics as right wingers, which should make you proud.

Quantum Entanglement to Aid Gravitational Wave Hunt


Detecting the faint ripples in spacetime known as gravitational waves is the primary objective of the Laser Interferometer Gravitational Observatory (LIGO), a huge collaboration that has been searching space for gravitational waves since 2002. Now LIGO scientists have developed a new technique that almost doubles the sensitivity of these detectors by exploiting “squeezed light” and the phenomenon of quantum entanglement.

ANALYSIS: Gravitational Affairs: LIGO’s Little Black Box

LIGO is essentially a giant interferometer. There is a very large mirror hung in such a way as to form an arm, with two more mirrors hung perpendicular to it to form an L-shape when viewed from above. Scientists then pass laser light through a beam splitter, thereby dividing the beam between those two arms, and let the light bounce back and forth a few times before returning to the beam splitter.
LIGO has three such detectors, since it needs to operate at least two detectors at the same time as a control, so they don’t get false positives. A passing gravity wave will cause ripples in spacetime, which in turn will change the distance measured by a light beam; the amount of light falling on the strategically placed photodetector will vary slightly in response.
The resulting signal will tell scientists how the light hitting the photodector changes over time. LIGO scientists liken the instrument to “a microphone that converts gravitational waves into electrical signals.”

ANALYSIS: Closing in on Gravitational Waves

Here’s the biggest problem facing LIGO: any change in the beams caused by gravitational waves is so tiny, it’s drowned out by a quantum effect called vacuum fluctuations. Per Ars Technica:
Basically, the place where we measure the light coming out of the interferometer is also a place where light enters the interferometer. So, we aren’t adding two light fields together at the beamsplitter. No, we are adding four light fields together. Scientists are not so stupid as to accidentally allow stray light into this device, but nature has its own way of producing strays. The vacuum itself is seething with photons that pop into existence and then disappear again. On average, nothing is there. Unfortunately for LIGO, on average is not good enough.
So improving the sensitivity of LIGO’s detectors is an ongoing quest. And according to physicist and blogger Dave Bacon (a.k.a. The Quantum Pontiff), there was a seminal paper published in 1981 by Carl Caves demonstrating that using so-called squeezed states of light could reduce the inherent uncertainty in interferometers by creating entangled photons between the two mirrors. In Bacon’s words: “We can fight quantum with quantum!”

ANALYSIS: Are We Living in a Hologram?

How To Entangle Photons

When subatomic particles collide, they can become invisibly connected, though they may be physically separated. Even at a distance, they are inextricably interlinked and act like a single object — hence the term “entanglement,” or, as Einstein preferred to call it, “spooky action at a distance.”
This is useful because if you measure the state of one, you will know the state of the other without having to make a second measurement, because the first measurement determines what the properties of the other particle must be as well. Cornell University physicist N. David Mermin has described entanglement as “the closest thing we have to magic.”
WATCH VIDEO: Discovery News investigates how and why the Large Hadron Collider is smashing protons together at record energies.
So disturbances in one part of the universe can instantly affect distant other parts of the universe, mysteriously bypassing the ubiquitous speed-of-light barrier. Spooky!
There are lots of different ways particles can become entangled, but in every case, both particles must arise from a single “mother” process. It’s a bit like how identical twins emerge from a single fertilized egg, sharing the genetic material between them.

ANALYSIS: We May Not Live in a Hologram After All

For instance, passing a single photon through a special kind of crystal can split that photon into two new “daughter” particles. We’ll call them “green” and “red.” Those particles will be entangled. Energy must be conserved, so both daughter particles have a lower frequency and energy than the original mother particle, but the total energy between them is equal to the mother’s energy.
We have no way of knowing which is the green one and which is the red. We just know that each daughter photon has a 50/50 chance of being one or the other color. But should we chance to see one of the particles and note that it is red, we can instantly conclude that the other must be green.
Entanglement is a tricky thing, and easily undone by even the slightest interference. That’s why it’s useful in quantum cryptography: the system can detect any “eavesdropper” immediately and know the transmission has been compromised. It now seems likely that gravitational waves could be detected just as easily, by leaving a telltale signature on any entangled particles they encounter.

Squeezing the Light

Physicists have been using light (photons) to probe the mysteries of nature for centuries. But at the quantum scale, uncertainty — a.k.a quantum noise — gets in the way of gleaning useful information.
Squeezing is a way to increase certainty in one quantity (e.g., position or speed) by trading a decrease in certainty in another complementary property. Using special crystals, this squeezing process creates quantum entangled photons between the interferometer’s mirrors, turning one photon into two.
Now you have highly sensitive entangled photons directly in the path of any gravitational waves that happen by. And LIGO scientists have successfully demonstrated that this does, indeed, result in more sensitive detectors, as evidenced in the plot above showing the noise at each frequency in one of the detectors. Per Bacon (again):
The red line shows the reduced noise when squeezed light is used. To get this to work, the squeezed quadrature must be in phase with the amplitude (readout) quadrature of the observatory output light, and this results in path entanglement between the photons in the two beams in the arms of the interferometer. The fluctuations in the photon counts can only be explained by stronger-than-classical correlation among the photons.
“The strange thing is, when you look at it, there’s nothing there, yet this ‘nothing’ which is the vacuum fluctuation can be squeezed and we know it’s real, because it changes the sensitivity of the detector,” physicist David Blair told ABC Science. Blair is director of the Australian International Gravity Wave Research Centre at the University of Western Australia, part of the LIGO collaboration.
LIGO hasn’t reached its full sensitivity yet; that will happen once the planned upgrades for Advanced LIGO are complete. Hopefully, by then, this new “squeezed light” approach can be incorporated into those upgraded detectors. Gravitational waves are a prediction of general relativity. It would be strangely fitting if quantum mechanics ultimately helped detect them.
Image credits: LIGO

Lie point symmetry

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Lie_point_symmetry     ...