Search This Blog

Tuesday, December 17, 2013

Carl Sagan and Jacob Bronowski Issue Same Words of Warning 25 Years apart.



I'd like to combine with this an old post (Isaac Asimov write a lot on this too)t, and ask, quite seriously, if we have made serious progress yet.  All comments invited.


Word of Warning from Forty Years Ago by Jacob Bronowski in "Ascent of Man"

"Knowledge is not a loose-leaf notebook of facts. Above all, it is a responsibility for the integrity of what we are, primarily of what we are as ethical creatures. You cannot possibly maintain that informed integrity if you let other people run the world for you while you yourself continue to live out of a ragbag of morals that come from past beliefs. This is really crucial today. You can see it is pointless to advise people to learn differential equations, or do a course in electronics or computer programming. And yet, fifty years from now, if an understanding of man's origins, his evolution, his history, his progress, is not the commonplace of the schoolbooks, we shall not exist. The commonplace of the schoolbooks of tomorrow is the adventure of today, and that is what we are engaged in."

That was in 1973. It is not 2023, meaning that we have but ten years left to make this vision a reality or we are all in peril. I think Bronowski was pessimistic in his prophecy, but there must be some time period in which it is true. We may not have but a decade, but certainly only decades before it will be true. which means tht we must start now if there is to be any realistic hope. Less than half of Americans accept Darwinian evolution. Many don't trust scientific and technological progress even though they themselves benefit from it.


More Walter Mitty than James Bond: How climate change expert posed as CIA agent for 10 years — RT USA

More Walter Mitty than James Bond: How climate change expert posed as CIA agent for 10 years — RT USA

David Strumfels comment:  Global deniers will probably make of this, although it isn't relevant to the scientific evidence of global warming.  But it does make one wonder.

More Walter Mitty than James Bond: How climate change expert posed as CIA agent for 10 years

Published time: December 17, 2013 11:21
Edited time: December 17, 2013 17:37
John C. Beale (Photo from flickr.com/oversight)
John C. Beale (Photo from flickr.com/oversight)
A leading American expert on climate change and the Environmental Protection Agency’s highest-paid employee, deserves to spend 30 months behind bars for lying to his bosses about being a CIA spy to avoid doing his real job, US federal prosecutors say.

Prosecutors described John Beale’s actions as “crimes of massive proportion” that were “offensive” to those who actually work for the CIA, NBC reported.

Beale pled guilty in September, and has been accused of major fraud of almost $1 million in salary and other benefits over a decade.

While Beale claimed to have a James Bond-style lifestyle, in reality he was leading a Walter Mitty-style fantasy double life.

At one point, Beale in fact claimed to be urgently needed in Pakistan because the Taliban was torturing his CIA replacement, according to EPA Assistant Inspector General Patrick Sullivan, who headed the investigation into Beale’s activities.

“Due to recent events that you have probably read about, I am in Pakistan,” Beale wrote to Gina McCarthy, the EPA’s administrator, in an e-mail dated December 18, 2010. “Got the call Thurs and left Fri. Hope to be back for Christmas ….Ho, ho, ho.”

The EPA official also failed to show at his workplace for long stretches of time, for instance, 18 months starting from June 2011. During this period, he did “absolutely no work,” according to Beale’s lawyer.

In 2008, Beale didn’t show up at the EPA for six months, telling his boss that he was part of a special multi-agency election-year project relating to “candidate security.” He billed the government $57,000 for five trips to California that were made purely “for personal reasons,” his lawyer acknowledged. It turned out that Beale’s parents lived there.

He also claimed to be suffering from malaria that he got while serving in Vietnam – another story that turned out to be a lie.

Among his bonuses and travel expenses were first-class trips to London, where he stayed at five-star hotels and racked up thousands in bills for limos and taxis.

However, most of the time Beale avoided work he was at his Northern Virginia home, riding bikes, doing housework and reading books, or at a vacation house on Cape Cod – all while he claimed he was at the CIA’s Northern Virginia headquarters.

“He’s never been to Langley (the CIA’s Virginia headquarters),” said Patrick Sullivan. “The CIA has no record of him ever walking through the door.”

At the same time, Beale had been the highest-paid official at the EPA, receiving $206,000 a year.
Beale’s scam was revealed in 2013 when it was noticed that he was still receiving his salary a year and a half after he retired.

When first questioned by EPA officials early in 2013 about his alleged CIA undercover work, Beale brushed them off by saying he couldn’t discuss it, according to Sullivan.

Weeks later, after being confronted again by investigators, Beale admitted to lying, but “didn’t show much remorse” and explained he acted this way to “puff up his own image.”

Through his lawyer, Beale has asked the prosecution for leniency, blaming his behavior on psychological problems.

“With the help of his therapist, Mr. Beale has come to recognize that, beyond the motive of greed, his theft and deception were animated by a highly self-destructive and dysfunctional need to engage in excessively reckless, risky behavior,” attorney John Kern wrote.

The lawyer added that the desire to manipulate people by making up grandiose lies stemmed from Beale’s “insecurities.”

The two sentencing memos and other documents on the trial present new details of the case that’s been branded one of the most audacious and creative federal frauds in history.

“I thought, ‘Oh my God, How could this possibly have happened in this agency? I’ve worked for the government for 35 years. I’ve never seen a situation like this,” Patrick Sullivan told NBC News.

One of the most important points raised by the investigation was that why it took the EPA administration so long to start looking into Beale’s grandiose stories.

There’s a certain culture here at the EPA where the mission is the most important thing. They don’t think like criminal investigators. They tend to be very trusting and accepting,” Sullivan said.

It was revealed that Beale publicly retired, but kept getting his salary for another year and a half, with his expense vouchers approved by his colleague whose conduct is now looked into.

Beale is set to be sentenced in Washington on Wednesday.

Brain Neurons Subtract Images, Use Differences: ScienceDaily

ScienceDaily: Your source for the latest research news.  Original source of this article:

http://www.sciencedaily.com/releases/2013/12/131217104240.htm?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+sciencedaily%2Ftop_news%2Ftop_science+%28ScienceDaily%3A+Top+News+--+Top+Science%29&utm_content=FaceBook

Dec. 17, 2013 — Researchers have hitherto assumed that information supplied by the sense of sight was transmitted almost in its entirety from its entry point to higher brain areas, across which visual sensation is generated. "It was therefore a surprise to discover that the data volumes are considerably reduced as early as in the primary visual cortex, the bottleneck leading to the cerebrum," says PD Dr Dirk Jancke from the Institute for Neural Computation at the Ruhr-Universität. "We intuitively assume that our visual system generates a continuous stream of images, just like a video camera. However, we have now demonstrated that the visual cortex suppresses redundant information and saves energy by frequently forwarding image differences."
 
Plus or minus: the brain's two coding strategies
The researchers recorded the neurons' responses to natural image sequences, for example vegetation landscapes or buildings. They created two versions of the images: a complete one and one in which they had systematically removed certain elements, specifically vertical or horizontal contours. If the time elapsing between the individual images was short, i.e. 30 milliseconds, the neurons represented complete image information. That changed when the time elapsing in the sequences was longer than 100 milliseconds. Now, the neurons represented only those elements that were new or missing, namely image differences. "When we analyse a scene, the eyes perform very fast miniature movements in order to register the fine details," explains Nora Nortmann, postgraduate student at the Institute of Cognitive Science at the University of Osnabrück and the RUB work group Optical Imaging. The information regarding those details are forwarded completely and immediately by the primary visual cortex. "If, on the other hand, the time elapsing between the gaze changes is longer, the cortex codes only those aspects in the images that have changed," continues Nora Nortmann. Thus, certain image sections stand out and interesting spots are easier to detect, as the researchers speculate.

"Our brain is permanently looking into the future"
This study illustrates how activities of visual neurons are influenced by past events. "The neurons build up a short-term memory that incorporates constant input," explains Dirk Jancke. However, if something changes abruptly in the perceived image, the brain generates a kind of error message on the basis of the past images. Those signals do not reflect the current input, but the way the current input deviates from the expectations. Researchers have hitherto postulated that this so-called predictive coding only takes place in higher brain areas. "We demonstrated that the principle applies for earlier phases of cortical processing, too," concludes Jancke. "Our brain is permanently looking into the future and comparing current input with the expectations that arose based on past situations."

Observing brain activities in millisecond range
In order to monitor the dynamics of neuronal activities in the brain in the millisecond range, the scientists used voltage-dependent dyes. Those substances fluoresce when neurons receive electrical impulses and become active. Thanks to a high-resolution camera system and the subsequent computer-aided analysis, the neuronal activity can be measured across a surface of several square millimetres. The result is a temporally and spatially precise film of transmission processes within neuronal networks.

With Apologies to Christians, Jews, and Yiddish Speakers of all Stripes

Oy!  To the world, this schmuck has come,
To bother us all day!
With all his hymns and sermons,
He's worse than Uncle Hermann,
And on and on he goes,
And on and on he goes,
Until we all have a hole in the head.

Monday, December 16, 2013

A summary of the evidence that most published research is false | Simply Statistics

A summary of the evidence that most published research is false | Simply Statistics

A summary of the evidence that most published research is false


One of the hottest topics in science has two main conclusions:
  • Most published research is false
  • There is a reproducibility crisis in science
The first claim is often stated in a slightly different way: that most results of scientific experiments do not replicate. I recently got caught up in this debate and I frequently get asked about it.
So I thought I'd do a very brief review of the reported evidence for the two perceived crises. An important point is all of the scientists below have made the best effort they can to tackle a fairly complicated problem and this is early days in the study of science-wise false discovery rates. But the take home message is that there is currently no definitive evidence one way or another about whether most results are false.
  1. Paper: Why most published research findings are falseMain idea: People use hypothesis testing to determine if specific scientific discoveries are significant. This significance calculation is used as a screening mechanism in the scientific literature. Under assumptions about the way people perform these tests and report them it is possible to construct a universe where most published findings are false positive results. Important drawback: The paper contains no real data, it is purely based on conjecture and simulation.
  2. Paper: Drug development: Raise standards for preclinical researchMain ideaMany drugs fail when they move through the development process. Amgen scientists tried to replicate 53 high-profile basic research findings in cancer and could only replicate 6. Important drawback: This is not a scientific paper. The study design, replication attempts, selected studies, and the statistical methods to define "replicate" are not defined. No data is available or provided.
  3. Paper: An estimate of the science-wise false discovery rate and application to the top medical literatureMain idea: The paper collects P-values from published abstracts of papers in the medical literature and uses a statistical method to estimate the false discovery rate proposed in paper 1 above. Important drawback: The paper only collected data from major medical journals and the abstracts. P-values can be manipulated in many ways that could call into question the statistical results in the paper.
  4. Paper: Revised standards for statistical evidenceMain idea: The P-value cutoff of 0.05 is used by many journals to determine statistical significance. This paper proposes an alternative method for screening hypotheses based on Bayes factors. Important drawback: The paper is a theoretical and philosophical argument for simple hypothesis tests. The data analysis recalculates Bayes factors for reported t-statistics and plots the Bayes factor versus the t-test then makes an argument for why one is better than the other.
  5. Paper: Contradicted and initially stronger effects in highly cited research Main idea: This paper looks at studies that attempted to answer the same scientific question where the second study had a larger sample size or more robust (e.g. randomized trial) study design. Some effects reported in the second study do not match the results exactly from the first. Important drawback: The title does not match the results. 16% of studies were contradicted (meaning effect in a different direction). 16% reported smaller effect size, 44% were replicated and 24% were unchallenged. So 44% + 24% + 16% = 86% were not contradicted. Lack of replication is also not proof of error.
  6. PaperModeling the effects of subjective and objective decision making in scientific peer reviewMain idea: This paper considers a theoretical model for how referees of scientific papers may behave socially. They use simulations to point out how an effect called "herding" (basically peer-mimicking) may lead to biases in the review process. Important drawback: The model makes major simplifying assumptions about human behavior and supports these conclusions entirely with simulation. No data is presented.
  7. Paper: Repeatability of published microarray gene expression analysesMain idea: This paper attempts to collect the data used in published papers and to repeat one randomly selected analysis from the paper. For many of the papers the data was either not available or available in a format that made it difficult/impossible to repeat the analysis performed in the original paper. The types of software used were also not clear. Important drawbackThis paper was written about 18 data sets in 2005-2006. This is both early in the era of reproducibility and not comprehensive in any way. This says nothing about the rate of false discoveries in the medical literature but does speak to the reproducibility of genomics experiments 10 years ago.
  8. Paper: Investigating variation in replicability: The "Many Labs" replication project. (not yet published) Main ideaThe idea is to take a bunch of published high-profile results and try to get multiple labs to replicate the results. They successfully replicated 10 out of 13 results and the distribution of results you see is about what you'd expect (see embedded figure below). Important drawback: The paper isn't published yet and it only covers 13 experiments. That being said, this is by far the strongest, most comprehensive, and most reproducible analysis of replication among all the papers surveyed here.
I do think that the reviewed papers are important contributions because they draw attention to real concerns about the modern scientific process. Namely
  • We need more statistical literacy
  • We need more computational literacy
  • We need to require code be published
  • We need mechanisms of peer review that deal with code
  • We need a culture that doesn't use reproducibility as a weapon
  • We need increased transparency in review and evaluation of papers
Some of these have simple fixes (more statistics courses, publishing code) some are much, much harder (changing publication/review culture).
The Many Labs project (Paper 8) points out that statistical research is proceeding in a fairly reasonable fashion. Some effects are overestimated in individual studies, some are underestimated, and some are just about right. Regardless, no single study should stand alone as the last word about an important scientific issue. It obviously won't be possible to replicate every study as intensely as those in the Many Labs project, but this is a reassuring piece of evidence that things aren't as bad as some paper titles and headlines may make it seem.

Many labs data. Blue x's are original effect sizes. Other dots are effect sizes from replication experiments (http://rolfzwaan.blogspot.com/2013/11/what-can-we-learn-from-many-labs.html)
The Many Labs results suggest that the hype about the failures of science are, at the very least, premature. I think an equally important idea is that science has pretty much always worked with some number of false positive and irreplicable studies. This was beautifully described by Jared Horvath in this blog post from the Economist.  I think the take home message is that regardless of the rate of false discoveries, the scientific process has led to amazing and life-altering discoveries.

FDA examining antibacterial soaps, body washes - CNN.com

FDA examining antibacterial soaps, body washes - CNN.com

(CNN) -- Manufacturers of antibacterial hand soap and body wash will be required to prove their products are more effective than plain soap and water in preventing illness and the spread of infection, under a proposed rule announced Monday by the Food and Drug Administration.
Those manufacturers also will be required to prove their products are safe for long-term use, the agency said.
 
"Millions of Americans use antibacterial hand soap and body wash products," the agency said in a statement. "Although consumers generally view these products as effective tools to help prevent the spread of germs, there is currently no evidence that they are any more effective at preventing illness than washing with plain soap and water.
 
"Further, some data suggest that long-term exposure to certain active ingredients used in antibacterial products -- for example, triclosan (liquid soaps) and triclocarban (bar soaps) -- could pose health risks, such as bacterial resistance or hormonal effects."
About 2,000 individual products contain these products, health officials said.
 
"Our goal is, if a company is making a claim that something is antibacterial and in this case promoting the concept that consumers who use these products can prevent the spread of germs, then there ought to be data behind that," said Dr. Sandra Kweder, deputy director of the Office of New Drugs in FDA's Center for Drug Evaluation and Research.
 
"We think that companies ought to have data before they make these claims."
Studies in rats have shown a decrease in thyroid hormones with long-term exposure, she said. Collecting data from humans is "very difficult" because the studies look at a long time period.
 
 
Before the proposed rule is finalized, companies will need to provide data to support their claims, or -- if they do not -- the products will need to be reformulated or relabeled to remain on the market.
"This is a good first step toward getting unsafe triclosan off the market," said Mae Wu, an attorney for the Natural Resources Defense Council. "FDA is finally taking concerns about triclosan seriously. Washing your hands with soap containing triclosan doesn't make them cleaner than using regular soap and water and can carry potential health risks.
 
The FDA first proposed removing triclosan from certain products in 1978, the council said, "but because the agency took no final action, triclosan has been found in more and more soaps."
In 2010, the council said it sued FDA to force it to issue a final rule. The new proposed rule stems from a settlement in that suit, according to the NRDC.
 
The rule is available for public comment for 180 days, with a concurrent one-year period for companies to submit new data and information, followed by a 60-day period for rebuttal comments, according to the FDA.
 
The target deadline is June 2014 for the public comment period, then companies will have until December 2014 to submit data and studies. The FDA wants to finalize the rule and determine whether these products are "generally recognized as safe and effective" by September 2016.
"Antibacterial soaps and body washes are used widely and frequently by consumers in everyday home, work, school and public settings, where the risk of infection is relatively low," said Dr. Janet Woodcock, director of the FDA's Center for Drug Evaluation and Research.
 
"Due to consumers' extensive exposure to the ingredients in antibacterial soaps, we believe there should be a clearly demonstrated benefit from using antibacterial soap to balance any potential risk."
The action is part of FDA's ongoing review of antibacterial active ingredients, the agency said.
Hand sanitizers, wipes and antibacterial products used in health care settings are not affected.
Most hand sanitizers have 60% alcohol or ethanol and are generally recognized as safe when water isn't available, Kweder said. However, health officials still believe washing hands with soap and water is the best method.

Cassini reveals clues about Titan’s hydrocarbon lakes and seas | Science Recorder

Cassini reveals clues about Titan’s hydrocarbon lakes and seas | Science Recorder

P.J. O’Rourke - American Satirist, Journalist and Author | Point of Inquiry

P.J. O’Rourke - American Satirist, Journalist and Author | Point of Inquiry

The News About the Universe Isn't Good

The News About the Universe Isn't Good

Hitchens didn't think rejecting religion would solve everything. But he knew only reason would give us justice by Jeffrey Tayler @ Salon

The real New Atheism: Rejecting religion for a just worldChristopher Hitchens (Credit: AP/Chad Rachman)
 
David Strumfels -- You might prefer the original article at
 
The ever-polemical atheist author Christopher Hitchens died two years ago this month, yet his incisive, erudite diatribes against religion continue to rile the faithful and spark debate. The latest anti-Hitch outburst comes from Sean McElwee, a writer and researcher of public policy who describes himself as “a poorly practicing Christian who reads enough science to be functional at dinner parties.” McElwee calls for a “truce” between believers and nonbelievers. But he stands on the losing side of both public opinion trends and history. According to a Pew poll conducted in 2012, a record number of young Americans – a quarter of those between the ages of 18 and 29 — see themselves as unaffiliated with any religion. Atheists’ ranks are swelling, and believers are finding it increasingly difficult to justify their faith.

McElwee begins by calling the New Atheist movement “a rather disturbing trend” in a country “whose greatest reformer” – Martin Luther King, Jr. – “was a Reverend.” Dr. King won fame as a civil rights leader, not as a religious figure. McElwee would do well to recall the words of Founding Father John Adams: “the Government of the United States of America is not, in any sense, founded on the Christian religion.” McElwee goes on to attribute to New Atheists an unsound premise of his own concoction:
1.  The cause of all human suffering is irrationality
2.  Religion is irrational
3.  Religion is the cause of all human suffering
Hitchens’ most notorious atheistic tome is entitled “God Is Not Great: How Religion Poisons Everything.” But no serious reader could conclude from this book (or from the writings of the other New Atheists — Richard Dawkins, Sam Harris and Daniel Dennett — whom McElwee also hopes to debunk) that he considers religion the sole wellspring of humankind’s woes. Though he derided religion long before and after he published “God Is Not Great,” Hitchens never said any such thing, and no reasonable person would believe it. Are cancer and flesh-eating bacteria manifestations of irrationality? What about about wars over territory or natural resources? Poverty and inequality?
Bullying and bulimia? The “classical logical error of post hoc ergo propter hoc” McElwee ascribes to New Atheists simply does not exist.

McElwee then jumps to Hitchens’ (misbegotten) support of the second Iraq war and attempts to press it into service to discredit him in matters of faith. Hitchens, as McElwee correctly notes, opposed the 1991 invasion of Iraq, but when George W. Bush was in office, according to McElwee, Hitchens “decided that, in fact, bombing children was no longer so abhorrent” because the 2003-2011 conflict was to be a “final Armageddon between the forces of rationality and the forces of religion.” No, this was not how Hitchens viewed the second Iraq war. He advocated invading Iraq to overthrow Saddam, who was, he contended, guilty of crimes against humanity, and he (mistakenly) assumed a stable democracy would result from the dictator’s ouster.

Hitchens understood the secular nature of Saddam’s Ba’ath Party, which made all the more puzzling and problematic his stubborn insistence that Saddam was colluding with Al Qaeda. But McElwee then asserts that “the force of rationality and civilization was led by a cabal of religious extremists” – in the Bush administration — which “was of no concern for Hitchens.” George W. Bush was a convert to Evangelical Christianity, which does not necessarily make him a “religious extremist,” and the (mixed) faiths of the Iraq War’s other architects (Dick Cheney, Condoleeza Rice, Paul Wolfowitz, Douglas Feith, et al.) did not fuel their zeal for deposing Saddam.

McElwee proceeds to mischaracterize Hitchens’ post-9/11 worldview as a “war between the good Christian West and the evil Muslim Middle East.” How McElwee can expect us to believe this of Hitchens, who authored a book (“The Missionary Position”) denouncing Mother Theresa as a fraud and relentlessly attacked Christianity, baffles me, as does McElwee’s blindness to his own blunder. Is Hitchens now, according to him, pro-Christian?

McElwee also falsely attributes obscurantist motives to New Atheists. “Might it be better to see jihad as a response to Western colonialism and the upending of Islamic society, rather than the product of religious extremism? The goal of the ‘New Atheists’ is to eliminate centuries of history that Europeans are happy to erase, and render the current conflict as one of reason versus faith rather than what is, exploiter and exploited.”

Stripping jihad of its religious grounds invites nothing but confusion. Jihad in Arabic means “struggle,” but, with respect to Islam, denotes “a struggle in the name of faith,” which includes holy war against infidels waged as a matter of religious duty.  Such jihad is, ipso facto, religious. Informed readers also know that jihadists, in their addresses to the Muslim umma, rail against Western occupation of Islamic lands, “infidel” Western-backed dictators in Muslim countries, and so on — all the while citing passages from the Quran. Hitchens and Dawkins, both Europe-born and versed in their continent’s past – a past replete with religious and political conflicts of all kinds — have never sought to “erase” its history or present “the current conflict” as solely one of “reason versus faith.”
McElwee then tendentiously defines religion so as to paper over its often decisive role in precipitating conflicts. Though he allows that it might “motivate acts of social justice and injustice,” “[r]eligion is both a personal search for truth as well as a communal attempt to discern where we fit in the order of things.” Religion first and foremost consists of unsubstantiated, dogmatically advanced explanations for the cosmos and our place in it, with resulting universally applicable rules of conduct. A good many of these rules – especially those regarding women’s behavior and their (subservient) status vis-à-vis men, and prescriptions for less-than-merciful treatment of gays – are repugnant, retrograde, and arbitrary, based on “sacred texts” espousing “revealed truths” dating back to what the British atheist philosopher Bertrand Russell justly called the “savage ages.” (Islam by no means has a monopoly on such rules – check Leviticus for its catalogue of “crimes”: working on the Sabbath, cursing one’s parents, being the victim of rape – that merit the death penalty.) Just how such “holy” compendia of ahistorical, often macabre fables are supposed to help anyone in a “personal search for truth” mystifies me.

Lacking any alternative, McElwee then tells nonbelievers to lay off the faithful: “any critique of religion that can be made from the outside (by atheists) can be made more persuasively from within religion.” The last time I checked, those “within religion” who denounce religion as untrue, unfounded on fact, irrational by its nature and preying upon our fears, would in fact be atheists. The problem is not, as McElwee says, “the Church’s excesses” – but the Church itself, its backward rules, its reactionary ethos, its groundless assertion of moral authority. The latter is laughable, especially regarding the Catholic Church, in view of the catalogue of crimes – including the persecution of Jews, the Crusades, the Inquisition and silence with respect to Hitler’s Final Solution — for which it bears self-admitted guilt. If one breaks free of the racket of faith, then faith-sanctioned strictures, fantastic tales (human parthenogenesis among them) demanding faith to be believed, to say nothing of the justness of tax exemptions for faith organizations, all appear as entirely human creations that are questionable at best, criminal at worst, and certainly deserving of no kid-glove treatment.

“The impulse to destroy religion will ultimately fail,” McElwee claims. Just what he means by this is unclear. Hitchens spoke out tirelessly against religion but never believed it could be eradicated; rather, he likened it to Camus’ plague-infected rats, scurrying about in humanity’s sewer, ever awaiting a chance to reemerge. Hitchens certainly never foresaw the bizarre scenario McElwee outlines: “Banish Christ and Muhammad and you may end up with religions surrounding the works of Zizek and Sloterdijk (there is already a Journal of Zizek Studies, maybe soon a seminary?).
Humans will always try to find meaning and purpose in their lives, and science will never be able to tell them what it is.” New Atheists have never assigned science such a role. Hitchens himself often recommended the consolations of literature for this purpose. The broader point rationalists make is simple: People, having set aside fairy tales and mandated moral certainties delivered from on high, must seek meaning on their own, seek to order society in ways beneficial for all, and do so with reason as lodestar.

So what is to be done? McElwee trots out the idea of a truce – “one originally proposed by the Catholic church and promoted by the eminent Stephen J. Gould,” that “Science, the study of the natural world, and religion, the inquiry into the meaning of life (or metaphysics, more broadly) constitute non-overlapping magisteria.” One straightaway must regard as suspect a “truce” advocated by an organization guilty of repressing scientists and opposing the scientific Weltanschauung. And one would be right to be suspicious, according to McElwee’s proposition: “Neither [science nor religion] can invalidate the theories of the other, if such theories are properly within their realm.” Just what the boundaries of those realms are and who decides them have been matters of contention since time immemorial. Just ask Galileo.

McElwee next concludes that “religion (either secular or theological) does not poison all of society and science should not be feared, but rather embraced.”

No one is waiting for McElwee’s green light to “embrace” science, which holds its place among us by virtue of its proven utility, its lab-tested veracity. At this point, McElwee’s second citation of Martin Luther King cannot avail him. “Science deals mainly with facts; religion deals mainly with values. The two are not rivals.” Dr. King’s saying this does not make it so. Faith and reason are fighting for supremacy the world over, and rationalists must make their case with ardor, shying away from no battle. Atheists who wobble in defense of nonbelief would do well to recall 9/11, Baruch Goldstein’s Hebron massacre of Palestinians, the Salem witch trials and violence meted out in the name of religion to “unchaste” women throughout the ages. This is, of course, an incomplete list of atrocities motivated by religion.

The sooner we accord priests, rabbis and imams the same respect we owe fabulists and self-help gurus, the faster we will progress toward a more just, more humane future. Enlightenment must be our goal, and that was what Hitchens advocated above all.
 
                
Jeffrey Tayler is a contributing editor at The Atlantic. His seventh book, "Topless Jihadis -- Inside Femen, the World's Most Provocative Activist Group," will be published on December 20 as an Atlantic ebook. Follow @JeffreyTayler1 on Twitter.

Sunday, December 15, 2013

Standing Up for Sex by Henry Gee

Humans evolved the ability to walk on two legs because it allowed them to more accurately size up prospective mates. Or did they?
 
By | December 1, 2013
UNIVERSITY OF CHICAGO PRESS, OCTOBER 2013It happened years ago, but the event was so traumatic that I remember it as if it were yesterday: an elderly professor physically pinned me against a wall and berated me for rejecting his paper on why human ancestors got up on their hind legs and walked. “The reason,” frothed the empurpled sage, “was to make it easier for mothers to carry babies close to their chests.” See? So blindingly obvious that anyone, even I, could understand it.

Manuscripts seeking to explain the evolutionary roots of human bipedalism land in my in-box at Nature with monotonous regularity. We became bipeds so that we could carry food, or tools; so we could see farther; so we wouldn’t expose so much of our skin to the Sun; so we could wade better in rivers and lakes. They all make good stories, but they all share the same error—they are explanations after the fact, and, as such, betray a fundamental misunderstanding of how evolution works.

Natural selection, the mechanism of evolution, operates without memory or foresight. It has no intention. It is we who choose to interpret evolutionary purposes as such later on. The features of living things, therefore, do not evolve for any preconceived purpose that we can discern. I explain how such misunderstandings color our understanding of human evolution in my latest book, The Accidental Species: Misunderstandings of Human Evolution.

But none of this stops me having my own go at understanding why humans came to walk on two legs. In my view, it all happened by accident.

Bipedalism is just one of the many peculiarities of human anatomy and behavior that set us apart from our closest relatives, the great apes. We are also much more social than they are, we have unusually large brains, we have much more body fat, and we are much less hairy. The differential distribution of fat and hair happens to be strongly correlated with sexual dimorphism.

On the subject of sex, women’s furless breasts are prominent at all times, not just when women are lactating. Unlike female chimpanzees, our closest living primate relatives, women do not advertise estrus—the time of maximum fertility—by the swelling of the sexual organs. And while we’re talking about advertisement, men have the largest penises, relative to body mass, of any ape. A male gorilla might weigh twice as much as an adult human, but he’s lucky if he ever gets an erection more than inch long.

But if humans’ prominent breasts and big penises are made obvious by hairlessness, they are made more so by bipedalism, which displays everything for all to see. In which case, standing upright could be a by-product of sexual selection, in which mates choose one another on the basis of features that might represent outward signs of inward genetic health.

Some sexually selected features, though, appear have been chosen at random when, by chance, a trait in one sex becomes associated with the preference for that trait in the other, leading to runaway positive feedback, survival value be damned. The massive train of the peacock is a good example. It looks flashy and attracts mates, but costs a great deal of energy to make and maintain, and hobbles a peacock trying to flee from predators. The transition to bipedalism might be seen in the same way: it was selected because it better advertised our sexual wares, but hobbled us in other ways.

The imposition of walking upright on a fundamentally quadrupedal design has prompted a thorough reworking of the entire human body, making back pain one of the single biggest causes of worker absenteeism in the world. Rather than an adaptation, bipedalism could be a dreadful kludge, forced on us by sexual selection in defiance of gravity and common sense.

Now, I advance the above more than half in jest. It’s possibly no better or worse than any other idea, but I’m not going to pin anyone against a wall and shout about it.

Henry Gee is a senior editor at Nature, and the author of Jacob’s Ladder: The History of the Human Genome, In Search of Deep Time, and The Science of Middle-earth. Read an excerpt of The Accidental Species.

Splitting water into hydrogen and oxygen using light, nanoparticles

Splitting water into hydrogen and oxygen using light, nanoparticles

Pseudoscience and psychopathy

hunting pseudoscience in the internet jungle

Guest post on Skeptical Raptor by Matthew Facciani
psychopath-lecter
There has been a news story creating some buzz lately regarding recent claims made by neuroscientist James Fallon, a Professor of Psychiatry at the University of California-Irvine School of Medicine. Dr. Fallon studied the brains of psychopaths for a few years and later saw that his brain was just like those of the psychopaths he studied. Many news outlets are picking up on this story as Dr. Fallon has just released a new book about it as well.

To summarize, Dr. Fallon had received a PET scan of his brain in conjunction with an Alzheimer’s disease study, and subsequently noticed that his PET scan image was eerily similar to PET scan images from those of the psychopaths he researched. These articles then reported how both the psychopaths and  Dr. Fallon had less activity in the frontal and temporal lobes which he claimed are linked to empathy, self-control, and morality. Beyond the PET scan results, Dr. Fallon mentions how he his family has the “Warrior Gene” which is associated with aggressive behavior.

Additionally, he admitted to some history of violence in his family. Despite all of these observations, Dr. Fallon claimed that he has led a normal life without violence. He argued that despite having genes which can promote aggression, a psychopathic brain, and a history of family violence, he did not turn into a psychopath because he did not have a traumatic childhood which could trigger psychopathic tendencies. Of course, his book will probably be a best seller as it brings up interesting questions and discussions about free will and criminal behavior.

As a neuroscientist myself, I was curious about the particular details about the specifics of Dr. Fallon’s brain imaging research. I searched through several news articles, listened to his NPR interview, and watched a talk he gave on the subject and was always left puzzled over the lack of details. I searched for ANY details regarding the PET scans Dr. Fallon mentioned, but every article simply had a pretty brain picture with little information. I was curious as to how a single brain scan from an unrelated study could predict psychopathic behavior.

Why didn’t anyone mention the details or link to an article that did? Specifically, I was confused on why Dr. Fallon was comparing his brain scan from an Alzheimer’s study to an unrelated study about psychopaths. Did they do the same task in each study? PET scans measure real time brain activity, so these brain activations seen in the pretty pictures reflect activity from some task. Despite this empirical data being crucial to make any sort of scientific inference, no article mentioned what the individuals being scanned were actually doing during the PET scan. Dr. Fallon argues that his own brain activity in the regions of the frontal and temporal cortex is lacking, similar to psychopaths, but fails to mention anything more specific in these interviews.

Morality, like any other high level cognition, is terribly difficult to study in the brain and there are a significant number of scientific articles trying to tease apart morality’s functional neuroanatomy. Higher-level cognitive processes are often derived from a complicated network of neural activation which requires careful experimental design to tease apart. A crucial issue with Dr. Fallon’s story is that we can’t even critique such an experimental design because he wasn’t even doing any sort of morality study! So to say that less frontal and temporal activity equals less morality is a gross oversimplification to begin with and there isn’t even any details to support such a claim.

Furthermore, even if Dr. Fallon’s was identical to a group of psychopathic brains, it would only prove association, not causation. There could be many factors which create differences in neural activity and a third variable (exposure to violence for example) could be the cause. Finally, neuroimaging studies are often based on the results of group analysis. Rarely is a single brain scan discussed in the results. Thus, comparing a single brain scan from one study to an aggregate of brain scans from an entirely different study isn’t just wrong, it’s unethical.

This is a classic example of poor scientific journalism and I believe it became so popular due to widespread deficits in scientific literacy. You don’t have to be a neuroscientist to see that there are huge problems with his story. You simply have view this story objectively have a healthy dose of skepticism without quickly deferring to the authority figure. There are simple questions which are never addressed here. What experiment was being done during each PET scan? If the psychopaths and Dr. Fallon were both completing a morality task and they both had low activity in certain regions, THEN that would be something more tangible. This is simply showing a brain picture and not asking questions. We know that people are much more likely to believe something if there is a brain picture associated with it and this is further proof.

My intention is not to claim that Dr. Fallon is lying and purposefully simplifying science to make a profit. I would need much more evidence for that. However, I am arguing that the news articles covering his story do not provide enough details to support his claims. I find it rather troubling that no one is even addressing this so I wanted to blog about it. It is also troubling that Dr. Fallon has not been more explicit about the limitations of his findings as he should surely be aware of them as an accomplished neuroscientist. America often ranks pretty poorly in scientific literacy and this is an example of the result. People should at least have a working understanding of the scientific method and not blindly believe an authority figure with an interesting story.

Matthew Facciani is a 3rd year Ph.D. candidate focused on cognitive neuroscience at a major Southern US research university. If you have questions for Mr. Facciani and his critiques, please drop a comment. 
Key citations:

Higgs Boson Gets Nobel Prize, But Physicists Still Don’t Know What It Means

By Adam Mann.  "Adam is a Wired Science staff writer. He lives in Oakland, Ca near a lake and enjoys space, physics, and other sciency things."

More than a year ago, scientists found the Higgs boson. This morning, two physicists who 50 years ago theorized the existence of this particle, which is responsible for conferring mass to all other known particles in the universe, got the Nobel, the highest prize in science.

For all the excitement the award has already generated, finding the Higgs — arguably the most important discovery in more than a generation — has left physicists without a clear roadmap of where to go next. While popular articles often describe how the Higgs might help theorists investigating the weird worlds of string theory, multiple universes, or supersymmetry, the truth is that evidence for these ideas is scant to nonexistent.

No one is sure which of these models, if any, will eventually describe reality. The current picture of the universe, the Standard Model, is supposed to account for all known particles and their interactions. But scientists know that it’s incomplete. Its problems need fixing, and researchers could use some help figuring out how. Some of them look at the data and say that we need to throw out speculative ideas such as supersymmetry and the multiverse, models that look elegant mathematically but are unprovable from an experimental perspective. Others look at the exact same data and come to the opposite conclusion.

“Physics is at a crossroads,” said cosmologist Neil Turok, speaking to a class of young scientists in September at the Perimeter Institute, which he directs. “In a sense we’ve entered a very deep crisis.”

The word “crisis” is a charged one within the physics community, invoking eras such as the early 20th century, when new observations were overturning long-held beliefs about how the universe works. Eventually, a group of young researchers showed that quantum mechanics was the best way to describe reality. Now, as then, many troubling observations leave physicists scratching their heads. Chief among them is the “Hierarchy Problem,” which in its simplest form asks why gravity is approximately 10 quadrillion times weaker than the three other fundamental forces in the universe. Another issue is the existence of dark matter, the unseen, mysterious mass thought to be responsible for strange observations in the rotation of galaxies.

The solution to both these problems might come from the discovery of new particles beyond the Higgs. One theory, supersymmetry, goes beyond the Standard Model to say that every subatomic particle — quarks, electrons, neutrinos, and so on — also has a heavier twin. Some of these new particles might have the right characteristics to account for the influence of dark matter. Engineers built the Large Hadron Collider to see if such new particles exist (and may yet see them once it reaches higher energy in 2014), but so far it hasn’t turned up anything other than the Higgs.

In fact, the Higgs itself has turned out to be part of the issue. The particle was the final piece in the Standard Model puzzle. When scientists discovered it at the LHC, it had a mass of 125 GeV, about 125 times heavier than a proton — exactly what standard physics expected. That was kind of a buzzkill. Though happy to know the Higgs was there, many scientists had hoped it would turn out to be strange, to defy their predictions in some way and give a hint as to which models beyond the Standard Model were correct. Instead, it’s ordinary, perhaps even boring.

All this means that confidence in supersymmetry is dropping like a stone, according to Tommaso Dorigo, a particle physicist at the LHC. In one blog post, he shared a rather pornographic plot showing how the findings of the LHC eliminated part of the evidence for supersymmetry. Later, he wrote that many physicists would have previously bet their reproductive organs on the idea that supersymmetric particles would appear at the LHC. That the accelerator’s experiments have failed to find anything yet “has significantly cooled everybody down,” he wrote.

In fact, when the organizers of a Higgs workshop in Madrid last month asked physicists there if they thought the LHC would eventually find new physics other than the Higgs boson, 41 percent said no. As to how to solve the known problems of the Standard Model, respondents were all over the map. String theory fared the worst, with three-quarters of those polled saying they did not think it is the ultimate answer to a unified physics.

One possibility has been brought up that even physicists don’t like to think about. Maybe the universe is even stranger than they think. Like, so strange that even post-Standard Model models can’t account for it. Some physicists are starting to question whether or not our universe is natural. This cuts to the heart of why our reality has the features that it does: that is, full of quarks and electricity and a particular speed of light.

This problem, the naturalness or unnaturalness of our universe, can be likened to a weird thought experiment. Suppose you walk into a room and find a pencil balanced perfectly vertical on its sharp tip. That would be a fairly unnatural state for the pencil to be in because any small deviation would have caused it to fall down. This is how physicists have found the universe: a bunch of rather well-tuned fundamental constants have been discovered that produce the reality that we see.

A natural explanation would show why the pencil is standing on its end. Perhaps there is a very thin string holding the pencil to the ceiling that you never noticed until you got up close. Supersymmetry is a natural explanation in this regard – it explains the structure of universe through as-yet-unseen particles.

But suppose that infinite rooms exist with infinite numbers of pencils. While most of the rooms would have pencils that have fallen over, it is almost certain that in at least one room, the pencil would be perfectly balanced. This is the idea behind the multiverse. Our universe is but one of many and it happens to be the one where the laws of physics happen to be in the right state to make stars burn hydrogen, planets form round spheres, and creatures like us evolve on their surface.

The multiverse idea has two strikes against it, though. First, physicists would refer to it as an unnatural explanation because it simply happened by chance. And second, no real evidence for it exists and we have no experiment that could currently test for it.

As of yet, physicists are still in the dark. We can see vague outlines ahead of us but no one knows what form they will take when we reach them. Finding the Higgs has provided the tiniest bit of light. But until more data appears, it won’t be enough.

I Had to Repost Jerry Coyne's Blog (with apologies if I'm not supposed to)

The good and bad of humanity

It is a truism of both religion and biology that humans are simultaneously selfish and altruistic.  The faithful say the selfishness comes from original sin and the goodness from God, while the biologist imputes our selfishness to evolution (for how better can you ensure propagation of your genes than by taking care of yourself and your kin first?); and, as for altruism, cooperation and kindness, they’re probably partly derived from adaptive reciprocal altruism evolved when we lived in small social groups, and partly from  a cultural overlay of expanded cooperation derived from reason (we now see that we don’t occupy any privileged position relative to others in society).
Regardless, I saw both traits demonstrated this week.  Last Saturday afternoon I parked my car in front of my building at work; I usually use it on the weekends and then leave it at work in case I need to use it during the week.  On Wednesday I looked out the window of my lab (I can overlook the car, which is nice) to see a huge dent in the front fender on the driver’s side. Going down to investigate, I saw that it was indeed a large, fresh dent, but I also found a note stuck in my door handle.
The note said this (I’ve redacted names and phone numbers):
“Hi,
I saw the guy hit your left front fender in the snow. It was a [model and make of car redacted], with the Illinois plate [license plate number redacted].   Best of luck.
—name redacted
[phone number of person who wrote note redacted]. That’s all I saw, but feel free to call if you want.”
So while I was enormously peeved that someone had dinged me and run off, I was touched that a passerby took the time to take down the license number and description of the car and leave it for me, along with his phone number.
I called the number, which turned out to belong to a medical student here at the University. He reported that he say the guy hit my car while backing out in the snow, and then get out of his car and inspect the damage to both his SUV and mine. At that time the student told him, “You know, you should leave a note for the owner.” The dinger said, “Yeah, I guess I should,” but the student suspected he wouldn’t.  So he took out a pen and wrote all the information down on a piece of paper, which he later on my car when he returned and found no note from the malefactor.
I reported it to my insurance company and the University police, which ran the plates of the car that hit me and identified the owner. They also filed a formal report with the state of Illinois (I guess hit and run, even if it doesn’t hurt someone, violates some law or other).  My insurance company will fix the damage for nearly free, (I have to pay a small deductable). I don’t know what will happen to the miscreant who hit me and ran: probably nothing except that my insurance company will force his to pony up for the damage to my car.
This is about the fourth time this has happened to me in my life, and only once has someone left a note—a woman visiting from California, and the damage was so minor that I didn’t do anything about it. But it’s a truly vile act to damage someone’s property and then abscond without taking responsibility.  They do it because, of course, they think they can get away with it.  But this guy didn’t, thanks to a kind and observant student.
It’s a slow news day, so I’m reporting this, but it does show what we all know: some people are jerks and others go out of their way to be helpful. The next time you’re on the bus and an old person gets on, don’t be one of those who keeps your sit or pretends not to notice. Stand up and let the older person sit down.
If you’ve had experiences with really nice strangers, report them below (car-bashing jerks or others can also be reported).

Saturday, December 14, 2013

Scientist: Eruption Of Yellowstone Super Volcano Would Be 2,000 Times The Size Of Mount St. Helens « CBS Las Vegas

Scientist: Eruption Of Yellowstone Super Volcano Would Be 2,000 Times The Size Of Mount St. Helens « CBS Las Vegas

Scientists discover secret code hidden within human DNA

What follows is from the RT web site (https://www.facebook.com/RTnews).  Word of warning:  I could find no scientific or other references on the RT site about this, and so can't vouch for its scientific accuracy.
This undated handout illustration shows the DNA double helix (AFP Photo)
This undated handout illustration shows the DNA double helix (AFP Photo)
Scientists have discovered a secret second code hiding within DNA which instructs cells on how genes are controlled. The amazing discovery is expected to open new doors to the diagnosis and treatment of diseases, according to a new study.

Ever since the genetic code was deciphered over 40 years ago, scientists have believed that it only described how proteins are made. However, the revelation made by the research team led by John Stamatoyannopoulos of the University of Washington indicates that genomes use the genetic code to write two separate languages.

“For over 40 years we have assumed that DNA changes affecting the genetic code solely impact how proteins are made,” said Stamatoyannopoulos, according to the press release. “Now we know that this basic assumption about reading the human genome missed half of the picture.”
Scientists discovered that the second language instructs the cells on how genes are controlled, according to findings published in Science magazine on Friday. The study is part of the Encyclopedia of DNA Elements Project, also known as ENCODE.
DNA (Deoxyribonucleic acid) is a nucleic acid that is the main constituent of the chromosomes of all organisms, except some viruses. DNA is self-replicating, plays a central role in protein synthesis, and is responsible for the transmission of hereditary characteristics from parents to offspring.

The second language remained hidden for so long because one language is written on top of the other, scientists said.

Scientists already knew that the genetic code uses a 64-letter alphabet called codons. The research team discovered that some of the codons can have two meanings – one related to proteins, the other to gene control. Those codons were given the name ‘duons.’

And it’s those duons that are expected to change the way physicians interpret human genomes, and give clues for the treatments of diseases.

The fact that the genetic code can simultaneously write two kinds of information means that many DNA changes that appear to alter protein sequences may actually cause disease by disrupting gene control programs or even both mechanisms simultaneously,” said Stamatoyannopoulos.
Speaking about the discovery, Stamatoyannopoulos said that the “new findings highlight that DNA is an incredibly powerful information storage device, which nature has fully exploited in unexpected ways.”

Could the universe collapse TODAY? Physicists claim that risk is ‘more likely than ever and may have already started’


  • Collapse could be down to a subatomic particle known as the Higgs boson
  • Higgs boson is evidence for an energy field that pervades the universe
  • Shift in field will cause particles in it to become billions of times heavier
  • The new weight will squeeze all material into a small, super-hot and heavy ball, and the universe as we know it will cease to exist
By Ellie Zolfagharifard
|


The universe could be about to collapse and everything in it - including us - will be compressed into a small, hard ball.
The process may already have started somewhere in our cosmos and is eating away at the rest of the universe, according to theoretical physicists.
The mind-bending concept has been around for a while, but now researchers in Denmark claim they have proven it is possible with mathematical equations.
Collapse of universe
Scientists believe sooner or later a radical shift in the forces of the universe will cause every particle in it to become extremely heavy. The new weight will squeeze all material into a small, super-hot and heavy ball, and the universe as we know it will cease to exist

The basis of the theory is that sooner or later a radical shift in the forces of the universe will cause every particle in it to become extremely heavy.
Everything - every grain of sand, every planet and every galaxy – will become billions of times heavier than it is now.
 
The theory suggests that the new weight will squeeze all material into a small, super-hot and heavy ball, and the universe as we know it will cease to exist.
This violent process is called a ‘phase transition’ and is similar to what happens when, for example, water turns to steam or a magnet heats up and loses its power.
Black hole swallowing Earth
The violent process is called a 'phase transition' and is similar to what happens when, for example, water turns to steam or a magnet heats up and loses its power

WHAT WOULD CAUSE OUR UNIVERSE TO COLLAPSE?

The collapse of the universe could all be down to a subatomic particle discovered last year known as the Higgs boson.
The Higgs boson particle is a manifestation of an energy field can be found throughout the universe called the Higgs field.
The Higgs field is thought to explain why particles have mass.
This Higgs field could exist in two states - one that we feel now - and another that is billions of times denser than what scientists have already observed.
If this ultra-dense Higgs field existed, then a bubble of this state could suddenly appear in a certain place of the universe at a certain time, similar to when you boil water.
The bubble would then expand at the speed of light, entering all space, and turning the Higgs field from the state it’s in now into a new one.
All elementary particles inside the bubble will reach a mass, that is much heavier than if they were outside the bubble.
The new weight will squeeze all material into a small, super-hot and heavy ball, and the universe as we know it will cease to exist.
According to something known as the Higgs theory, a phase transition such as this took place one tenth of a billionth of a second after the Big Bang, causing a shift in the fabric of space-time.
During this transition, empty space became filled with an invisible substance that we now call the Higgs field.
Some elementary particles interact with this field, gaining energy in the process, and this intrinsic energy is known as the mass of a particle.
By using mathematical equations, researchers at the University of Southern Denmark have discovered that the Higgs field could exist in two states, just like matter can exist as a liquid or a solid.
In the second state, the Higgs field is billions of times denser than what scientists have already observed.
If this ultra-dense Higgs field exists, then a 'bubble' of this state could suddenly appear in a certain place of the universe at anytime, similar to when you boil water.
The bubble would then expand at the speed of light, entering all space, and turning the Higgs field from the state it is in now into a new one.
All elementary particles inside the bubble will reach a mass much heavier than if they were outside the bubble, and they would be pulled together to form supermassive centres.
‘Many theories and calculations predict such a phase transition– but there have been some uncertainties in the previous calculations,’ said Jens Krog, PhD student at University of Southern Denmark.
‘Now we have performed more precise calculations, and we see two things: Yes, the universe will probably collapse, and: A collapse is even more likely than the old calculations predicted.’
The collapse of the universe could all be down to a subatomic particle discovered last year known as the Higgs boson
The collapse of the universe could all be down to a subatomic particle discovered last year known as the Higgs boson. British physicist Peter Higgs (right) and Belgian physicist Francois Englert (left) received the Nobel prize this year for work on the theory of this particle

‘The phase transition will start somewhere in the universe and spread from there. Maybe the collapse has already started somewhere in the universe and right now it is eating its way into the rest of the universe.
Maybe a collapsed is starting right now right here. Or maybe it will start far away from here in a billion years. We do not know.’
The researchers looked at three main equations that underlie the prediction of a phase transition and showed how these equations can be worked out together and interact with each other.
Although the new calculations predict that a collapse is now more likely than ever before, it is also possible, that it will not happen at all.
It is a prerequisite for the phase change that the universe consists of the elementary particles that we know today, including the Higgs particle.
If the universe contains undiscovered particles, the whole basis for the prediction of phase change would prove false.


Read more: http://www.dailymail.co.uk/sciencetech/article-2523177/Could-universe-collapse-TODAY-Physicists-claim-risk-likely-started.html#ixzz2nSa5tfV3
Follow us: @MailOnline on Twitter | DailyMail on Facebook

Friday, December 13, 2013

Is God Dying? By Michael Shermer, from SciAm

Is God Dying?
The decline of religion and the rise of the “nones”

Cryogenics

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Cryogenics...