Search This Blog

Friday, January 17, 2014

Outwitting the Perfect Pathogen | The Scientist Magazine®

Outwitting the Perfect Pathogen | The Scientist Magazine®
Tuberculosis is exquisitely adapted to the human body. Researchers need a new game plan for beating it.  By | January 1, 2014

WORLDWIDE PATHOGEN: About one-third of the human population is infected with Mycobacterium tuberculosis (cultures shown above), some 13 million of which are actually sick with TB.CDC/GEORGE KUBICA

In 2009, an international consortium of researchers initiated an efficacy trial for a new tuberculosis (TB) vaccine—the first in more than 80 years. With high hopes, a team led by the South African Tuberculosis Vaccine Initiative inoculated 2,797 infants in the country, half with a vaccine called MVA85A and half with a placebo. They followed the children for up to three years and finally announced the result last February. It was not good news (Lancet, 381:1021-28, 2013).
“It did not work,” says Thomas Evans, president and CEO of Aeras, the Rockville, Maryland-based nonprofit that sponsored the trial. The vaccine did not protect children against the deadly disease.

“The whole field was disappointed,” says Robert Ryall, TB vaccine project leader at Sanofi Pasteur, who was not involved in the trial. “And unfortunately the field did not learn much.” The vaccine developers still do not know why MVA85A didn’t work.

The only vaccine currently available in the fight against TB is Bacille Calmette-Guérin (BCG), a live vaccine first used in 1921 and originally derived from a cow tuberculosis strain. Though the exact mechanism of the vaccine’s protection remains unclear, researchers do know that it doesn’t work well: it reduces the risk of a form of TB that is especially lethal to infants, but it does not reliably protect against TB lung infections, which kill more than a million adults worldwide each year.

With every cough or sneeze of an infected individual, TB bacilli fly through the air, and to date have spread to one-third of the world’s population. In most individuals, Mycobacterium tuberculosis (Mtb) lie dormant, never causing sickness. In others, however, the bacteria cause life-threatening lung infections. Some 13 million people around the world are actively sick with TB, and someone dies of the disease approximately every 20 seconds, according to the World Health Organization (WHO).

“The need for a TB vaccine is enormous,” says David Sherman, a tuberculosis expert at the nonprofit Seattle Biomedical Research Institute. And an inadequate vaccine is not the field’s only problem: the four main drugs currently used to treat tuberculosis are also decades old, take six months to rid the body of the bacilli, and are becoming obsolete due to the spread of multidrug-resistant and extensively drug-resistant TB. Despite the gloomy outlook, many researchers are still plugging away, through pharmaceutical-nonprofit partnerships and redesigned basic research efforts, to achieve a happy ending.

Ancient foe

Tuberculosis has plagued humans for thousands of years. Even ancient Egyptians were ravaged by TB, as evidence from mummies has shown. And over those millennia, Mtb has learned to quietly, carefully live within the human body.

“It’s not just a pathogen; in some ways it’s commensal,” says Evans. “It’s been dealing with the human immune system for a long period of time and knows how to go latent and keep itself transmitted.” Of the roughly 2 billion people infected with Mtb, about 90 percent will never get sick, though they are a vast reservoir of the bacteria, fueling the epidemic. And when illness occurs, unlike many infections that involve an acute sickness as the host’s immune system battles the pathogen, tuberculosis infection resembles a chronic disease. “Everything about the infection is slowed down, frankly, in ways we don’t understand,” says Sherman.

E. coli, for example, replicates so quickly—about once every 20 minutes—that one cell can grow into a colony of a million overnight. Mtb, on the other hand, only doubles once every 20 hours, and would take three weeks to grow a colony of similar size. Additionally, the human immune system produces antibodies against most pathogens in roughly 5 to 7 days. Antibody production against Mtb takes three weeks, likely because the bacteria are slow to travel to the lymph nodes where an adaptive immune response commences. “TB is exquisitely adapted to long-term survival in a human host,” says Sherman.

The current TB drug regimen relies on a six-month treatment of four antibiotics, all discovered in the 1950s and ’60s and which primarily inhibit cell-wall and RNA synthesis. (See illustration.) Worldwide, about 3.6 percent of new TB cases and 20 percent of recurring infections are multidrug resistant, according to the WHO.
Mtb is not just a pathogen; in some ways it’s commensal.
—­Thomas Evans, Aeras
Unfortunately, there isn’t a deep pipeline of drug candidates to fall back on. It wasn’t until December 2012, some 50 years after the last first-in-class approvals, that the US Food and Drug Administration approved a TB drug with a new mechanism of action. Janssen Therapeutics’ bedaquiline (Sirturo) inhibits an ATP synthase enzyme in the bacterium’s cell membrane to prevent the pathogen from generating energy and replicating. (See illustration.) No other anti-TB drugs are close to approval.

TB drug development has been slow for several reasons. For one, the drugs are difficult and expensive to make, and they are primarily needed in developing countries that can’t afford to pay top dollar for a six-month drug regimen. “Working in TB will not drive profit for pharmaceutical companies,” says Manos Perros, head of AstraZeneca’s Boston-based Infection Innovative Medicines Unit. As a result, most recent TB drug development has involved collaborations between big pharma and government institutions or nonprofit advocacy organizations, as well as academia. These are “partnerships that bring resources and funding that make this kind of work, frankly, possible,” says Perros. “This is a space where competitions between pharma and academia are unfruitful.”

Other pharma companies share that sentiment. In February 2013, Glaxo-SmithKline (GSK) opened up the closely guarded doors of their laboratories to share information with the TB research community about 177 compounds from the company’s pharmaceutical library that appear to inhibit Mtb (ChemMedChem, 8:313-21, 2013). The set of compounds has already been sent to nine groups in the U.K., U.S., Canada, The Netherlands, France, Australia, Argentina, and India, according to GSK spokesperson Melinda Stubbee.

But even with this collaborative attitude, the research community has struggled to develop successful new TB drugs, in part because the bacterium hides latent inside cells such as macrophages, and unpredictably becomes active in different sites in the lung. “TB drug development is extremely challenging because a drug has to kill not only the replicating but the nonreplicating bacteria,” says Feng Wang of the California Institute for Biomedical Research in La Jolla. To tackle this problem, Wang, along with Peter Schultz at Scripps Research Institute, also in La Jolla, and William Jacobs at Albert Einstein College of Medicine in New York, used a novel screening method to test the effect of 70,000 compounds on a biofilm of Mtb that simulates the latent version of the bacterium. One compound popped out of the screen: TCA1 killed both replicating and nonreplicating Mtb (PNAS, 110:E2510-17, 2013). It appeared to attack on two fronts: preventing bacterial cell-wall synthesis and inhibiting a bacterial enzyme involved in cofactor biosynthesis, which is likely what makes it effective against nonreplicating Mtb. (See illustration.) The compound has since proven successful in both acute and chronic animal models of TB, and the team is tweaking the chemistry to try and make it even more potent, says Wang.

Pharmaceutical company AstraZeneca is similarly developing a drug that is active against latent bacteria. AZD5847, a type of antibiotic called an oxazolidinone that is typically used to treat staph infections, is able to reach and kill Mtb lodging inside macrophages. The company is currently testing the drug in a Phase 2 efficacy trial in South Africa involving 75 patients. But developing the compound wasn’t easy, notes Perros. “We’ve been investing for a decade. It really takes a long time.”

Seeking a boost

But even if quick-acting, potent drugs were available, Mtb is so abundant and so well adapted to the human population that the only true path to eradication is not treatment, but prevention. “There’s no endgame without a vaccine,” says Aeras’s Evans. “No matter how much we think we should work on drugs or diagnostics, if we’re not working on vaccines, we’ll never get to our final goal.”

The failure of the MVA85A vaccine trial in South Africa last year was disappointing, but at least a dozen other TB vaccine candidates continue in clinical trials. Most of these reflect one of two general strategies for preventing tuberculosis: improve the existing BCG vaccine or, more commonly, boost its effect with a secondary vaccine. BCG, which is given to infants, primes the immune response early in life, so booster vaccines are usually designed to protect adolescents and adults from later infection. The MVA85A vaccine, for example, was a modified viral vector expressing Mtb antigen 85A designed as a booster to BCG.

Vaccine development, however, is hindered by lack of cellular or molecular markers that directly correlate with immune protection from TB, making it difficult to predict how well a vaccine might protect against TB based on the responses of a handful of individuals. “The only tool we have to make sure a vaccine works is a very large, very expensive field trial,” says Evans. And that high price tag, as in TB drug development, has turned numerous pharmaceutical companies off the pursuit of a TB vaccine.

But with financial and research support from nonprofit partners like Aeras—funded by the Bill & Melinda Gates Foundation, among others—a few companies are still in the game. In collaboration with Aeras, Sanofi Pasteur is developing a BCG booster vaccine that began Phase 1/2a safety trials in South Africa last July. It is a recombinant vaccine made up of two TB proteins fused together and coupled with an adjuvant called IC31, which really “drives the immune response,” says Sanofi’s Ryall. Aeras also has another big-pharma partnership with GSK on a vaccine called M72/AS01e, which has been in Phase 1 and 2 clinical trials since 2004, including an ongoing trial in Taiwan and Estonia. The vaccine combines a GSK recombinant antigen called M72, derived from two tuberculosis-expressed proteins, and a GSK adjuvant called AS01e.

Fresh start

With TB drugs and vaccines still in early clinical phases, some scientists are going back to the basics to see if a better molecular understanding of the bacterium itself could assist these programs. “We need to develop vaccines, and we need to develop products, but as we do, it’s very clear that we need to be learning a lot more about the immunobiology [of TB],” says Evans.

Last July, for example, Sherman and colleagues published the first large-scale map of the bacterium’s regulatory and metabolic networks (Nature, 499:178-83, 2013). The team initially plotted the relationships of 50 Mtb transcription factors, and later, all 200, which control the expression of the rest of the bacterium’s genes. “Our hope is that by looking at it in this different way, we can describe different kinds of drug targets than we have ever done before,” says Sherman.

The team found that Mtb is remarkably well networked, so that if a mutation or drug stymies one gene or protein, others step in as backups, allowing the bacterium to continue functioning normally. But targeting transcription factors that control whole networks could shut down an entire system, backups and all. One such network already looks like a promising drug target—transcription factors controlling a group of proteins in the bacterium’s cell membrane that pump antibiotics and other drugs out of the cell. Mtb has so many such pumps that it is extremely difficult to target multiple pumps for treatment, but genes that activate numerous pumps at the same time are a far more promising drug target.

The idea that scientists will soon develop new, better TB drugs and vaccines “helps get me up in the morning,” says Sherman. It’s going to take more breakthroughs than are on the immediate horizon, he adds, “but if we keep at it, we will get there.”

From funding agencies to scientific agency

From funding agencies to scientific agency | EMBOr

Collective allocation of science funding as an alternative to peer review
, , , ,

Author Affiliations

Publicly funded research involves the distribution of a considerable amount of money. Funding agencies such as the US National Science Foundation (NSF), the US National Institutes of Health (NIH) and the European Research Council (ERC) give billions of dollars or euros of taxpayers' money to individual researchers, research teams, universities, and research institutes each year. Taxpayers accordingly expect that governments and funding agencies will spend their money prudently and efficiently.
 
Investing money to the greatest effect is not a challenge unique to research funding agencies and there are many strategies and schemes to choose from. Nevertheless, most funders rely on a tried and tested method in line with the tradition of the scientific community: the peer review of individual proposals to identify the most promising projects for funding. This method has been considered the gold standard for assessing the scientific value of research projects essentially since the end of the Second World War.
Investing money to the greatest effect is not a challenge unique to research funding agencies and there are many strategies and schemes to choose from
However, there is mounting critique of the use of peer review to direct research funding. High on the list of complaints is the cost, both in terms of time and money. In 2012, for example, NSF convened more than 17,000 scientists to review 53,556 proposals [1]. Reviewers generally spend a considerable time and effort to assess and rate proposals of which only a minority can eventually get funded. Of course, such a high rejection rate is also frustrating for the applicants. Scientists spend an increasing amount of time writing and submitting grant proposals. Overall, the scientific community invests an extraordinary amount of time, energy, and effort into the writing and reviewing of research proposals, most of which end up not getting funded at all. This time would be better invested in conducting the research in the first place.
 
Peer review may also be subject to biases, inconsistencies, and oversights. The need for review panels to reach consensus may lead to sub‐optimal decisions owing to the inherently stochastic nature of the peer review process. Moreover, in a period where the money available to fund research is shrinking, reviewers may tend to “play it safe” and select proposals that have a high chance of producing results, rather than more challenging and ambitious projects. Additionally, the structuring of funding around calls‐for‐proposals to address specific topics might inhibit serendipitous discovery, as scientists work on problems for which funding happens to be available rather than trying to solve more challenging problems.
 
The scientific community holds peer review in high regard, but it may not actually be the best possible system for identifying and supporting promising science. Many proposals have been made to reform funding systems, ranging from incremental changes to peer review—including careful selection of reviewers [2] and post‐hoc normalization of reviews [3]—to more radical proposals such as opening up review to the entire online population [4] or removing human reviewers altogether by allocating funds through an objective performance measure [5].
Overall, the scientific community invests an extraordinary amount of time, energy and effort into the writing and reviewing of research proposals…
We would like to add another alternative inspired by the mathematical models used to search the internet for relevant information: a highly decentralized funding model in which the wisdom of the entire scientific community is leveraged to determine a fair distribution of funding. It would still require human insight and decision‐making, but it would drastically reduce the overhead costs and may alleviate many of the issues and inefficiencies of the proposal submission and peer review system, such as bias, “playing it safe”, or reluctance to support curiosity‐driven research.
 
Our proposed system would require funding agencies to give all scientists within their remit an unconditional, equal amount of money each year. However, each scientist would then be required to pass on a fixed percentage of their previous year's funding to other scientists whom they think would make best use of the money (Fig 1). Every year, then, scientists would receive a fixed basic grant from their funding agency combined with an elective amount of funding donated by their peers. As a result of each scientist having to distribute a given percentage of their previous year's budget to other scientists, money would flow through the scientific community. Scientists who are generally anticipated to make the best use of funding will accumulate more.
Figure 1. Proposed funding system
 
Illustrations of the existing (left) and the proposed (right) funding systems, with reviewers in blue and investigators in red. In most current funding models, like those used by NSF and NIH, investigators write proposals in response to solicitations from funding agencies. These proposals are reviewed by small panels and funding agencies use these reviews to help make funding decisions, providing awards to some investigators. In the proposed system, all scientists are both investigators and reviewers: every scientist receives a fixed amount of funding from the government and discretionary distributions from other scientists, but each is required in turn to redistribute some fraction of the total they received to other investigators.
It may help to illustrate the idea with an example. Suppose that the basic grant is set to US$100,000—this corresponds to roughly the entire 2012 NSF budget divided by the total number of researchers that it funded [1]—and the required fraction that any scientist is required to donate is set to f = 0.5 or 50%. Suppose, then, that Scientist K received a basic grant of $100,000 and $200,000 from her peers, which gave her a funding total of $300,000. In 2013, K can spend 50% of that total sum, $150,000, on her own research program, but must donate 50% to other scientists for their 2014 budget. Rather than painstakingly submitting and reviewing project proposals, K and her colleagues can donate to one another by logging into a centralized website and entering the names of the scientists they choose to donate to and how much each should receive.
 
More formally, suppose that a funding agency's total budget is ty in year y, and it simply maintains a set of funding accounts for n qualified scientists chosen according to criteria such as academic appointment status, number of publications and other bibliometric indicators, or area of research. The amount of funding in these accounts in year y is represented as n vectors αy, where each entry αy(i) corresponds to the amount of funding in the account of scientist i in year y. Each year, the funding agency deposits a fixed amount into each account, equal to the total funding budget divided by the total number of scientists: ty/n. In addition, in each year y scientist i must distribute a fixed fraction f є [0,1] of the funding he or she received to other scientists. We represent all of these choices by an n × n funding transfer matrix Dy, where Dy(i, j) contains the fraction of his or her funds that scientist I will give to scientist j. By construction, this matrix satisfies the properties that all entries are between 0 and 1 inclusive; Dy(i,i) = 0, so that no scientist can donate money to him or herself; and Embedded Image, so that every scientist is required to donate a fraction f of the previous year's funding to others. The distribution of funding over scientists received for year y + 1 is thus expressed by: Embedded Image
 
This form assumes that the portion of a scientist's funding that remains after donation is either spent or stored in a separate research account for later years. An interesting and perhaps necessary modification may be that redistribution pertains to the entirety of funding that a scientist has accumulated over many years, not just the amount received in a particular year. This would ensure that unused funding is gradually re‐injected into the system while still preserving long‐term stability of funding.
 
Network and computer scientists will recognize the general outline of these equations. Google pioneered a similar heuristic approach to rank web pages by transferring “importance” [6] via the web's network of page links; pages that accumulate “importance” rank higher in search results. A similar principle has been successfully used to determine the impact of scientific journals [7] and scholarly authors [8].
 
Instead of attributing “impact” or “relevance”, our approach distributes actual money. We believe that this simple, highly distributed, self‐organizing process can yield sophisticated behavior at a global level. Respected and productive scientists are likely to receive a comparatively large number of donations. They must in turn distribute a fraction of this larger total to others; their high status among scientists thus affords them greater influence over how funding is distributed. The unconditional yearly basic grant in turn ensures stability and gives all scientists greater autonomy for serendipitous discovery, rather than having to chase available funding. As the priorities and preferences of the scientific community change over time, reflected in the values of Dy, the flow of funding will gradually change accordingly. Rather than converging on a stationary distribution, the system will dynamically adjust funding levels to where they are most needed as scientists collectively assess and re‐assess each others' merits. Last but not least, the proposed scheme would fund people instead of projects: it would liberate researchers from peer pressure and funding cycles and would give them much greater flexibility to spend their allocation as they see fit.
 
Of course, funding agencies and governments may still wish or need to play a guiding role, for instance to foster advances in certain areas of national interest or to encourage diversity. This capacity could be included in the outlined system in a number of straightforward ways. Traditional peer‐reviewed, project‐based funding could be continued in parallel. In addition, funding agencies could vary the base funding rate to temporarily inject more money into certain disciplines or research areas. Scientists may be offered the option to donate to special aggregated “large‐scale projects” to support research projects that develop or rely on large‐scale scientific infrastructure. The system could also include some explicit temporal dampening to prevent sudden large changes. Scientists could, for example, be allowed to save surplus funding from previous years in “slush” funds to protect against lean times in the future.
 
In practice, the system will require stringent conflict‐of‐interest rules similar to the ones that have been widely adopted to keep traditional peer review fair and unbiased. For example, scientists might be prevented from donating to themselves, advisors, advisees, close collaborators, or even researchers at their own institution. Funding decisions must remain confidential so scientists can always make unbiased decisions; should groups of people attempt to affect global funding distribution they will lack the information to do so effectively. At the very least, the system will allow funding agencies to confidentially study and monitor the flow of funding in the aggregate; potential abuse such as circular funding schemes can be identified and remediated. This data will furthermore support Science of Science efforts to identify new emerging areas of research and future priorities.
Peer review of proposals has served science well for decades, but funding agencies may want to consider alternative approaches to public funding of research…
Such an open and dynamic funding system might also induce profound changes in scholarly communication. Scientists and researchers may feel more strongly compelled to openly and freely share results with the public and their community if this attracts the interest of colleagues and therefore potential donors. A “publish or perish” strategy may matter less than clearly and compellingly communicating the outcomes, scientific merit, broader impact, vision, and agenda of one's research programs so as to convince the scientific community to contribute to it.
 
Peer review of proposals has served science well for decades, but perhaps it's time for funding agencies to consider alternative approaches to public funding of research—based on advances in mathematics and modern technology—to optimize their return on investment. The system proposed here requires a fraction of the costs associated with traditional peer review, but may yield comparable or even better results. The savings of financial and human resources could be used to identify new targets of opportunity, to support the translation of scientific results into products and jobs, and to help communicate advances in science and technology.

Acknowledgments

The authors acknowledge support by the National Science Foundation under grant SBE #0914939, the Andrew W. Mellon Foundation, and National Institutes of Health award U01 GM098959.
National Science Foundation 0914939
Andrew W. Mellon Foundation
National Institutes of Health U01 GM098959

Footnotes

  • The authors declare that they have no conflict of interest.

References

It is TIme for Greenpeace to be Proscecuted for Crimes Against Humanity

Standing Up for GMOs

  1. Phillip Sharp11
+ Author Affiliations
  1. 1Bruce Alberts is President Emeritus of the U.S. National Academy of Sciences and former Editor-in-Chief of Science.
  2. 2Roger Beachy is a Wolf Prize laureate; President Emeritus of the Donald Danforth Plant Science Center, St. Louis, MO, USA; and former director of the U.S. National Institute of Food and Agriculture.
  3. 3David Baulcombe is a Wolf Prize laureate and Royal Society Professor in the Department of Plant Sciences of the University of Cambridge, Cambridge, UK. He receives research funding from Syngenta and is a consultant for Syngenta.
  4. 4Gunter Blobel is a Nobel laureate and the John D. Rockefeller Jr. Professor at the Rockefeller University, New York, NY, USA.
  5. 5Swapan Datta is Deputy Director General (Crop Science) of the Indian Council of Agricultural Research, New Delhi, India; the Rash Behari Ghosh Chair Professor at Calcutta University, India; and a former scientist at ETH-Zurich, Switzerland, and at IRRI, Philippines.
  6. 6Nina Fedoroff is a National Medal of Science laureate; a Distinguished Professor at the King Abdullah University of Science and Technology, Thuwal, Saudi Arabia; an Evan Pugh Professor at Pennylvania State University, University Park, PA, USA; and former President of AAAS.
  7. 7Donald Kennedy is President Emeritus of Stanford University, Stanford, CA, USA, and former Editor-in-Chief of Science.
  8. 8Gurdev S. Khush is a World Food Prize laureate, Japan Prize laureate, and former scientist at IRRI, Los Baños, Philippines.
  9. 9Jim Peacock is a former Chief Scientist of Australia and former Chief of the Division of Plant Industry at the Commonwealth Scientific and Industrial Research Organization, Canberra, Australia.
  10. 10Martin Rees is President Emeritus of the Royal Society, Fellow of Trinity College, and Emeritus Professor of Cosmology and Astrophysics at the University of Cambridge, Cambridge, UK.
  11. 11Phillip Sharp is a Nobel laureate; an Institute Professor at the Massachusetts Institute of Technology, Cambridge, MA, USA; and President of AAAS.
Figure
CREDIT: IRRI
On 8 August 2013, vandals destroyed a Philippine “Golden Rice” field trial. Officials and staff of the Philippine Department of Agriculture that conduct rice tests for the International Rice Research Institute (IRRI) and the Philippine Rice Research Institute (PhilRice) had gathered for a peaceful dialogue. They were taken by surprise when protesters invaded the compound, overwhelmed police and village security, and trampled the rice. Billed as an uprising of farmers, the destruction was actually carried out by protesters trucked in overnight in a dozen jeepneys.
 
The global scientific community has condemned the wanton destruction of these field trials, gathering thousands of supporting signatures in a matter of days.* If ever there was a clear-cut cause for outrage, it is the concerted campaign by Greenpeace and other nongovernmental organizations, as well as by individuals, against Golden Rice. Golden Rice is a strain that is genetically modified by molecular techniques (and therefore labeled a genetically modified organism or GMO) to produce β-carotene, a precursor of vitamin A. Vitamin A is an essential component of the light-absorbing molecule rhodopsin in the eye. Severe vitamin A deficiency results in blindness, and half of the roughly half-million children who are blinded by it die within a year. Vitamin A deficiency also compromises immune system function, exacerbating many kinds of illnesses. It is a disease of poverty and poor diet, responsible for 1.9 to 2.8 million preventable deaths annually, mostly of children under 5 years old and women.
 
Rice is the major dietary staple for almost half of humanity, but white rice grains lack vitamin A. Research scientists Ingo Potrykus and Peter Beyer and their teams developed a rice variety whose grains accumulate β-carotene. It took them, in collaboration with IRRI, 25 years to develop and test varieties that express sufficient quantities of the precursor that a few ounces of cooked rice can provide enough β-carotene to eliminate the morbidity and mortality of vitamin A deficiency. It took time, as well, to obtain the right to distribute Golden Rice seeds, which contain patented molecular constructs, free of charge to resource-poor farmers.
 
The rice has been ready for farmers to use since the turn of the 21st century, yet it is still not available to them. Escalating requirements for testing have stalled its release for more than a decade. IRRI and PhilRice continue to patiently conduct the required field tests with Golden Rice, despite the fact that these tests are driven by fears of “potential” hazards, with no evidence of actual hazards. Introduced into commercial production over 17 years ago, GM crops have had an exemplary safety record. And precisely because they benefit farmers, the environment, and consumers, GM crops have been adopted faster than any other agricultural advance in the history of humanity.
 
New technologies often evoke rumors of hazard. These generally fade with time when, as in this case, no real hazards emerge. But the anti-GMO fever still burns brightly, fanned by electronic gossip and well-organized fear-mongering that profits some individuals and organizations. We, and the thousands of other scientists who have signed the statement of protest, stand together in staunch opposition to the violent destruction of required tests on valuable advances such as Golden Rice that have the potential to save millions of impoverished fellow humans from needless suffering and death.
  • * B. Chassy et al., “Global scientific community condemns the recent destruction of field trials of Golden Rice in the Philippines”; http://chn.ge/143PyHo (2013).
  • E. Mayo-Wilson et al., Br. Med. J. 343, d5094 (2011).
  • G. Tang et al., Am. J. Clin. Nutr. 96, 658 (2012).

Astrophysics, the Impossible Science -- More Than Quantum Mechanics?

Last week, Nobel Laureate Martinus Veltman gave a talk at the Simons Center. After the talk, a number of people asked him questions about several things he didn’t know much about, including supersymmetry and dark matter. After deflecting a few such questions, he proceeded to go on a brief rant against astrophysics, professing suspicion of the field’s inability to do experiments and making fun of an astrophysicist colleague’s imprecise data. The rant was a rather memorable feat of curmudgeonliness, and apparently typical Veltman behavior. It left several of my astrophysicist friends fuming. For my part, it inspired me to write a positive piece on astrophysics, highlighting something I don’t think is brought up enough.
 
The thing about astrophysics, see, is that astrophysics is impossible.
Imagine, if you will, an astrophysical object. As an example, picture a black hole swallowing a star.
Are you picturing it?
 
Now think about where you’re looking from. Chances are, you’re at some point up above the black hole, watching the star swirl around, seeing something like this:
Where are you in this situation? On a spaceship? Looking through a camera on some probe?
 
Astrophysicists don’t have spaceships that can go visit black holes. Even the longest-ranging probes have barely left the solar system. If an astrophysicist wants to study a black hole swallowing a star, they can’t just look at a view like that. Instead, they look at something like this:
The image on the right is an artist’s idea of what a black hole looks like. The three on the left?
 
They’re what the astrophysicist actually sees. And even that is cleaned up a bit, the raw output can be even more opaque.
 
A black hole swallowing a star? Just a few blobs of light, pixels on screen. You can measure brightness and dimness, filter by color from gamma rays to radio waves, and watch how things change with time. You don’t even get a whole lot of pixels for distant objects. You can’t do experiments, either, you just have to wait for something interesting to happen and try to learn from the results.
 
It’s like staring at the static on a TV screen, day after day, looking for patterns, until you map out worlds and chart out new laws of physics and infer a space orders of magnitude larger than anything anyone’s ever experienced.
 
And naively, that’s just completely and utterly impossible.
And yet…and yet…and yet…it works!
 
Crazy people staring at a screen can’t successfully make predictions about what another part of the screen will look like. They can’t compare results and hone their findings. They can’t demonstrate principles (like General Relativity) that change technology here on Earth. Astrophysics builds on itself, discovery by discovery, in a way that can only be explained by accepting that it really does work (a theme that I’ve had occasion to harp on before).
 
Physics began with astrophysics. Trying to explain the motion of dots in a telescope and objects on the ground with the same rules led to everything we now know about the world. Astrophysics is hard, arguably impossible…but impossible or not, there are people who spend their lives successfully making it work.
 
 
(David Strumfels) -- With a chemistry background, not astrophysics, I have to wonder where quantum mechanics stacks up.  TO give one example, the hydrogen atom:
 
 
We see the electron orbiting about the proton nucleus, an image we probably saw in high school, and the quantized orbits added by Bohr don't alter what we see significantly (though it is a significant addition).  Now, physics teaches us that an object in orbit about another orbit possesses angular momentum -- which means it is changing direction continuously.
 
But the electron here possesses no angular momentum, according to quantum mechanics.  It's worse that that; the electron has not exact space at anytime we specify.  It is attracted to the nucleus, yes, but outside of that it could be anywhere in the universe, though mostly like close to the nucleus.  I hesitate to go into this further, except that the electron occupies well defined orbitals, which describe its spatial distribution through all space.  The orbitals are squares of the wave function describing the electron, and has a simple formula like this:
 
 
And this is just the simplest of all atoms, hydrogen.  Try to work out more complicated atoms, and you run up against the three body equation, meaning there is no exact solution at all.  Same with molecules molecules ... you get the idea.
 
In the end I won't judge, because I understand neither astrophysics or quantum mechanics well enough to draw a comparison.  As for molecules, I can only give a picture, in this case of hemoglobin.  Here there is structure built upon structure, built upon structure -- the final structure being the atomic orbitals of hydrogen and other atoms.
 
 
 
 
 

Are There 'Laws' in Social Science?

by Ross Pomeroy in Think Big 
January 17, 2014, 12:29 PM
220595
This post originally appeared in the Newton blog on RealClearScience.
You can read the original here.

Richard Feynman rarely shied away from debate. When asked for his opinions, he gave them, honestly and openly. In 1981, he put forth this one:

"Social science is an example of a science which is not a science... They follow the forms... but they don't get any laws."

Many modern social scientists will certainly say they've gotten somewhere. They can point to the law of supply and demand or Zipf's law for proof-at-first-glance -- they have the word "law" in their title! The law of supply and demand, of course, states that the market price for a certain good will fluctuate based upon the quantity demanded by consumers and the quantity supplied by producers. Zipf's law statistically models the frequency of words uttered in a given natural language.

But are social science "laws" really laws? A scientific law is "a statement based on repeated experimental observations that describes some aspect of the world. A scientific law always applies under the same conditions, and implies that there is a causal relationship involving its elements." The natural and physical sciences are rife with laws. It is, for example, a law that non-linked genes assort independently, or that the total energy of an isolated system is conserved.

But what about the poster child of social science laws: supply and demand? Let's take it apart. Does it imply a causal relationship? Yes, argues MIT professor Harold Kincaid.

"A demand or supply curve graphs how much individuals are willing to produce or buy at any given price. When there is a shift in price, that causes corresponding changes in the amount produced and purchased. A shift in the supply or demand curve is a second causal process – when it gets cheaper to produce some commodity, for example, the amount supplied for each given price may increase."

Are there repeated experimental observations for it? Yes, again, says Kincaid (PDF).

"The observational evidence comes from many studies of diverse commodities – ranging from agricultural goods to education to managerial reputations – in different countries over the past 75 years. Changes in price, demand, and supply are followed over time. Study after study finds the proposed connections."

Does supply and demand occur under the same conditions? That is difficult to discern. In the real world, unseen factors lurk behind every observation. Economists can do their best to control variables, but how can we know if the conditions are precisely identical?

Still, supply and demand holds up very well. Has Mr. Feynman been proved wrong? Perhaps. And if social science can produce laws, is it, too, a science? By Feynman's definition, it seems so.

The reason why social science and its purveyors often gets such a bad rap has less to do with the rigor of their methods and more to do with the perplexity of their subject matter. Humanity and its cultural constructs are more enigmatic than much of the natural world. Even Feynman recognized this.

"Social problems are very much harder than scientific ones," he noted. Social science itself may be an enterprise doomed, not necessarily to fail, just to never fully succeed. Utilizing science to study something inherently unscientific is a tricky business.

Of lice and men (and chimps): Study tracks pace of molecular evolution

Jan 07, 2014 from Phys.Org



 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
A new study led by Kevin Johnson of the Illinois Natural History Survey (seated, at left), with (from left to right) entomology professor Barry Pittendrigh, animal biology professor Ken Paige and postdoctoral researcher Julie Allen, indicates
Read more at:
http://phys.org/news/2014-01-lice-men-chimps-tracks-pace.html#jCp
 
A new study compares the relative rate of molecular evolution between humans and chimps with that of their lice. The researchers wanted to know whether evolution marches on at a steady pace in all creatures or if subtle changes in genes – substitutions of individual letters of the genetic code – occur more rapidly in some groups than in others.
 
A report of the study appears in the Proceedings of the Royal Society B.
The team chose its study subjects because humans, chimps and their lice share a common historical fate: When the ancestors of humans and chimps went their separate ways, evolutionarily speaking, so did their lice.

"Humans are chimps' closest relatives and chimps are humans' closest relatives – and their lice are each others' closest relatives," said study leader Kevin Johnson, an ornithologist with the Illinois Natural History Survey at the University of Illinois. "Once the hosts were no longer in contact with each other, the parasites were not in contact with each other because they spend their entire life cycle on their hosts."

This fact, a mutual divergence that began at the same point in time (roughly 5 million to 6 million years ago) allowed Johnson and his colleagues to determine whether occurs faster in primates or in their parasites.

Previous studies had looked at the rate of molecular changes between parasites and their hosts, but most focused on single in the mitochondria, tiny energy-generating structures outside the nucleus of the cell that are easier to study. The new analysis is the first to look at the pace of molecular change across the genomes of different groups. It compared a total of 1,534 genes shared by the primates and their parasites. To do this, the team had to first assemble a rough sequence of the chimp louse (Pan troglodytes schweinfurthii) genome, the only one of the four organisms for which a full genome sequence was unavailable.

The team also tracked whether changes in gene sequence altered the structure of the proteins for which the genes coded (they looked only at protein-coding genes). For every gene they analyzed, they determined whether sequence changes resulted in a different amino acid being added to a protein at a given location.

They found that – at the scale of random changes to gene sequence – the lice are winning the molecular evolutionary race. This confirmed what previous, more limited studies had hinted at.
"For every single gene we looked at, the lice had more differences (between them) than (were found) between humans and chimps. On average, the parasites had almost 15 times more changes," Johnson said. "Often in parasites you see these faster rates," he said. There have been several hypotheses as to why, he said.

Humans and chimps had a greater percentage of sequence changes that led to changes in protein structure, the researchers found. That means that even though the louse genes are changing at a faster rate, most of those changes are "silent," having no effect on the proteins for which they code. Since these changes make no difference to the life of the organism, they are tolerated, Johnson said. Those sequence changes that actually do change the structure of proteins in lice are likely to be harmful and are being eliminated by natural selection, he said.

In humans and , the higher proportion of amino acid changes suggests that some of those genes are under the influence of "positive selection," meaning that the altered proteins give the primates some evolutionary advantage, Johnson said. Most of the genes that changed more quickly or slowly in primates followed the same pattern in their , Johnson said.

"The most likely explanation for this is that certain genes are more important for the function of the cell and can't tolerate change as much," Johnson said.

The new study begins to answer fundamental questions about changes at the molecular level that eventually shape the destinies of all organisms, Johnson said.

"Any difference that we see between species at the morphological level almost certainly has a genetic basis, so understanding how different genes are different from each other helps us understand why different species are different from each other," he said. "Fundamentally, we want to know which genetic differences matter, which don't, and why certain genes might change faster than others, leading to those differences."
Explore further: Louse genetics offer clues on human migrations
More information: "Rates of Genomic Divergence in Humans, Chimpanzees and Their Lice," rspb.royalsocietypublishing.org/lookup/doi/10.1098/rspb.2013.2174
Journal reference: Proceedings of the Royal Society B

Read more at: http://phys.org/news/2014-01-lice-men-chimps-tracks-pace.html#jCp



New form of quantum matter: Natural 3D counterpart to graphene discovered

17 hours ago by Lynn Yarris in Phys.orgNatural 3D counterpart to graphene discovered
A topological Dirac semi-metal state is realized at the critical point in the phase transition from a normal insulator to a topological insulator. The + and - signs denote the even and odd parity of the energy bands. Credit: Yulin Chen, Oxford

The discovery of what is essentially a 3D version of graphene – the 2D sheets of carbon through which electrons race at many times the speed at which they move through silicon - promises exciting new things to come for the high-tech industry, including much faster transistors and far more compact hard drives. A collaboration of researchers at the DOE's Lawrence Berkeley National Laboratory (Berkeley Lab) has discovered that sodium bismuthate can exist as a form of quantum matter called a three-dimensional topological Dirac semi-metal (3DTDS). This is the first experimental confirmation of 3D Dirac fermions in the interior or bulk of a material, a novel state that was only recently proposed by theorists.
"A 3DTDS is a natural three-dimensional counterpart to graphene with similar or even better mobility and velocity electrons," says Yulin Chen, a physicist with Berkeley Lab's Advanced Light Source (ALS) when he initiated the study that led to this discovery, and now with the University of Oxford. "Because of its 3D Dirac fermions in the bulk, a 3DTDS also features intriguing non-saturating linear magnetoresistance that can be orders of magnitude higher than the GMR materials now used in hard drives, and it opens the door to more efficient optical sensors."
Chen is the corresponding author of a paper in Science reporting the discovery. The paper is titled "Discovery of a Three-dimensional Topological Dirac Semimetal, Na3Bi." Co-authors were Zhongkai Liu, Bo Zhou, Yi Zhang, Zhijun Wang, Hongming Weng, Dharmalingam Prabhakaran, Sung-Kwan Mo, Zhi-Xun Shen, Zhong Fang, Xi Dai and Zahid Hussain.


Two of the most exciting new materials in the world of high technology today are graphene and , crystalline materials that are electrically insulating in the bulk but conducting on the surface. Both feature 2D Dirac fermions (fermions that aren't their own antiparticle), which give rise to extraordinary and highly coveted physical properties. Topological insulators also possess a unique , in which bulk electrons behave like those in an insulator while surface electrons behave like those in graphene.

Natural 3D counterpart to graphene discovered       
Beamline 10.0.1 at Berkeley Lab's Advanced Light Source is optimized for the study of for electron structures and correlated electron systems. Credit: Roy Kaltschmidt, Berkeley Lab

"The swift development of graphene and topological insulators has raised questions as to whether there are 3D counterparts and other materials with unusual topology in their electronic structure," says Chen. "Our discovery answers both questions. In the sodium bismuthate we studied, the bulk conduction and valence bands touch only at discrete points and disperse linearly along all three momentum directions to form bulk 3D Dirac fermions. Furthermore, the topology of a 3DTSD electronic structure is also as unique as those of topological insulators."

Read more at: http://phys.org/news/2014-01-quantum-natural-3d-counterpart-graphene.html#jCp

Cryogenics

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Cryogenics...