Sunday 2 September 2012

Physics on the edge of the possible

Solar flare, as depicted in Black Rain by Semiconductor

A fascinating post in Henning Dekant's excellent Wavewatching blog this week, adds some depth to August's stories suggesting scientists had found a way to predict solar storms.

Jere Jenkins and Ephraim Fischbach of Purdue University published a paper in Astroparticle Physics which showed evidence that the rate of the breakdown of radioactive materials changes in advance of solar flares. They believe this fluctuation should be able to be used to create an early-warning system for potentially destructive solar storms.

The astrophysics community met this with surprise, skepticism and even alarm in some quarters, as whilst an early-warning system for solar flares is something of a holy grail within space engineering, Jenkins and Fischbach did appear to be challenging our fundamental understanding of radioactive decay.

Their latest work builds on earlier research, including a paper four years ago, which presented surprising evidence of a correlation between nuclear decay rates and Earth-Sun distance.

So why is all of this weird? Radioactive elements are unstable and break down over time. As they do this they release energy in the form of radiation. As Dekant notes,"radioactive decay is supposed to be the ultimate random process, immutably governed by an element's half life and nothing else. There is no way to determine when a single radioactive atom will decay, nor any way to speed-up or slow down the process." He emphasises this is considered to be an "iron clad certainty".

Therefore the absolute last thing you'd be expecting reputable scientists to report is results which show "a discernible pattern in the decay rate of a radioactive element" or "any correlation with outside events". That's precisely what Jenkins and Fischbach have presented in their latest paper. Beyond the practical implications for an early-warning system for solar storms, this has far-reaching implications for our understanding of radiation in general.

Jenkins' research was inspired by what Jonathan Ball describes as a chance event. Jenkins "was watching television coverage of astronauts spacewalking at the International Space Station. A solar flare erupted and was thought to pose a risk to the astronauts. On checking equipment in his laboratory, he was surprised to discover that the rate of radioactive decay changed before the solar flare."

This lead to Jenkins, and colleagues, developing a hypothesis that radioactive decay rates are influenced by solar activity, possibly streams of subatomic particles called solar neutrinos. This influence can wax and wane due to seasonal changes in the Earth's distance from the sun and also during solar flares. The latest paper in Astroparticle Physics provides the evidence for this hypothesis, and as Dekant notes, "the evidence for the reality of this effect is surprisingly good, and that is rather shocking".

Shocking, because:
"It does not fit into any established theory at this time."

Sources:
http://wavewatching.net/
Astroparticle Physics
http://www.astronomynow.com/news/n1208/15solarflares/

Thanks to Dan Hon for directing me to this.