How's this for spooky action at a distance? The sun, at 93 million miles away, appears to be influencing the decay of radioactive elements inside the Earth, researchers say.
Given what we know about radioactivity and solar neutrinos, this should not happen. It's so bizarre that a couple scientists at Stanford and Purdue universities believe there's a chance that a previously unknown solar particle is behind it all.
The big news, according to Stanford's news service, is that the core of the sun -- where nuclear reactions produce neutrinos -- spins more slowly than the surface. This phenomenon might explain changing rates of radioactive decay scientists observed at two separate labs. But it does not explain why the decay-change happens. That violates the laws of physics as we know them.
While examining data on radioactive isotopes, Purdue researchers found disagreement in measured decay rates, which goes against the long-accepted belief that these rates are constant. While searching for an explanation, the scientists came across other research that noted seasonal variation in these decay rates. Apparently radioactivity is stronger in winter than in summer.
A 2006 solar flare suggested the sun was involved somehow. Purdue University nuclear engineer Jere Jenkins noticed the decay rate of a medical isotope dropped during the solar flare, and what's more, the decline started before the flare did. The latter finding could be useful for protecting satellites and astronauts -- if there is a correlation between decay rates and solar activity, changed decay rates could provide early warning of an impending solar storm.
But while that's good news for astronauts, it's bad news for physics.
Peter Sturrock, Stanford emeritus professor of applied physics and an expert on the inner workings of the sun, told the researchers to look for evidence that the changes in radioactive decay vary with the rotation of the sun. The answer was yes, suggesting that neutrinos are responsible.
But how could the nebulous neutrino, which does not interact with normal matter, be affecting decay rates? No one knows. It might be a previously unknown particle instead.
As Jenkins puts it, "What we're suggesting is that something that doesn't really interact with anything is changing something that can't be changed."
Though disaster movies would have you believe otherwise, we should not yet worry about solar neutrinos warming the core of the Earth. But perhaps we should worry that our understanding of the sun -- and perhaps our understanding of nuclear physics in general -- is a lot weaker than we thought.
Hmm, I saw a movie like this once. People died lol
Maybe it's not a solar particle though.
Perhaps an as yet un-recognized force which permiates everything around us is causing it all to happen.
That would explain why this is happening in tandem.
Well guess scientists are wrong again. Radioactive decay is not exact, but overall I'm impressed with how close they get. In reality everything is a mystery to scientists.
I just got the chills.
This immediately calls into question the accuracy of radioactive dating techniques, especially those of extreme age.
I recently learned (through a Huffington Post article) that events in the present can change events in the past. Particles that were forced to choose between behaving as waves or particles were found to have made that choice in the past based on how they were detected in the present. And, that particles seemingly instantly communicate their quantum state when entangled, even across vast distances, has long been known.
Creationists suggested almost two decades ago that if the velocity of light constant were changing, it would explain the seemingly old age of the Earth from radioactive dating techniques, as such a change would indicate a change in a constant that affects radioactive decay rates. After many years of careful measurement, scientists concluded that the speed of light constant actually is constant. Now, it turns out that radioactive decay rates are changing, anyway.
I guess that Creationists should have picked up on the quantum mechanical properties long before now, to suggest that unknown events around the Universe might alter measured decay rates on Earth. I guess that's too tenuous a connection to make, though, without further evidence. Now, we have it.
Wasn't there research contradicting this?
"Data from the power output of the radioisotope thermoelectric generators aboard the Cassini spacecraft are used to test the conjecture that small deviations observed in terrestrial measurements of the exponential radioactive decay law are correlated with the Earth-Sun distance. No significant deviations from exponential decay are observed over a range of 0.7 - 1.6 A.U. A 90% Cl upper limit of 0.84 x 10^-4 is set on a term in the decay rate of Pu-238 proportional to 1/R^2 and 0.99 x 10^-4 for a term proportional to 1/R."
from the article
"Purdue University nuclear engineer Jere Jenkins noticed the decay rate of a medical isotope dropped during the solar flare,"
This tells us that if this were true then the earth would be even older than the carbon dating shows it to be . This isn't good for creationists.
Anyhow this report does not "immediately call into question" Radioactive carbon dating which is a very well understood science .
You clearly have an inquisitive mind and are seeking a better understanding of Quantum Mechanics . I would advise steering clear of getting a better understanding from Huffington Post articles or any such internet sources .
Go down to your local college and take some science classes . The teachers there are always happy to help. And you'll have access to real textbooks
If you want more information right now Look up Richard Feynman or Erwin Schrödinger or Werner Heisenberg .
I would imagine that the 'medical isotope' spoken of is molybdenum99, as reports of supply problems at our one plant, and the breakdown at the Canadian plant led to increased research to find like properties in other isotopes to alleviate the western hemispheres' ever increasing dependence on facilities that specialize in just one area of production. Our government's typical criterion of triple redundancy seems to have been overlooked in this critical area of need. The increase in decay rate in winter makes sense to me, as there is less in the surrounding matter to prevent decay. Cold, near dead matter, should pull energy from an available source at a higher rate than if the matter surrounding the energy source were in a similar state of flux. The greater the disparity, the greater the decay. If this were not true, we would need nowhere near the water we need for our long term storage of radioactive waste products. Not quite stating the obvious, but this is not a new state of affairs, either. Just because something has been overlooked does not mean all physics discoveries up to this point are now all of a sudden null and void. A warm log is easier to ignite than a frozen one. That's not new either.
Don't atomic clocks work of decay rates of radioactive elements? If the decay rate changes, would not these clocks be inaccurate? Do I need to reset my atomic clock?
that thought came to my mind as well, but I assume that this article is talking about extremely minute discrepancies, or it would have been realized by now. Since any change violates the theory of constant decay, its still a big deal. Atomic clocks should still be more acurate than any other chronometer we currently have.
Could you clairify your idea? Measuring radioactive decay at this scale requires a decent bit of lab control to acomplish. Ergo, discrepancies bewteen the radioactive decay rates were bewteen samples at similar temperatures (room temperature, or a set lab temperature). Its also moot because it has been proven that they are constant at seperate temperatures.
The point of Seasonal difference bewteen summer and winter has nothing to do with the atmospheric temperature, but with our proximity to the Sun, which is greater during the N.Hemisphere's winter. That is what leads them to believe the sun is involved; The obvious change in environment that has not been tested is significant change in distance from the sun.
Obviously someone mentioned radio carbon dating. I'm sure others will. It does not say anything specific.
"Purdue University nuclear engineer Jere Jenkins noticed the decay rate of a medical isotope dropped during the solar flare"
It never says all radioactive elements.
I never said the overall yearly AMOUNT is different.
It says the RATE varies by season and from solar activity.
It also does not say how much it changes. On a subatomic scale and from the viewpoint of a physicist a tiny variation from an equation is a huge deal.
These researchers acted irresponsibly when making their findings so vaguely public.
Look you guys. If you want to learn everything you could probably cram into your brain and then some about science. Then I'll let you in on a little secret.
You know I'm not a spammer. I've been leaving comments for almost a year. I'm serious. Check this place out.
This might be a classic case of correlation vs. causality.
Whatever is affecting the decay might also be affecting the solar activity...
Ok, call me crazy but maybe the sun is affecting the devices they're using to measure radioactivity? that sounds more likely to me.
This phenomenon makes perfect sense from the analogy of saturation. A damp rag put in water absorbs water, a damp rag in air drips - always seeking a constant level of environmental saturation.
Thus, during periods of increased solar radiation saturation, particles would cast off less radiation, while in "dry" periods, they would cast off more. So a solar flare or solar rich summer would decrease decay, while the more insulated winter would accelorate decay.
Of course, if this was the case, it would be another coffin nail in carbon dating, as long periods with increase solar activity would appear much shorter and short periods with decreased solar activity would appear much longer.
That way, radioactive saturation remains stable, while the decay rate of any one atom is variable.
Remember as well that "radiation as a constant" is based on us observing that "radiation is a constant" for what is, in radioactive terms, a ridiculously small amount of time.
@rww, maybe the time is the one who is changing or is fluctuating ?
Nuclear reactions at the core of the sun produce vastly more particles than any surface flare. This is definitely true of neutrinos and should hold for any 'unknown particle' as well. This means that flare activity by itself could not effect decay rates in the way this article suggests. While the solar distance correlation may pan out (causality is another matter), I believe that the flare prediction idea is based on one measurement of manganese-54 (and no physics) will be quickly dismissed.
A large solar flare might activate a genetic bomb. Nostradamus implied that would occur during this war. IMO
I believe he said that the impurities in salmonella will activate something in chromosome # 6. Leading to the "wormwood" poison ( heart attack ) . Needless to say the "confusion" will be high. The tower of babel story will be verified. The human plague that Lee was worried about will be cured.
I wonder how the sun could be affecting radioactive decay?
It's a good thing that scientist have proved that solar activity has no effect at all on the climate.
Violence must be eradicated; Kill all the violent people you know!!
I don't think the scientific community knows much at all about nuclear science. Except how to kill people with it. Oh, they can get some useful things from it ,IE, electricity, propulsion for large ships but any good they can get comes with a price of it's really not worth the risk to use it because of it's lethality if something goes wrong. We have been experimenting with radioactivity for about 100 years now.Since Madam Currie died from experimenting with it. We are only 65 years from detonating the first atomic bomb.How could we possibly know very much about it in 65 years of using it and knowing it will kill you if you make 1 tiny mistake dealing with it. We cannot even store the waste products from it safely.
If we really want to go to Mars in the near future nuclear propulsion is the only way we can do it cheaply, reliably and quickly enough.
Instead of this useless International Space Station, we should build a moon base where we can launch a mission to Mars from. It could be done relatively quickly and a reactor to power the spaceship to propel the ship to Mars can be built. A mission can't be launched from earth because too much fuel is used to get terminal velocity to escape earth's atmosphere.
When we understand anti-matter we can work with it and maybe understand nuclear science well enough to use it here on our own planet.
The neutrino stream could be setting the local 'zero point'. Kind of a local static zone around our star, greater inside the orbital yearly track than around the outside face of the sun in its' radiation out from the corona due to reciprocal gravity of the planets.
Just because a particle doesn't appear to interact with the rest of the electromagnetic spectrum in some way we've found so far seems like a very good reason to wonder if we aren't assigning too small a role to the possibly much more important neutrino, and it could then just be a matter of something we didn't catch splashing into the sun on the backside. We would still stay more stable than the outside that had been hit. But an increase in neutrino output inside the orbital track is still inevitable, even if more stable than the outer, or splash zone.
If we are talking an increase in decay rate every single year it might be the sun isn't quite round and it has an oscillation that shifts mass, throwing more neutrinos
In my little insanity, the neutrinos would just serve as a local outward effect, getting more space between them but acting as the baseline for decay around our star, literally the grease between particles within the star, with the most neutrinos and fissile matter so the most flux, right? And so it might stand to reason that an increase in local neutrinos could cause decay increase. Or maybe that's what they said, or it's somethin different. I'm goin to bed and as long as there aren't too many neutrinos, I won't pass right through the mattress. But if I'm wrong I just might. Wait, it's not winter yet. I thought I was tired when popsci microwaved me right through my tinfoil cone and made me think about this again. Jeez. I thot Lawrence Livermore already figured this stuff out for us.
I'm not an expert on radio carbon dating, but from my field of electrical engineering the first thing that came to mind is that we need to try and take solar cycles and solar flares into account when designing some electronic equipment. As exciting as what a new partial that might redefine physics might seem exciting.
I wondered if all possibilities that modern high resolution electronic sampling and high impedance circuits that is used these days in electronic instruments cannot contribute to the result. Modern analogs that still have relatively high noise levels in hardware which is often hidden behind digital software filtering techniques that might use methods such as arithmetic means to filter noise are essential part of accuracy of modern equipment. This can take a noisy circuit and make the value extremely accurate and stable in stable conditions. Since most noise produce cycles within a small period and results in zero average offset unless it is some electric charge in a high impedance circuit or uncompensated temperature drift. However the circuits become more sensitive to anything from electric fields to EMI these days due to the high impedance and high resolution designs while software hides it very well, OK, well, most of the time.
Any unusual noise might produce adverse results when external electromagnetic interference from industrial environments or in this case perhaps solar flares or solar cycles are induced. Possibly affecting results with minor to major deviations. I have found it many times that temperature compensation is done for improved stability but at the same time small charges are often kept in high impedance circuits that may result in offset values that cannot be filtered by an arithmetic mean digitally. With weighing equipment typically which often use high resolution sampling this phenomena of slow drifts are very common especially in industrial setups where EMI is usually high. Solar flares are possible source of electromagnetic or even electric fields.
To try and shield the equipment, used different types, different makes/models of equipment might be a good idea to see if the results are consistent. I think anything that samples smaller than 1/10,000 of its capacity or low energy/force values becomes relatively difficult to manage noise and interference in all forms.
In one example I connected a strain transducer to a high quality ADC circuit with thick shielding that sample at a resolution of 1/300,000 in a lab, free from wind and the environment is temperature controlled and EMI is of typical/normal levels. It was stable using software digital filtering and it put a massive WOW on everyone’s faces for its extreme precision and accuracy. The moment you step closer than 4 meters to the experiment, it started drifting like crazy due to interference from the human body. Just to prove the filtering does not work for all form of interference. Often the rate of change can be a problem. I wonder what a solar flare would have done at the time. Commonly with even good designs we see drifts in analog to digital converters at 1/3000 already for normal conditions in hardware. I don’t say the equipment can be directly compared, I just hope they gave it their 5c worth to eliminate such a possibility before making the problem too sophisticated. Large shields might just as well provide large reception areas for interference.
Solar flares can even put satellites out of order never mind sensitive lab equipment that requires controlled environment for reliable results. I would not trust the results of any high precision electronic instruments during these “electromagnetic storms” especially if the results deviated. Solar cycles also produce variations in normal electromagnetic radiation that can result in a slow drift often it is assumed that temp. and RH is to be blamed but I’m not so sure, it will depend on the particular design.
One question - how come there is no mention here of carbon dating? Even the original article from Stanford mentions it.
It is kind of humorous how authors of the articles here don't hesitate to jump to most ridiculous conclusions if they go along the "scientific" party line. However, if some fact begs an obvious question outside those lines - it is not even mentioned.
"This tells us that if this were true then the earth would be even older than the carbon dating shows it to be . This isn't good for creationists."
Not necessarily. If solar activity increased at one point of time in the history (i.e. post flood), it would slow down the rate of decay. Now if you try to extrapolate the time (measure the age of things) based on the current slower rate of decay you would come to a fallacious conclusion that the object that you are measuring is billions years old.
Perhaps Creationists are correct and that radioactive decay has been modified in the past. Perhaps instead of 4.54 billion years the earth is 6,000 years old.
I think I'll stick with the Word of God. Sciences changes but the Word of God is for ever.
"And there shall be signs in the sun, and in the moon, and in the stars; and upon the earth distress of nations, with perplexity; the sea and the waves roaring;" Luke 21:15
Solar power is endless, the key is how we make good use of solar energy resources, introduction of China manufacturers of solar water heaters: solar water heater, solar collector: http://www.jinyi-solar.com/
the answer to this, as with all questions, is midichlorians
It seems that when the sun is near to the earth radioacitivity increases. This make also be a gravity space time phenomena as well as a mystery particle.