[/caption]
You can shelf your designs for a warp drive engine (for now) and put the DeLorean back in the garage; it turns out neutrinos may not have broken any cosmic speed limits after all.
Ever since the news came out on September 22 of last year that a team of researchers in Italy had clocked neutrinos traveling faster than the speed of light, the physics world has been resounding with the potential implications of such a discovery — that is, if it were true. The speed of light has been a key component of the standard model of physics for over a century, an Einstein-established limit that particles (even tricky neutrinos) weren’t supposed to be able to break, not even a little.
Now, according to a breaking news article by Edwin Cartlidge on AAAS’ ScienceInsider, the neutrinos may be cleared of any speed violations.
“According to sources familiar with the experiment, the 60 nanoseconds discrepancy appears to come from a bad connection between a fiber optic cable that connects to the GPS receiver used to correct the timing of the neutrinos’ flight and an electronic card in a computer,” Cartlidge reported.
The original OPERA (Oscillation Project with Emulsion-tRacking Apparatus) experiment had a beam of neutrinos fired from CERN in Geneva, Switzerland, aimed at an underground detector array located 730 km away at the Gran Sasso facility, near L’Aquila, Italy. Researchers were surprised to discover the neutrinos arriving earlier than expected, by a difference of 60 nanoseconds. This would have meant the neutrinos had traveled faster than light speed to get there.
Repeated experiments at the facility revealed the same results. When the news was released, the findings seemed to be solid — from a methodological standpoint, anyway.
Shocked at their own results, the OPERA researchers were more than happy to have colleagues check their results, and welcomed other facilities to attempt the same experiment.
Repeated attempts may no longer be needed.
Once the aforementioned fiber optic cable was readjusted, it was found that the speed of data traveling through it matched the 60 nanosecond discrepancy initially attributed to the neutrinos. This could very well explain the subatomic particles’ apparent speed burst.
Case closed? Well… it is science, after all.
“New data,” Cartlidge added, “will be needed to confirm this hypothesis.”
See the original OPERA team paper here.
_______________________
UPDATE 2/22/12 11:48 pm EST: According to a more recent article on Nature’s newsblog, the Science Insider report erroneously attributed the 60 nanosecond discrepancy to loose fiber optic wiring from the GPS unit, based on inside “sources”. OPERA’s statement doesn’t specify as such, “saying instead that its two possible sources of error point in opposite directions and it is still working things out.”
OPERA’s official statement released today is as follows:
“The OPERA Collaboration, by continuing its campaign of verifications on the neutrino velocity measurement, has identified two issues that could significantly affect the reported result. The first one is linked to the oscillator used to produce the events time-stamps in between the GPS synchronizations. The second point is related to the connection of the optical fiber bringing the external GPS signal to the OPERA master clock.
These two issues can modify the neutrino time of flight in opposite directions. While continuing our investigations, in order to unambiguously quantify the effect on the observed result, the Collaboration is looking forward to performing a new measurement of the neutrino velocity as soon as a new bunched beam will be available in 2012. An extensive report on the above mentioned verifications and results will be shortly made available to the scientific committees and agencies.” (via Nature newsblog.)
Oh dear, there goes the pseudo quaks down the drain. again…
Universe maintains another secret for now b/c they didn’t hook their VCR up right at CERN 🙂
Why do comets have a tale? It seems that all of us are trying to find ways to dispel the myth (yes) that it’s impossible to travel faster than the speed of light.
I bet you have no solid understanding in the maths and science in why “c” is the maximum speed.
Before you go into claiming that FTL is possible, you should have the basics what science does know.
You’ve made my point better than I ever could. Modern science works very hard to fit everything into its predetermined slot based on previous academically agreed upon theories. Anything or anyone that dares examine data outside of those agreements is given the intellectual elitist response. nyquil762
So in other words you have no clue why “c” exists and why it cannot be broken, in deep scientific understanding and maths.
And because you lack the scientific understanding, you just make up fun stuff that sounds logical and plausible.
I barely made it through calculus. Consequently, I rely on those that have studied the specifics much more than I (perhaps that means you). However, my original comment is validated more proficiently each time you respond. Again, bravo!
Calculus is all you need to try to understand the theory of relativity. Quantum mechanics is a different story.
There is a very good book “An illustrated Guide to Relativity” by Tatsu Takeuchi which takes you step by step into the the theory.
This is one of the things where you do not have to rely on other people to study the specifics. It also gives a good tool to test theories of other so called open minded people for pure BS or if they have something.
The annoying thing of the theory of relativity is that it is not intuitive. A limit on “c” is not intuitive. It is tempting to compare “c” with the speed of sound or confuse “c” with the speed of light (which can become slower than “c”). It is the temptation of finding “human” logic and philosophical reasoning in how the universe operates that leads to pseudo science and is completely untrue in reality.
Just get into the maths and you do not get blocked by the unintuitive thing that “c” really is. It actually becomes very elegant and so simple.
Thank you for taking the time to educate and understand my position. I’ll take a look at the recommended reading material. nyquil762
Technically incorrect. The speed of light is not a maximum speed limit which nothing can go faster; it is an asymptote — a point of discontinuity at which nothing (except light, of course, any anything massless) can travel. Nothing (except massless entities) can travel the speed of light because, per Einstein’s theory of relativity, mass is not constant but increases with speed (time and distance also distort/warp the closer you get to the speed of light, but mass is the critical part of why you cannot accelerate to light speed, which is the relevant part of this discussion) and becomes infinite at the speed-of-light asymptote. Because of this, it would require infinite — not “a humongous amount” but truly infinite — amount of energy to fully accelerate to the speed of light.
While many people incorrectly presumes this to mean that nothing can travel faster than the speed of light, this is not true. We have no concept of how a slower-than-light object could accelerate to (and thus past) light speed due to infinite mass at light speed, there are theoretical objects we call tachyons that go faster than light but don’t violate relativity because they don’t accelerate to the speed of light, but are created and their existence is entirely faster-than-light. Interestingly, a hypothetical creature made of tachyons on a starship also made of tachyons would have the inverse problem we do — they could not decelerate all the way down to the speed of light.
Traveling faster than the speed of light by delaying information for the wanted duration.
hmm (ponders)
I said right from the start that there must be some instrumentation or software problem.
LC
I think a lot of people did.
…and yet, you are beating your chest because the ‘same team using the same equipment’ has identified possible sources of error…qualified further still that such errors may negate the discrepancy or increase the original “ftl” result…?
I guess research funding must be earned by dogged adherence to an end result.
This is not about trying to shore up the established physics, which in this case is relativity. There is of course a bit of a bias which does tell us data indicating faster than light physics is extraordinary. So in working through this we employ the rule AC Doyle penned for Sherlock Holmes. This is that one eliminates the most probable cause first, work your way up to increasingly improbable causes. Then if you reach a cause which can’t be eliminated then you probably have found the source, no matter how improbable it initially was.
As Torbjörn Larsson indicated the cause is similar to a weak cable connection. A weak connection introduces an effective capacitor in the transmission line. The velocity of a wave is determined in part by the impedance of the transmission line. A fiber optic is really the same thing; the main difference is the wavelength of the transmission wave is much smaller. As a result there are material differences and so forth, but the basic electromagnetic physics is the same. So in trouble shooting the system the OPERA team went about to eliminate the most probable causes; instrumentation flaws and software failures. In the end it turned out to be a fiber optic connection that was bad. That is at the outset far more probable than accepting that the universe permits causality violating faster than light signals.
LC
well this has really gotten a lot of people thinking, sort of like conceptual art in that regard. it’s got to be clocked very accurately and that’s the rub.
the cosmological evidence is more compelling.
what they DO have is a very accurate method of determining distance through solid rock. the speed of light remains a very good means of calibration. all this scrutiny on signal propagation can only help the craft.
…exotic and extraordinary…yes, but not impossible, just ir-relative?
I hate to say I told you so, but I originally and repeatedly posted that a measurement problem would be found to be the answer to this discrepancy. Einstein will continue to rest soundly in his grave. It was interesting to see how many jumped off the cliff with the other lemmings. Still it made for some creative problem solving discussions.
Except lemmings don’t jump off cliffs, so your analogy fails.
the weird thing is that if you take out the 60 nanoseconds, the neutrinos, with their tiny bit of mass, are STILL traveling at light speed which I believe, STILL flies in the face of Einstein’s postulations because it’s supposed to take infinite energy to GET to light speed!
Am I missing something?
Yes, you are missing that all neutrino species masses are predicted to be very small. So the initial expectation of the experiment was that the time differential between photons and neutrinos would be within experimental error. (IIRC.)
The SN 1987A neutrinos showed that the race differential between (another species of) neutrinos and photons can be minute even over cosmological distances.
Indeed, the energy that goes into producing these rapid neutrinos is huge.
yeah, neutrinos have almost no mass and carry no electictrical charge. Therefor they do not interact with the EM spectrum or matter so much.
Because the neutrinos are very light particles, they will mostly travle at speeds comparable to light. That brings in measurement errors as a factor, because at whatever speed they actually do travel in (and they individually travel at different speeds), if one cant measure the difference in speed compared to light then the actual speed is only known to be ‘close to light speed’. And that is expected from theory anyway.
If the result had been that neutrinos travel at double light speed, then there would have been no arguments, because the measurement is sensitive enough to then rule out ‘lower than light speed’. Positive and accurate measurement is the only way to reach confirmation, and the accuracy is now the issue.
Damn, but it was a very nice dream while it lasted….
Warp drive is still a possibility for an advanced civilization. But Einstein shall nevertheless reign forever.
No, the problem with an Alcubierre drive are manifold when you look at the details.
You need more energy than is available within the observable universe. You need non-existent exotic matter. And you hit a paradox, you need to have the object travel ftl without the warp bubble to insert it and have it travel ftl with its help.
(Precisely the same happen with GR wormholes, except here the paradox is that the object which attempts to enter the wormhole will be pushed back by it attempting to exit the wormhole.)
Actually, for the warp drive, there are studies which state you need only one Jupiter mass. Enormous, for sure, but attainable. There are many pessimistic papers out there, but they *are* being revised.
But I’m not one to say that we will get it within our lifetime. Certainly no less than 1,000 years from now, and likely more.
If we don’t have a doable (although expensive) design in 100 I’ll be dissapointed. Or would be if I was alive at that point.
I stand corrected. I haven’t been following the subject since the inherent paradox became apparent to me, even mentioned elsewhere.
So I have to disagree with your claim, a drive can’t be done. And if you mean a bubble in itself, it still can’t be done, because of the requirement for matter that is unobserved and likely non-existent.
The existence of exotic matter is a part of the problem. This matter or field is called exotic because the mass-energy of this field is negative. This means the stress-energy tensor has “time-time”component T^{00} < 0. This is proportional to the Ricci curvature, which in this case has negative Gaussian curvature. This means that gravity acts not to focus geodesics (geometric paths) in spacetime, but to defocus them. An analogue would be to think of spacetime as having an index of refraction. Ordinarily this focuses paths in spacetime, but in this extraordinary case it defocuses paths. This violates the Hawking-Penrose energy conditions, and explicitly here this is the weak energy condition (WEC). As indicated below this has two main problems; this leads to closed paths in spacetime or time travel, and the other is that quantum field theory is not bounded below.
The violation of the WEC means that singularities in spacetime are not enclosed by event horizons. A singularity in a black hole is a spacelike surface where Weyl curvature diverges, which exists in a trapped region behind an event horizon. The violation of the WEC means this singularity can exist in spacetime, it is transformed into a timelike geometry outside of an event horizon. In fact this is one property of a violation of WEC, timelike and spacelike paths or regions can be transformed continuously between each other. The result of this is that closed causal paths can exist, effectively as time loops or time machines, and the concept of causality is lost.
The other problem is that violation of WEC means the quantum field which defines this exotic matter has no minimal eigenstate. The energy level can endlessly descend a “ladder of states” that has no minimal energy value. This puts quantum mechanics in a pickle. Neils Bohr worked the early model of the hydrogen atom by looking at quantum or deBroglie waves orbiting a nucleus. The minimal wave length, we call the S-shell, is the lowest energy state. This prevented the ultraviolet catastrophe predicted for the atom from classical physics. However, if WEC is violated there is no minimal energy level. So the quantum field can endlessly descend down these states and emit photons, or some form of quantum field or bosons. It does not take much to see that this is a disaster.
Ultimately we have a question on whether the global principles of physics and cosmology enforce local laws. The physics of a local frame, such as special relativity, has some relationship to the global principles of cosmology. The violation of WEC which permits transformation between spacelike and timelike paths or regions violates the rules of special relativity. So if we had some real physics that illustrated how neutrinos are tachyons, or some euclideanized map to a tiny Alcubierre warp drive bubble, then we really have a difficult task of making some coherent sense of the universe. Already understanding the universe without this problem is very tough. A violation of the WEC would make things hideously more difficult — if not impossible.
LC
Now _that_ is a systematic error. We need conformation of course.
Something seems fishy about this story. I wouldn’t be surprised if this turns out to be nothing more than a rumor.
Seems to me a short in the wiring would slow down the results,not speed them up.
It is not a short, but an optic cabling. It was loose, and in the same way that a bad electrical coupling (increased impedance) slows down a traveling current pulse it slows down a traveling photonic pulse.
Whether that delay make the travel time period appear faster or slower depends on the details of the setup which we don’t know.
Not really, If the cable sent a pulse every second then the clock sees this as a second. If then only every second pulse gets through the clock will think one second has passed when in reality 2 seconds have. if you are measuring something against a clock that counts slow S=d/t will tell you that a particle has travelled the same distance for a shorter time.
Hmm, an official quote from the OPERA team makes no mention of an exact 60-s discrepancy here http://blogs.nature.com/news/2012/02/faster-than-light-neutrino-measurement-has-two-possible-errors.html In fact, they say there are two potential issues, which affect the measured speed in two different directions. Seems like some inconsistent information here. Before we consign this one to the trash bin, we should wait to see new measurements with these errors taken into account. Chances are good, of course, that one of these errors will explain the anomaly.
Thanks for the heads up!
…lazy neutrinos. It isn’t the first time neutrinos have disappointed me either as I once hoped their cherenkov radiation was what made water look blue. Boo-urns 🙁
Updated information above.
In my opinion the most plausible sounding explanation discussed and suggested was that the GPS measurements failed to take into account the relativistic differential caused by the GPS satellite itself in orbit in a different inertial reference frame which was calculated by some to be around the 60ns time discrepancy..
I am still waiting to see how this pans out but still as skeptical as I was from day one and always have agreed with Torbjorn, LC and others that the physics textbooks don’t need any rewrite just yet.
Wezley
What hyperbole, somebody suggests that It COULD be a loose cable and the Internet Blogs take it as case closed.
And like “lemmings’ readers dont bother reading Past the Headline. Since the Cern results were published “many’ scientists have pointed to measurement errors as ‘Possible’ Causes of skewed results’.
Not one of them has repeated the experiment.
Lets wait until CERN repeats the experiment with the cables plugged in correctly shall we. ?
Fermilab is still moving forward with its repeat of the experiment.
Of course this experiment must be repeated, with the correct repairs.
When the story of faster-than-light neutrinos first broke, someone on these very boards suggested GPS abnormalities could be the problem. I don’t remember who it was, but get that person a plane ticket to CERN.
I am surprised that even though there was a problem within the GPS connectivity that produced an error comparable to the discrepency, as far as my knowledge goes, the timing was also measured using an atomic clock. Was there a problem with the atomic clock measurement as well?? Why isnt that part of the experiment discussed here??