On September 22, an international team of researchers working on the OPERA project at the Gran Sasso research facility released a paper on some potentially physics-shattering findings: beams of neutrinos that had traveled from the CERN facility near Geneva to their detector array outside of Rome at a speed faster than light. (Read more about this here and here.) Not a great deal faster, to be sure – only 60 nanoseconds faster than expected – but still faster. There’s been a lot of recoil from the scientific community about this announcement, and rightly so, since if it does end up being a legitimate finding then it would force us to rework much of what we have come to know about physics ever since Einstein’s theory of relativity.
Of course, to those of us not so well-versed in particle physics *raises hand* a lot of this information can quickly become overwhelming, to say the least. Thankfully the folks at Sixty Symbols have recorded this interview with two astrophysicists at the UK’s University of Nottingham. It helps explain some of the finer points of the discovery, what it means and what the science community in general thinks about it. Check it out!
The recent news from the Oscillation Project with Emulsion-tRacking Apparatus (OPERA) neutrino experiment, that neutrinos have been clocked travelling faster than light, made the headlines over the last week – and rightly so. There are some very robust infrastructure and measurement devices involved that give the data a certain gravitas.
The researchers had appropriate cause to put their findings up for public scrutiny and peer review – and to their credit have produced a detailed paper on the subject, beyond just the media releases we have seen. Nonetheless, it has been reported that some senior members of the OPERA research team declined to be associated with this paper, considering that it was all a bit preliminary.
After all, the reported results indicate that the neutrinos crossed a distance of 730 kilometres in 60 nanoseconds less time than light would have taken. But given that light would have taken 2.4 million nanoseconds to cross the same distance – there is a lot hanging on such a proportionally tiny difference.
It would have been a different story if the neutrinos had been clocked at 1.5x or 2x light speed, but this is more like 1.0025x light speed. And it would have been no surprise to anyone to have found the neutrinos travelling at 99.99% of light speed, given their association with the Large Hadron Collider. So, confirming that they really are exceeding light speed, but only by a tiny amount, requires supreme confidence in the measuring systems used. And there are reasons to doubt that there are grounds for such confidence.
The distance component of the speed calculation had an error of less than 20 cm out of the 730 kilometres path, or 0.00003% if you like, over the data collection period. That’s not much error, but then the degree to which the neutrinos are claimed to have moved faster than light isn’t that much either.
But the travel time component of the speed calculation is the real question mark here. The release time of neutrinos from the source could only be inferred as arising from a 10.5 microsecond burst of protons from the CERN Super Proton Synchrotron (SPS) – fired at a graphite target, which then releases neutrinos towards OPERA.
The researchers substantially restrained the potential error (i.e. 10.5 microseconds) by comparing the time distributions of SPS proton release and neutrino detection at OPERA over repeated trials, to give a probability density function of the time of emission of the neutrinos. But this is really just a long-winded way of saying they could only estimate the likely travel time, more or less. And the dependence on GPS satellite links to time stamp the release and detection steps represents a further source of potential measurement error.
It’s also important to note that this was not a race. The 730 kilometre straight-line pathway to OPERA is through the Earth’s crust – which is virtually transparent to neutrinos, but opaque to light. The travel time of light is hence inferred from measuring the path distance. So it was never the case that the neutrinos were seen to beat a light beam across the path distance.
The real problem with the OPERA experiment is that the calculated bettering of light speed is a very tiny margin that has been measured over a relatively short path distance. If the experiment could be repeated by firing at a neutrino detector on the Moon say, that longer path distance would deliver more robust and more convincing data – since, if the OPERA effect is real, the neutrinos should very obviously reach the Moon quicker than a light beam could.
Until then, it all seems a bit premature to start throwing out the physics textbooks.
Hey. We’re all aware of Einstein’s theories and how gravity affects light. We know it was proved during a total solar eclipse, but what we’ve never realized in observational astronomy is that light just might get bent by other gravitational influences. If it can happen from something as small as a star, then what might occur if you had a huge group of stars? Like a galaxy… Or a group of galaxies!
What’s new in the world of light? Astrophysicists at the Dark Cosmology Centre at the Niels Bohr Institute have now gone around the bend and came up with a method of measuring how outgoing light is affected by the gravity of galaxy clusters. Not only does each individual star and each individual galaxy possess its own gravity, but a galaxy group is held together by gravitational attraction as well. Sure, it stands to reason that gravity is affecting what we see – but there’s even more to it. Redshift…
“It is really wonderful. We live in an era with the technological ability to actually measure such phenomena as cosmological gravitational redshift”, says astrophysicist Radek Wojtak, Dark Cosmology Centre under the Niels Bohr Institute at the University of Copenhagen.
Together with team members Steen Hansen and Jens Hjorth, Wojtak has been collecting light data and measurements from 8,000 galaxy clusters. Their studies have included calculations from mid-placed members to calibrations on those that reside at the periphery.
“We could measure small differences in the redshift of the galaxies and see that the light from galaxies in the middle of a cluster had to ‘crawl’ out through the gravitational field, while it was easier for the light from the outlying galaxies to emerge”, explains Radek Wojtak.
The next step in the equation is to measure the entire galaxy cluster’s total mass to arrive at its gravitational potential. Then, using the general theory of relativity, the gravitational redshift could be determined by galaxy location.
“It turned out that the theoretical calculations of the gravitational redshift based on the general theory of relativity was in complete agreement with the astronomical observations.” explains Wojtak. “Our analysis of observations of galaxy clusters show that the redshift of the light is proportionally offset in relation to the gravitational influence from the galaxy cluster’s gravity. In that way our observations confirm the theory of relativity.”
Of course, this kind of revelation also has other implications… theoretical dark matter just might play a role in gravitational redshift, too. And don’t forget dark energy. All these hypothetical models need to be taken into account. But, for now, we’re looking at the big picture in a different way.
“Now the general theory of relativity has been tested on a cosmological scale and this confirms that the general theory of relativity works and that means that there is a strong indication for the presence of dark energy”, explains Radek Wojtak.
As Walt Whitman once said, “I open the scuttle at night and see the far-sprinkled systems, And all I see multiplied as high as I can cypher edge but the rim of the farther systems. Wider and wider they spread, expanding, always expanding,Outward and outward and forever outward.”
Nope. A standard candle isn’t the same red, green, blue, yellow and omni-present pink wax sticks that decorate your every day birthday cake. Until now a standard candle meant a Cepheid variable star – or more recently – a Type 1a supernova. But something new happens almost every day in astronomy, doesn’t it? So start thinking about how an active galactic nucleus could be used to determine distance…
“Accurate distances to celestial objects are key to establishing the age and energy density of the Universe and the nature of dark energy.” says Darach Watson (et al). “A distance measure using active galactic nuclei (AGN) has been sought for more than forty years, as they are extremely luminous and can be observed at very large distances.”
So how is it done? As we know, active galactic nuclei are home to supermassive black holes which unleash powerful radiation. When this radiation ionizes nearby gas clouds, they also emit their own light signature. With both emissions in range of data gathering telescopes, all that’s needed is a way to measure the time it takes between the radiation signal and the ionization point. The process is called reverberation mapping.
“We use the tight relationship between the luminosity of an AGN and the radius of its broad line region established via reverberation mapping to determine the luminosity distances to a sample of 38 AGN.” says Watson. “All reliable distance measures up to now have been limited to moderate redshift — AGN will, for the first time, allow distances to be estimated to z~4, where variations of dark energy and alternate gravity theories can be probed.”
The team hasn’t taken their research “lightly”. It means careful calculations using known factors and repeating the results with other variables thrown into the mix. Even uncertainty…
“The scatter due to observational uncertainty can be reduced significantly. A major advantage held by AGN is that they can be observed repeatedly and the distance to any given object substantially refined.” explains Watson. “The ultimate limit of the accuracy of the method will rely on how the BLR (broad-line emitting region) responds to changes in the luminosity of the central source. The current tight radius-luminosity relationship indicates that the ionisation parameter and the gas density are both close to constant across our sample.”
At the first standard candle we discovered the Universe was expanding. At the second we learned it was accelerating. Now we’re looking back to just 750 million years after the Big Bang. What will tomorrow bring?
A few days ago, the physics world was turned upside down at the announcement of “faster than the speed of light”. The mighty neutrino has struck again by breaking the cosmic speed limit and traveling at a velocity 20 parts per million above light speed. To absolutely verify this occurrence, collaboration is needed from different sources and we’re here to give you the latest update.
“This result comes as a complete surprise,” said OPERA spokesperson, Antonio Ereditato of the University of Bern. “After many months of studies and cross checks we have not found any instrumental effect that could explain the result of the measurement. While OPERA researchers will continue their studies, we are also looking forward to independent measurements to fully assess the nature of this observation.”
Since the OPERA measurements go against everything we think we know, it’s more important than ever to verify its findings through independent research.
“When an experiment finds an apparently unbelievable result and can find no artifact of the measurement to account for it, it’s normal procedure to invite broader scrutiny, and this is exactly what the OPERA collaboration is doing, it’s good scientific practice,” said CERN Research Director Sergio Bertolucci. “If this measurement is confirmed, it might change our view of physics, but we need to be sure that there are no other, more mundane, explanations. That will require independent measurements.”
To get the job done, the OPERA Collaboration joined forces with CERN metrology experts and other facilities to establish absolute calibrations. There cannot be any error margin in parameters between the source and detector distances – and the neutrino’s flight time. In this circumstance, the measurements of the initial source of the neutrino beam and OPERA has an uncertainty value of 20 cm over the 730 km. The neutrino flight time has an accuracy of less than 10 nanoseconds, and was confirmed through the use of highly regarded GPS equipment and an atomic clock. Every care was given to ensure precision.
“We have established synchronization between CERN and Gran Sasso that gives us nanosecond accuracy, and we’ve measured the distance between the two sites to 20 centimetres,” said Dario Autiero, the CNRS researcher who will give this afternoon’s seminar. “Although our measurements have low systematic uncertainty and high statistical accuracy, and we place great confidence in our results, we’re looking forward to comparing them with those from other experiments.”
“The potential impact on science is too large to draw immediate conclusions or attempt physics interpretations. My first reaction is that the neutrino is still surprising us with its mysteries.” said Ereditato. “Today’s seminar is intended to invite scrutiny from the broader particle physics community.”
It’s been a tenet of the standard model of physics for over a century. The speed of light is a unwavering and unbreakable barrier, at least by any form of matter and energy we know of. Nothing in our Universe can travel faster than 299,792 km/s (186,282 miles per second), not even – as the term implies – light itself. It’s the universal constant, the “c” in Einstein’s E = mc2, a cosmic speed limit that can’t be broken.
That is, until now.
An international team of scientists at the Gran Sasso research facility outside of Rome announced today that they have clocked neutrinos traveling faster than the speed of light. The neutrinos, subatomic particles with very little mass, were contained within beams emitted from CERN 730 km (500 miles) away in Switzerland. Over a period of three years, 15,000 neutrino beams were fired from CERN at special detectors located deep underground at Gran Sasso. Where light would have made the trip in 2.4 thousandths of a second, the neutrinos made it there 60 nanoseconds faster – that’s 60 billionths of a second – a tiny difference to us but a huge difference to particle physicists!
The implications of such a discovery are staggering, as it would effectively undermine Einstein’s theory of relativity and force a rewrite of the Standard Model of physics.
“We are shocked,” said project spokesman and University of Bern physicist Antonio Ereditato.
“We have high confidence in our results. We have checked and rechecked for anything that could have distorted our measurements but we found nothing. We now want colleagues to check them independently.”
Neutrinos are created naturally from the decay of radioactive materials and from reactions that occur inside stars. Neutrinos are constantly zipping through space and can pass through solid material easily with little discernible effect… as you’ve been reading this billions of neutrinos have already passed through you!
The experiment, called OPERA (Oscillation Project with Emulsion-tRacking Apparatus) is located in Italy’s Gran Sasso facility 1,400 meters (4,593 feet) underground and uses a complex array of electronics and photographic plates to detect the particle beams. Its subterranean location helps prevent experiment contamination from other sources of radiation, such as cosmic rays. Over 750 scientists from 22 countries around the world work there.
Ereditato is confident in the results as they have been consistently measured in over 16,000 events over the past two years. Still, other experiments are being planned elsewhere in an attempt to confirm these remarkable findings. If they are confirmed, we may be looking at a literal breakdown of the modern rules of physics as we know them!
“We have high confidence in our results,” said Ereditato. “We have checked and rechecked for anything that could have distorted our measurements but we found nothing. We now want colleagues to check them independently.”
A preprint of the OPERA results will be posted on the physics website ArXiv.org.
Well, we’re off to see the Wizard again, my friends. This time it’s to explore the possibilities of primordial black holes colliding with stars and all the implications therein. If this theory is correct, then we should be able to observe the effects of dark matter first hand – proof that it really does exist – and deeper understand the very core of the Universe.
Are primordial black holes blueprints for dark matter? Postdoctoral researchers Shravan Hanasoge of Princeton’s Department of Geosciences and Michael Kesden of NYU’s Center for Cosmology and Particle Physics have utilized computer modeling to visualize a primordial black hole passing through a star. “Stars are transparent to the passage of primordial black holes (PBHs) and serve as seismic detectors for such objects.” says Kesden. “The gravitational field of a PBH squeezes a star and causes it to ring acoustically.”
If primordial black holes do exist, then chances are great that these type of collisions happen within our own galaxy – and frequently. With ever more telescopes and satellites observing the stellar neighborhoods, it only stands to reason that sooner or later we’re going to catch one of these events. But, the most important thing is simply understanding what we’re looking for. The computer model developed by Hanasoge and Kesden can be used with these current solar-observation techniques to offer a more precise method for detecting primordial black holes than existing tools.
“If astronomers were just looking at the Sun, the chances of observing a primordial black hole are not likely, but people are now looking at thousands of stars,” Hanasoge said.”There’s a larger question of what constitutes dark matter, and if a primordial black hole were found it would fit all the parameters — they have mass and force so they directly influence other objects in the Universe, and they don’t interact with light. Identifying one would have profound implications for our understanding of the early Universe and dark matter.”
Sure. We haven’t seen DM, but what we can see are galaxies that are hypothesized to have extended dark-matter halos and to study the effects the gravity has on their materials – like gaseous regions and stellar members. If these new models are correct, primordial black holes should be heavier than existing dark matter and when they collide with a star, should cause a rippling effect.
“If you imagine poking a water balloon and watching the water ripple inside, that’s similar to how a star’s surface appears,” Kesden said. “By looking at how a star’s surface moves, you can figure out what’s going on inside. If a black hole goes through, you can see the surface vibrate.”
Using the Sun as a model, Kesden and Hanasoge calculated the effects a PBH might have and then gave the data to NASA’s Tim Sandstrom. Using the Pleiades supercomputer at the agency’s Ames Research Center in California, the team was then able to create a video simulation of the collisional effect. Below is the clip which shows the vibrations of the Sun’s surface as a primordial black hole — represented by a white trail — passes through its interior.
“It’s been known that as a primordial black hole went by a star, it would have an effect, but this is the first time we have calculations that are numerically precise,” comments Marc Kamionkowski, a professor of physics and astronomy at Johns Hopkins University. “This is a clever idea that takes advantage of observations and measurements already made by solar physics. It’s like someone calling you to say there might be a million dollars under your front doormat. If it turns out to not be true, it cost you nothing to look. In this case, there might be dark matter in the data sets astronomers already have, so why not look?”
Radioactive decay – a random process right? Well, according to some – maybe not. For several years now a team of physicists from Purdue and Stanford have reviewed isotope decay data across a range of different isotopes and detectors – seeing a non-random pattern and searching for a reason. And now, after eliminating all other causes – the team are ready to declare that the cause is… extraterrestrial.
OK, so it’s suggested to just be the Sun – but cool finding, huh? Well… maybe it’s best to first put on your skeptical goggles before reading through anyone’s claim of discovering new physics.
Now, it’s claimed that there is a certain periodicity to the allegedly variable radioactive decay rates. A certain annual periodicity suggests a link to the varying distance from the Sun to the Earth, as a result of the Earth’s elliptical orbit – as well as there being other overlying patterns of periodicity that may link to the production of large solar flares and the 11 year (or 22 year if you prefer) solar cycle.
However, the alleged variations in decay rates are proportionally tiny and there remain a good deal of critics citing disconfirming evidence to this somewhat radical idea. So before drawing any conclusions here, maybe we need to first consider what exactly good science is:
• Replication – a different laboratory or observatory can collect the same data that you claim to have collected.
• A signal stronger than noise – there is a discrete trend existent within your data that has a statistically significant difference from the random noise existent within your data.
• A plausible mechanism – for example, if the rate of radioactive decay seems to correlate with the position and magnetic activity of the Sun – why is this so?
• A testable hypothesis – the plausible mechanism proposed should allow you to predict when, or under what circumstances, the effect can be expected to occur again.
The proponents of variable radioactive decay appeal to a range of data sources to meet the replication criterion, but independent groups equally appeal to other data sources which are not consistent with variable radioactive decay. So, there’s still a question mark here – at least until more confirming data comes in, to overwhelm any persisting disconfirming data.
Whether there is a signal stronger than noise is probably the key point of debate. The alleged periodic variations in radioactive decay are proportionally tiny variations and it’s not clear whether a compellingly clear signal has been demonstrated.
An accompanying paper outlines the team’s proposed mechanism – although this is not immediately compelling either. They appeal to neutrinos, which are certainly produced in abundance by the Sun, but actually propose a hypothetical form that they call ‘neutrellos’, which necessarily interact with atomic nuclei more strongly than neutrinos are considered to do. This creates a bit of a circular argument – because we think there is an effect currently unknown to science, we propose that it is caused by a particle currently unknown to science.
So, in the context of having allegedly found a periodic variability in radioactive decay, what the proponents need to do is to make a prediction – that sometime next year, say at a particular latitude in the northern hemisphere, the radioactive decay of x isotope will measurably alter by z amount compared to an equivalent measure made, say six months earlier. And maybe they could collect some neutrellos too.
If that all works out, they could start checking the flight times to Sweden. But one assumes that it won’t be quite that easy.
Surprisingly, rumors still persist in some corners of the Internet that the Large Hadron Collider (LHC) is going to destroy the Earth – even though nearly three years have passed since it was first turned on. This may be because it is yet to be ramped up to full power in 2014 – although it seems more likely that this is just a case of moving the goal posts, since the same doomsayers were initially adamant that the Earth would be destroyed the moment the LHC was switched on, in September 2008.
The story goes that the very high energy collisions engineered by the LHC could jam colliding particles together with such force that their mass would be compressed into a volume less than the Schwarzschild radius required for that mass. In other words, a microscopic black hole would form and then grow in size as it sucked in more matter, until it eventually consumed the Earth.
Here’s a brief run-through of why this can’t happen.
1. Microscopic black holes are implausible.
While a teaspoon of neutron star material might weigh several million tons, if you extract a teaspoon of neutron star material from a neutron star it will immediately blow out into the volume you might expect several million tons of mass to usually occupy.
Notwithstanding you can’t physically extract a teaspoon of black hole material from a black hole – if you could, it is reasonable to expect that it would also instantly expand. You can’t maintain these extreme matter densities outside of a region of extreme gravitational compression that is created by the proper mass of a stellar-scale object.
The hypothetical physics that might allow for the creation of microscopic black holes (large extra dimensions) proposes that gravity gains more force in near-Planck scale dimensions. There is no hard evidence to support this theory – indeed there is a growing level of disconfirming evidence arising from various sources, including the LHC.
High energy particle collisions involve converting momentum energy into heat energy, as well as overcoming the electromagnetic repulsion that normally prevents charged particles from colliding. But the heat energy produced quickly dissipates and the collided particles fragment into sub-atomic shrapnel, rather than fusing together. Particle colliders attempt to mimic conditions similar to the Big Bang, not the insides of massive stars.
2. A hypothetical microscopic black hole couldn’t devour the Earth anyway.
Although whatever goes on inside the event horizon of a black hole is a bit mysterious and unknowable – physics still operates in a conventional fashion outside. The gravitational influence exerted by the mass of a black hole falls away by the inverse square of the distance from it, just like it does for any other celestial body.
The gravitational influence exerted by a microscopic black hole composed of, let’s say 1000 hyper-compressed protons, would be laughably small from a distance of more than its Schwarzschild radius (maybe 10-18 metres). And it would be unable to consume more matter unless it could overcome the forces that hold other matter together – remembering that in quantum physics, gravity is the weakest force.
It’s been calculated that if the Earth had the density of solid iron, a hypothetical microscopic black hole in linear motion would be unlikely to encounter an atomic nucleus more than once every 200 kilometres – and if it did, it would encounter a nucleus that would be at least 1,000 times larger in diameter.
So the black hole couldn’t hope to swallow the whole nucleus in one go and, at best, it might chomp a bit off the nucleus in passing – somehow overcoming the strong nuclear force in so doing. The microscopic black hole might have 100 such encounters before its momentum carried it all the way through the Earth and out the other side, at which point it would probably still be a good order of magnitude smaller in size than an uncompressed proton.
And that still leaves the key issue of charge out of the picture. If you could jam multiple positively-charged protons together into such a tiny volume, the resultant object should explode, since the electromagnetic force far outweighs the gravitational force at this scale. You might get around this if an exactly equivalent number of electrons were also added in, but this requires appealing to an implausible level of fine-tuning.
3. What the doomsayers say
When challenged with the standard argument that higher-than-LHC energy collisions occur naturally and frequently as cosmic ray particles collide with Earth’s upper atmosphere, LHC conspiracy theorists refer to the high school physics lesson that two cars colliding head-on is a more energetic event than one car colliding with a brick wall. This is true, to the extent that the two car collision has twice the kinetic energy as the one car collision. However, cosmic ray collisions with the atmosphere have been measured as having 50 times the energy that will ever be generated by LHC collisions.
In response to the argument that a microscopic black hole would pass through the Earth before it could achieve any appreciable mass gain, LHC conspiracy theorists propose that an LHC collision would bring the combined particles to a dead stop and they would then fall passively towards the centre of the Earth with insufficient momentum to carry them out the other side.
This is also implausible. The slightest degree of transverse momentum imparted to LHC collision fragments after a head-on collision of two particles travelling at nearly 300,000 kilometres a second will easily give those fragments an escape velocity from the Earth (which is only 11.2 kilometres a second, at sea-level).