Time Reborn: From the Crisis of Physics to the Future of the Universe is one of those books intended to provoke discussion. Right from the first pages, author Lee Smolin — a Canadian theoretical physicist who also teaches philosophy — puts forward a position: time is real, and not an illusion of the human experience (as other physicists try to argue).
Smolin, in fact, uses that concept of time as a basis for human free will. If time is real, he writes, this is the result: “Novelty is real. We can create, with our imagination, outcomes not computable from knowledge of the present.”
Physics as philosophy. A powerful statement to make in the opening parts of the book. The only challenge is understanding the rest of it.
Smolin advertises his book as open to the general reader who has no background in physics or mathematics, promising that there aren’t even equations to worry about. He also breaks up the involved explanations with wry observations of fatherhood, or by bringing up anecdotes from his past.
It works, but you need to be patient. Theoretical physics is so far outside of the everyday that at times it took me (with education focusing on journalism and space policy, admittedly) two or three readings of the same passage to understand what was going on.
But as I took my time, a whole world opened up to me.
I found myself understanding more about Einstein’s special and general relativity than I did in readings during high school and university. The book also made me think differently about cosmology (the nature of the universe), especially in relation to biological laws.
While the book is enjoyable, it is probably best not to read it in isolation as it is a positional one — a book that gathers information scientifically and analytically, to be sure, but one that does not have a neutral point of view to the conclusions.
We’d recommend picking up other books such as the classic A Brief History of Time (by physicist Stephen Hawking) to learn more about the universe, and how other scientists see time work.
Balloon-based research on cosmic particles that began over a century ago will get a big boost next year — all the way up to low-Earth orbit, when NASA’s Cosmic Ray Energetics and Mass (CREAM) will be sent to the Space Station thus becoming (are you ready for this?) ISS-CREAM, specifically designed to detect super-high-energy cosmic rays and help scientists determine what their mysterious source(s) may be.
“The answer is one the world’s been waiting on for 100 years,” said program scientist Vernon Jones.
Read more about this “cool” experiment below:
Cosmic Ray Energetics and Mass (CREAM) will be the first cosmic ray instrument designed to detect at such higher energy ranges, and over such an extended duration in space. Scientists hope to discover whether cosmic rays are accelerated by a single cause, which is believed to be supernovae. The new research also could determine why there are fewer cosmic rays detected at very high energies than are theorized to exist.
“Cosmic rays are energetic particles from outer space,” said Eun-Suk Seo, principal investigator for the CREAM study. “They provide a direct sample of matter from outside the solar system. Measurements have shown that these particles can have energies as high as 100,000 trillion electron volts. This is an enormous energy, far beyond and above any energy that can be generated with manmade accelerators, even the Large Hadron Collider at CERN.”
Researchers also plan to study the decline in cosmic ray detection, called the spectral “knee” that occurs at about a thousand trillion electron-volts (eV), which is about 2 billion times more powerful than the emissions in a medical nuclear imaging scan. Whatever causes cosmic rays, or filters them as they move through the galaxy, takes a bite out of the population from 1,000 trillion electron-volts upwards. Further, the spectrum for cosmic rays extends much farther beyond what supernovas are believed to be able to produce.
To tackle these questions, NASA plans to place CREAM aboard the space station, becoming ISS-CREAM. The instrument has flown six times for a total of 161 days on long-duration balloons circling the South Pole, where Earth’s magnetic field lines are essentially vertical.
The idea of energetic particles coming from space was unknown in 1911 when Victor Hess, the 1936 Nobel laureate in physics credited for the discovery of cosmic rays, took to the air to tackle the mystery of why materials became more electrified with altitude, an effect called ionization. The expectation was that the ionization would weaken as one got farther from Earth. Hess developed sensitive instruments and took them as high as 3.3 miles (5.3 kilometers) and he established that ionization increased up to fourfold with altitude, day or night.
A better understanding of cosmic rays will help scientists finish the work started when Hess unexpectedly turned an earthly question into a stellar riddle. Answering that riddle will help us understand a hidden, fundamental facet of how our galaxy, and perhaps the universe, is built and works.
The phenomenon soon gained a popular but confusing name, cosmic rays, from a mistaken theory that they were X-rays or gamma rays, which are electromagnetic radiation, like light. Instead, cosmic rays are high-speed, high-energy particles of matter.
As particles, cosmic rays cannot be focused like light in a telescope. Instead, researchers detect cosmic rays by the light and electrical charges produced when the particles slam into matter. The scientists then use detective work to identify the original particle by direct measurement of its electric charge and its energy determination from the avalanche of debris particles creating their own overlapping trails.
CREAM does this trace work using an ionization calorimeter designed to make cosmic rays shed their energies. Layers of carbon, tungsten and other materials present well-known nuclear “cross sections” within the stack. Electrical and optical detectors measure the intensity of events as cosmic particles, from hydrogen to iron, crash through the instrument.
Even though CREAM balloon flights reached high altitudes, enough atmosphere remained above to interfere with measurements. The plan to mount the instrument to the exterior of the space station will place it above the obscuring effects of the atmosphere, at an altitude of 250 miles (400 kilometers).
“On what can we now place our hopes of solving the many riddles which still exist as to the origin and composition of cosmic rays?”
Cold fusion has been called one of the greatest scientific breakthroughs that might likely never happen. On the surface, it seems simple – a room-temperature reaction occurring under normal pressure. But it is a nuclear reaction, and figuring it out and getting it to work has not been simple, and any success in this area could ultimately – and seriously — change the world. Despite various claims of victory over the years since 1920, none have been able to be replicated consistently and reliably.
But there’s buzz this week of a cold fusion experiment that has been replicated, twice. The tests have reportedly produced excess heat with roughly 10,000 times the energy density and 1,000 times the power density of gasoline.
The names involved are familiar in the cold fusion circles: Italian entrepreneur Andrea Rossi has been claiming for several years that his E-Cat device produces heat through a process called a Low Energy Nuclear Reaction (LENR), and puts out more energy than goes in. In the past, Rossi didn’t allow anyone to verify his device because he claimed his device was an “industrial trade secret.”
But a new paper published on arXiv last week says that seven independent scientists have performed tests of two E-Cat prototypes under controlled conditions, using high-precision instrumentation. Although the authors of the paper wrote that they weren’t allowed to see what was going on inside the sealed steel cylinder reactor, they did write in their paper, “Even by the most conservative assumptions as to the errors in the measurements, the result is still one order of magnitude greater than conventional energy sources.“
The team did two tests:
The first test experiment, lasting 96 hours (from Dec. 13th 2012, to Dec. 17th 2012), was carried out by the two first authors of this paper, Levi and Foschi, while the second experiment, lasting for 116 hours (from March 18th 2013, to March 23rd 2013), was carried out by all authors.
Previously, Rossi and his colleague Sergio Focardi have said their device works by infusing hydrogen into nickel, transmuting the nickel into copper and releasing a large amount of heat.
As expected, the paper – which is not peer-reviewed – and Rossi’s work have both been met with lots of skepticism.
Steven Krivit, writing in the New Energy Times said that the paper by Levi, Foschi et al doesn’t describe any independent test but that authors were just witnesses of a Rossi demonstration.
The folks at the Martin Fleishman Memorial Project website – a group that facilitates the wide-spread replication and validation of things like LENR in an open and scientific manner – say they have an overall positive impression of the paper by Levi and Foschi.
“Our preliminary assessment among the team is that it is a generally good report with no obvious errors or glaring omissions,” they wrote on their website. “It is easily the best evidence to date that Rossi has a working technology, and, if verified openly and widely, this report could be remembered as historic.”
But they also don’t have total confidence in the paper. “It is unfortunate that there are some justified concerns about the independence of the test team, since many of the authors are names that we have seen before in the context of Rossi.” Plus, they are disappointed that none of the authors of the Levi and Foschi paper are willing to present their findings at an upcoming conference.
They also have several other technical questions and criticisms, as do many others.
It’s too soon to say if this latest buzz about cold fusion will amount to anything. Only time and more tests and scrutiny will reveal whether this is anything to get excited about.
What goes up must always come down, right? Well, the European Laboratory for Particle Physics (CERN) wants to test if that principle applies to antimatter.
Antimatter, most simply speaking, is a mirror image of matter. The concept behind it is that the particles that make up matter have an opposite counterpart, antiparticles. For example, if you consider that electrons are negatively charged, an antielectron would be positively charged.
This sounds like science fiction, but as NASA says, it is “real stuff.” In past experiments, CERN’s particle accelerator has created antiprotons, positrons and even antihydrogen. Properly harnessed, antimatter could be used for applications ranging from rocketry to medicine, NASA added. But we’ll need to figure out its nature first.
When it comes to sheer wattage, blazars definitely rule. As the brightest of active galactic nuclei, these sources of extreme high-energy gamma rays are usually associated with relativistic jets of material spewing into space and enabled by matter falling into a host galaxy’s black hole. The further away they are, the dimmer they should be, right? Not necessarily. According to new observations of blazar PKS 1424+240, the emission spectrum might hold a new twist… one that can’t be readily explained.
David Williams, adjunct professor of physics at UC Santa Cruz, said the findings may indicate something new about the emission mechanisms of blazars, the extragalactic background light, or the propagation of gamma-ray photons over long distances. “There may be something going on in the emission mechanisms of the blazar that we don’t understand,” Williams said. “There are more exotic explanations as well, but it may be premature to speculate at this point.”
The Fermi Gamma-ray Space Telescope was the first instrument to detect gamma rays from PKS 1424+240, and the observation was then seconded by VERITAS (Very Energetic Radiation Imaging Telescope Array System) – a terrestrially based tool designed to be sensitive to gamma-rays in the very high-energy (VHE) band. However, these weren’t the only science gadgets in action. To help determine the redshift of the blazar, researchers also employed the Hubble Space Telescope’s Cosmic Origins Spectrograph.
To help understand what they were seeing, the team then set a lower limit for the blazar’s redshift, taking it to a distance of at least 7.4 billion light-years. If their guess is correct, such a huge distance would mean that the majority of the gamma rays should have been absorbed by the extragalactic background light, but again the answers didn’t add up. For that amount of absorption, the blazar itself would be creating a very unexpected emission spectrum.
“We’re seeing an extraordinarily bright source which does not display the characteristic emission expected from a very high-energy blazar,” said Amy Furniss, a graduate student at the Santa Cruz Institute for Particle Physics (SCIPP) at UCSC and first author of a paper describing the new findings.
Bright? You bet. In this circumstance it has to over-ride the ever-present extragalactic background light (EBL). The whole Universe is filled with this “stellar light pollution”. We know it’s there – produced by countless stars and galaxies – but it’s just hard to measure. What we do know is that when a high-energy gamma ray photo meets with a low-energy EBL photon, they essentially cancel each other out. It stands to reason that the further a gamma ray has to travel, the more likely it is to encounter the EBL, putting a limit on the distance to which we can detect high-energy gamma ray sources. By lowering the limit, the new model was then used to ” calculate the expected absorption of very high-energy gamma rays from PKS 1424+240″. This should have allowed Furniss’ team to gather an intrinsic gamma-ray emission spectrum for the most distant blazar yet captured – but all it did was confuse the issue. It just doesn’t coincide with expected emissions using current models.
“We’re finding very high-energy gamma-ray sources at greater distances than we thought we might, and in doing so we’re finding some things we don’t entirely understand,” Williams said. “Having a source at this distance will allow us to better understand how much background absorption there is and test the cosmological models that predict the extragalactic background light.”
Dark matter: it’s invisible, it’s elusive, it’s controversial… and it’s everywhere — in the Universe, yes, but especially in the world of astrophysics, where researchers have been exhaustively trying to reveal its true identity for decades.
Now, scientists with the international Super Cryogenic Dark Matter Search (SuperCDMS) experiment are reporting the detection of a particle that’s thought to make up dark matter: a weakly-interacting massive particle, or WIMP. According to a press release from Texas A&M University (whose high-energy physicist Rupak Mahapatra is a principal investigator in the experiment) SuperCDMS has identified a WIMP-like signal at the 3-sigma level, which indicates a 99.8 percent chance of an actual discovery — a “concrete hint,” as it’s being called.
“In high-energy physics, a discovery is only claimed at 5-sigma or better,” Mahapatra said. “So this is certainly very exciting, but not fully convincing by the standards. We just need more data to be sure. For now, we have to live with this tantalizing hint of one of the biggest puzzles of our time.”
If this is indeed a WIMP it will be the first time such a particle has been directly observed, lending more insight into what dark matter is… or isn’t.
Notoriously elusive, WIMPs rarely interact with normal matter and therefore are difficult to detect. Scientists believe they occasionally bounce off, or scatter like billiard balls from, atomic nuclei, leaving behind a small amount of energy capable of being tracked by detectors deep underground, particle colliders such as the Large Hadron Collider at CERN and even instruments in space like the Alpha Magnetic Spectrometer (AMS) mounted on the International Space Station.
The CDMS experiment, located a half-mile underground at the Soudan mine in northern Minnesota and managed by the United States Department of Energy’s Fermi National Accelerator Laboratory, has been searching for dark matter since 2003. The experiment uses very sophisticated detector technology and advanced analysis techniques to enable cryogenically cooled (almost absolute zero temperature at -460 degrees F) germanium and silicon targets to search for the rare recoil of dark matter particles.
This newly-announced detection actually comes from data acquired during an earlier phase of the experiment.
“This result is from data taken a few years ago using silicon detectors manufactured at Stanford that are now defunct,” Mahapatra said. “Increased interest in the low mass WIMP region motivated us to complete the analysis of the silicon-detector exposure, which is less sensitive than germanium for WIMP masses above 15 giga-electronvolts [one GeVa is equal to a billion electron volts] but more sensitive for lower masses. The analysis resulted in three events, and the estimated background is 0.7 events.”
Although Mahapatra says the result is certainly encouraging and worthy of further investigation, he cautions it should not be considered a discovery just yet.
“We are only 99.8 percent sure, and we want to be 99.9999 percent sure,” Mahapatra said. “At 3-sigma, you have a hint of something. At 4-sigma, you have evidence. At 5-sigma, you have a discovery.”
“In medicine, you can say you are curing 99.8 percent of the cases, and that’s OK. When you say you’ve made a fundamental discovery in high-energy physics, you can’t be wrong.”
– Dr. Rupak Mahapatra, SuperCDMS principal investigator, Texas A&M University
The collaboration will continue to probe this WIMP sector using the SuperCDMS Soudan experiment’s operating germanium detectors and is considering using larger, more advanced 6-inch silicon detectors developed at the Texas A&M’s Department of Electrical Engineering in future experiments.
The team has detailed its results in a paper published in arXiv that eventually will appear in Physical Review Letters. Mahapatra will also announce the results today at 12 p.m. CDT in a talk at the Mitchell Institute for Fundamental Physics and Astronomy.
With its 180 degree views of Earth and space, the ISS’s cupola is the perfect place for photography. But Austrian researchers want to use the unique and panoramic platform to test the limits of “spooky action at distance” in hopes of creating a new quantum communications network.
In a new study published April 9, 2012 in the New Journal of Physics, a group of Austrian researchers propose equipping the camera that is already aboard the ISS — the Nikon 400 mm NightPOD camera — with an optical receiver that would be key to performing the first-ever quantum optics experiment in space. The NightPOD camera faces the ground in the cupola and can track ground targets for up to 70 seconds allowing researchers to bounce a secret encryption key across longer distances than currently possible with optical fiber networks on Earth.
“During a few months a year, the ISS passes five to six times in a row in the correct orientation for us to do our experiments. We envision setting up the experiment for a whole week and therefore having more than enough links to the ISS available,” said co-author of the study Professor Rupert Ursin from the Austrian Academy of Sciences.
Albert Einstein first coined the phrase ‘spooky action at a distance’ during his philosophical battles with Neils Bohr in the 1930s to explain his frustration with the inadequacies of the new theory called quantum mechanics. Quantum mechanics explains actions on the tiniest scales in the domain of atoms and elemental particles. While classical physics explains motion, matter and energy on the level that we can see, 19th century scientists observed phenomena in both the macro and micro world that could not easily explained using classical physics.
In particular, Einstein was dissatisfied with the idea of entanglement. Entanglement occurs when two particles are so deeply connected that they share the same existence; meaning that they share the same mathematical relationships of position, spin, momentum and polarization. This could happen when two particles are created at the same point and instant in spacetime. Over time, as the two particles become widely separated in space, even by light-years, quantum mechanics suggests that a measurement of one would immediately impact the other. Einstein was quick to point out that this violated the universal speed limit set out by special relativity. It was this paradox Einstein referred to as spooky action.
CERN physicist John Bell partially resolved this mystery in 1964 by coming up with the idea of non-local phenomena. While entanglement allows one particle to be instantaneously influenced by its exact counterpart, the flow of classical information does not travel faster than light.
The ISS experiment proposes using a “Bell experiment” to test the theoretical contradiction between predictions in quantum and classical physics. For the Bell experiment, a pair of entangled photons would be generated on the ground; one would be sent from the ground station to the modified camera aboard the ISS, while the other would be measured locally on the ground for later comparison. So far, researchers sent a secret key to receivers just a few hundred kilometers apart.
“According to quantum physics, entanglement is independent of distance. Our proposed Bell-type experiment will show that particles are entangled, over large distances — around 500 km — for the very first time in an experiment,” says Ursin. “Our experiments will also enable us to test potential effects gravity may have on quantum entanglement.”
The researchers point out that making the minor alteration to a camera already aboard the ISS will save time and money needed to build a series of satellites to test researchers’ ideas.
No tears in heaven? Expedition 35 Commander Chris Hadfield shows that while you really can cry in space, tears don’t fall like they do here on Earth, and instead just end up as a big ball of water on your face. It’s physics, baby!
Checking out the spin rate on a supermassive black hole is a great way for astronomers to test Einstein’s theory under extreme conditions – and take a close look at how intense gravity distorts the fabric of space-time. Now, imagine a monster … one that has a mass of about 2 million times that of our Sun, measures 2 million miles in diameter and rotating so fast that it’s nearly breaking the speed of light.
A fantasy? Not hardly. It’s a supermassive black hole located at the center of spiral galaxy NGC 1365 – and it is about to teach us a whole lot more about how black holes and galaxies mature.
What makes researchers so confident they have finally taken definitive calculations of such an incredible spin rate in a distant galaxy? Thanks to data taken by the Nuclear Spectroscopic Telescope Array, or NuSTAR, and the European Space Agency’s XMM-Newton X-ray satellites, the team of scientists has peered into the heart of NGC 1365 with x-ray eyes – taking note of the location of the event horizon – the edge of the spinning hole where surrounding space begins to be dragged into the mouth of the beast.
“We can trace matter as it swirls into a black hole using X-rays emitted from regions very close to the black hole,” said the coauthor of a new study, NuSTAR principal investigator Fiona Harrison of the California Institute of Technology in Pasadena. “The radiation we see is warped and distorted by the motions of particles and the black hole’s incredibly strong gravity.”
However, the studies didn’t stop there, they advanced to the inner edge to encompass the location of the accretion disk. Here is the “Innermost Stable Circular Orbit” – the proverbial point of no return. This region is directly related to a black hole’s spin rate. Because space-time is distorted in this area, some of it can get even closer to the ISCO before being pulled in. What makes the current data so compelling is to see deeper into the black hole through a broader range of x-rays, allowing astronomers to see beyond veiling clouds of dust which only confused past readings. These new findings show us it isn’t the dust that distorts the x-rays – but the crushing gravity.
“This is the first time anyone has accurately measured the spin of a supermassive black hole,” said lead author Guido Risaliti of the Harvard-Smithsonian Center for Astrophysics (CfA) and INAF — Arcetri Observatory.
“If I could have added one instrument to XMM-Newton, it would have been a telescope like NuSTAR,” said Norbert Schartel, XMM-Newton Project Scientist at the European Space Astronomy Center in Madrid. “The high-energy X-rays provided an essential missing puzzle piece for solving this problem.”
Even though the central black hole in NGC 1365 is a monster now, it didn’t begin as one. Like all things, including the galaxy itself, it evolved with time. Over millions of years it gained in girth as it consumed stars and gas – possibly even merging with other black holes along the way.
“The black hole’s spin is a memory, a record, of the past history of the galaxy as a whole,” explained Risaliti.
“These monsters, with masses from millions to billions of times that of the sun, are formed as small seeds in the early universe and grow by swallowing stars and gas in their host galaxies, merging with other giant black holes when galaxies collide, or both,” said the study’s lead author, Guido Risaliti of the Harvard-Smithsonian Center for Astrophysics in Cambridge, Mass., and the Italian National Institute for Astrophysics.
This new spin on black holes has shown us that a monster can emerge from “ordered accretion” – and not simply random multiple events. The team will continue their studies to see how factors other than black hole spin changes over time and continue to observe several other supermassive black holes with NuSTAR and XMM-Newton.
“This is hugely important to the field of black hole science,” said Lou Kaluzienski, NuSTAR program scientist at NASA Headquarters in Washington, D.C. “NASA and ESA telescopes tackled this problem together. In tandem with the lower-energy X-ray observations carried out with XMM-Newton, NuSTAR’s unprecedented capabilities for measuring the higher energy X-rays provided an essential, missing puzzle piece for unraveling this problem.”
Chalk another one up for Citizen Science. Earlier this month, researchers announced the discovery of 24 new pulsars. To date, thousands of pulsars have been discovered, but what’s truly fascinating about this month’s discovery is that came from culling through old data using a new method.
A pulsar is a dense, highly magnetized, swiftly rotating remnant of a supernova explosion. Pulsars where first discovered by Jocelyn Bell Burnell and Antony Hewish in 1967. The discovery of a precisely timed radio beacon initially suggested to some that they were the product of an artificial intelligence. In fact, for a very brief time, pulsars were known as LGM’s, for “Little Green Men.” Today, we know that pulsars are the product of the natural death of massive stars.
The data set used for the discovery comes from the Parkes 64-metre radio observatory based out of New South Wales, Australia. The installation was the first to receive telemetry from the Apollo 11 astronauts on the Moon and was made famous in the movie The Dish. The Parkes Multi-Beam Pulsar Survey (PMPS) was conducted in the late 1990’s, making thousands of 35-minute recordings across the plane of the Milky Way galaxy. This survey turned up over 800 pulsars and generated 4 terabytes of data. (Just think of how large 4 terabytes was in the 90’s!)
The nature of these discoveries presented theoretical astrophysicists with a dilemma. Namely, the number of short period and binary pulsars was lower than expected. Clearly, there were more pulsars in the data waiting to be found.
Enter Citizen Science. Using a program known as Einstein@Home, researchers were able to sift though the recordings using innovative modeling techniques to tease out 24 new pulsars from the data.
“The method… is only possible with the computing resources provided by Einstein@Home” Benjamin Knispel of the Max Planck Institute for Gravitational Physics told the MIT Technology Review in a recent interview. The study utilized over 17,000 CPU core years to complete.
Einstein@Home is a program uniquely adapted to accomplish this feat. Begun in 2005, Einstein@Home is a distributed computing project which utilizes computing power while machines are idling to search through downloaded data packets. Similar to the original distributed computing program SETI@Home which searches for extraterrestrial signals, Einstein@Home culls through data from the LIGO (Laser Interferometer Gravitational Wave Observatory) looking for gravity waves. In 2009, the Einstein@Home survey was expanded to include radio astronomy data from the Arecibo radio telescope and later the Parkes observatory.
Among the discoveries were some rare finds. For example, PSR J1748-3009 Has the highest known dispersion measure of any millisecond pulsar (The dispersion measure is the density of free electrons observed moving towards the viewer). Another find, J1750-2531 is thought to belong to a class of intermediate-mass binary pulsars. 6 of the 24 pulsars discovered were part of binary systems.
These discoveries also have implications for the ongoing hunt for gravity waves by such projects as LIGO. Specifically, a through census of binary pulsars in the galaxy will give scientists a model for the predicted rate of binary pulsar mergers. Unlike radio surveys, LIGO seeks to detect these events via the copious amount of gravity waves such mergers should generate. Begun in 2002, LIGO consists of two gravity wave observatories, one in Hanford Washington and one in Livingston Louisiana just outside of Baton Rouge. Each LIGO detector consists of two 2 kilometre Fabry-Pérot arms in an “L” configuration which allow for ultra-precise measurements of a 200 watt laser beam shot through them. Two detectors are required to pin-point the direction of an incoming gravity wave on the celestial sphere. You can see the orientation of the “L’s” on the display on the Einstein@Home screensaver. Two geographically separate detectors are also required to rule out local interference. A gravity wave from a galactic source would ripple straight through the Earth.
Such a movement would be tiny, on the order of 1/1,000th the diameter of a proton, unnoticed by all except the LIGO detectors. To date, LIGO has yet to detect gravity waves, although there have been some false alarms. Scientists regularly interject test signals into the data to see if system catches them. The lack of detection of gravity waves by LIGO has put some constraints on certain events. For example, LIGO reported a non-detection of gravity waves during the February 2007 short gamma-ray burst event GRB 070201. The event arrived from the direction of the Andromeda Galaxy, and thus was thought to have been relatively nearby in the universe. Such bursts are thought to be caused by neutron star and/or black holes mergers. The lack of detection by LIGO suggests a more distant event. LIGO should be able to detect a gravitational wave event out to 70 million light years, and Advanced LIGO (AdLIGO) is set to go online in 2014 and will increase its sensitivity tenfold.
Knowledge of where these potential pulsar mergers are by such discoveries as the Parkes radio survey will also give LIGO researchers clues of targets to focus on. “The search for pulsars isn’t easy, especially for these “quiet” ones that aren’t doing the equivalent of “screaming” for our attention,” Says LIGO Livingston Data Analysis and EPO Scientist Amber Stuver. The LIGO consortium developed the data analysis technique used by Einstein@Home. The direct detection of gravitational waves by LIGO or AdLIGO would be an announcement perhaps on par with CERN’s discovery of the Higgs Boson last year. This would also open up a whole new field of gravitational wave astronomy and perhaps give new stimulus to the European Space Agencies’ proposed Laser Interferometer Space Antenna (LISA) space-based gravity wave detector. Congrats to the team at Parkes on their discovery… perhaps we’ll have the first gravity wave detection announcement out of LIGO as well in years to come!
-Read the original paper on the discovery of 24 new pulsars here.