Disturbance in the Force – A Spatially Varying Fine Structure Constant

Illustration of the dipolar variation in the fine-structure constant, alpha, across the sky, as seen by the two telescopes used in the work: the Keck telescope in Hawaii and the ESO Very Large Telescope in Chile. IMAGE CREDIT: Copyright Dr. Julian Berengut, UNSW, 2010.

[/caption]

In order for astronomers to explore the outer reaches of our universe, they rely upon the assumption that the physical constants we observe in the lab on Earth are physically constant everywhere in the universe. This assumption seems to hold up extremely well. If the universe’s constants were grossly different, stars would fail to shine and galaxies would fail to coalesce. Yet as far we we look in our universe, the effects which rely on these physical constants being constant, still seem to happen. But new research has revealed that one of these constants, known as the fine structure constant, may vary ever so slightly in different portions of the universe.

Of all physical constants, the fine structure constant seems like an odd one to be probing with astronomy. It appears in many equations involving some of the smallest scales in the universe. In particular, it is used frequently in quantum physics and is part of the quantum derivation of the structure of the hydrogen atom. This quantum model determines the allowed energy levels of electrons in the atoms. Change this constant and the orbitals shift as well.

Since the allowed energy levels determine what wavelengths of light such an atom can emit, a careful analysis of the positioning of these spectral lines in distant galaxies would reveal variations in the constant that helped control them. Using the Very Large Telescope (VLT) and the Keck Observatory, a team from the University of New South Whales has analyzed the spectra of 300 galaxies and found the subtle changes that should exist if this constant was less than constant.

Since the two sets of telescopes used point in different directions (Keck in the Northern hemisphere and the VLT in the Southern), the researchers noticed that the variation seemed to have a preferred direction. As Julian King, one of the paper’s authors, explained, “Looking to the north with Keck we see, on average, a smaller alpha in distant galaxies, but when looking south with the VLT we see a larger alpha.”

However, “it varies by only a tiny amount — about one part in 100,000 — over most of the observable universe”. As such, although the result is very intriguing, it does not demolish our understanding of the universe or make hypotheses like that of a greatly variable speed of light plausible (an argument frequently tossed around by Creationists). But, “If our results are correct, clearly we shall need new physical theories to satisfactorily describe them.”

While this finding doesn’t challenge our knowledge of the observable universe, it may have implications for regions outside of the portion of the universe we can observe. Since our viewing distance is ultimately limited by how far we can look back, and that time is limited by when the universe became transparent, we cannot observe what the universe would be like beyond that visible horizon. The team speculates that beyond it, there may be even larger changes in this constant which would have large effects on physics in such portions. They conclude the results may, “suggest a violation of the Einstein Equivalence Principle, and could infer a very large or in finite universe, within which our `local’ Hubble volume represents a tiny fraction, with correspondingly small variations in the physical constants.”

This would mean that, outside of our portion of the universe, the physical laws may not be suitable for life making our little corner of the universe a sort of oasis. This could help solve the supposed “fine-tuning” problem without relying on explanations such as multiple universes.

Want some other articles on this subject? Here’s an article about there might be 10 dimensions.

Scientists Say They Can Now Test String Theory

Quantum entanglement visualized. Credit: Discovery News.


The idea of the “Theory of Everything” is enticing – that we could somehow explain all that is. String theory has been proposed since the 1960’s as a way to reconcile quantum mechanics and general relativity into such an explanation. However, the biggest criticism of String Theory is that it isn’t testable. But now, a research team led by scientists from the Imperial College London unexpectedly discovered that that string theory also seems to predict the behavior of entangled quantum particles. As this prediction can be tested in the laboratory, the researchers say they can now test string theory.

“If experiments prove that our predictions about quantum entanglement are correct, this will demonstrate that string theory ‘works’ to predict the behavior of entangled quantum systems,” said Professor Mike Duff, lead author of the study.

String theory was originally developed to describe the fundamental particles and forces that make up our universe, and has a been a favorite contender among physicists to allow us to reconcile what we know about the incredibly small from particle physics with our understanding of the very large from our studies of cosmology. Using the theory to predict how entangled quantum particles behave provides the first opportunity to test string theory by experiment.

But – at least for now – the scientists won’t be able to confirm that String Theory is actually the way to explain all that is, just if it actually works.

“This will not be proof that string theory is the right ‘theory of everything’ that is being sought by cosmologists and particle physicists,” said Duff. “However, it will be very important to theoreticians because it will demonstrate whether or not string theory works, even if its application is in an unexpected and unrelated area of physics.”

String theory is a theory of gravity, an extension of General Relativity, and the classical interpretation of strings and branes is that they are quantum mechanical vibrating, extended charged black holes.The theory hypothesizes that the electrons and quarks within an atom are not 0-dimensional objects, but 1-dimensional strings. These strings can move and vibrate, giving the observed particles their flavor, charge, mass and spin. The strings make closed loops unless they encounter surfaces, called D-branes, where they can open up into 1-dimensional lines. The endpoints of the string cannot break off the D-brane, but they can slide around on it.

Duff said he was sitting in a conference in Tasmania where a colleague was presenting the mathematical formulae that describe quantum entanglement when he realized something. “I suddenly recognized his formulae as similar to some I had developed a few years earlier while using string theory to describe black holes. When I returned to the UK I checked my notebooks and confirmed that the maths from these very different areas was indeed identical.”

Duff and his colleagues realized that the mathematical description of the pattern of entanglement between three qubits resembles the mathematical description, in string theory, of a particular class of black holes. Thus, by combining their knowledge of two of the strangest phenomena in the universe, black holes and quantum entanglement, they realized they could use string theory to produce a prediction that could be tested. Using the string theory mathematics that describes black holes, they predicted the pattern of entanglement that will occur when four qubits are entangled with one another. (The answer to this problem has not been calculated before.) Although it is technically difficult to do, the pattern of entanglement between four entangled qubits could be measured in the laboratory and the accuracy of this prediction tested.

The discovery that string theory seems to make predictions about quantum entanglement is completely unexpected, but because quantum entanglement can be measured in the lab, it does mean that there is way – finally – researchers can test predictions based on string theory.

But, Duff said, there is no obvious connection to explain why a theory that is being developed to describe the fundamental workings of our universe is useful for predicting the behavior of entangled quantum systems. “This may be telling us something very deep about the world we live in, or it may be no more than a quirky coincidence”, said Duff. “Either way, it’s useful.”

Source: Imperial College London

Astronomy Without A Telescope – Strange Stars

(Caption) One step closer to a black hole? A hypothetical strange star results from extreme gravitational compression overcoming the strong interaction that holds neutrons and protons together. Credit Swinburne University - astronomy.swin.edu.au

[/caption]

Atoms are made of protons, neutrons and electrons. If you cram them together and heat them up you get plasma where the electrons are only loosely associated with individual nuclei and you get a dynamic, light-emitting mix of positively charged ions and negatively charged electrons. If you cram that matter together even further, you drive electrons to merge with protons and you are left with a collection of neutrons – like in a neutron star. So, what if you keep cramming that collection of neutrons together into an even higher density? Well, eventually you get a black hole – but before that (at least hypothetically) you get a strange star.

The theory has it that compressing neutrons can eventually overcome the strong interaction, breaking down a neutron into its constituent quarks, giving a roughly equal mix of up, down and strange quarks – allowing these particles to be crammed even closer together in a smaller volume. By convention, this is called strange matter. It has been suggested that very massive neutron stars may have strange matter in their compressed cores.

However, some say that strange matter has a more fundamentally stable configuration than other matter. So, once a star’s core becomes strange, contact between it and baryonic (i.e. protons and neutrons) matter might drive the baryonic matter to adopt the strange (but more stable) matter configuration. This is the sort of thinking behind why the Large Hadron Collider might have destroyed the Earth by producing strangelets, which then produce a Kurt Vonnegut Ice-9 scenario. However, since the LHC hasn’t done any such thing, it’s reasonable to think that strange stars probably don’t form this way either.

More likely a ‘naked’ strange star, with strange matter extending from its core to its surface, might evolve naturally under its own self gravity. Once a neutron star’s core becomes strange matter, it should contract inwards leaving behind volume for an outer layer to be pulled inwards into a smaller radius and a higher density, at which point that outer layer might also become strange… and so on. Just as it seems implausible to have a star whose core is so dense that it’s essentially a black hole, but still with a star-like crust – so it may be that when a neutron star develops a strange core it inevitably becomes strange throughout.

Anyhow, if they exist at all, strange stars should have some tell tale characteristics. We know that neutron stars tend to lie in the range of 1.4 to 2 solar masses – and that any star with a neutron star’s density that’s over 10 solar masses has to become a black hole. That leaves a bit of a gap – although there is evidence of stellar black holes down to only 3 solar masses, so the gap for strange stars to form may only be in that 2 to 3 solar masses range.

By adopting a more compressed 'ground state' of matter, a strange (quark) star should be smaller, but more massive, than a neutron star. RXJ1856 is in the ballpark for size, but may not be massive enough to fit the theory. Credit: chandra.harvard.edu

The likely electrodynamic properties of strange stars are also of interest (see below). It is likely that electrons will be displaced towards the surface – leaving the body of the star with a nett positive charge surrounded by an atmosphere of negatively charged electrons. Presuming a degree of differential rotation between the star and its electron atmosphere, such a structure would generate a magnetic field of the magnitude that can be observed in a number of candidate stars.

Another distinct feature should be a size that is smaller than most neutron stars. One strange star candidate is RXJ1856, which appears to be a neutron star, but is only 11 km in diameter. Some astrophysicists may have muttered hmmm… that’s strange on hearing about it – but it remains to be confirmed that it really is.

Further reading: Negreiros et al (2010) Properties of Bare Strange Stars Associated with Surface Electrical Fields.

Cosmologists Provide Closest Measure of Elusive Neutrino

Slices through the SDSS 3-dimensional map of the distribution of galaxies. Earth is at the center, and each point represents a galaxy, typically containing about 100 billion stars. Galaxies are colored according to the ages of their stars, with the redder, more strongly clustered points showing galaxies that are made of older stars. The outer circle is at a distance of two billion light years. The region between the wedges was not mapped by the SDSS because dust in our own Galaxy obscures the view of the distant universe in these directions. Both slices contain all galaxies within -1.25 and 1.25 degrees declination. Credit: M. Blanton and the Sloan Digital Sky Survey.

[/caption]

Cosmologists – and not particle physicists — could be the ones who finally measure the mass of the elusive neutrino particle. A group of cosmologists have made their most accurate measurement yet of the mass of these mysterious so-called “ghost particles.” They didn’t use a giant particle detector but used data from the largest survey ever of galaxies, the Sloan Digital Sky Survey. While previous experiments had shown that neutrinos have a mass, it is thought to be so small that it was very hard to measure. But looking at the Sloan data on galaxies, PhD student Shawn Thomas and his advisers at University College London put the mass of a neutrino at no greater than 0.28 electron volts, which is less than a billionth of the mass of a single hydrogen atom. This is one of the most accurate measurements of the mass of a neutrino to date.

Their work is based on the principle that the huge abundance of neutrinos (there are trillions passing through you right now) has a large cumulative effect on the matter of the cosmos, which naturally forms into “clumps” of groups and clusters of galaxies. As neutrinos are extremely light they move across the universe at great speeds which has the effect of smoothing this natural “clumpiness” of matter. By analysing the distribution of galaxies across the universe (i.e. the extent of this “smoothing-out” of galaxies) scientists are able to work out the upper limits of neutrino mass.

A neutrino is capable of passing through a light year –about six trillion miles — of lead without hitting a single atom.

Central to this new calculation is the existence of the largest ever 3D map of galaxies, called Mega Z, which covers over 700,000 galaxies recorded by the Sloan Digital Sky Survey and allows measurements over vast stretches of the known universe.

“Of all the hypothetical candidates for the mysterious Dark Matter, so far neutrinos provide the only example of dark matter that actually exists in nature,” said Ofer Lahav, Head of UCL’s Astrophysics Group. “It is remarkable that the distribution of galaxies on huge scales can tell us about the mass of the tiny neutrinos.”

The Cosmologists at UCL were able to estimate distances to galaxies using a new method that measures the colour of each of the galaxies. By combining this enormous galaxy map with information from the temperature fluctuations in the after-glow of the Big Bang, called the Cosmic Microwave Background radiation, they were able to put one of the smallest upper limits on the size of the neutrino particle to date.

“Although neutrinos make up less than 1% of all matter they form an important part of the cosmological model,” said Dr. Shaun Thomas. “It’s fascinating that the most elusive and tiny particles can have such an effect on the Universe.”

“This is one of the most effective techniques available for measuring the neutrino masses,” said Dr. Filipe Abadlla. “This puts great hopes to finally obtain a measurement of the mass of the neutrino in years to come.”

The authors are confident that a larger survey of the Universe, such as the one they are working on called the international Dark Energy Survey, will yield an even more accurate weight for the neutrino, potentially at an upper limit of just 0.1 electron volts.
The results are published in the journal Physical Review Letters.

Source: University College London

Magnetic Fields in Inter-cluster Space: Measured at Last

How Does Light Travel?

[/caption]
The strength of the magnetic fields here on Earth, on the Sun, in inter-planetary space, on stars in our galaxy (the Milky Way; some of them anyway), in the interstellar medium (ISM) in our galaxy, and in the ISM of other spiral galaxies (some of them anyway) have been measured. But there have been no measurements of the strength of magnetic fields in the space between galaxies (and between clusters of galaxies; the IGM and ICM).

Up till now.

But who cares? What scientific importance does the strength of the IGM and ICM magnetic fields have?

The Large Area Telescope (LAT) on Fermi detects gamma-rays through matter (electrons) and antimatter (positrons) they produce after striking layers of tungsten. Credit: NASA/Goddard Space Flight Center Conceptual Image Lab

Estimates of these fields may provide “a clue that there was some fundamental process in the intergalactic medium that made magnetic fields,” says Ellen Zweibel, a theoretical astrophysicist at the University of Wisconsin, Madison. One “top-down” idea is that all of space was somehow left with a slight magnetic field soon after the Big Bang – around the end of inflation, Big Bang Nucleosynthesis, or decoupling of baryonic matter and radiation – and this field grew in strength as stars and galaxies amassed and amplified its intensity. Another, “bottom-up” possibility is that magnetic fields formed initially by the motion of plasma in small objects in the primordial universe, such as stars, and then propagated outward into space.

So how do you estimate the strength of a magnetic field, tens or hundreds of millions of light-years away, in regions of space a looong way from any galaxies (much less clusters of galaxies)? And how do you do this when you expect these fields to be much less than a nanoGauss (nG), perhaps as small as a femtoGauss (fG, which is a millionth of a nanoGauss)? What trick can you use??

A very neat one, one that relies on physics not directly tested in any laboratory, here on Earth, and unlikely to be so tested during the lifetime of anyone reading this today – the production of positron-electron pairs when a high energy gamma ray photon collides with an infrared or microwave one (this can’t be tested in any laboratory, today, because we can’t make gamma rays of sufficiently high energy, and even if we could, they’d collide so rarely with infrared light or microwaves we’d have to wait centuries to see such a pair produced). But blazars produce copious quantities of TeV gamma rays, and in intergalactic space microwave photons are plentiful (that’s what the cosmic microwave background – CMB – is!), and so too are far infrared ones.

MAGIC telescope (Credit: Robert Wagner)

Having been produced, the positron and electron will interact with the CMB, local magnetic fields, other electrons and positrons, etc (the details are rather messy, but were basically worked out some time ago), with the net result that observations of distant, bright sources of TeV gamma rays can set lower limits on the strength of the IGM and ICM through which they travel. Several recent papers report results of such observations, using the Fermi Gamma-Ray Space Telescope, and the MAGIC telescope.

So how strong are these magnetic fields? The various papers give different numbers, from greater than a few tenths of a femtoGauss to greater than a few femtoGauss.

“The fact that they’ve put a lower bound on magnetic fields far out in intergalactic space, not associated with any galaxy or clusters, suggests that there really was some process that acted on very wide scales throughout the universe,” Zweibel says. And that process would have occurred in the early universe, not long after the Big Bang. “These magnetic fields could not have formed recently and would have to have formed in the primordial universe,” says Ruth Durrer, a theoretical physicist at the University of Geneva.

So, perhaps we have yet one more window into the physics of the early universe; hooray!

Sources: Science News, arXiv:1004.1093, arXiv:1003.3884

Andromeda’s Double Nucleus – Explained at Last?

M31's nucleus (Credit: WF/PC, Hubble Space Telescope)


In 1993, the Hubble Space Telescope snapped a close-up of the nucleus of the Andromeda galaxy, M31, and found that it is double.

In the 15+ years since, dozens of papers have been written about it, with titles like The stellar population of the decoupled nucleus in M 31, Accretion Processes in the Nucleus of M31, and The Origin of the Young Stars in the Nucleus of M31.

And now there’s a paper which seems, at last, to explain the observations; the cause is, apparently, a complex interplay of gravity, angular motion, and star formation.

[/caption]
It is now reasonably well-understood how supermassive black holes (SMBHs), found in the nuclei of all normal galaxies, can snack on stars, gas, and dust which comes within about a third of a light-year (magnetic fields do a great job of shedding the angular momentum of this ordinary, baryonic matter).

Also, disturbances from collisions with other galaxies and the gravitational interactions of matter within the galaxy can easily bring gas to distances of about 10 to 100 parsecs (30 to 300 light years) from a SMBH.

However, how does the SMBH snare baryonic matter that’s between a tenth of a parsec and ~10 parsecs away? Why doesn’t matter just form more-or-less stable orbits at these distances? After all, the local magnetic fields are too weak to make changes (except over very long timescales), and collisions and close encounters too rare (these certainly work over timescales of ~billions of years, as evidenced by the distributions of stars in globular clusters).

That’s where new simulations by Philip Hopkins and Eliot Quataert, both of the University of California, Berkeley, come into play. Their computer models show that at these intermediate distances, gas and stars form separate, lopsided disks that are off-center with respect to the black hole. The two disks are tilted with respect to one another, allowing the stars to exert a drag on the gas that slows its swirling motion and brings it closer to the black hole.

The new work is theoretical; however, Hopkins and Quataert note that several galaxies seem to have lopsided disks of elderly stars, lopsided with respect to the SMBH. And the best-studied of these is in M31.

Hopkins and Quataert now suggest that these old, off-center disks are the fossils of the stellar disks generated by their models. In their youth, such disks helped drive gas into black holes, they say.

The new study “is interesting in that it may explain such oddball [stellar disks] by a common mechanism which has larger implications, such as fueling supermassive black holes,” says Tod Lauer of the National Optical Astronomy Observatory in Tucson. “The fun part of their work,” he adds, is that it unifies “the very large-scale black hole energetics and fueling with the small scale.” Off-center stellar disks are difficult to observe because they lie relatively close to the brilliant fireworks generated by supermassive black holes. But searching for such disks could become a new strategy for hunting supermassive black holes in galaxies not known to house them, Hopkins says.

Sources: ScienceNews, “The Nuclear Stellar Disk in Andromeda: A Fossil from the Era of Black Hole Growth”, Hopkins, Quataert, to be published in MNRAS (arXiv preprint), AGN Fueling: Movies.

LHC Sets Record for Particle Collisions, Marks “New Territory” in Physics

The Large Hadron Collider at CERN. Credit: CERN/LHC

Event display of a 7 TeV proton collision recorded by ATLAS. Credit: CERN

Physicists at the CERN research center collided sub-atomic particles in the Large Hadron Collider on Tuesday at the highest speeds ever achieved. “It’s a great day to be a particle physicist,” said CERN Director General Rolf Heuer. “A lot of people have waited a long time for this moment, but their patience and dedication is starting to pay dividends.” Already, the instruments in the LHC have recorded thousands of events, and at this writing, the LHC has had more than an hour of stable and colliding beams.

This is an attempt to create mini-versions of the Big Bang that led to the birth of the universe 13.7 billion years ago, providing new insights into the nature and evolution of matter in the early Universe.
Continue reading “LHC Sets Record for Particle Collisions, Marks “New Territory” in Physics”

Watch History Live from the Large Hadron Collider

Particle Collider
Today, CERN announced that the LHCb experiment had revealed the existence of two new baryon subatomic particles. Credit: CERN/LHC/GridPP

[/caption]
CERN announced that on March 30 they will attempt to circulate beams in the Large Hadron Collider at 3.5 TeV, the highest energy yet achieved in a particle accelerator. A live webcast will be shown of the event, and will include live footage from the control rooms for the LHC accelerator and all four LHC experiment, as well as a press conference after the first collisions are announced.

“With two beams at 3.5 TeV, we’re on the verge of launching the LHC physics program,” said CERN’s Director for Accelerators and Technology, Steve Myers. “But we’ve still got a lot of work to do before collisions. Just lining the beams up is a challenge in itself: it’s a bit like firing needles across the Atlantic and getting them to collide half way.”

The webcast will be available at a link to be announced, but the tentative schedule of events (subject to change) and more information can be found at this link.

Webcasts will also be available from the control rooms of the four LHC experiments: ALICE, ATLAS, CMS and LHCb. The webcasts will be primarily in English.

Between now and 30 March, the LHC team will be working with 3.5 TeV beams to commission the beam control systems and the systems that protect the particle detectors from stray particles. All these systems must be fully commissioned before collisions can begin.

“The LHC is not a turnkey machine,” said CERN Director General Rolf Heuer.“The machine is working well, but we’re still very much in a commissioning phase and we have to recognize that the first attempt to collide is precisely that. It may take hours or even days to get collisions.”

The last time CERN switched on a major new research machine, the Large Electron Positron collider, LEP, in 1989 it took three days from the first attempt to collide to the first recorded collisions.

The current Large Hadron Collider run began on 20 November 2009, with the first circulating beam at 0.45 TeV. Milestones were quick to follow, with twin circulating beams established by 23 November and a world record beam energy of 1.18 TeV being set on 30 November. By the time the LHC switched off for 2009 on 16 December, another record had been set with collisions recorded at 2.36 TeV and significant quantities of data recorded. Over the 2009 part of the run, each of the LHC’s four major experiments, ALICE, ATLAS, CMS and LHCb recorded over a million particle collisions, which were distributed smoothly for analysis around the world on the LHC computing grid. The first physics papers were soon to follow. After a short technical stop, beams were again circulating on 28 February 2010, and the first acceleration to 3.5 TeV was on 19 March.

Once 7 TeV collisions have been established, the plan is to run continuously for a period of 18-24 months, with a short technical stop at the end of 2010. This will bring enough data across all the potential discovery areas to firmly establish the LHC as the world’s foremost facility for high-energy particle physics.

Source: CERN

This is Getting Boring: General Relativity Passes Yet another Big Test!

Princeton University scientists (from left) Reinabelle Reyes, James Gunn and Rachel Mandelbaum led a team that analyzed more than 70,000 galaxies and demonstrated that the universe - at least up to a distance of 3.5 billion light years from Earth - plays by the rules set out by Einstein in his theory of general relativity. (Photo: Brian Wilson)

[/caption]
Published in 1915, Einstein’s theory of general relativity (GR) passed its first big test just a few years later, when the predicted gravitational deflection of light passing near the Sun was observed during the 1919 solar eclipse.

In 1960, GR passed its first big test in a lab, here on Earth; the Pound-Rebka experiment. And over the nine decades since its publication, GR has passed test after test after test, always with flying colors (check out this review for an excellent summary).

But the tests have always been within the solar system, or otherwise indirect.

Now a team led by Princeton University scientists has tested GR to see if it holds true at cosmic scales. And, after two years of analyzing astronomical data, the scientists have concluded that Einstein’s theory works as well in vast distances as in more local regions of space.

A partial map of the distribution of galaxies in the SDSS, going out to a distance of 7 billion light years. The amount of galaxy clustering that we observe today is a signature of how gravity acted over cosmic time, and allows as to test whether general relativity holds over these scales. (M. Blanton, SDSS)

The scientists’ analysis of more than 70,000 galaxies demonstrates that the universe – at least up to a distance of 3.5 billion light years from Earth – plays by the rules set out by Einstein in his famous theory. While GR has been accepted by the scientific community for over nine decades, until now no one had tested the theory so thoroughly and robustly at distances and scales that go way beyond the solar system.

Reinabelle Reyes, a Princeton graduate student in the Department of Astrophysical Sciences, along with co-authors Rachel Mandelbaum, an associate research scholar, and James Gunn, the Eugene Higgins Professor of Astronomy, outlined their assessment in the March 11 edition of Nature.

Other scientists collaborating on the paper include Tobias Baldauf, Lucas Lombriser and Robert Smith of the University of Zurich and Uros Seljak of the University of California-Berkeley.

The results are important, they said, because they shore up current theories explaining the shape and direction of the universe, including ideas about dark energy, and dispel some hints from other recent experiments that general relativity may be wrong.

“All of our ideas in astronomy are based on this really enormous extrapolation, so anything we can do to see whether this is right or not on these scales is just enormously important,” Gunn said. “It adds another brick to the foundation that underlies what we do.”

GR is one, of two, core theories underlying all of contemporary astrophysics and cosmology (the other is the Standard Model of particle physics, a quantum theory); it explains everything from black holes to the Big Bang.

In recent years, several alternatives to general relativity have been proposed. These modified theories of gravity depart from general relativity on large scales to circumvent the need for dark energy, dark matter, or both. But because these theories were designed to match the predictions of general relativity about the expansion history of the universe, a factor that is central to current cosmological work, it has become crucial to know which theory is correct, or at least represents reality as best as can be approximated.

“We knew we needed to look at the large-scale structure of the universe and the growth of smaller structures composing it over time to find out,” Reyes said. The team used data from the Sloan Digital Sky Survey (SDSS), a long-term, multi-institution telescope project mapping the sky to determine the position and brightness of several hundred million galaxies and quasars.

By calculating the clustering of these galaxies, which stretch nearly one-third of the way to the edge of the universe, and analyzing their velocities and distortion from intervening material – due to weak lensing, primarily by dark matter – the researchers have shown that Einstein’s theory explains the nearby universe better than alternative theories of gravity.

Some of the 70,000 luminous galaxies in SDSS analyzed (Image: SDSS Collaboration)

The Princeton scientists studied the effects of gravity on the SDSS galaxies and clusters of galaxies over long periods of time. They observed how this fundamental force drives galaxies to clump into larger collections of galaxies and how it shapes the expansion of the universe.

Critically, because relativity calls for the curvature of space to be equal to the curvature of time, the researchers could calculate whether light was influenced in equal amounts by both, as it should be if general relativity holds true.

“This is the first time this test was carried out at all, so it’s a proof of concept,” Mandelbaum said. “There are other astronomical surveys planned for the next few years. Now that we know this test works, we will be able to use it with better data that will be available soon to more tightly constrain the theory of gravity.”

Firming up the predictive powers of GR can help scientists better understand whether current models of the universe make sense, the scientists said.

“Any test we can do in building our confidence in applying these very beautiful theoretical things but which have not been tested on these scales is very important,” Gunn said. “It certainly helps when you are trying to do complicated things to understand fundamentals. And this is a very, very, very fundamental thing.”

“The nice thing about going to the cosmological scale is that we can test any full, alternative theory of gravity, because it should predict the things we observe,” said co-author Uros Seljak, a professor of physics and of astronomy at UC Berkeley and a faculty scientist at Lawrence Berkeley National Laboratory who is currently on leave at the Institute of Theoretical Physics at the University of Zurich. “Those alternative theories that do not require dark matter fail these tests.”

Sources: “Princeton scientists say Einstein’s theory applies beyond the solar system” (Princeton University), “Study validates general relativity on cosmic scale, existence of dark matter” (University of California Berkeley), “Confirmation of general relativity on large scales from weak lensing and galaxy velocities” (Nature, arXiv preprint)

World-wide Campaign Sheds New Light on Nature’s “LHC”

Recent observations of blazar jets require researchers to look deeper into whether current theories about jet formation and motion require refinement. This simulation, courtesy of Jonathan McKinney (KIPAC), shows a black hole pulling in nearby matter (yellow) and spraying energy back out into the universe in a jet (blue and red) that is held together by magnetic field lines (green).

[/caption]
In a manner somewhat like the formation of an alliance to defeat Darth Vader’s Death Star, more than a decade ago astronomers formed the Whole Earth Blazar Telescope consortium to understand Nature’s Death Ray Gun (a.k.a. blazars). And contrary to its at-death’s-door sounding name, the GASP has proved crucial to unraveling the secrets of how Nature’s “LHC” works.

“As the universe’s biggest accelerators, blazar jets are important to understand,” said Kavli Institute for Particle Astrophysics and Cosmology (KIPAC) Research Fellow Masaaki Hayashida, corresponding author on the recent paper presenting the new results with KIPAC Astrophysicist Greg Madejski. “But how they are produced and how they are structured is not well understood. We’re still looking to understand the basics.”

Blazars dominate the gamma-ray sky, discrete spots on the dark backdrop of the universe. As nearby matter falls into the supermassive black hole at the center of a blazar, “feeding” the black hole, it sprays some of this energy back out into the universe as a jet of particles.

Researchers had previously theorized that such jets are held together by strong magnetic field tendrils, while the jet’s light is created by particles spiraling around these wisp-thin magnetic field “lines”.

Yet, until now, the details have been relatively poorly understood. The recent study upsets the prevailing understanding of the jet’s structure, revealing new insight into these mysterious yet mighty beasts.

“This work is a significant step toward understanding the physics of these jets,” said KIPAC Director Roger Blandford. “It’s this type of observation that is going to make it possible for us to figure out their anatomy.”

Over a full year of observations, the researchers focused on one particular blazar jet, 3C279, located in the constellation Virgo, monitoring it in many different wavebands: gamma-ray, X-ray, optical, infrared and radio. Blazars flicker continuously, and researchers expected continual changes in all wavebands. Midway through the year, however, researchers observed a spectacular change in the jet’s optical and gamma-ray emission: a 20-day-long flare in gamma rays was accompanied by a dramatic change in the jet’s optical light.

Although most optical light is unpolarized – consisting of light with an equal mix of all polarizations – the extreme bending of energetic particles around a magnetic field line can polarize light. During the 20-day gamma-ray flare, optical light from the jet changed its polarization. This temporal connection between changes in the gamma-ray light and changes in the optical polarization suggests that light in both wavebands is created in the same part of the jet; during those 20 days, something in the local environment changed to cause both the optical and gamma-ray light to vary.

“We have a fairly good idea of where in the jet optical light is created; now that we know the gamma rays and optical light are created in the same place, we can for the first time determine where the gamma rays come from,” said Hayashida.

This knowledge has far-reaching implications about how a supermassive black hole produces polar jets. The great majority of energy released in a jet escapes in the form of gamma rays, and researchers previously thought that all of this energy must be released near the black hole, close to where the matter flowing into the black hole gives up its energy in the first place. Yet the new results suggest that – like optical light – the gamma rays are emitted relatively far from the black hole. This, Hayashida and Madejski said, in turn suggests that the magnetic field lines must somehow help the energy travel far from the black hole before it is released in the form of gamma rays.

“What we found was very different from what we were expecting,” said Madejski. “The data suggest that gamma rays are produced not one or two light days from the black hole [as was expected] but closer to one light year. That’s surprising.”

In addition to revealing where in the jet light is produced, the gradual change of the optical light’s polarization also reveals something unexpected about the overall shape of the jet: the jet appears to curve as it travels away from the black hole.

“At one point during a gamma-ray flare, the polarization rotated about 180 degrees as the intensity of the light changed,” said Hayashida. “This suggests that the whole jet curves.”

This new understanding of the inner workings and construction of a blazar jet requires a new working model of the jet’s structure, one in which the jet curves dramatically and the most energetic light originates far from the black hole. This, Madejski said, is where theorists come in. “Our study poses a very important challenge to theorists: how would you construct a jet that could potentially be carrying energy so far from the black hole? And how could we then detect that? Taking the magnetic field lines into account is not simple. Related calculations are difficult to do analytically, and must be solved with extremely complex numerical schemes.”

Theorist Jonathan McKinney, a Stanford University Einstein Fellow and expert on the formation of magnetized jets, agrees that the results pose as many questions as they answer. “There’s been a long-time controversy about these jets – about exactly where the gamma-ray emission is coming from. This work constrains the types of jet models that are possible,” said McKinney, who is unassociated with the recent study. “From a theoretician’s point of view, I’m excited because it means we need to rethink our models.”

As theorists consider how the new observations fit models of how jets work, Hayashida, Madejski and other members of the research team will continue to gather more data. “There’s a clear need to conduct such observations across all types of light to understand this better,” said Madejski. “It takes a massive amount of coordination to accomplish this type of study, which included more than 250 scientists and data from about 20 telescopes. But it’s worth it.”

With this and future multi-wavelength studies, theorists will have new insight with which to craft models of how the universe’s biggest accelerators work. Darth Vader has been denied all access to these research results.

Sources: DOE/SLAC National Accelerator Laboratory Press Release, a paper in the 18 February, 2010 issue of Nature.