Astronomy Without A Telescope – Blazar Jets

A 5000 light year long jet observable in optical light from the giant elliptical galaxy M87 - which is not technically a blazar, but only because it's jet isn't more closely aligned with Earth. Credit: ESA/Hubble.

[/caption]

Polar jets are often found around objects with spinning accretion disks – anything from newly forming stars to ageing neutron stars. And some of the most powerful polar jets arise from accretion disks around black holes, be they of stellar or supermassive size. In the latter case, jets emerging from active galaxies such as quasars, with their jets roughly orientated towards Earth, are called blazars.

The physics underlying the production of polar jets at any scale is not completely understood. It is likely that twisting magnetic lines of force, generated within a spinning accretion disk, channel plasma from the compressed centre of the accretion disk into the narrow jets we observe. But exactly what energy transfer process gives the jet material the escape velocity required to be thrown clear is still subject to debate.

In the extreme cases of black hole accretion disks, jet material acquires escape velocities close to the speed of light – which is needed if the material is to escape from the vicinity of a black hole. Polar jets thrown out at such speeds are usually called relativistic jets.

Relativistic jets from blazars broadcast energetically across the electromagnetic spectrum – where ground based radio telescopes can pick up their low frequency radiation, while space-based telescopes, like Fermi or Chandra, can pick up high frequency radiation. As you can see from the lead image of this story, Hubble can pick up optical light from one of M87‘s jets – although ground-based optical observations of a ‘curious straight ray’ from M87 were recorded as early as 1918.

Polar jets are thought to be shaped (collimated) by twisting magnetic lines of force. The driving force that pushes the jets out may be magnetic and/or intense radiation pressure, but no-one is really sure at this stage. Credit: NASA.

A recent review of high resolution data obtained from Very Long Baseline Interferometry (VLBI) – involving integrating data inputs from geographically distant radio telescope dishes into a giant virtual telescope array – is providing a bit more insight (although only a bit) into the structure and dynamics of jets from active galaxies.

The radiation from such jets is largely non-thermal (i.e. not a direct result of the temperature of the jet material). Radio emission probably results from synchrotron effects – where electrons spun rapidly within a magnetic field emit radiation across the whole electromagnetic spectrum, but generally with a peak in radio wavelengths. The inverse Compton effect, where a photon collision with a rapidly moving particle imparts more energy and hence a higher frequency to that photon, may also contribute to the higher frequency radiation.

Anyhow, VLBI observations suggest that blazar jets form within a distance of between 10 or 100 times the radius of the supermassive black hole – and whatever forces work to accelerate them to relativistic velocities may only operate over the distance of 1000 times that radius. The jets may then beam out over light year distances, as a result of that initial momentum push.

Shock fronts can be found near the base of the jets, which may represent points at which magnetically driven flow (Poynting flux) fades to kinetic mass flow – although magnetohydrodynamic forces continue operating to keep the jet collimated (i.e. contained within a narrow beam) over light year distances.

Left: A Xray/radio/optical composite photo of Centaurus A - also not technically a blazar because its jets don't align with the Earth. Credit: X-ray: NASA/CXC/CfA/R.Kraft et al.; Submillimeter: MPIfR/ESO/APEX/A.Weiss et al.; Optical: ESO/WFI. Right: A composite image showing the radio glow from Centaurus A compared with that of the full Moon. The foreground antennas are CSIRO's Australia Telescope Compact Array, which gathered the data for this image.

That was about as much as I managed to glean from this interesting, though at times jargon-dense, paper.

Further reading: Lobanov, A. Physical properties of blazar jets from VLBI observations.

Hawking(ish) Radiation Observed

In 1974, Steven Hawking proposed a seemingly ridiculous hypothesis. Black holes, the gravitational monsters from which nothing escapes, evaporate. To justify this, he proposed that pairs of virtual particles in which one strayed too close to the event horizon, could be split, causing one particle to escape and become an actual particle that could escape. This carrying off of mass would take energy and mass away from the black hole and deplete it. Due to the difficulty of observing astronomical black holes, this emission has gone undetected. But recently, a team of Italian physicists, led by Francesco Belgiorno, claims to have observed Hawking radiation in the lab. Well, sort of. It depends on your definition.

The experiment worked by sending powerful laser pulses through a block of ultra-pure glass. The intensity of the laser would change the optical properties of the glass increasing the refractive index to the point that light could not pass. In essence, this created an artificial event horizon. But instead of being a black hole which particles could pass but never return, this created a “white hole” in which particles could never pass in the first place. If a virtual pair were created near this barrier, one member could be trapped on one side while the other member could escape and be detected creating a situation analogous to that predicted by Hawking radiation.

Readers with some background in quantum physics may be scratching their heads at this point. The experiment uses a barrier to impede the photons, but quantum tunneling has demonstrated that there’s no such thing as a perfect barrier. Some photons should tunnel through. To avoid detecting these photons, the team simply moved the detector. While some photons will undoubtedly tunnel through, they would continue on the same path with which they were set. The detector was moved 90º to avoid detecting such photons.

The change in position also helped to minimize other sources of false detections such as scattering. At 90º, scattering only occurs for vertically polarized light and the experiment used horizontally polarized light. As a check to make sure none of the light became mispolarized, the team checked to ensure no photons of the emitted wavelength were observed. The team also had to guard against false detections from absorption and re-emission from the molecules in the glass (fluorescence). This was achieved through experimentation to gain an understanding of how much of this to expect so the effects could be subtracted out. Additionally, the group chose a wavelength in which fluorescence was minimized.

After all the removal of sources of error for which the team could account, they still reported a strong signal which they interpreted as coming from separated virtual particles and call a detection of Hawking radiation. Other scientists disagree in the definition. While they do not question the interpretation, others note that Hawking radiation, by definition, only occurs at gravitational event horizons. While this detection is interesting, it does not help to shed light on the more interesting effects that come with such gravitational event horizons such as quantum gravity or the paradox provided by the Trans-Planckian problem. In other words, while this may help to establish that virtual particles like this exist, it says nothing of whether or not they could truly escape from near a black hole, which is a requirement for “true” Hawking radiation.

Meanwhile, other teams continue to explore similar effects with other artificial barriers and event horizons to explore the effects of these virtual particles. Similar effects have been reported in other such systems including ones with water waves to form the barrier.

New Discovery at the Large Hadron Collider?

Image of a 7 TeV proton-proton collision in CMS producing more than 100 charged particles. Credit: CERN

[/caption]

Scientists at the Large Hadron Collider reported today they apparently have discovered a previously unobserved phenomenon in proton-proton collisions. One of the detectors shows that the colliding particles appear to be intimately linked in a way not seen before in proton collisions. The correlations were observed between particles produced in 7 TeV collisions. “The new feature has appeared in our analysis around the middle of July,” physicist Guido Tonelli told fellow CERN scientists at a seminar to present the findings from the collider’s CMS (Compact Muon Solenoid) detector.

The scientists said the effect is subtle and they have performed several detailed crosschecks and studies to ensure that it is real. It bears some similarity to effects seen in the collisions of nuclei at the RHIC facility located at the US Brookhaven National Laboratory, which have been interpreted as being possibly due to the creation of hot dense matter formed in the collisions.

CMS studies the collisions by measuring angular correlations between the particles as they fly away from the point of impact.

The scientists stressed that there are several potential explanations to be considered and the they presented their news to the physics community at CERN today in hopes of “fostering a broader discussion on the subject.”

“Now we need more data to analyze fully what’s going on, and to take our first steps into the vast landscape of new physics we hope the LHC will open up,” said Tonelli.

Proton running at the Large Hadron Collider is scheduled to continue until the end of October, during which time CMS will accumulate much more data to analyze. After that, and for the remainder of 2010, the LHC will collide lead nuclei.

Source: CERN

Disturbance in the Force – A Spatially Varying Fine Structure Constant

Illustration of the dipolar variation in the fine-structure constant, alpha, across the sky, as seen by the two telescopes used in the work: the Keck telescope in Hawaii and the ESO Very Large Telescope in Chile. IMAGE CREDIT: Copyright Dr. Julian Berengut, UNSW, 2010.

[/caption]

In order for astronomers to explore the outer reaches of our universe, they rely upon the assumption that the physical constants we observe in the lab on Earth are physically constant everywhere in the universe. This assumption seems to hold up extremely well. If the universe’s constants were grossly different, stars would fail to shine and galaxies would fail to coalesce. Yet as far we we look in our universe, the effects which rely on these physical constants being constant, still seem to happen. But new research has revealed that one of these constants, known as the fine structure constant, may vary ever so slightly in different portions of the universe.

Of all physical constants, the fine structure constant seems like an odd one to be probing with astronomy. It appears in many equations involving some of the smallest scales in the universe. In particular, it is used frequently in quantum physics and is part of the quantum derivation of the structure of the hydrogen atom. This quantum model determines the allowed energy levels of electrons in the atoms. Change this constant and the orbitals shift as well.

Since the allowed energy levels determine what wavelengths of light such an atom can emit, a careful analysis of the positioning of these spectral lines in distant galaxies would reveal variations in the constant that helped control them. Using the Very Large Telescope (VLT) and the Keck Observatory, a team from the University of New South Whales has analyzed the spectra of 300 galaxies and found the subtle changes that should exist if this constant was less than constant.

Since the two sets of telescopes used point in different directions (Keck in the Northern hemisphere and the VLT in the Southern), the researchers noticed that the variation seemed to have a preferred direction. As Julian King, one of the paper’s authors, explained, “Looking to the north with Keck we see, on average, a smaller alpha in distant galaxies, but when looking south with the VLT we see a larger alpha.”

However, “it varies by only a tiny amount — about one part in 100,000 — over most of the observable universe”. As such, although the result is very intriguing, it does not demolish our understanding of the universe or make hypotheses like that of a greatly variable speed of light plausible (an argument frequently tossed around by Creationists). But, “If our results are correct, clearly we shall need new physical theories to satisfactorily describe them.”

While this finding doesn’t challenge our knowledge of the observable universe, it may have implications for regions outside of the portion of the universe we can observe. Since our viewing distance is ultimately limited by how far we can look back, and that time is limited by when the universe became transparent, we cannot observe what the universe would be like beyond that visible horizon. The team speculates that beyond it, there may be even larger changes in this constant which would have large effects on physics in such portions. They conclude the results may, “suggest a violation of the Einstein Equivalence Principle, and could infer a very large or in finite universe, within which our `local’ Hubble volume represents a tiny fraction, with correspondingly small variations in the physical constants.”

This would mean that, outside of our portion of the universe, the physical laws may not be suitable for life making our little corner of the universe a sort of oasis. This could help solve the supposed “fine-tuning” problem without relying on explanations such as multiple universes.

Want some other articles on this subject? Here’s an article about there might be 10 dimensions.

Scientists Say They Can Now Test String Theory

Quantum entanglement visualized. Credit: Discovery News.


The idea of the “Theory of Everything” is enticing – that we could somehow explain all that is. String theory has been proposed since the 1960’s as a way to reconcile quantum mechanics and general relativity into such an explanation. However, the biggest criticism of String Theory is that it isn’t testable. But now, a research team led by scientists from the Imperial College London unexpectedly discovered that that string theory also seems to predict the behavior of entangled quantum particles. As this prediction can be tested in the laboratory, the researchers say they can now test string theory.

“If experiments prove that our predictions about quantum entanglement are correct, this will demonstrate that string theory ‘works’ to predict the behavior of entangled quantum systems,” said Professor Mike Duff, lead author of the study.

String theory was originally developed to describe the fundamental particles and forces that make up our universe, and has a been a favorite contender among physicists to allow us to reconcile what we know about the incredibly small from particle physics with our understanding of the very large from our studies of cosmology. Using the theory to predict how entangled quantum particles behave provides the first opportunity to test string theory by experiment.

But – at least for now – the scientists won’t be able to confirm that String Theory is actually the way to explain all that is, just if it actually works.

“This will not be proof that string theory is the right ‘theory of everything’ that is being sought by cosmologists and particle physicists,” said Duff. “However, it will be very important to theoreticians because it will demonstrate whether or not string theory works, even if its application is in an unexpected and unrelated area of physics.”

String theory is a theory of gravity, an extension of General Relativity, and the classical interpretation of strings and branes is that they are quantum mechanical vibrating, extended charged black holes.The theory hypothesizes that the electrons and quarks within an atom are not 0-dimensional objects, but 1-dimensional strings. These strings can move and vibrate, giving the observed particles their flavor, charge, mass and spin. The strings make closed loops unless they encounter surfaces, called D-branes, where they can open up into 1-dimensional lines. The endpoints of the string cannot break off the D-brane, but they can slide around on it.

Duff said he was sitting in a conference in Tasmania where a colleague was presenting the mathematical formulae that describe quantum entanglement when he realized something. “I suddenly recognized his formulae as similar to some I had developed a few years earlier while using string theory to describe black holes. When I returned to the UK I checked my notebooks and confirmed that the maths from these very different areas was indeed identical.”

Duff and his colleagues realized that the mathematical description of the pattern of entanglement between three qubits resembles the mathematical description, in string theory, of a particular class of black holes. Thus, by combining their knowledge of two of the strangest phenomena in the universe, black holes and quantum entanglement, they realized they could use string theory to produce a prediction that could be tested. Using the string theory mathematics that describes black holes, they predicted the pattern of entanglement that will occur when four qubits are entangled with one another. (The answer to this problem has not been calculated before.) Although it is technically difficult to do, the pattern of entanglement between four entangled qubits could be measured in the laboratory and the accuracy of this prediction tested.

The discovery that string theory seems to make predictions about quantum entanglement is completely unexpected, but because quantum entanglement can be measured in the lab, it does mean that there is way – finally – researchers can test predictions based on string theory.

But, Duff said, there is no obvious connection to explain why a theory that is being developed to describe the fundamental workings of our universe is useful for predicting the behavior of entangled quantum systems. “This may be telling us something very deep about the world we live in, or it may be no more than a quirky coincidence”, said Duff. “Either way, it’s useful.”

Source: Imperial College London

Astronomy Without A Telescope – Strange Stars

(Caption) One step closer to a black hole? A hypothetical strange star results from extreme gravitational compression overcoming the strong interaction that holds neutrons and protons together. Credit Swinburne University - astronomy.swin.edu.au

[/caption]

Atoms are made of protons, neutrons and electrons. If you cram them together and heat them up you get plasma where the electrons are only loosely associated with individual nuclei and you get a dynamic, light-emitting mix of positively charged ions and negatively charged electrons. If you cram that matter together even further, you drive electrons to merge with protons and you are left with a collection of neutrons – like in a neutron star. So, what if you keep cramming that collection of neutrons together into an even higher density? Well, eventually you get a black hole – but before that (at least hypothetically) you get a strange star.

The theory has it that compressing neutrons can eventually overcome the strong interaction, breaking down a neutron into its constituent quarks, giving a roughly equal mix of up, down and strange quarks – allowing these particles to be crammed even closer together in a smaller volume. By convention, this is called strange matter. It has been suggested that very massive neutron stars may have strange matter in their compressed cores.

However, some say that strange matter has a more fundamentally stable configuration than other matter. So, once a star’s core becomes strange, contact between it and baryonic (i.e. protons and neutrons) matter might drive the baryonic matter to adopt the strange (but more stable) matter configuration. This is the sort of thinking behind why the Large Hadron Collider might have destroyed the Earth by producing strangelets, which then produce a Kurt Vonnegut Ice-9 scenario. However, since the LHC hasn’t done any such thing, it’s reasonable to think that strange stars probably don’t form this way either.

More likely a ‘naked’ strange star, with strange matter extending from its core to its surface, might evolve naturally under its own self gravity. Once a neutron star’s core becomes strange matter, it should contract inwards leaving behind volume for an outer layer to be pulled inwards into a smaller radius and a higher density, at which point that outer layer might also become strange… and so on. Just as it seems implausible to have a star whose core is so dense that it’s essentially a black hole, but still with a star-like crust – so it may be that when a neutron star develops a strange core it inevitably becomes strange throughout.

Anyhow, if they exist at all, strange stars should have some tell tale characteristics. We know that neutron stars tend to lie in the range of 1.4 to 2 solar masses – and that any star with a neutron star’s density that’s over 10 solar masses has to become a black hole. That leaves a bit of a gap – although there is evidence of stellar black holes down to only 3 solar masses, so the gap for strange stars to form may only be in that 2 to 3 solar masses range.

By adopting a more compressed 'ground state' of matter, a strange (quark) star should be smaller, but more massive, than a neutron star. RXJ1856 is in the ballpark for size, but may not be massive enough to fit the theory. Credit: chandra.harvard.edu

The likely electrodynamic properties of strange stars are also of interest (see below). It is likely that electrons will be displaced towards the surface – leaving the body of the star with a nett positive charge surrounded by an atmosphere of negatively charged electrons. Presuming a degree of differential rotation between the star and its electron atmosphere, such a structure would generate a magnetic field of the magnitude that can be observed in a number of candidate stars.

Another distinct feature should be a size that is smaller than most neutron stars. One strange star candidate is RXJ1856, which appears to be a neutron star, but is only 11 km in diameter. Some astrophysicists may have muttered hmmm… that’s strange on hearing about it – but it remains to be confirmed that it really is.

Further reading: Negreiros et al (2010) Properties of Bare Strange Stars Associated with Surface Electrical Fields.

Cosmologists Provide Closest Measure of Elusive Neutrino

Slices through the SDSS 3-dimensional map of the distribution of galaxies. Earth is at the center, and each point represents a galaxy, typically containing about 100 billion stars. Galaxies are colored according to the ages of their stars, with the redder, more strongly clustered points showing galaxies that are made of older stars. The outer circle is at a distance of two billion light years. The region between the wedges was not mapped by the SDSS because dust in our own Galaxy obscures the view of the distant universe in these directions. Both slices contain all galaxies within -1.25 and 1.25 degrees declination. Credit: M. Blanton and the Sloan Digital Sky Survey.

[/caption]

Cosmologists – and not particle physicists — could be the ones who finally measure the mass of the elusive neutrino particle. A group of cosmologists have made their most accurate measurement yet of the mass of these mysterious so-called “ghost particles.” They didn’t use a giant particle detector but used data from the largest survey ever of galaxies, the Sloan Digital Sky Survey. While previous experiments had shown that neutrinos have a mass, it is thought to be so small that it was very hard to measure. But looking at the Sloan data on galaxies, PhD student Shawn Thomas and his advisers at University College London put the mass of a neutrino at no greater than 0.28 electron volts, which is less than a billionth of the mass of a single hydrogen atom. This is one of the most accurate measurements of the mass of a neutrino to date.

Their work is based on the principle that the huge abundance of neutrinos (there are trillions passing through you right now) has a large cumulative effect on the matter of the cosmos, which naturally forms into “clumps” of groups and clusters of galaxies. As neutrinos are extremely light they move across the universe at great speeds which has the effect of smoothing this natural “clumpiness” of matter. By analysing the distribution of galaxies across the universe (i.e. the extent of this “smoothing-out” of galaxies) scientists are able to work out the upper limits of neutrino mass.

A neutrino is capable of passing through a light year –about six trillion miles — of lead without hitting a single atom.

Central to this new calculation is the existence of the largest ever 3D map of galaxies, called Mega Z, which covers over 700,000 galaxies recorded by the Sloan Digital Sky Survey and allows measurements over vast stretches of the known universe.

“Of all the hypothetical candidates for the mysterious Dark Matter, so far neutrinos provide the only example of dark matter that actually exists in nature,” said Ofer Lahav, Head of UCL’s Astrophysics Group. “It is remarkable that the distribution of galaxies on huge scales can tell us about the mass of the tiny neutrinos.”

The Cosmologists at UCL were able to estimate distances to galaxies using a new method that measures the colour of each of the galaxies. By combining this enormous galaxy map with information from the temperature fluctuations in the after-glow of the Big Bang, called the Cosmic Microwave Background radiation, they were able to put one of the smallest upper limits on the size of the neutrino particle to date.

“Although neutrinos make up less than 1% of all matter they form an important part of the cosmological model,” said Dr. Shaun Thomas. “It’s fascinating that the most elusive and tiny particles can have such an effect on the Universe.”

“This is one of the most effective techniques available for measuring the neutrino masses,” said Dr. Filipe Abadlla. “This puts great hopes to finally obtain a measurement of the mass of the neutrino in years to come.”

The authors are confident that a larger survey of the Universe, such as the one they are working on called the international Dark Energy Survey, will yield an even more accurate weight for the neutrino, potentially at an upper limit of just 0.1 electron volts.
The results are published in the journal Physical Review Letters.

Source: University College London

Magnetic Fields in Inter-cluster Space: Measured at Last

How Does Light Travel?

[/caption]
The strength of the magnetic fields here on Earth, on the Sun, in inter-planetary space, on stars in our galaxy (the Milky Way; some of them anyway), in the interstellar medium (ISM) in our galaxy, and in the ISM of other spiral galaxies (some of them anyway) have been measured. But there have been no measurements of the strength of magnetic fields in the space between galaxies (and between clusters of galaxies; the IGM and ICM).

Up till now.

But who cares? What scientific importance does the strength of the IGM and ICM magnetic fields have?

The Large Area Telescope (LAT) on Fermi detects gamma-rays through matter (electrons) and antimatter (positrons) they produce after striking layers of tungsten. Credit: NASA/Goddard Space Flight Center Conceptual Image Lab

Estimates of these fields may provide “a clue that there was some fundamental process in the intergalactic medium that made magnetic fields,” says Ellen Zweibel, a theoretical astrophysicist at the University of Wisconsin, Madison. One “top-down” idea is that all of space was somehow left with a slight magnetic field soon after the Big Bang – around the end of inflation, Big Bang Nucleosynthesis, or decoupling of baryonic matter and radiation – and this field grew in strength as stars and galaxies amassed and amplified its intensity. Another, “bottom-up” possibility is that magnetic fields formed initially by the motion of plasma in small objects in the primordial universe, such as stars, and then propagated outward into space.

So how do you estimate the strength of a magnetic field, tens or hundreds of millions of light-years away, in regions of space a looong way from any galaxies (much less clusters of galaxies)? And how do you do this when you expect these fields to be much less than a nanoGauss (nG), perhaps as small as a femtoGauss (fG, which is a millionth of a nanoGauss)? What trick can you use??

A very neat one, one that relies on physics not directly tested in any laboratory, here on Earth, and unlikely to be so tested during the lifetime of anyone reading this today – the production of positron-electron pairs when a high energy gamma ray photon collides with an infrared or microwave one (this can’t be tested in any laboratory, today, because we can’t make gamma rays of sufficiently high energy, and even if we could, they’d collide so rarely with infrared light or microwaves we’d have to wait centuries to see such a pair produced). But blazars produce copious quantities of TeV gamma rays, and in intergalactic space microwave photons are plentiful (that’s what the cosmic microwave background – CMB – is!), and so too are far infrared ones.

MAGIC telescope (Credit: Robert Wagner)

Having been produced, the positron and electron will interact with the CMB, local magnetic fields, other electrons and positrons, etc (the details are rather messy, but were basically worked out some time ago), with the net result that observations of distant, bright sources of TeV gamma rays can set lower limits on the strength of the IGM and ICM through which they travel. Several recent papers report results of such observations, using the Fermi Gamma-Ray Space Telescope, and the MAGIC telescope.

So how strong are these magnetic fields? The various papers give different numbers, from greater than a few tenths of a femtoGauss to greater than a few femtoGauss.

“The fact that they’ve put a lower bound on magnetic fields far out in intergalactic space, not associated with any galaxy or clusters, suggests that there really was some process that acted on very wide scales throughout the universe,” Zweibel says. And that process would have occurred in the early universe, not long after the Big Bang. “These magnetic fields could not have formed recently and would have to have formed in the primordial universe,” says Ruth Durrer, a theoretical physicist at the University of Geneva.

So, perhaps we have yet one more window into the physics of the early universe; hooray!

Sources: Science News, arXiv:1004.1093, arXiv:1003.3884

Andromeda’s Double Nucleus – Explained at Last?

M31's nucleus (Credit: WF/PC, Hubble Space Telescope)


In 1993, the Hubble Space Telescope snapped a close-up of the nucleus of the Andromeda galaxy, M31, and found that it is double.

In the 15+ years since, dozens of papers have been written about it, with titles like The stellar population of the decoupled nucleus in M 31, Accretion Processes in the Nucleus of M31, and The Origin of the Young Stars in the Nucleus of M31.

And now there’s a paper which seems, at last, to explain the observations; the cause is, apparently, a complex interplay of gravity, angular motion, and star formation.

[/caption]
It is now reasonably well-understood how supermassive black holes (SMBHs), found in the nuclei of all normal galaxies, can snack on stars, gas, and dust which comes within about a third of a light-year (magnetic fields do a great job of shedding the angular momentum of this ordinary, baryonic matter).

Also, disturbances from collisions with other galaxies and the gravitational interactions of matter within the galaxy can easily bring gas to distances of about 10 to 100 parsecs (30 to 300 light years) from a SMBH.

However, how does the SMBH snare baryonic matter that’s between a tenth of a parsec and ~10 parsecs away? Why doesn’t matter just form more-or-less stable orbits at these distances? After all, the local magnetic fields are too weak to make changes (except over very long timescales), and collisions and close encounters too rare (these certainly work over timescales of ~billions of years, as evidenced by the distributions of stars in globular clusters).

That’s where new simulations by Philip Hopkins and Eliot Quataert, both of the University of California, Berkeley, come into play. Their computer models show that at these intermediate distances, gas and stars form separate, lopsided disks that are off-center with respect to the black hole. The two disks are tilted with respect to one another, allowing the stars to exert a drag on the gas that slows its swirling motion and brings it closer to the black hole.

The new work is theoretical; however, Hopkins and Quataert note that several galaxies seem to have lopsided disks of elderly stars, lopsided with respect to the SMBH. And the best-studied of these is in M31.

Hopkins and Quataert now suggest that these old, off-center disks are the fossils of the stellar disks generated by their models. In their youth, such disks helped drive gas into black holes, they say.

The new study “is interesting in that it may explain such oddball [stellar disks] by a common mechanism which has larger implications, such as fueling supermassive black holes,” says Tod Lauer of the National Optical Astronomy Observatory in Tucson. “The fun part of their work,” he adds, is that it unifies “the very large-scale black hole energetics and fueling with the small scale.” Off-center stellar disks are difficult to observe because they lie relatively close to the brilliant fireworks generated by supermassive black holes. But searching for such disks could become a new strategy for hunting supermassive black holes in galaxies not known to house them, Hopkins says.

Sources: ScienceNews, “The Nuclear Stellar Disk in Andromeda: A Fossil from the Era of Black Hole Growth”, Hopkins, Quataert, to be published in MNRAS (arXiv preprint), AGN Fueling: Movies.

LHC Sets Record for Particle Collisions, Marks “New Territory” in Physics

The Large Hadron Collider at CERN. Credit: CERN/LHC

Event display of a 7 TeV proton collision recorded by ATLAS. Credit: CERN

Physicists at the CERN research center collided sub-atomic particles in the Large Hadron Collider on Tuesday at the highest speeds ever achieved. “It’s a great day to be a particle physicist,” said CERN Director General Rolf Heuer. “A lot of people have waited a long time for this moment, but their patience and dedication is starting to pay dividends.” Already, the instruments in the LHC have recorded thousands of events, and at this writing, the LHC has had more than an hour of stable and colliding beams.

This is an attempt to create mini-versions of the Big Bang that led to the birth of the universe 13.7 billion years ago, providing new insights into the nature and evolution of matter in the early Universe.
Continue reading “LHC Sets Record for Particle Collisions, Marks “New Territory” in Physics”