Live Discussion: How Good is the Science of “Interstellar?”

Kip Thorne’s concept for a black hole in 'Interstellar.' Image Credit: Paramount Pictures

The highly anticipated film “Interstellar” is based on science and theory; from wormholes, to the push-pull of gravity on a planet, to the way a black hole might re-adjust your concept of time. But just how much of the movie is really true to what we know about the Universe? There has also been some discussion whether the physics used for the visual effects in the movie actually was good enough to produce some science. But how much of it is just creative license?

Today, (Wed. November 26) at 19:00 UTC (3 pm EDT, 12:00 pm PDT), the Kavli foundation hosts a live discussion with three astrophysicists who will answer viewers’ questions about black holes, relativity and gravity, to separate the movie’s science facts from its science fiction.

According to the Kavli twitter feed, the Hangout will even help you understand what in the world happened at the end of the movie!

Scientists Mandeep Gill, Eric Miller and Hardip Sanghera will answer your questions in the live Google Hangout.

Submit questions ahead of and during the webcast by emailing [email protected] or by using the hashtag #KavliSciBlog on Twitter or Google+.

You can watch today’s hangout here:

Also, you can enjoy the “Interstellar” trailer:

New Simulation Offers Stunning Images of Black Hole Merger

A binary black hole system, viewed edge-on. This pair of extremely dense objects twists and warps spacetime as the two black holes spiral in toward one another. Image Credit: Bohn, Throwe, Hébert, Henriksson, Bunandar, Taylor, Scheel (see http://www.black-holes.org/lensing)

A black hole is an extraordinarily massive, improbably dense knot of spacetime that makes a living swallowing or slinging away any morsel of energy that strays too close to its dark, twisted core. Anyone fortunate (or unfortunate) enough to directly observe one of these beasts in the wild would immediately notice the way its colossal gravitational field warps all of the light from the stars and galaxies behind it, a phenomenon known as gravitational lensing.

Thanks to the power of supercomputers, a curious observer no longer has to venture into outer space to see such a sight. A team of astronomers has released their first simulated images of the lensing effects of not just one, but two black holes, trapped in orbit by each other’s gravity and ultimately doomed to merge as one.

Astronomers have been able to model the gravitational effects of a single black hole since the 1970s, but the imposing mathematics of general relativity made doing so for a double black-hole system a much larger challenge. Over the last ten years, however, scientists have improved the accuracy of computer models that deal with these types of calculations in an effort to match observations from gravitational wave detectors like LIGO and VIRGO.

The research collaboration Simulating Extreme Spacetimes (SXS) has begun using these models to mimic the lensing effects of high-gravity systems involving objects such as neutron stars and black holes. In their most recent paper, the team imagines a camera pointing at a binary black hole system against a backdrop of the stars and dust of the Milky Way. One way to figure out what the camera would see in this situation would be to use general relativity to compute the path of each photon traveling from every light source at all points within the frame. This method, however, involves a nearly impossible number of calculations.  So instead, the researchers worked backwards, mapping only those photons that would reach the camera and result in a bright spot on the final image – that is, photons that would not be swallowed by either of the black holes.

A binary black hole system, viewed from above. Image Credit: Bohn et al. (see http://arxiv.org/abs/1410.7775)
The same binary black hole system, viewed from above. Image Credit: Bohn et al. (see http://arxiv.org/abs/1410.7775)

As you can see in the image above, the team’s simulations testify to the enormous effect that these black holes have on the fabric of spacetime. Ambient photons curl into a ring around the converging binaries in a process known as frame dragging. Background objects appear to multiply on opposite sides of the merger (for instance, the yellow and blue pair of stars in the “northeast” and the “southwest” areas of the ring). Light from behind  the camera is even pulled into the frame by the black holes’ mammoth combined gravitational field. And each black hole distorts the appearance of the other, pinching off curved, comma-shaped regions of shadow called “eyebrows.” If you could zoom in with unlimited precision, you would find that there are, in fact, an infinite number of these eyebrows, each smaller than the last, like a cosmic set of Russian dolls.

In case you thought things couldn’t get any more amazing, SXS has also created two videos of the black hole merger: one simulated from above, and the other edge-on.
 



 



The SXS collaboration will continue to investigate gravitationally ponderous objects like black holes and neutron stars in an effort to better understand their astronomical and physical properties. Their work will also assist observational scientists as they search the skies for evidence of gravitational waves.

Check out the team’s ArXiv paper describing this work and their website for even more fascinating images.

BICEP2 All Over Again? Researchers Place Higgs Boson Discovery in Doubt

This is the signature of one of 100s of trillions of particle collisions detected at the Large Hadron Collider. The combined analysis lead to the discovery of the Higgs Boson. This article describes one team in dissension with the results. (Photo Credit: CERN)

At the Large Hadron Collider (LHC) in Europe, faster is better. Faster means more powerful particle collisions and looking deeper into the makeup of matter. However, other researchers are proclaiming not so fast. LHC may not have discovered the Higgs Boson, the boson that imparts mass to everything, the god particle as some have called it. While the Higgs Boson discovery in 2012 culminated with the awarding in December 2013 of the Nobel Prize to Peter Higgs and François Englert, a team of researchers has raised these doubts about the Higgs Boson in their paper published in the journal Physical Review D.

The discourse is similar to what unfolded in the last year with the detection of light from the beginning of time that signified the Inflation epoch of the Universe. Researchers looking into the depths of the Universe and the inner depths of subatomic particles are searching for signals at the edge of detectability, just above the noise level and in proximity to the signals from other sources. For the BICEP2 telescope observations (previous U.T. articles), its pretty much back to the drawing board but the Higgs Boson (previous U.T. articles) doubts are definitely challenging but needing more solid evidence. In human affairs, if the Higgs Boson was not detected by the LHC, what does one do with an awarded Nobel Prize?

Cross-section of the Large Hadron Collider where its detectors are placed and collisions occur. LHC is as much as 175 meters (574 ft) below ground on the Frence-Swiss border near Geneva, Switzerland. The accelerator ring is 27 km (17 miles) in circumference. (Photo Credit: CERN)
Cross-section of the Large Hadron Collider where its detectors are placed and collisions occur. LHC is as much as 175 meters (574 ft) below ground on the Franco-Swiss border near Geneva, Switzerland. The accelerator ring is 27 km (17 miles) in circumference. (Photo Credit: CERN)

The present challenge to the Higgs Boson is not new and is not just a problem of detectability and acuity of the sensors as is the case with BICEP2 data. The Planck space telescope revealed that light radiated from dust combined with the magnetic field in our Milky Way galaxy could explain the signal detected by BICEP2 that researchers proclaimed as the primordial signature of the Inflation period. The Higgs Boson particle is actually a prediction of the theory proposed by Peter Higgs and several others beginning in the early 1960s. It is a predicted particle from gauge theory developed by Higgs, Englert and others, at the heart of the Standard Model.

This recent paper is from a team of researchers from Denmark, Belgium and the United Kingdom led by Dr. Mads Toudal Frandsen. Their study entitled, “Technicolor Higgs boson in the light of LHC data” discusses how their supported theory predicts Technicolor quarks through a range of energies detectable at LHC and that one in particular is within the uncertainty level of the data point declared to be the Higgs Boson. There are variants of Technicolor Theory (TC) and the research paper compares in detail the field theory behind the Standard Model Higgs and the TC Higgs (their version of the Higgs boson). Their conclusion is that a TC Higgs is predicted by Technicolor Theory that is consistent with expected physical properties, is low mass and has an energy level – 125 GeV – indistinguishable from the resonance now considered to be the Standard Model Higgs. Theirs is a composite particle and it does not impart mass upon everything.

So you say – hold on! What is a Technicolor in jargon of particle physics? To answer this you would want to talk to a plumber from South Bronx, New York – Dr. Leonard Susskind. Though no longer a plumber, Susskind first proposed Technicolor to describe the breaking of symmetry in gauge theories that are part of the Standard Model. Susskind and other physicists from the 1970s considered it unsatisfactory that many arbitrary parameters were needed to complete the Gauge theory used in the Standard Model (involving the Higgs Scalar and Higgs Field). The parameters consequently defined the mass of elementary particles and other properties. These parameters were being assigned and not calculated and that was not acceptable to Susskind, ‘t Hooft, Veltmann and others. The solution involved the concept of Technicolor which provided a “natural” means of describing the breakdown of symmetry in the gauge theories that makeup the Standard Model.

Technicolor in particle physics shares one simple thing in common with Technicolor that dominated the early color film industry – the term composite in creating color or particles.

Dr. Leonard Susskind, a leading developer of the Theory of Technicolor (left) and Nobel Prize winner Dr. Peter Higgs who proposed the existence of a particle that imparts mass to all matter - the Higgs Boson (right). (Photo Credit: University of Stanford, CERN)
Dr. Leonard Susskind, a leading developer of the Theory of Technicolor (left) and Nobel Prize winner Dr. Peter Higgs who proposed the existence of a particle that imparts mass to all matter – the Higgs Boson (right). (Photo Credit: University of Stanford, CERN)

If the theory surrounding Technicolor is correct, then there should be many techni-quark and techni-Higgs particles to be found with the LHC or a more powerful next generation accelerator; a veritable zoo of particles besides just the Higgs Boson. The theory also means that these ‘elementary’ particles are composites of smaller particles and that another force of nature would be needed to bind them. And this new paper by Belyaev, Brown, Froadi and Frandsen claims that one specific techni-quark particle has a resonance (detection point) that is within the uncertainty of measurements for the Higgs Boson. In other words, the Higgs Boson might not be “the god particle” but rather a Technicolor Quark particle comprised of smaller more fundamental particles and another force binding them.

This paper by Belyaev, Brown, Froadi and Frandsen is a clear reminder that the Standard Model is unsettled and that even the discovery of the Higgs Boson is not 100% certain. In the last year, more sensitive sensors have been integrated into CERN’s LHC which will help refute this challenge to Higgs theory – Higgs Scalar and Field, the Higgs Boson or may reveal the signatures of Technicolor particles. Better detectors may resolve the difference between the energy level of the Technicolor quark and the Higgs Boson. LHC researchers were quick to state that their work moves on beyond discovery of the Higgs Boson. Also, their work could actually disprove that they found the Higgs Boson.

Contacting the co-investigator Dr. Alexander Belyaev, the question was raised – will the recent upgrades to CERN accelerator provide the precision needed to differentiate a technie-Quark from the Higg’s particle?

“There is no guarantee of course” Dr. Belyaev responded to Universe Today, “but upgrade of LHC will definitely provide much better potential to discover other particles associated with theory of Technicolor, such as heavy Techni-mesons or Techni-baryons.”

Resolving the doubts and choosing the right additions to the Standard Model does depend on better detectors, more observations and collisions at higher energies. Presently, the LHC is down to increase collision energies from 8 TeV to 13 TeV. Among the observations at the LHC, Super-symmetry has not fared well and the observations including the Higgs Boson discovery has supported the Standard Model. The weakness of the Standard Model of particle physics is that it does not explain the gravitational force of nature whereas Super-symmetry can. The theory of Technicolor maintains strong supporters as this latest paper shows and it leaves some doubt that the Higgs Boson was actually detected. Ultimately another more powerful next-generation particle accelerator may be needed.

In a previous Universe Today story, the question was raised - is the Standard Model a Rube Goldberg Device? Most theorists would say 'no' but it is unlikely to reach the status of the 'theory of everything' (Illustration Credit: R.Goldberg- the toothpaste dispenser, variant T.Reyes)
In a previous Universe Today story, the question was raised – is the Standard Model a Rube Goldberg Device? Most theorists would say ‘no’ but it is unlikely to reach the status of the ‘theory of everything’ (Illustration Credit: R.Goldberg- the toothpaste dispenser, variant T.Reyes)

For Higgs and Englert, the reversal of the discovery is by no means the ruination of a life’s work or would be the dismissal of a Nobel Prize. The theoretical work of the physicists have long been recognized by previous awards. The Standard Model as, at least, a partial solution of the theory of everything is like a jig-saw puzzle. Piece by piece is how it is being developed but not without missteps. Furthermore, the pieces added to the Standard Model can be like a house of cards and require replacing a larger solution with a wholly other one. This could be the case of Higgs and Technicolor.

At times like children somewhat determined, physicists thrust a solution into the unfolding puzzle that seems to fit but ultimately has to be retracted. The present discourse does not yet warrant a retraction. Elegance and simplicity is the ultimate characteristics sought in theoretical solutions. Particle physicists also use the term Naturalness when describing the concerns with gauge theory parameters. The solutions – the pieces – of the puzzle created by Peter Higgs and François Englert have spearheaded and encouraged further work which will achieve a sounder Standard Model but few if any claim that it will emerge as the theory of everything.

References:

Pre-print of Technicolor Higgs boson in the light of LHC data

An Introduction to Technicolor, P. Sikivie, CERN, October 1980

Technicolour, Farhi & Susskind, March 1981

Two New Subatomic Particles Found

Particle Collider
Today, CERN announced that the LHCb experiment had revealed the existence of two new baryon subatomic particles. Credit: CERN/LHC/GridPP

With its first runs of colliding protons in 2008-2013, the Large Hadron Collider has now been providing a stream of experimental data that scientists rely on to test predictions arising out of particle and high-energy physics. In fact, today CERN made public the first data produced by LHC experiments. And with each passing day, new information is released that is helping to shed light on some of the deeper mysteries of the universe.

This week, for example, CERN announced the discovery two new subatomic particles that are part of the baryon family. The particles, known as the Xi_b’ and Xi_b*, were discovered thanks to the efforts of the LHCb experiment – an international collaboration involving roughly 750 scientists from around the world.

The existence of these particles was predicted by the quark model, but had never been seen before. What’s more, their discovery could help scientists to further confirm the Standard Model of particle physics, which is considered virtually unassailable now thanks to the discovery of the Higgs Boson.

Like the well-known protons that the LHC accelerates, the new particles are baryons made from three quarks bound together by the strong force. The types of quarks are different, though: the new X_ib particles both contain one beauty (b), one strange (s), and one down (d) quark. Thanks to the heavyweight b quarks, they are more than six times as massive as the proton.

Cross-section of the Large Hadron Collider where its detectors are placed and collisions occur. LHC is as much as 175 meters (574 ft) below ground on the Frence-Swiss border near Geneva, Switzerland. The accelerator ring is 27 km (17 miles) in circumference. (Photo Credit: CERN)
Cross-section of the Large Hadron Collider where its detectors are placed and collisions occur. Credit: CERN

However, their mass also depends on how they are configured. Each of the quarks has an attribute called “spin”; and in the Xi_b’ state, the spins of the two lighter quarks point in the opposite direction to the b quark, whereas in the Xi_b* state they are aligned. This difference makes the Xi_b* a little heavier.

“Nature was kind and gave us two particles for the price of one,” said Matthew Charles of the CNRS’s LPNHE laboratory at Paris VI University. “The Xi_b’ is very close in mass to the sum of its decay products: if it had been just a little lighter, we wouldn’t have seen it at all using the decay signature that we were looking for.”

“This is a very exciting result,” said Steven Blusk from Syracuse University in New York. “Thanks to LHCb’s excellent hadron identification, which is unique among the LHC experiments, we were able to separate a very clean and strong signal from the background,” “It demonstrates once again the sensitivity and how precise the LHCb detector is.”

Blusk and Charles jointly analyzed the data that led to this discovery. The existence of the two new baryons had been predicted in 2009 by Canadian particle physicists Randy Lewis of York University and Richard Woloshyn of the TRIUMF, Canada’s national particle physics lab in Vancouver.

The bare masses of all 6 flavors of quarks, proton and electron, shown in proportional volume. Credit: Wikipedia/Incnis Mrsi
The bare masses of all 6 flavors of quarks, proton and electron, shown in proportional volume. Credit: Wikipedia/Incnis Mrsi

As well as the masses of these particles, the research team studied their relative production rates, their widths – a measure of how unstable they are – and other details of their decays. The results match up with predictions based on the theory of Quantum Chromodynamics (QCD).

QCD is part of the Standard Model of particle physics, the theory that describes the fundamental particles of matter, how they interact, and the forces between them. Testing QCD at high precision is a key to refining our understanding of quark dynamics, models of which are tremendously difficult to calculate.

“If we want to find new physics beyond the Standard Model, we need first to have a sharp picture,” said LHCb’s physics coordinator Patrick Koppenburg from Nikhef Institute in Amsterdam. “Such high precision studies will help us to differentiate between Standard Model effects and anything new or unexpected in the future.”

The measurements were made with the data taken at the LHC during 2011-2012. The LHC is currently being prepared – after its first long shutdown – to operate at higher energies and with more intense beams. It is scheduled to restart by spring 2015.

The research was published online yesterday on the physics preprint server arXiv and have been submitted to the scientific journal Physical Review Letters.

Further Reading: CERN, LHCb

Higgs Boson Threatened The Early Universe, But Gravity Saved The Day

Image Credit: Science/AAAS

All the physical properties of our Universe – indeed, the fact that we even exist within a Universe that we can contemplate and explore – owe to events that occurred very early in its history. Cosmologists believe that our Universe looks the way it does thanks to a rapid period of inflation immediately before the Big Bang that smoothed fluctuations in the vacuum energy of space and flattened out the fabric of the cosmos itself.

According to current theories, however, interactions between the famed Higgs boson and the inflationary field should have caused the nascent Universe to collapse. Clearly, this didn’t happen. So what is going on? Scientists have worked out a new theory: It was gravity that (literally) held it all together.

The interaction between the curvature of spacetime (more commonly known as gravity) and the Higgs field has never been well understood. Resolving the apparent problem of our Universe’s stubborn existence, however, provides a good excuse to do some investigating. In a paper published this week in Physical Review Letters, researchers from the University of Copenhagen, the University of Helsinki, and Imperial College London show that even a small interaction between gravity and the Higgs would have been sufficient to stave off a collapse of the early cosmos.

The researchers modified the Higgs equations to include the effect of gravity generated by UV-scale energies. These corrections were found to stabilize the inflationary vacuum at all but a narrow range of energies, allowing expansion to continue and the Universe as we know it to exist… without the need for new physics beyond the Standard Model.

This new theory is based on the controversial evidence of inflation announced by BICEP2 earlier this summer, so its true applicability will depend on whether or not those results turn out to be real. Until then, the researchers are hoping to support their work with additional observational studies that seek out gravitational waves and more deeply examine the cosmic microwave background.

At this juncture, the Higgs-gravity interaction is not a testable hypothesis because the graviton (the particle that handles all of gravity’s interactions) itself has yet to be detected. Based purely on the mathematics, however, the new theory presents an elegant and efficient solution to the potential conundrum of why we exist at all.

Better than Bieber, Rosetta’s Comet Sings Strange, Seductive Song

Magnetic field lines bound up in the sun’s wind pile up and drape around a comet’s nucleus to shape the blue ion tail. Notice the oppositely-directed fields on the comet’s backside. The top set points away from the comet; the bottom set toward. In strong wind gusts, the two can be squeezed together and reconnect, releasing energy that snaps off a comet’s tail. Credit: Tufts University


Tune in to the song of Comet Churyumov-Gerasimenko

Scientists can’t figure exactly why yet, but Comet 67P/Churyumov-Gerasimenko has been singing since at least August. Listen to the video – what do you think? I hear a patter that sounds like frogs, purring and ping-pong balls. The song is being sung at a frequency of 40-50 millihertz, much lower than the 20 hertz – 20 kilohertz range of human hearing. Rosetta’s magnetometer experiment first clearly picked up the sounds in August, when the spacecraft drew to within 62 miles (100 km) of the comet. To make them audible Rosetta scientists increased their pitch 10,000 times. 

The sounds are thought to be oscillations in the magnetic field around  the comet. They were picked up by the Rosetta Plasma Consortium,  a suite  of five instruments on the spacecraft devoted to observing interactions between the solar plasma and the comet’s tenuous coma as well as the physical properties of the nucleus. A far cry from the stuff you donate at the local plasma center, plasma in physics is an ionized gas. Ionized means the atoms in the gas have lost or gained an electron through heating or collisions to become positively or negatively charged ions. Common forms of plasma include the electric glow of neon signs, lightning and of course the Sun itself.

Having lost their neutrality, electric and magnetic fields can now affect the motion of particles in the plasma. Likewise, moving electrified particles affect the very magnetic field controlling them.

Scientists think that neutral gas particles from vaporizing ice shot into the coma become ionized under the action of ultraviolet light from the Sun. While the exact mechanism that creates the curious oscillations is still unknown, it might have something to do with the electrified atoms or ions interacting with the magnetic fields bundled with the Sun’s everyday outpouring of plasma called the solar wind. It’s long been known that a comet’s electrified or ionized gases present an obstacle to the solar wind, causing it to drape around the nucleus and shape the streamlined blue-tinted ion or gas tail.

“This is exciting because it is completely new to us. We did not expect this, and we are still working to understand the physics of what is happening,” said Karl-Heinz Glassmeier, head of Space Physics and Space Sensorics at the Technical University of Braunschweig, Germany.

While 67P C-G’s song probably won’t make the Top 40, we might listen to it just as we would any other piece of music to learn what message is being communicated.

RAISE: How to Capture 1,500 Solar Images in a Five Minute Flight

RAISE in the cleanroom prior to launch. Credit: NASA/RAISE.

Quick: how do you aim an instrument at the Sun from a moving rocket on a fifteen minute suborbital flight?

The answer is very carefully, and NASA plans to do just that today, Thursday, November 6th as the Rapid Acquisition Imaging Spectrograph Experiment, also known as RAISE, takes to the skies over White Sands, New Mexico, to briefly study the Sun.

Capturing five images per second, RAISE is expected to gather over 1,500 images during five minutes of data collection near apogee.

Why use sub-orbital sounding rockets to do observations of the Sun? Don’t we already have an armada of space and ground-based instruments to accomplish this that stare at our nearest star around the clock? Well, it turns out that sounding rockets are still cost-effective means of testing and demonstrating new technologies.

“Even on a five-minute flight, there are niche areas of science we can focus on well,” said solar scientist Don Hassler of the Southwest Research Institute in Boulder, Colorado in a recent press release. “There are areas of the Sun that need to be examined with the high-cadence observations that we can provide.”

Indeed, there’s a long history of studying the Sun by use of high-altitude sounding rockets, starting with the detection of solar X-rays by a detector placed in a captured V-2 rocket launched from White Sands in 1949.

Credit: NASA.
Sub-orbital astronomy in 5 minutes: the flight of a sounding rocket. Credit: NASA.

RAISE will actually scrutinize an active region of the Sun turned Earthward during its brief flight to create what’s known as a spectrogram, or an analysis of solar activity at differing wavelengths. This gives scientists a three dimensional layered snapshot of solar activity, as different wavelengths correspond to varying velocities of solar material and wavelengths. Think of looking at layers of cake. This, in turn, paints a picture of how material is circulated and moved around the surface of the Sun.

This will be RAISE’s second flight, and this week’s launch will sport a brand new diffraction grating coated with boron carbide to enhance wavelength analysis. RAISE will also look at the Sun in the extreme ultraviolet which cannot penetrate the Earth’s lower atmosphere. Technology pioneered by missions such as RAISE may also make its way into space permanently on future missions, such as the planned European Space Agency and NASA joint Solar Orbiter Mission, set for launch in 2017. The Solar Orbit Mission will study the Sun close up and personal, journeying only 26 million miles or 43 million kilometres from its surface, well inside the perihelion of the planet Mercury.

“This is the second time we have flown a RAISE payload, and we keep improving it along the way,” Hassler continued. “This is a technology that is maturing relatively quickly.”

As you can imagine, RAISE relies on clear weather for a window to launch. RAISE was scrubbed for launch on November 3rd, and the current window for launch is set for 2:07 PM EST/19:07 Universal Time, which is 12:07 PM MST local time at White Sands. Unlike the suborbital launches from Wallops Island, the White Sands launches aren’t generally carried live, though they tend to shut down US highway 70 between Las Cruces and Alamogordo that bisects White Sands just prior to launch.

Currently, the largest sunspot turned forward towards the Earth is active region 2205.

Another recent mission lofted by a sounding rocket to observe the Sun dubbed Hi-C was highly successful during its short flight in 2013.

RAISE will fly on a Black Brant sounding rocket, which typically reaches an apogee of 180 miles or 300 kilometres.

Credit: NASA/SDO
A look at recent solar activity coming around the solar limb to be targeted by RAISE. Credit: NASA/SDO

Unfortunately, the massive sunspot region AR2192 is currently turned away from the Earth and will effectively be out of RAISE’s view. The largest in over a decade, the Jupiter sized sunspot wowed viewers of the final solar eclipse of 2014 just last month. This large sunspot group will most likely survive its solar farside journey and reappear around the limb of the Sun sometime after November 9th, good news if RAISE is indeed scrubbed today due to weather.

And our current solar cycle has been a very schizophrenic one indeed. After a sputtering start, solar cycle #24 has been anemic at best, with the Sun struggling to come out of a profound minimum, the likes of which hasn’t been seen in over a century. And although October 2014 produced a Jupiter-sized sunspot that was easily seen with eclipse glasses, you wouldn’t know that we’ve passed a solar maximum from looking at the Sun now. In fact, there’s been talk among solar astronomers that solar cycle #25 may be even weaker, or absent all together.

All this makes for fascinating times to study our sometimes strange star. RAISE observations will also be coordinated with views from the Solar Dynamics Observatory and the joint NASA-JAXA Hinode satellites in Earth orbit. We’ll also be at White Sands National Park today, hoping the get a brief view of RAISE as it briefly touches space.

It’s a great time for solar astronomy!

Unusual Distributions of Organics Found in Titan’s Atmosphere

The ALMA array, as it looks now completed and standing on a Chilean high plateau at 5000 meters (16,400 ft) altitude. The first observations with ALMA of Titan have added to the Saturn moon's list of mysteries. {Credit: ALMA (ESO/NAOJ/NRAO) / L. Calçada (ESO)}

A new mystery of Titan has been uncovered by astronomers using their latest asset in the high altitude desert of Chile. Using the now fully deployed Atacama Large Millimeter Array (ALMA) telescope in Chile, astronomers moved from observing comets to Titan. A single 3 minute observation revealed organic molecules that are askew in the atmosphere of Titan. The molecules in question should be smoothly distributed across the atmosphere, but they are not.

The Cassini/Huygens spacecraft at the Saturn system has been revealing the oddities of Titan to us, with its lakes and rain clouds of methane, and an atmosphere thicker than Earth’s. But the new observations by ALMA of Titan underscore how much more can be learned about Titan and also how incredible the ALMA array is.

ALMA first obserations of the atmospher of Saturn's moon Titan. The image shows the distribution of the organic molecule HNC. Red to White representing low to high concenrations. The offset locations of the molecules relative to the poles suprised the researchers lead by NASA/GSFC astrochemist M. Cordiner.(Credit: NRAO/AUI/NSF; M. Cordiner (NASA) et at.)
ALMA’s first observations of the atmosphere of Saturn’s moon Titan. The image shows the distribution of the organic molecule HNC. Red to White representing low to high concentrations. The offset locations of the molecules relative to the poles surprised the researchers led by NASA/GSFC astrochemist M. Cordiner. (Credit: NRAO/AUI/NSF; M. Cordiner (NASA) et at.)

The ALMA astronomers called it a “brief 3 minute snapshot of Titan.” They found zones of organic molecules offset from the Titan polar regions. The molecules observed were hydrogen isocyanide (HNC) and cyanoacetylene (HC3N). It is a complete surprise to the astrochemist Martin Cordiner from NASA Goddard Space Flight Center in Greenbelt, Maryland. Cordiner is the lead author of the work published in the latest release of Astrophysical Journal Letters.

The NASA Goddard press release states, “At the highest altitudes, the gas pockets appeared to be shifted away from the poles. These off-pole locations are unexpected because the fast-moving winds in Titan’s middle atmosphere move in an east–west direction, forming zones similar to Jupiter’s bands, though much less pronounced. Within each zone, the atmospheric gases should, for the most part, be thoroughly mixed.”

When one hears there is a strange, skewed combination of organic compounds somewhere, the first thing to come to mind is life. However, the astrochemists in this study are not concluding that they found a signature of life. There are, in fact, other explanations that involve simpler forces of nature. The Sun and Saturn’s magnetic field deliver light and energized particles to Titan’s atmosphere. This energy causes the formation of complex organics in the Titan atmosphere. But how these two molecules – HNC and HC3N – came to have a skewed distribution is, as the astrochemists said, “very intriguing.” Cordiner stated, “This is an unexpected and potentially groundbreaking discovery… a fascinating new problem.”

The press release from the National Radio Astronomy Observatory states, “studying this complex chemistry may provide insights into the properties of Earth’s very early atmosphere.” Additionally, the new observations add to understanding Titan – a second data point (after Earth) for understanding organics of exo-planets, which may number in the hundreds of billions beyond our solar system within our Milky Way galaxy. Astronomers need more data points in order to sift through the many exo-planets that will be observed and harbor organic compounds. With Titan and Earth, astronomers will have points of comparison to determine what is happening on distant exo-planets, whether it’s life or not.

High in the atmosphere of Titan, large patches of two trace gases glow near the north pole, on the dusk side of the moon, and near the south pole, on the dawn side. Brighter colors indicate stronger signals from the two gases, HNC (left) and HC3N (right); red hues indicate less pronounced signals. Image (Credit: NRAO/AUI/NSF)
High in the atmosphere of Titan, large patches of two trace gases glow near the north pole, on the dusk side of the moon, and near the south pole, on the dawn side. Brighter colors indicate stronger signals from the two gases, HNC (left) and HC3N (right); red hues indicate less pronounced signals.
(Image Credit: NRAO/AUI/NSF)

The report of this new and brief observation also underscores the new astronomical asset in the altitudes of Chile. ALMA represents the state of the art of millimeter and sub-millimeter astronomy. This field of astronomy holds a lot of promise. Back around 1980, at the Kitt Peak National Observatory in Arizona, alongside the great visible light telescopes, there was an oddity, a millimeter wavelength dish. That dish was the beginning of radio astronomy in the 1 – 10 millimeter wavelength range. Millimeter astronomy is only about 35 years old. These wavelengths stand at the edge of the far infrared and include many light emissions and absorptions from cold objects which often include molecules and particularly organics. The ALMA array has 10 times more resolving power than the Hubble space telescope.

The Earth’s atmosphere stands in the way of observing the Universe in these wavelengths. By no coincidence our eyes evolved to see in the visible light spectrum. It is a very narrow band, and it means that there is a great, wide world of light waves to explore with different detectors than just our eyes.

The diagram shows the electromagnetic spectrum, the absorption of light by the Earth's atmosphere and illustrates the astronomical assets that focus on specific wavelengths of light. ALMA at the Chilean site and with modern solid state electronics is able to overcome the limitations placed by the Earth's atmosphere. (Credit: Wikimedia, T.Reyes)
The diagram shows the electromagnetic spectrum, the absorption of light by the Earth’s atmosphere, and illustrates the astronomical assets that focus on specific wavelengths of light. ALMA at the Chilean site, with modern solid state electronics, is able to overcome the limitations placed by the Earth’s atmosphere. (Credit: Wikimedia, T.Reyes)

In the millimeter range of wavelengths, water, oxygen, and nitrogen are big absorbers. Some wavelengths in the millimeter range are completely absorbed. So there are windows in this range. ALMA is designed to look at those wavelengths that are accessible from the ground. The Chajnantor plateau in the Atacama desert at 5000 meters (16,400 ft) provides the driest, clearest location in the world for millimeter astronomy outside of the high altitude regions of the Antarctic.

At high altitude and over this particular desert, there is very little atmospheric water. ALMA consists of 66 12 meter (39 ft) and 7 meter (23 ft) dishes. However, it wasn’t just finding a good location that made ALMA. The 35 year history of millimeter-wavelength astronomy has been a catch up game. Detecting these wavelengths required very sensitive detectors – low noise in the electronics. The steady improvement in solid-state electronics from the late 70s to today and the development of cryostats to maintain low temperatures have made the new observations of Titan possible. These are observations that Cassini at 1000 kilometers from Titan could not do but ALMA at 1.25 billion kilometers (775 million miles) away could.

The 130 ton German Antenna Dish Transporter, nicknamed Otto. The ALMA transporter vehicle carefully carries the state-of-the-art antenna, with a diameter of 12 metres and a weight of about 100 tons, on the 28 km journey to the Array Operations Site, which is at an altitude of 5000 m. The antenna is designed to withstand the harsh conditions at the high site, where the extremely dry and rarefied air is ideal for ALMA’s observations of the universe at millimetre- and sub-millimetre-wavelengths. (Credit: ESO)
The 130 ton German Antenna Dish Transporter, nicknamed Otto. The ALMA transporter vehicle carefully carries the state-of-the-art antenna, with a diameter of 12 metres and a weight of about 100 tons, on the 28 km journey to the Array Operations Site, which is at an altitude of 5000 m. The antenna is designed to withstand the harsh conditions at the high site, where the extremely dry and rarefied air is ideal for ALMA’s observations of the universe at millimetre- and sub-millimetre-wavelengths. (Credit: ESO)

The ALMA telescope array was developed by a consortium of countries led by the United States’ National Science Foundation (NSF) and countries of the European Union though ESO (European Organisation for Astronomical Research in the Southern Hemisphere). The first concepts were proposed in 1999. Japan joined the consortium in 2001.

The prototype ALMA telescope was tested at the site of the VLA in New Mexico in 2003. That prototype now stands on Kitt Peak having replaced the original millimeter wavelength dish that started this branch of astronomy in the 1980s. The first dishes arrived in 2007 followed the next year by the huge transporters for moving each dish into place at such high altitude. The German-made transporter required a cabin with an oxygen supply so that the drivers could work in the rarefied air at 5000 meters. The transporter was featured on an episode of the program Monster Moves. By 2011, test observations were taking place, and by 2013 the first science program was undertaken. This year, the full array was in place and the second science program spawned the Titan observations. Many will follow. ALMA, which can operate 24 hours per day, will remain the most powerful instrument in its class for about 10 years when another array in Africa will come on line.

References:

NASA Goddard Press Release

NRAO Press Release

ALMA Observatory Website

Alma Measurements Of The Hnc And Hc3N Distributions In Titan’s Atmosphere“, M. A. Cordiner, et al., Astrophysical Journal Letters

The Physics Behind “Interstellar’s” Visual Effects Was So Good, it Led to a Scientific Discovery

Kip Thorne’s concept for a black hole in 'Interstellar.' Image Credit: Paramount Pictures

While he was working on the film Interstellar, executive producer Kip Thorne was tasked with creating the black hole that would be central to the plot. As a theoretical physicist, he also wanted to create something that was truly realistic and as close to the real thing as movie-goers would ever see.

On the other hand, Christopher Nolan – the film’s director – wanted to create something that would be a visually-mesmerizing experience. As you can see from the image above, they certainly succeeded as far as the aesthetics were concerned. But even more impressive was how the creation of this fictitious black hole led to an actual scientific discovery.

Continue reading “The Physics Behind “Interstellar’s” Visual Effects Was So Good, it Led to a Scientific Discovery”

Hawking Radiation Replicated in a Laboratory?

In honor of Dr. Stephen Hawking, the COSMOS center will be creating the most detailed 3D mapping effort of the Universe to date. Credit: BBC, Illus.: T.Reyes

Dr. Stephen Hawking delivered a disturbing theory in 1974 that claimed black holes evaporate. He said black holes are not absolutely black and cold but rather radiate energy and do not last forever. So-called “Hawking radiation” became one of the physicist’s most famous theoretical predictions. Now, 40 years later, a researcher has announced the creation of a simulation of Hawking radiation in a laboratory setting.

The possibility of a black hole came from Einstein’s theory of General Relativity. Karl Schwarzchild in 1916 was the first to realize the possibility of a gravitational singularity with a boundary surrounding it at which light or matter entering cannot escape.

This month, Jeff Steinhauer from the Technion – Israel Institute of Technology, describes in his paper, “Observation of self-amplifying Hawking radiation in an analogue black-hole laser” in the journal Nature, how he created an analogue event horizon using a substance cooled to near absolute zero and using lasers was able to detect the emission of Hawking radiation. Could this be the first valid evidence of the existence of Hawking radiation and consequently seal the fate of all black holes?

This is not the first attempt at creating a Hawking radiation analogue in a laboratory. In 2010, an analogue was created from a block of glass, a laser, mirrors and a chilled detector (Phys. Rev. Letter, Sept 2010); no smoke accompanied the mirrors. The ultra-short pulse of intense laser light passing through the glass induced a refractive index perturbation (RIP) which functioned as an event horizon. Light was seen emitting from the RIP. Nevertheless, the results by F. Belgiorno et al. remain controversial. More experiments were still warranted.

The latest attempt at replicating Hawking radiation by Steinhauer takes a more high tech approach. He creates a Bose-Einstein condensate, an exotic state of matter at very near absolute zero temperature. Boundaries created within the condensate functioned as an event horizon. However, before going into further details, let us take a step back and consider what Steinhauer and others are trying to replicate.

Artists illustrations of black holes are guided by descriptions given from theorists. There are many illustrations. A black hole has never been seen up close. However, to have Hawking radiation all the theatrics of accretion disks and matter being funneled off a companion star are unnecessary. One just needs a black hole in the darkness of space. (Illustration: public domain)
Artists illustrations of black holes are guided by descriptions given to them by theorists. There are many illustrations. A black hole has never been seen up close. However, to have Hawking radiation, all the theatrics of accretion disks and matter being funneled off a companion star are unnecessary. Just a black hole in the darkness of space will do. (Illustration: public domain)

The recipe for the making Hawking radiation begins with a black hole. Any size black hole will do. Hawking’s theory states that smaller black holes will more rapidly radiate than larger ones and in the absence of matter falling into them – accretion, will “evaporate” much faster. Giant black holes can take longer than a million times the present age of the Universe to evaporate by way of Hawking radiation. Like a tire with a slow leak, most black holes would get you to the nearest repair station.

So you have a black hole. It has an event horizon. This horizon is also known as the Schwarzchild radius; light or matter checking into the event horizon can never check out. Or so this was the accepted understanding until Dr. Hawking’s theory upended it. And outside the event horizon is ordinary space with some caveats; consider it with some spices added. At the event horizon the force of gravity from the black hole is so extreme that it induces and magnifies quantum effects.

All of space – within us and surrounding us to the ends of the Universe includes a quantum vacuum. Everywhere in space’s quantum vacuum, virtual particle pairs are appearing and disappearing; immediately annihilating each other on extremely short time scales. With the extreme conditions at the event horizon, virtual particle and anti-particles pairs, such as, an electron and positron, are materializing. The ones that appear close enough to an event horizon can have one or the other virtual particle zapped up by the black holes gravity leaving only one particle which consequently is now free to add to the radiation emanating from around the black hole; the radiation that as a whole is what astronomers can use to detect the presence of a black hole but not directly observe it. It is the unpairing of virtual particles by the black hole at its event horizon that causes the Hawking radiation which by itself represents a net loss of mass from the black hole.

So why don’t astronomers just search in space for Hawking radiation? The problem is that the radiation is very weak and is overwhelmed by radiation produced by many other physical processes surrounding the black hole with an accretion disk. The radiation is drowned out by the chorus of energetic processes. So the most immediate possibility is to replicate Hawking radiation by using an analogue. While Hawking radiation is weak in comparison to the mass and energy of a black hole, the radiation has essentially all the time in the Universe to chip away at its parent body.

This is where the convergence of the growing understanding of black holes led to Dr. Hawking’s seminal work. Theorists including Hawking realized that despite the Quantum and Gravitational theory that is necessary to describe a black hole, black holes also behave like black bodies. They are governed by thermodynamics and are slaves to entropy. The production of Hawking radiation can be characterized as a thermodynamic process and this is what leads us back to the experimentalists. Other thermodynamic processes could be used to replicate the emission of this type of radiation.

Using the Bose-Einstein condensate in a vessel, Steinhauer directed laser beams into the delicate condensate to create an event horizon. Furthermore, his experiment creates sound waves that become trapped between two boundaries that define the event horizon. Steinhauer found that the sound waves at his analogue event horizon were amplified as happens to light in a common laser cavity but also as predicted by Dr. Hawking’s theory of black holes. Light escapes from the laser present at the analogue event horizon. Steinhauer  explains that this escaping light represents the long sought Hawking radiation.

Publication of this work in Nature underwent considerable peer review to be accepted but that alone does not validate his findings. Steinhauer’s work will now withstand even greater scrutiny. Others will attempt to duplicate his work. His lab setup is an analogue and it remains to be verified that what he is observing truly represents Hawking radiation.

References:

Observation of self-amplifying Hawking radiation in an analogue black-hole laser“, Nature Physics, 12 October 2014

“Hawking Radiation from Ultrashort Laser Pulse Filaments”, F. Belgiorno, et al., Phys. Rev. Letter, Sept 2010

“Black hole explosions?”, S. W. Hawking, et al., Nature, 01 March 1974

“The Quantum Mechanics of Black Holes”, S. W. Hawking, Scientific American, January 1977