Unusual Distributions of Organics Found in Titan’s Atmosphere

The ALMA array, as it looks now completed and standing on a Chilean high plateau at 5000 meters (16,400 ft) altitude. The first observations with ALMA of Titan have added to the Saturn moon's list of mysteries. {Credit: ALMA (ESO/NAOJ/NRAO) / L. Calçada (ESO)}

A new mystery of Titan has been uncovered by astronomers using their latest asset in the high altitude desert of Chile. Using the now fully deployed Atacama Large Millimeter Array (ALMA) telescope in Chile, astronomers moved from observing comets to Titan. A single 3 minute observation revealed organic molecules that are askew in the atmosphere of Titan. The molecules in question should be smoothly distributed across the atmosphere, but they are not.

The Cassini/Huygens spacecraft at the Saturn system has been revealing the oddities of Titan to us, with its lakes and rain clouds of methane, and an atmosphere thicker than Earth’s. But the new observations by ALMA of Titan underscore how much more can be learned about Titan and also how incredible the ALMA array is.

ALMA first obserations of the atmospher of Saturn's moon Titan. The image shows the distribution of the organic molecule HNC. Red to White representing low to high concenrations. The offset locations of the molecules relative to the poles suprised the researchers lead by NASA/GSFC astrochemist M. Cordiner.(Credit: NRAO/AUI/NSF; M. Cordiner (NASA) et at.)
ALMA’s first observations of the atmosphere of Saturn’s moon Titan. The image shows the distribution of the organic molecule HNC. Red to White representing low to high concentrations. The offset locations of the molecules relative to the poles surprised the researchers led by NASA/GSFC astrochemist M. Cordiner. (Credit: NRAO/AUI/NSF; M. Cordiner (NASA) et at.)

The ALMA astronomers called it a “brief 3 minute snapshot of Titan.” They found zones of organic molecules offset from the Titan polar regions. The molecules observed were hydrogen isocyanide (HNC) and cyanoacetylene (HC3N). It is a complete surprise to the astrochemist Martin Cordiner from NASA Goddard Space Flight Center in Greenbelt, Maryland. Cordiner is the lead author of the work published in the latest release of Astrophysical Journal Letters.

The NASA Goddard press release states, “At the highest altitudes, the gas pockets appeared to be shifted away from the poles. These off-pole locations are unexpected because the fast-moving winds in Titan’s middle atmosphere move in an east–west direction, forming zones similar to Jupiter’s bands, though much less pronounced. Within each zone, the atmospheric gases should, for the most part, be thoroughly mixed.”

When one hears there is a strange, skewed combination of organic compounds somewhere, the first thing to come to mind is life. However, the astrochemists in this study are not concluding that they found a signature of life. There are, in fact, other explanations that involve simpler forces of nature. The Sun and Saturn’s magnetic field deliver light and energized particles to Titan’s atmosphere. This energy causes the formation of complex organics in the Titan atmosphere. But how these two molecules – HNC and HC3N – came to have a skewed distribution is, as the astrochemists said, “very intriguing.” Cordiner stated, “This is an unexpected and potentially groundbreaking discovery… a fascinating new problem.”

The press release from the National Radio Astronomy Observatory states, “studying this complex chemistry may provide insights into the properties of Earth’s very early atmosphere.” Additionally, the new observations add to understanding Titan – a second data point (after Earth) for understanding organics of exo-planets, which may number in the hundreds of billions beyond our solar system within our Milky Way galaxy. Astronomers need more data points in order to sift through the many exo-planets that will be observed and harbor organic compounds. With Titan and Earth, astronomers will have points of comparison to determine what is happening on distant exo-planets, whether it’s life or not.

High in the atmosphere of Titan, large patches of two trace gases glow near the north pole, on the dusk side of the moon, and near the south pole, on the dawn side. Brighter colors indicate stronger signals from the two gases, HNC (left) and HC3N (right); red hues indicate less pronounced signals. Image (Credit: NRAO/AUI/NSF)
High in the atmosphere of Titan, large patches of two trace gases glow near the north pole, on the dusk side of the moon, and near the south pole, on the dawn side. Brighter colors indicate stronger signals from the two gases, HNC (left) and HC3N (right); red hues indicate less pronounced signals.
(Image Credit: NRAO/AUI/NSF)

The report of this new and brief observation also underscores the new astronomical asset in the altitudes of Chile. ALMA represents the state of the art of millimeter and sub-millimeter astronomy. This field of astronomy holds a lot of promise. Back around 1980, at the Kitt Peak National Observatory in Arizona, alongside the great visible light telescopes, there was an oddity, a millimeter wavelength dish. That dish was the beginning of radio astronomy in the 1 – 10 millimeter wavelength range. Millimeter astronomy is only about 35 years old. These wavelengths stand at the edge of the far infrared and include many light emissions and absorptions from cold objects which often include molecules and particularly organics. The ALMA array has 10 times more resolving power than the Hubble space telescope.

The Earth’s atmosphere stands in the way of observing the Universe in these wavelengths. By no coincidence our eyes evolved to see in the visible light spectrum. It is a very narrow band, and it means that there is a great, wide world of light waves to explore with different detectors than just our eyes.

The diagram shows the electromagnetic spectrum, the absorption of light by the Earth's atmosphere and illustrates the astronomical assets that focus on specific wavelengths of light. ALMA at the Chilean site and with modern solid state electronics is able to overcome the limitations placed by the Earth's atmosphere. (Credit: Wikimedia, T.Reyes)
The diagram shows the electromagnetic spectrum, the absorption of light by the Earth’s atmosphere, and illustrates the astronomical assets that focus on specific wavelengths of light. ALMA at the Chilean site, with modern solid state electronics, is able to overcome the limitations placed by the Earth’s atmosphere. (Credit: Wikimedia, T.Reyes)

In the millimeter range of wavelengths, water, oxygen, and nitrogen are big absorbers. Some wavelengths in the millimeter range are completely absorbed. So there are windows in this range. ALMA is designed to look at those wavelengths that are accessible from the ground. The Chajnantor plateau in the Atacama desert at 5000 meters (16,400 ft) provides the driest, clearest location in the world for millimeter astronomy outside of the high altitude regions of the Antarctic.

At high altitude and over this particular desert, there is very little atmospheric water. ALMA consists of 66 12 meter (39 ft) and 7 meter (23 ft) dishes. However, it wasn’t just finding a good location that made ALMA. The 35 year history of millimeter-wavelength astronomy has been a catch up game. Detecting these wavelengths required very sensitive detectors – low noise in the electronics. The steady improvement in solid-state electronics from the late 70s to today and the development of cryostats to maintain low temperatures have made the new observations of Titan possible. These are observations that Cassini at 1000 kilometers from Titan could not do but ALMA at 1.25 billion kilometers (775 million miles) away could.

The 130 ton German Antenna Dish Transporter, nicknamed Otto. The ALMA transporter vehicle carefully carries the state-of-the-art antenna, with a diameter of 12 metres and a weight of about 100 tons, on the 28 km journey to the Array Operations Site, which is at an altitude of 5000 m. The antenna is designed to withstand the harsh conditions at the high site, where the extremely dry and rarefied air is ideal for ALMA’s observations of the universe at millimetre- and sub-millimetre-wavelengths. (Credit: ESO)
The 130 ton German Antenna Dish Transporter, nicknamed Otto. The ALMA transporter vehicle carefully carries the state-of-the-art antenna, with a diameter of 12 metres and a weight of about 100 tons, on the 28 km journey to the Array Operations Site, which is at an altitude of 5000 m. The antenna is designed to withstand the harsh conditions at the high site, where the extremely dry and rarefied air is ideal for ALMA’s observations of the universe at millimetre- and sub-millimetre-wavelengths. (Credit: ESO)

The ALMA telescope array was developed by a consortium of countries led by the United States’ National Science Foundation (NSF) and countries of the European Union though ESO (European Organisation for Astronomical Research in the Southern Hemisphere). The first concepts were proposed in 1999. Japan joined the consortium in 2001.

The prototype ALMA telescope was tested at the site of the VLA in New Mexico in 2003. That prototype now stands on Kitt Peak having replaced the original millimeter wavelength dish that started this branch of astronomy in the 1980s. The first dishes arrived in 2007 followed the next year by the huge transporters for moving each dish into place at such high altitude. The German-made transporter required a cabin with an oxygen supply so that the drivers could work in the rarefied air at 5000 meters. The transporter was featured on an episode of the program Monster Moves. By 2011, test observations were taking place, and by 2013 the first science program was undertaken. This year, the full array was in place and the second science program spawned the Titan observations. Many will follow. ALMA, which can operate 24 hours per day, will remain the most powerful instrument in its class for about 10 years when another array in Africa will come on line.

References:

NASA Goddard Press Release

NRAO Press Release

ALMA Observatory Website

Alma Measurements Of The Hnc And Hc3N Distributions In Titan’s Atmosphere“, M. A. Cordiner, et al., Astrophysical Journal Letters

The Physics Behind “Interstellar’s” Visual Effects Was So Good, it Led to a Scientific Discovery

Kip Thorne’s concept for a black hole in 'Interstellar.' Image Credit: Paramount Pictures

While he was working on the film Interstellar, executive producer Kip Thorne was tasked with creating the black hole that would be central to the plot. As a theoretical physicist, he also wanted to create something that was truly realistic and as close to the real thing as movie-goers would ever see.

On the other hand, Christopher Nolan – the film’s director – wanted to create something that would be a visually-mesmerizing experience. As you can see from the image above, they certainly succeeded as far as the aesthetics were concerned. But even more impressive was how the creation of this fictitious black hole led to an actual scientific discovery.

Continue reading “The Physics Behind “Interstellar’s” Visual Effects Was So Good, it Led to a Scientific Discovery”

Hawking Radiation Replicated in a Laboratory?

In honor of Dr. Stephen Hawking, the COSMOS center will be creating the most detailed 3D mapping effort of the Universe to date. Credit: BBC, Illus.: T.Reyes

Dr. Stephen Hawking delivered a disturbing theory in 1974 that claimed black holes evaporate. He said black holes are not absolutely black and cold but rather radiate energy and do not last forever. So-called “Hawking radiation” became one of the physicist’s most famous theoretical predictions. Now, 40 years later, a researcher has announced the creation of a simulation of Hawking radiation in a laboratory setting.

The possibility of a black hole came from Einstein’s theory of General Relativity. Karl Schwarzchild in 1916 was the first to realize the possibility of a gravitational singularity with a boundary surrounding it at which light or matter entering cannot escape.

This month, Jeff Steinhauer from the Technion – Israel Institute of Technology, describes in his paper, “Observation of self-amplifying Hawking radiation in an analogue black-hole laser” in the journal Nature, how he created an analogue event horizon using a substance cooled to near absolute zero and using lasers was able to detect the emission of Hawking radiation. Could this be the first valid evidence of the existence of Hawking radiation and consequently seal the fate of all black holes?

This is not the first attempt at creating a Hawking radiation analogue in a laboratory. In 2010, an analogue was created from a block of glass, a laser, mirrors and a chilled detector (Phys. Rev. Letter, Sept 2010); no smoke accompanied the mirrors. The ultra-short pulse of intense laser light passing through the glass induced a refractive index perturbation (RIP) which functioned as an event horizon. Light was seen emitting from the RIP. Nevertheless, the results by F. Belgiorno et al. remain controversial. More experiments were still warranted.

The latest attempt at replicating Hawking radiation by Steinhauer takes a more high tech approach. He creates a Bose-Einstein condensate, an exotic state of matter at very near absolute zero temperature. Boundaries created within the condensate functioned as an event horizon. However, before going into further details, let us take a step back and consider what Steinhauer and others are trying to replicate.

Artists illustrations of black holes are guided by descriptions given from theorists. There are many illustrations. A black hole has never been seen up close. However, to have Hawking radiation all the theatrics of accretion disks and matter being funneled off a companion star are unnecessary. One just needs a black hole in the darkness of space. (Illustration: public domain)
Artists illustrations of black holes are guided by descriptions given to them by theorists. There are many illustrations. A black hole has never been seen up close. However, to have Hawking radiation, all the theatrics of accretion disks and matter being funneled off a companion star are unnecessary. Just a black hole in the darkness of space will do. (Illustration: public domain)

The recipe for the making Hawking radiation begins with a black hole. Any size black hole will do. Hawking’s theory states that smaller black holes will more rapidly radiate than larger ones and in the absence of matter falling into them – accretion, will “evaporate” much faster. Giant black holes can take longer than a million times the present age of the Universe to evaporate by way of Hawking radiation. Like a tire with a slow leak, most black holes would get you to the nearest repair station.

So you have a black hole. It has an event horizon. This horizon is also known as the Schwarzchild radius; light or matter checking into the event horizon can never check out. Or so this was the accepted understanding until Dr. Hawking’s theory upended it. And outside the event horizon is ordinary space with some caveats; consider it with some spices added. At the event horizon the force of gravity from the black hole is so extreme that it induces and magnifies quantum effects.

All of space – within us and surrounding us to the ends of the Universe includes a quantum vacuum. Everywhere in space’s quantum vacuum, virtual particle pairs are appearing and disappearing; immediately annihilating each other on extremely short time scales. With the extreme conditions at the event horizon, virtual particle and anti-particles pairs, such as, an electron and positron, are materializing. The ones that appear close enough to an event horizon can have one or the other virtual particle zapped up by the black holes gravity leaving only one particle which consequently is now free to add to the radiation emanating from around the black hole; the radiation that as a whole is what astronomers can use to detect the presence of a black hole but not directly observe it. It is the unpairing of virtual particles by the black hole at its event horizon that causes the Hawking radiation which by itself represents a net loss of mass from the black hole.

So why don’t astronomers just search in space for Hawking radiation? The problem is that the radiation is very weak and is overwhelmed by radiation produced by many other physical processes surrounding the black hole with an accretion disk. The radiation is drowned out by the chorus of energetic processes. So the most immediate possibility is to replicate Hawking radiation by using an analogue. While Hawking radiation is weak in comparison to the mass and energy of a black hole, the radiation has essentially all the time in the Universe to chip away at its parent body.

This is where the convergence of the growing understanding of black holes led to Dr. Hawking’s seminal work. Theorists including Hawking realized that despite the Quantum and Gravitational theory that is necessary to describe a black hole, black holes also behave like black bodies. They are governed by thermodynamics and are slaves to entropy. The production of Hawking radiation can be characterized as a thermodynamic process and this is what leads us back to the experimentalists. Other thermodynamic processes could be used to replicate the emission of this type of radiation.

Using the Bose-Einstein condensate in a vessel, Steinhauer directed laser beams into the delicate condensate to create an event horizon. Furthermore, his experiment creates sound waves that become trapped between two boundaries that define the event horizon. Steinhauer found that the sound waves at his analogue event horizon were amplified as happens to light in a common laser cavity but also as predicted by Dr. Hawking’s theory of black holes. Light escapes from the laser present at the analogue event horizon. Steinhauer  explains that this escaping light represents the long sought Hawking radiation.

Publication of this work in Nature underwent considerable peer review to be accepted but that alone does not validate his findings. Steinhauer’s work will now withstand even greater scrutiny. Others will attempt to duplicate his work. His lab setup is an analogue and it remains to be verified that what he is observing truly represents Hawking radiation.

References:

Observation of self-amplifying Hawking radiation in an analogue black-hole laser“, Nature Physics, 12 October 2014

“Hawking Radiation from Ultrashort Laser Pulse Filaments”, F. Belgiorno, et al., Phys. Rev. Letter, Sept 2010

“Black hole explosions?”, S. W. Hawking, et al., Nature, 01 March 1974

“The Quantum Mechanics of Black Holes”, S. W. Hawking, Scientific American, January 1977

Old Equations Shed New Light on Quasars

An artists illustration of the early Universe. Image Credit: NASA

There’s nothing more out of this world than quasi-stellar objects or more simply – quasars. These are the most powerful and among the most distant objects in the Universe. At their center is a black hole with the mass of a million or more Suns. And these powerhouses are fairly compact – about the size of our Solar System. Understanding how they came to be and how — or if — they evolve into the galaxies that surround us today are some of the big questions driving astronomers.

Now, a new paper by Yue Shen and Luis C. Ho – “The diversity of quasars unified by accretion and orientation” in the journal Nature confirms the importance of a mathematical derivation by the famous astrophysicist Sir Arthur Eddington during the first half of the 20th Century, in understanding not just stars but the properties of quasars, too. Ironically, Eddington did not believe black holes existed, but now his derivation, the Eddington Luminosity, can be used more reliably to determine important properties of quasars across vast stretches of space and time.

A quasar is recognized as an accreting (meaning- matter falling upon) super massive black hole at the center of an “active galaxy”. Most known quasars exist at distances that place them very early in the Universe; the most distant is at 13.9 billion light years, a mere 770 million years after the Big Bang. Somehow, quasars and the nascent galaxies surrounding them evolved into the galaxies present in the Universe today.  At their extreme distances, they are point-like, indistinguishable from a star except that the spectra of their light differ greatly from a star’s. Some would be as bright as our Sun if they were placed 33 light years away meaning that  they are over a trillion times more luminous than our star.

An artists illustration of the central engine of a Quasar. These "Quasi-stellar Objects" QSOs are now recognized as the super massive black holes at the center of emerging galaxies in the early Universe. (Photo Credit: NASA)
An artists illustration of the central engine of a quasar. These “Quasi-stellar Objects” QSOs are now recognized as the super massive black holes at the center of emerging galaxies in the early Universe. (Photo Credit: NASA)

The Eddington luminosity  defines the maximum luminosity that a star can exhibit that is in equilibrium; specifically, hydrostatic equilibrium. Extremely massive stars and black holes can exceed this limit but stars, to remain stable for long periods, are in hydrostatic equilibrium between their inward forces – gravity – and the outward electromagnetic forces. Such is the case of our star, the Sun, otherwise it would collapse or expand which in either case, would not have provided the stable source of light that has nourished life on Earth for billions of years.

Generally, scientific models often start simple, such as Bohr’s model of the hydrogen atom, and later observations can reveal intricacies that require more complex theory to explain, such as Quantum Mechanics for the atom. The Eddington luminosity and ratio could be compared to knowing the thermal efficiency and compression ratio of an internal combustion engine; by knowing such values, other properties follow.

Several other factors regarding the Eddington Luminosity are now known which are necessary to define the “modified Eddington luminosity” used today.

The new paper in Nature shows how the Eddington Luminosity helps understand the driving force behind the main sequence of quasars, and Shen and Ho call their work the missing definitive proof that quantifies the correlation of a quasar properties to a quasar’s Eddington ratio.

They used archival observational data to uncover the relationship between the strength of the optical Iron [Fe] and Oxygen[O III] emissions – strongly tied to the physical properties of the quasar’s central engine – a super-massive black hole, and the Eddington ratio. Their work provides the confidence and the correlations needed to move forward in our understanding of quasars and their relationship to the evolution of galaxies in the early Universe and up to our present epoch.

Astronomers have been studying quasars for a little over 50 years. Beginning in 1960, quasar discoveries began to accumulate but only through radio telescope observations. Then, a very accurate radio telescope measurement of Quasar 3C 273 was completed using a Lunar occultation. With this in hand, Dr. Maarten Schmidt of California Institute of Technology was able to identify the object in visible light using the 200 inch Palomar Telescope. Reviewing the strange spectral lines in its light, Schmidt reached the right conclusion that quasar spectra exhibit an extreme redshift and it was due to cosmological effects. The cosmological redshift of quasars meant that they are at a great distance from us in space and time. It also spelled the demise of the Steady-State theory of the Universe and gave further support to an expanding Universe that emanated from a singularity – the Big Bang.

Dr. Maarten Schmidt, Caltech University, with Donald Lynden-Bell, were the first recipients of the Kavli Prize in Astrophysics, “for their seminal contributions to understanding the nature of quasars”. While in high school, this author had the privilege to meet Dr. Schmidt at the Los Angeles Museum of Natural History after his presentation to a group of students. (Photo Credit: Caltech)
Dr. Maarten Schmidt, Caltech, with Donald Lynden-Bell, were the first recipients of the Kavli Prize in Astrophysics, “for their seminal contributions to understanding the nature of quasars”. While in high school, this author had the privilege to meet Dr. Schmidt at the Los Angeles Museum of Natural History after his presentation to a group of students. (Photo Credit: Caltech)

The researchers, Yue Shen and Luis C. Ho are from the Institute for Astronomy and Astrophysics at Peking University working with the Carnegie Observatories, Pasadena, California.

References and further reading:

“The diversity of quasars unified by accretion and orientation”, Yue Shen, Luis C. Ho, Sept 11, 2014, Nature

“What is a Quasar?”, Universe Today, Fraser Cain, August 12, 2013

“Interview with Maarten Schmidt”, Caltech Oral Histories, 1999

“Fifty Years of Quasars, a Symposium in honor of Maarten Schmidt”, Caltech, Sept 9, 2013

Comet Siding Spring: Close Call for Mars, Wake Up Call for Earth?

Five orbiters from India, the European Union and the United States will nestle behind the Mars as comet Siding Springs passes at a speed of 200,000 km/hr (125,000 mph). At right, Comet Shoemaker-Levy 9 impacts on Jupiter, the Chelyabinsk Asteroid over Russia. (Credits: NASA,ESA, ISRO)

It was 20 years ago this past July when images of Jupiter being pummeled by a comet caught the world’s attention. Comet Shoemaker-Levy 9 had flown too close to Jupiter. It was captured by the giant planet’s gravity and torn into a string of beads. One by one the comet fragments impacted Jupiter — leaving blemishes on its atmosphere, each several times larger than Earth in size.

Until that event, no one had seen a comet impact a planet. Now, Mars will see a very close passage of the comet Siding Spring on October 19th. When the comet was first discovered, astronomers quickly realized that it was heading straight at Mars. In fact, it appeared it was going to be a bulls-eye hit — except for the margin of error in calculating a comet’s trajectory from 1 billion kilometers (620 million miles, 7 AU) away.

It took several months of analysis for a cataclysmic impact on Mars to be ruled out. So now today, Mars faces just a cosmic close shave. But this comet packs enough energy that an impact would have globally altered Mars’ surface and atmosphere.

So what should we Earthlings gather from this and other events like it? Are we next? Why or why not should we be prepared for impacts from these mile wide objects?

For one, ask any dinosaur and you will have your answer.

Adding Siding Spring to the Comet 67P atop Los Angeles provides a rough comparison of sizes. This images was expanded upon U.T.'s Bob King - "What Comets, Parking Lots and Charcoal Have in Common". (Credit: ESA, anosmicovni)
An illustration of the Siding Spring comet in comparison to the Comet 67P atop Los Angeles. The original image was the focus of Bob King’s article – “What Comets, Parking Lots and Charcoal Have in Common“. (Credit: ESA, anosmicovni)

One can say that Mars was spared as were the five orbiting spacecraft from India (Mars Orbiter Mission), the European Union (Mars Express) and the United States (MOD, MRO, MAVEN). We have Scottish-Australian astronomer Robert McNaught to thank for discovering the comet on January 3, 2013, using the half meter (20 inch) Uppsala Southern Schmidt Telescope at Siding Spring, Australia.

Initially the margin of error in the trajectory was large, but a series of observations gradually reduced the error. By late summer 2014, Mars was in the clear and astronomers could confidently say the comet would pass close but not impact. Furthermore, as observations accumulated — including estimates of the outpouring of gases and dust — comet Siding Spring shrunk in size, i.e. the estimates of potentially tens of kilometers were down to now 700 meters (4/10th of a mile) in diameter. Estimates of the gas and dust production are low and the size of the tail and coma — the spherical gas cloud surrounding the solid body — are small and only the outer edge of both will interact with Mars’ atmosphere.

The mass, velocity and kinetic energy of celestial bodies can be deceiving. It is useful to compare the Siding Spring comet to common or man-made objects.
The mass, velocity and kinetic energy of celestial bodies can be deceiving. It is useful to compare the Siding Spring comet to common or man-made objects.

Yet, this is a close call for Mars. We could not rule out a collision for over six months. While this comet is small, it is moving relative to Mars at a speed of 200,000 kilometers/hour (125,000 mph, 56 km/sec). This small body packs a wallop. From high school science or intro college Physics, many of us know that the kinetic energy of an object increases by the square of the velocity. Double the velocity and the energy of the object goes up by 4, increase by 3 – energy increases by 9.

So the close shave for Mars is yet another wake up call for the “intelligent” space faring beings of the planet Earth. A wake up call because the close passage of a comet could have just as easily involved Earth. Astronomers would have warned the world of a comet heading straight for us, one that could wipe out 70% of all life as happened 65 million years ago to the dinosaurs. Replace dinosaur with humans and you have the full picture.

Time would have been of the essence. The space faring nations of the world — those of the EU, and Russia, the USA, Japan and others — would have gathered and attempted to conceive some spacecrafts with likely nuclear weapons that could be built and launched within a few months. Probably several vehicles with weapons would be launched at once, leaving Earth as soon as possible. Intercepting a comet or asteroid further out would give the impulse from the explosions more time to push the incoming body away from the Earth.

There is no way that humanity could sit on their collective hands and wait for astronomers to observe and measure for months until they could claim that it would just be a close call for Earth. We could imagine the panic it would cause. Recall the scenes from Carl Sagan’s movie Contact with people of every persuasion expressing at 120 decibels their hopes and fears. Even a small comet or asteroid, only a half kilometer – a third of a mile in diameter would be a cataclysmic event for Mars or Earth.

But yet, in the time that has since transpired from discovery of the comet Siding Spring (1/3/2013), the Chelyabinsk asteroid (~20 m/65 ft) exploded in an air burst that injured 1500 people in Russia. The telescope that discovered Comet Siding Spring was decommissioned in late 2013 and the Southern Near-Earth Object Survey was shutdown. This has left the southern skies without a dedicated telescope for finding near-Earth asteroids. And proposals such as the Sentinel project by the B612 Foundation remain underfunded.

We know of the dangers from small celestial bodies such as comets or asteroids. Government organizations in the United States and groups at the United Nations are discussing plans. There is plenty of time to find and protect the Earth but not necessarily time to waste.

Previous U.T. Siding Spring stories:
What Comets, Parking Lots and Charcoal Have in Common“, Bob King, Sept 5, 2014
MAVEN Mars Orbiter Ideally Poised to Uniquely Map Comet Siding Spring Composition
– Exclusive Interview with Principal Investigator Bruce Jakosky”, Ken Kremer“, Sept 5, 2014
NASA Preps for Nail-biting Comet Flyby of Mars“, BoB King, July 26,2014

Time Dilation Confirmed in the Lab

Blah.

It sounds like science fiction, but the time you experience between two events depends directly on the path you take through the universe. In other words, Einstein’s theory of special relativity postulates that a person traveling in a high-speed rocket would age more slowly than people back on Earth.

Although few physicists doubt Einstein was right, it’s crucial to verify time dilation to the best possible accuracy. Now, an international team of researchers, including Nobel laureate Theodor Hänsch, director of the Max Planck optics institute, has done just this.

Tests of special relativity date back to 1938. But once we started going to space regularly, we had to learn to deal with time dilation on a daily basis. GPS satellites, for example, are basically clocks in orbit. They travel at a whopping speed of 14,000 kilometers per hour well above the Earth’s surface at a distance of 20,000 kilometers. So relative to an atomic clock on the ground they lose about 7 microseconds per day, a number that has to be taken into account for them to work properly.

To test time dilation to a much higher precision, Benjamin Botermann of Johannes Gutenberg-University, Germany, and colleagues accelerated lithium ions to one-third the speed of light. Here the Doppler shift quickly comes into play. Any ions flying toward the observer will be blue shifted and any ions flying away from the observer will be red shifted.

The level at which the ions undergo a Doppler shift depends on their relative motion, with respect to the observer. But this also makes their clock run slow, which redshifts the light from the observer’s point of view — an effect that you should be able to measure in the lab.

So the team stimulated transitions in the ions using two lasers propagating in opposite directions. Then any shifts in the absorption frequency of the ions are dependent on the Doppler effect, which we can easily calculate, and the redshift due to time dilation.

The team verified their time dilation prediction to a few parts per billion, improving on previous limits. The findings were published on Sept. 16 in the journal Physical Review Letters.

A Fun Way of Understanding E=mc2

Einstein's Relativity, yet another momentous advancement for humanity brought forth from an ongoing mathematical dialogue. Image via Pixabay.

Many people fail to realize just how much energy there is locked up in matter. The nucleus of any atom is an oven of intense radiation, and when you open the oven door, that energy spills out; oftentimes violently. However, there is something even more intrinsic to this aspect of matter that escaped scientists for years.

It wasn’t until the brilliance of Albert Einstein that we were able to fully grasp this correlation between mass and energy. Enter E=mc2. This seemingly simple algebraic formula represents the correlation of energy to matter (energy equivalence of any given amount of mass). Many have heard of it, but not very many understand what it implies. Many people are unaware of just how much energy is contained within matter. So, for the next few minutes, I will attempt to convey to you the magnitude of your own personal potential energy equivalence.

First, we must break down this equation. What do each of the letters mean? What are their values? Let’s break it down from left to right:

Albert Einstein's Inventions
Albert Einstein. Image credit: Library of Congress

E represents the energy, which we measure in Joules. Joules is an SI measurement for energy and is measured as kilograms x meters squared per seconds squared [kg x m2/s2]. All this essentially means is that a Joule of energy is equal to the force used to move a specific object 1 meter in the same direction as the force.

m represents the mass of the specified object. For this equation, we measure mass in Kilograms (or 1000 grams).

c represents the speed of light. In a vacuum, light moves at 186,282 miles per second. However in science we utilize the SI (International System of Units), therefore we use measurements of meters and kilometers as opposed to feet and miles. So whenever we do our calculations for light, we use 3.00 × 108m/s, or rather 300,000,000 meters per second.

So essentially what the equation is saying is that for a specific amount of mass (in kilograms), if you multiply it by the speed of light squared (3.00×108)2, you get its energy equivalence (Joules). So, what does this mean? How can I relate to this, and how much energy is in matter? Well, here comes the fun part. We are about to conduct an experiment.

This isn’t one that we need fancy equipment for, nor is it one that we need a large laboratory for. All we need is simple math and our imagination. Now before I go on, I would like to point out that I am utilizing this equation in its most basic form. There are many more complex derivatives of this equation that are used for many different applications. It is also worth mentioning that when two atoms fuse (such as Hydrogen fusing into Helium in the core of our star) only about 0.7% of the mass is converted into total energy. For our purposes we needn’t worry about this, as I am simply illustrating the incredible amounts of energy that constitutes your equivalence in mass, not illustrating the fusion of all of your mass turning into energy.

Let’s begin by collecting the data so that we can input it into our equation. I weigh roughly 190 pounds. Again, as we use SI units in science, we need to convert this over from pounds to grams. Here is how we do this:

1 Josh = 190lbs
1 lbs = 453.6g
So 190lbs × 453.6g/1 lbs = 86,184g
So 1 Josh = 86,184g

Since our measurement for E is in Joules, and Joule units of measurement are kilograms x meters squared per seconds squared, I need to convert my mass in grams to my mass in kilograms. We do that this way:

86,184g × 1kg/1000g = 86.18kg.

So 1 Josh = 86.18kg.
Now that I’m in the right unit of measure for mass, we can plug the values into the equation and see just what we get:
E=mc2
E= (86.18kg)(3.00 × 108m/s)2
E= 7.76 × 1018 J

That looks like this: 7,760,000,000,000,000,000 or roughly 7.8 septillion Joules of energy.

Artistic rendition of energy released in an explosion. Via Pixabay.
Artistic rendition of energy released in an explosion. Via Pixabay.

This is an incredibly large amount of energy. However, it still seems very vague. What does that number mean? How much energy is that really? Well, let’s continue this experiment and find something that we can measure this against, to help put this amount of energy into perspective for us.

First, let’s convert our energy into an equivalent measurement. Something we can relate to. How does TNT sound? First, we must identify a common unit of measurement for TNT. The kiloton. Now we find out just how many kilotons of TNT are in 1 Joule. After doing a little searching I found a conversion ratio that will let us do just this:

1 Joule = 2.39 × 10-13 kilotons of explosives. Meaning that 1 Joule of energy is equal to .000000000000239 kilotons of TNT. That is a very small number. A better way to understand this relationship is to flip that ratio around to see how many Joules of energy is in 1 kiloton of TNT. 1 kiloton of TNT = 4.18×1012 Joules or rather 4,184,000,000,000 Joules.

Now that we have our conversion ratio, let’s do the math.

1 Josh (E) = 7.76 x 1018 J
7.76 x 1018 J x 1 kT TNT/ 4.18 x 1012 J = 1,856,459 kilotons of TNT.

Thus, concluding our little mind experiment we find that just one human being is roughly the equivalence of 1.86 MILLION kilotons of TNT worth of energy. Let’s now put that into perspective, just to illuminate the massive amount of power that this equivalence really is.

The bomb that destroyed Nagasaki in Japan during World War II was devastating. It leveled a city in seconds and brought the War in the Pacific to a close. That bomb was approximately 21 kilotons of explosives. So that means that I, 1 human being, have 88,403 times more explosive energy in me than a bomb that destroyed an entire city… and that goes for every human being.

So when you hear someone tell you that you’ve got real potential, just reply that they have no idea….

Hydrogen Bomb Blast. Image via Pixabay.
Hydrogen Bomb Blast. Image via Pixabay.

There Are No Such Things As Black Holes

UNC-Chapel Hill physics professor Laura Mersini-Houghton has proven mathematically that black holes don't exist. (Source: unc.edu)

That’s the conclusion reached by one researcher from the University of North Carolina: black holes can’t exist in our Universe — not mathematically, anyway.

“I’m still not over the shock,” said Laura Mersini-Houghton, associate physics professor at UNC-Chapel Hill. “We’ve been studying this problem for a more than 50 years and this solution gives us a lot to think about.”

In a news article spotlighted by UNC the scenario suggested by Mersini-Houghton is briefly explained. Basically, when a massive star reaches the end of its life and collapses under its own gravity after blasting its outer layers into space — which is commonly thought to result in an ultra-dense point called a singularity surrounded by a light- and energy-trapping event horizon — it undergoes a period of intense outgoing radiation (the sort of which was famously deduced by Stephen Hawking.) This release of radiation is enough, Mersini-Houghton has calculated, to cause the collapsing star to lose too much mass to allow a singularity to form. No singularity means no event horizon… and no black hole.

Artist's conception of the event horizon of a black hole. Credit: Victor de Schwanberg/Science Photo Library
Artist’s conception of the event horizon of a black hole. Credit: Victor de Schwanberg/Science Photo Library

At least, not by her numbers.

Read more: How Do Black Holes Form?

So what does happen to massive stars when they die? Rather than falling ever inwards to create an infinitely dense point hidden behind a space-time “firewall” — something that, while fascinating to ponder and a staple of science fiction, has admittedly been notoriously tricky for scientists to reconcile with known physics — Mersini-Houghton suggests that they just “probably blow up.” (Source)

According to the UNC article Mersini-Houghton’s research “not only forces scientists to reimagine the fabric of space-time, but also rethink the origins of the universe.”

Hm.

The submitted papers on this research are publicly available on arXiv.org and can be found here and here.

Read more: What Would It Be Like To Fall Into a Black Hole?

Don’t believe it? I’m not surprised. I’m certainly no physicist but I do expect that there will be many scientists (and layfolk) who’ll have their own take on Mersini-Houghton’s findings (*ahem* Brian Koberlein*) especially considering 1. the popularity of black holes in astronomical culture, and 2. the many — scratch that; the countlessobservations that have been made on quite black hole-ish objects found throughout the Universe.

So what do you think? Have black holes just been voted off the cosmic island? Or are the holes more likely in the research? Share your thoughts in the comments!

Want to hear more from Mersini-Houghton herself? Here’s a link to a video explaining her view of why event horizons and singularities might simply be a myth.

Source: UNC-Chapel Hill. HT to Marco Iozzi on the Google+ Space Community (join us!)

Of course this leads me to ask: if there really are “no black holes” then what’s causing the stars in the center of our galaxy to move like this?

*Added Sept. 25: I knew Brian wouldn’t disappoint! Read his post on why “Yes, Virginia, There Are Black Holes.”

How Watching 13 Billion Years Of Cosmic Growth Links To Storytelling

Screenshot of a simulation of how the universe's dark matter and gas grew in its first 13 billion years. Credit: Harvard-Smithsonian Center for Astrophysics / YouTube

How do you show off 13 billion years of cosmic growth? One way that astronomers can figure that out is through visualizations — such as this one from the Harvard-Smithsonian Center for Astrophysics, called Illustris.

Billed as the most detailed computer simulation ever of the universe (done on a fast supercomputer), you can slowly see how galaxies come alight and the structure of the universe grows. While the pictures are pretty to look at, the Kavli Foundation also argues this is good for science.

In a recent roundtable discussion, the foundation polled experts to talk about the simulation (and in particular how the gas evolves), and how watching these interaction play out before their eyes helps them come to new understandings. But like any dataset, part of the understanding comes from knowing what to focus on and why.

“I think we should look at visualization like mapmakers look at map making. A good mapmaker will be deliberate in what gets included in the map, but also in what gets left out,” said Stuart Levy, a research programmer at the National Center for Supercomputing Applications’ advanced visualization lab, in a statement.

“Visualizers think about their audience … and the specific story they want to tell. And so even with the same audience in mind, you might set up the visualization very differently to tell different stories. For example, for one story you might want to show only what it’s possible for the human eye to see, and in others you might want to show the presence of something that wouldn’t be visible in any sort of radiation at all. That can help to get a point across.”

You can read the whole discussion at this webpage.

Parallel Universes and the Many-Worlds Theory

Credit: Glenn Loos-Austin

Are you unique? In your perception of the world, the answer is simple: you are different than every other person on this planet. But is our universe unique? The concept of multiple realities — or parallel universes — complicates this answer and challenges what we know about the world and ourselves. One model of potential multiple universes called the Many-Worlds Theory might sound so bizarre and unrealistic that it should be in science fiction movies and not in real life. However, there is no experiment that can irrefutably discredit its validity.

The origin of the parallel universe conjecture is closely connected with introduction of the idea of quantum mechanics in the early 1900s. Quantum mechanics, a branch of physics that studies the infinitesimal world, predicts the behavior of nanoscopic objects. Physicists had difficulties fitting a mathematical model to the behavior of quantum matter because some matter exhibited signs of both particle-like and wave-like movements. For example, the photon, a tiny bundle of light, can travel vertically up and down while moving horizontally forward or backward.

Such behavior starkly contrasts with that of objects visible to the naked eye; everything we see moves like either a wave or a particle. This theory of matter duality has been called the Heisenberg Uncertainty Principle (HUP), which states that the act of observation disturbs quantities like momentum and position.

In relation to quantum mechanics, this observer effect can impact the form – particle or wave – of quantum objects during measurements. Future quantum theories, like Niels Bohr’s Copenhagen interpretation, use HUP to state that an observed object does not retain its dual nature and can only behave in one state.

Multiverse Theory
Artist concept of the multiverse. Credit: Florida State University

In 1954, a young student at Princeton University named Hugh Everett proposed a radical supposition that differed from the popular models of quantum mechanics. Everett did not believe that observation causes quantum matter to stop behaving in multiple forms.

Instead, he argued that observation of quantum matter creates a split in the universe. In other words, the universe makes copies of itself to account for all the possibilities and these duplicates will proceed independently. Every time a photon is measured, for instance, a scientist in one universe will analyze it in wave form and the same scientist in another universe will analyze it in particle form. Each of these universes offers a unique and independent reality that coexists with other parallel universes.

If Everett’s Many-Worlds Theory (MWT) is true, it holds many ramifications that completely transform our perceptions on life. Any action that has more than one possible result produces a split in the universe. Thus, there are an infinite number of parallel universes and infinite copies of each person.

These copies have identical facial and body features, but do not have identical personalities (one may be aggressive and another may be passive) because each one experiences a separate outcome. The infinite number of alternate realities also suggests that nobody can achieve unique accomplishments. Every person – or some version of that person in a parallel universe – has done or will do everything.

Moreover, the MWT implies that everybody is immortal. Old age will no longer be a surefire killer, as some alternate realities could be so scientifically and technologically advanced that they have developed an anti-aging medicine. If you do die in one world, another version of you in another world will survive.

The most troubling implication of parallel universes is that your perception of the world is never real. Our “reality” at an exact moment in one parallel universe will be completely unlike that of another world; it is only a tiny figment of an infinite and absolute truth. You might believe you are reading this article at this instance, but there are many copies of you that are not reading. In fact, you are even the author of this article in some distant reality. Thus, do winning prizes and making decisions matter if we might lose those awards and make different choices? Is living important if we might actually be dead somewhere else?

Some scientists, like Austrian mathematician Hans Moravec, have tried to debunk the possibility of parallel universes. Moravec developed a famous experiment called quantum suicide in 1987 that connects a person to a fatal weapon and a machine that determines the spin value, or angular momentum, of protons. Every 10 seconds, the spin value, or quark, of a new proton is recorded.

Based on this measurement, the machine will cause the weapon to kill or spare the person with a 50 percent chance for each scenario. If the Many-World’s Theory is not true, then the experimenter’s survival probability decreases after every quark measurement until it essentially becomes zero (a fraction raised to a large exponent is a very small value). On the other hand, MWT argues that the experimenter always has a 100% chance of living in some parallel universe and he/she has encountered quantum immortality.

When the quark measurement is processed, there are two possibilities: the weapon can either fire or not fire. At this moment, MWT claims that the universe splits into two different universes to account for the two endings. The weapon will discharge in one reality, but not discharge in the other. For moral reasons, scientists cannot use Moravec’s experiment to disprove or corroborate the existence of parallel worlds, as the test subjects may only be dead in that particular reality and still alive in another parallel universe. In any case, the peculiar Many-World’s Theory and its startling implications challenges everything we know about the world.

Sources: Scientific American