Debris Disks Around Stars Could Point the Way to Giant Exoplanets

This artist's rendering shows a large exoplanet causing small bodies to collide in a disk of dust. Credit: NASA/JPL-Caltech

According to current estimates, there could be as many as 100 billion planets in the Milky Way Galaxy alone. Unfortunately, finding evidence of these planets is tough, time-consuming work. For the most part, astronomers are forced to rely on indirect methods that measure dips in a star’s brightness (the Transit Method) of Doppler measurements of the star’s own motion (the Radial Velocity Method).

Direct imaging is very difficult because of the cancelling effect stars have, where their brightness makes it difficult to spot planets orbiting them. Luckily a new study led by the Infrared Processing and Analysis Center (IPAC) at Caltech has determined that there may be a shortcut to finding exoplanets using direct imaging. The solution, they claim, is to look for systems with a circumstellar debris disk, for they are sure to have at least one giant planet.

The study, titled “A Direct Imaging Survey of Spitzer Detected Debris Disks: Occurrence of Giant Planets in Dusty Systems“, recently appeared in The Astronomical Journal. Tiffany Meshkat, an assistant research scientist at IPAC/Caltech, was the lead author on the study, which she performed while working at NASA’s Jet Propulsion Laboratory as a postdoctoral researcher.

A circumstellar disk of debris around a mature stellar system could indicate the presence of Earth-like planets. Credit: NASA/JPL
Artist’s impression of circumstellar disk of debris around a distant star. Credit: NASA/JPL

For the sake of this study, Dr. Meshkat and her colleagues examined data on 130 different single-star systems with debris disks, which they then compared to 277 stars that do not appear to host disks. These stars were all observed by NASA’s Spitzer Space Telescope and were all relatively young in age (less than 1 billion years). Of these 130 systems, 100 had previously been studied for the sake of finding exoplanets.

Dr. Meshkat and her team then followed up on the remaining 30 systems using data from the W.M. Keck Observatory in Hawaii and the European Southern Observatory’s (ESO) Very Large Telescope (VLT) in Chile. While they did not detect any new planets in these systems, their examinations helped characterize the abundance of planets in systems that had disks.

What they found was that young stars with debris disks are more likely to also have giant exoplanets with wide orbits than those that do not. These planets were also likely to have five times the mass of Jupiter, thus making them “Super-Jupiters”. As Dr. Meshkat explained in a recent NASA press release, this study will be of assistance when it comes time for exoplanet-hunters to select their targets:

“Our research is important for how future missions will plan which stars to observe. Many planets that have been found through direct imaging have been in systems that had debris disks, and now we know the dust could be indicators of undiscovered worlds.”

This artist’s conception shows how collisions between planetesimals can create additional debris. Credit: NASA/JPL-Caltech

This study, which was the largest examination of stars with dusty debris disks, also provided the best evidence to date that giant planets are responsible for keeping debris disks in check. While the research did not directly resolve why the presence of a giant planet would cause debris disks to form, the authors indicate that their results are consistent with predictions that debris disks are the products of giant planets stirring up and causing dust collisions.

In other words, they believe that the gravity of a giant planet would cause planestimals to collide, thus preventing them from forming additional planets. As study co-author Dimitri Mawet, who is also a JPL senior research scientist, explained:

“It’s possible we don’t find small planets in these systems because, early on, these massive bodies destroyed the building blocks of rocky planets, sending them smashing into each other at high speeds instead of gently combining.”

Within the Solar System, the giant planets create debris belts of sorts. For example, between Mars and Jupiter, you have the Main Asteroid Belt, while beyond Neptune lies the Kuiper Belt. Many of the systems examined in this study also have two belts, though they are significantly younger than the Solar System’s own belts – roughly 1 billion years old compared to 4.5 billion years old.

Artist’s impression of Beta Pictoris b. Credit: ESO L. Calçada/N. Risinger (skysurvey.org)

One of the systems examined in the study was Beta Pictoris, a system that has a debris disk, comets, and one confirmed exoplanet. This planet, designated Beta Pictoris b, which has 7 Jupiter masses and orbits the star at a distance of 9 AUs – i.e. nine times the distance between the Earth and the Sun. This system has been directly imaged by astronomers in the past using ground-based telescopes.

Interestingly enough, astronomers predicted the existence of this exoplanet well before it was confirmed, based on the presence and structure of the system’s debris disk. Another system that was studied was HR8799, a system with a debris disk that has two prominent dust belts. In these sorts of systems, the presence of more giant planets is inferred based on the need for these dust belts to be maintained.

This is believed to be case for our own Solar System, where 4 billion years ago, the giant planets diverted passing comets towards the Sun. This resulted in the Late Heavy Bombardment, where the inner planets were subject to countless impacts that are still visible today. Scientists also believe that it was during this period that the migrations of Jupiter, Saturn, Uranus and Neptune deflected dust and small bodies to form the Kuiper Belt and Asteroid Belt.

Dr. Meshkat and her team also noted that the systems they examined contained much more dust than our Solar System, which could be attributable to their differences in age. In the case of systems that are around 1 billion years old, the increased presence of dust could be the result of small bodies that have not yet formed larger bodies colliding. From this, it can be inferred that our Solar System was once much dustier as well.

Artist’s concept of the multi-planet system around HR 8799, initially discovered with Gemini North adaptive optics images. Credit: Gemini Observatory/Lynette Cook”

However, the authors note is also possible that the systems they observed – which have one giant planet and a debris disk – may contain more planets that simply have not been discovered yet. In the end, they concede that more data is needed before these results can be considered conclusive. But in the meantime, this study could serve as an guide as to where exoplanets might be found.

As Karl Stapelfeldt, the chief scientist of NASA’s Exoplanet Exploration Program Office and a co-author on the study, stated:

“By showing astronomers where future missions such as NASA’s James Webb Space Telescope have their best chance to find giant exoplanets, this research paves the way to future discoveries.”

In addition, this study could help inform our own understanding of how the Solar System evolved over the course of billions of years. For some time, astronomers have been debating whether or not planets like Jupiter migrated to their current positions, and how this affected the Solar System’s evolution. And there continues to be debate about how the Main Belt formed (i.e. empty of full).

Last, but not least, it could inform future surveys, letting astronomers know which star systems are developing along the same lines as our own did, billions of years ago. Wherever star systems have debris disks, they an infer the presence of a particularly massive gas giant. And where they have a disk with two prominent dust belts, they can infer that it too will become a system containing many planets and and two belts.

Further Reading: NASA, The Astrophysical Journal

Tales of the King: Watch as the Moon Occults Regulus for North America This Weekend

The Moon occults Regulus on July 25th, 2017. The Moon also occulted the star shortly after the August 21st total solar eclipse. Credit and copyright; @Shahgazer (Shahrin Ahmad).
The Moon occults Regulus on July 25th, 2017. The Moon also occulted the star shortly after the August 21st total solar eclipse. Credit and copyright: @Shahgazer (Shahrin Ahmad).

Up early Sunday morning? Or perhaps, as we often do, you’re “pulling an all-nighter,” out observing until the break of dawn. Well, the clockwork celestial mechanics of the Universe has a treat in store on the morning of October 15th, as the waning crescent Moon occults (passes in front of) the bright star Regulus (Alpha Leonis, the “Little King” or “Heart of the Lion”) for the contiguous United States, Mexico and southern Canada.

The visibility footprint for Sunday’s occultation. The white solid lines show where the occultation occurs under dark skies, blue marks twilight, and broken lines represent where the event occurs under daytime skies. Credit: Occult 4.2.

You might call this one the “Great American Occultation,” as it takes a similar track to a certain total solar eclipse and another occultation of the bright star Aldebaran earlier this year. The Moon is a 20% illuminated waning crescent during Sunday’s occultation, about the best phase for such an event, as you’ll also get a nice contrasting Earthshine or Ashen light on dark nighttime limb of the Moon. That’s sunlight from the waxing gibbous Earth, illuminating the (cue Pink Floyd) Dark Side of the Moon.

Moon versus Regulus shortly after Sunday’s occultation as seen from Spring Hill, Florida. Stellarium.

Early morning occultations always see the target star or planet ingress (passing behind) the oncoming bright limb of the waning Moon, then egress (reappearing) from behind its dark limb. During waxing evening occultations, the reverse is true, as the dark limb of the Moon leads the way. The Moon will be 53 degrees west of the Sun during the event, and folks in the western U.S. will see the occultation lower to the eastern horizon under dark skies, while observers from Florida to the Great Lakes will see the event transpire under twilight skies and observers in the U.S. northeast will see the occultation finish up after sunrise. Shining at magnitude +1.4, you’ll be able to see the disappearance and reappearance of Regulus with the unaided eye, though events on the dark limb are always more dramatic. And you may just be able to spy Regulus in the daytime post sunrise near the Moon after the occultation, using binoculars or a telescope.

The northern limit graze line path for Sunday’s occultation. Click here for a Google interactive map. Credit: IOTA

Observers along a line running from Oregon through Lake of the Woods, above the Great Lakes and north of New Brunswick are also in for a treat as you just might be able to catch a rare grazing occultation of Regulus, (see the video below) as the star’s light shines down through those lunar valleys and gets blocked by mountain peaks along the limb of the Moon. Such an event can be quite dramatic to watch, as the star light winks in and out during the very last second of its 79 light year journey.

A look at the occultation circumstances for selected locations tells the story. The International Occultation Timing Association has a full list for locales across North America.

Ingress Egress Moon Alt start/end
Boise, Idaho 9:48 UT 10:03 UT 3 deg / 6 deg
Tuscon, Arizona (before Moonrise) 10:14 UT NA / 10 deg
Mexico City, Mexico 9:19 UT 10:06 UT 6 deg / 17 deg
Tampa, Florida 9:24 UT 10:32 UT 23 deg / 39 deg
St. Louis, Missouri 9:29 UT 10:26 UT 18 deg / 29 deg
Boston, Massachusetts 9:50 UT 10:44 UT 36 deg / 45 deg
Toronto, Canada 9:47 UT 10:32 UT 29 / 37 deg

The Moon is in the midst of a cycle of occultations of Regulus running from December 18th, 2016 to the final one for the cycle on April 24th, 2018. This is number 12 in a series of 19 events, and the best pre-dawn occultation of Regulus for the United States in the current cycle.

US cloud cover percentages, a few hours before the occultation. Credit: NOAA

The Moon can occult four bright +1st magnitude stars during the current epoch: Regulus, Antares, Spica and Aldebaran. And though Regulus lies closest to the ecliptic plane, it actually gets occulted the least of any 1st magnitude star in the 21st century, with only 220 events. The Moon actually also occulted the bright star Pollux up until almost two millennia ago, and will resume doing so again in the future.

Occultations are easy to observe, and one of the few times (including eclipses) were you can see the motion of the Moon, in real time. The Moon moves its own diameter (30′ or half a degree) per hour, and the reemergence of the bright star will be an abrupt “lights back on” for Regulus. Does it seem to linger a bit between the horns of the crescent Moon? This often reported optical illusion is called the Coleridge Effect, from a line from Samuel Coleridge’s (not Iron Maiden’s) Rime of the Ancient Mariner:

While clome above the Eastern bar

The horned Moon, with one bright star

Almost atween the tips.

Happen to see Regulus “a clome ‘atween the tips?” We also like to refer to this as the ‘Protor and Gamble effect’ due to the company’s traditional star-filled logo.

Occultations also adorn the flags of many Middle Eastern countries. The star and crescent of Islam traces back to antiquity, but was said to have been adapted for the Turkish flag after Sultan Alp Arslan witnessed a close pairing shortly after the Battle of Manzikert on August 26th, 1071 AD. Though Venus is usually stated as the legendary “star,” Regulus was in fact, just a few degrees away from the Moon on the very same morning… perhaps adding some credence to a major legend vexing vexillology?

The Moon and Regulus on the morning of August 26th, 1071 AD. Stellarium.

Of course, we may never truly know just what Sultan Arsulan saw. A more recent occultation tale was featured in the November 2017 issue of Sky and Telescope, positing the an occultation of Aldebaran by the Moon on March 7th, 1974 was the source of William Wilkins’ alleged “Volcano on the Moon…” the timing is certainly right, though one wonders how a skilled observer like Wilkins could be fooled by a prominent star (wistful thinking, maybe?)

Recording an occultation is as easy as aiming a video camera at the Moon through a telescope and letting it run. Start early, and make sure you’ve got the contrast between the bright limb of the Moon and the star adjusted, so both appear in the frame. We like to have WWV radio running in the background for an accurate time hack on the video.

Regulus also has a suspected (though never seen) white dwarf companion. Such a star should shine at +12th magnitude or so… and just might make a very brief appearance on the dark limb of the Moon during egress. One total unknown is its position angle, which is a big wild card, but you just never know… its worth examining that video afterwards, especially if you’re shooting at a high frame rate.

…and speaking of occultations, we’re in the midst of combing through near double occultations of bright stars and planets out to 3000 A.D… hey, it’s what we do for fun. Anyhow, we’re tweeting these out as @Astroguyz as we find ’em, one per day. As a teaser, I give you this grazing occultation of Venus and Regulus over Siberia coming right up in 2025:

The Moon occults Venus and Regulus in 2025. Stellarium.

If nothing else, the cosmic grin of a planet, star and crescent Moon does hint at the Universe’s strange sense of humor.

Scientist Find Treasure Trove of Giant Black Hole Pairs

In February 2016, LIGO detected gravity waves for the first time. As this artist's illustration depicts, the gravitational waves were created by merging black holes. The third detection just announced was also created when two black holes merged. Credit: LIGO/A. Simonnet.
Artist's impression of merging binary black holes. Credit: LIGO/A. Simonnet.

For decades, astronomers have known that Supermassive Black Holes (SMBHs) reside at the center of most massive galaxies. These black holes, which range from being hundreds of thousands to billions of Solar masses, exert a powerful influence on surrounding matter and are believed to be the cause of Active Galactic Nuclei (AGN). For as long as astronomers have known about them, they have sought to understand how SMBHs form and evolve.

In two recently published studies, two international teams of researchers report on the discovery of five newly-discovered black hole pairs at the centers of distant galaxies. This discovery could help astronomers shed new light on how SMBHs form and grow over time, not to mention how black hole mergers produce the strongest gravitational waves in the Universe.

The first four dual black hole candidates were reported in a study titled “Buried AGNs in Advanced Mergers: Mid-Infrared Color Selection as a Dual AGN Finder“, which was led by Shobita Satyapal, a professor of astrophysics at George Mason University. This study was accepted for publication in The Astrophysical Journal and recently appeared online.

Optical and x-ray data on two of the new black hole pairs discovered. Credit: NASA/CXC/Univ. of Victoria/S.Ellison et al./George Mason Univ./S.Satyapal et al./SDSS

The second study, which reported the fifth dual black hole candidate, was led by Sarah Ellison – an astrophysics professor at the University of Victoria. It was recently published in the Monthly Notices of the Royal Astronomical Society under the title “Discovery of a Dual Active Galactic Nucleus with ~8 kpc Separation. The discovery of these five black hole pairs was very fortuitous, given that pairs are a very rare find.

As Shobita Satyapal explained in a Chandra press statement:

“Astronomers find single supermassive black holes all over the universe. But even though we’ve predicted they grow rapidly when they are interacting, growing dual supermassive black holes have been difficult to find.

The black hole pairs were discovered by combining data from a number of different ground-based and space-based instruments. This included optical data from the Sloan Digital Sky Survey (SDSS) and the ground-based Large Binocular Telescope (LBT) in Arizona with near-infrared data from the Wide-Field Infrared Survey Explorer (WISE) and x-ray data from NASA’s Chandra X-ray Observatory.

For the sake of their studies, Satyapal, Ellison, and their respective teams sought to detect dual AGNs, which are believed to be a consequence of galactic mergers. They began by consulting optical data from the SDSS to identify galaxies that appeared to be in the process of merging. Data from the all-sky WISE survey was then used to identify those galaxies that displayed the most powerful AGNs.

Illustration of a pair of black holes. Credit: NASA/CXC/A.Hobart

They then consulted data from the Chandra’s Advanced CCD Imaging Spectrometer (ACIS) and the LBT to identify seven galaxies that appeared to be in an advanced stage of merger. The study led by Ellison also relied on optical data provided by the Mapping Nearby Galaxies at Apache Point Observatory (MaNGA) survey to pinpoint one of the new black hole pairs.

From the combined data, they found that five out of the seven merging galaxies hosted possible dual AGNs, which were separated by less than 10 kiloparsecs (over 30,000 light years). This was evidenced by the infrared data provided by WISE, which was consistent with what is predicated of rapidly growing supermassive black holes.

In addition, the Chandra data showed closely-separated pairs of x-ray sources, which is also consistent with black holes that have matter slowly being accreted onto them. This infrared and x-ray data also suggested that the supermassive black holes are buried in large amounts of dust and gas. As Ellison indicated, these findings were the result of painstaking work that consisted of sorting through multiple wavelengths of data:

“Our work shows that combining the infrared selection with X-ray follow-up is a very effective way to find these black hole pairs. X-rays and infrared radiation are able to penetrate the obscuring clouds of gas and dust surrounding these black hole pairs, and Chandra’s sharp vision is needed to separate them”.

Artist’s impression of binary black hole system in the process of merging. Credit: Bohn et al.

Before this study, less than ten pairs of growing black holes had been confirmed based on X-ray studies, and these were mostly by chance. This latest work, which detected five black hole pairs using combined data, was therefore both fortunate and significant. Aside from bolstering the hypothesis that supermassive black holes form from the merger of smaller black holes, these studies also have serious implications for gravitational wave research.

“It is important to understand how common supermassive black hole pairs are, to help in predicting the signals for gravitational wave observatories,” said Satyapa. “With experiments already in place and future ones coming online, this is an exciting time to be researching merging black holes. We are in the early stages of a new era in exploring the universe.”

Since 2016, a total of four instances of gravitational waves have been detected by instruments like the Laser Interferometer Gravitational-Wave Observatory (LIGO) and the VIRGO Observatory. However, these detections were the result of black hole mergers where the black holes were all smaller and less massive  – between eight and 36 Solar masses.

Supermassive Black Holes, on the other hand, are much more massive and will likely produce a much larger gravitational wave signature as they continue to draw closer together. And in a few hundred million years, when these pairs eventually do merge, the resulting energy produced by mass being converted into gravitational waves will be incredible.

Artist’s conception of two merging black holes, similar to those detected by LIGO on January 4th, 2017. Credit: LIGO/Caltech

At present, detectors like LIGO and Virgo are not able to detect the gravitational waves created by Supermassive Black Hole pairs. This work is being done by arrays like the North American Nanohertz Observatory for Gravitational Waves (NANOGrav), which relies on high-precision millisecond pulsars to measure the influence of gravitational waves on space-time.

The proposed Laser Interferometer Space Antenna (LISA), which will be the first dedicated space-based gravitational wave detector, is also expected to help in the search. In the meantime, gravitational wave research has already benefited immensely from collaborative efforts like the one that exists between Advanced LIGO and Advanced Virgo.

In the future, scientists also anticipate that they will be able to study the interiors of supernovae through gravitational wave research. This is likely to reveal a great deal about the mechanisms behind black hole formation. Between all of these ongoing efforts and future developments, we can expect to “hear” a great deal more of the Universe and the most powerful forces at work within it.

Be sure to check out this animation that shows what the eventual merger of two of these black hole pairs will look like, courtesy of the Chandra X-ray Observatory:

Further Reading: Chandra HarvardarXiv, MNRAS

New Clues Emerge for the Existence of Planet 9

Artist's impression of Planet Nine, blocking out the Milky Way. The Sun is in the distance, with the orbit of Neptune shown as a ring. Credit: ESO/Tomruen/nagualdesign
Artist's impression of Planet Nine, blocking out the Milky Way. The Sun is in the distance, with the orbit of Neptune shown as a ring. Credit: ESO/Tomruen/nagualdesign

Planet 9 cannot hide forever, and new research has narrowed the range of possible locations further! In January of 2016, astronomers Mike Brown and Konstantin Batygin published the first evidence that there might be another planet in our Solar System. Known as “Planet 9” (“Planet X” to some), this hypothetical body was believed to orbit at an extreme distance from our Sun, as evidenced by the orbits of certain extreme Kuiper Belt Objects (eKBOs).

Since that time, multiple studied have been produced that have attempted to place constraints on Planet 9’s location. The latest study once again comes from Brown and Batygin, who conducted an analytical assessment of all the processes that have indicated the presence of Planet 9 so far. Taken together, these indications show that the existence of this body is not only likely, but also essential to the Solar System as we know it.

The study, titled “Dynamical Evolution Induced by Planet Nine“, recently appeared online and has been accepted for publication in The Astronomical Journal. Whereas previous studies have pointed to the behavior of various populations of KBOs as proof of Planet 9, Brown and Batygin sought to provide a coherent theoretical description of the dynamical mechanisms responsible for these effects.

In the end, they concluded that it would be more difficult to imagine a Solar System without a Planet 9 than with one. As Konstantin Batygin explained in a recent NASA press statement:

“There are now five different lines of observational evidence pointing to the existence of Planet Nine. If you were to remove this explanation and imagine Planet Nine does not exist, then you generate more problems than you solve. All of a sudden, you have five different puzzles, and you must come up with five different theories to explain them.”

In 2016, Brown and Batygin described the first three lines of observational evidence for Planet 9. These include six extreme Kuiper Belt Objects which follow highly elliptical paths around the Sun, which are indicative of an unseen mechanism affecting their orbit. Second is the fact that the orbits of these bodies are all tilted the same way – about 30° “downward” to the plane of the Kuiper Belt.

The third hint came in the form of computer simulations that included Planet 9 as part of the Solar System. Based to these simulations, it was apparent that more objects should be tilted with respect to the Solar plane, on the order of about 90 degrees. Thanks to their research, Brown and Batygin found five such objects that happened to fit this orbital pattern, and suspected that more existed.

Caltech professor Mike Brown and assistant professor Konstanin Batygin have been working together to investigate Planet Nine. Credit: Lance Hayashida/Caltech

Since the publication of the original paper, two more indications have emerged for the existence of Planet 9. Another involved the unexplained orbits of more Kuiper Belt Objects which were found to be orbiting in the opposite direction from everything else in the Solar System. This was a telltale indication that a relatively close body with a powerful gravitational force was affecting their orbits.

And then there was the argument presented in a second paper by the team – which was led by Elizabeth Bailey, Batygin’s graduate student. This study argued that Planet 9 was responsible for tilting the orbits of the Solar planets over the past 4.5 billion years. This not only provided additional evidence for Planet 9, but also answered a long standing mystery in astrophysics – why the planets are tilted 6 degrees relative to the Sun’s equator.

As Batygin indicated, all of this adds up to a solid case for the existence of a yet-to-discovered massive planet in the outer Solar System:

“No other model can explain the weirdness of these high-inclination orbits. It turns out that Planet Nine provides a natural avenue for their generation. These things have been twisted out of the solar system plane with help from Planet Nine and then scattered inward by Neptune.”

A predicted consequence of Planet Nine is that a second set of confined objects (represented in blue) should also exist. Credit: Caltech/R. Hurt (IPAC)

Recent studies have also shed some light on how and where Planet 9 originated. Whereas some suggested that the planet moved to the edge of the Solar System after forming closer to the Sun, others have suggested that it might be an exoplanet that was captured early in the Solar System’s history. At present, the favored theory appears to be that it formed closer to the Sun and migrated outward over time.

Granted, there is not yet a scientific consensus when it comes to Planet 9 and other astronomers have offered other possible explanations for the evidence cited by Batygin and Brown. For instance, a recent analysis based on the Outer Solar System Origins Survey – which discovered more than 800 new Trans-Neptunian Objects (TNOs) – suggests that the evidence could also be consistent with a random distribution of such objects.

In the meantime, all that remains is to find direct evidence of the planet. At present, Batygin and Brown are attempting to do just that, using the Subaru Telescope at the Mauna Kea Observatory in Hawaii. The detection of this planet will not only settle the matter of whether or not it even exists, it will also help resolve a mystery that emerged in recent years thanks to the discovery of thousands of extra-solar planets.

In short, thanks to the discovery of 3,529 confirmed exoplanets in 2,633 solar systems, astronomers have noticed that statistically, the most likely types of planets are “Super-Earths” and “mini-Neptunes” – i.e. planets that are more massive than Earth but not more than about 10 Earth masses. If Planet 9 is confirmed to exist, which is estimated to have 10 times the Mass of Earth, then it could explain this discrepancy.

Planet 9, we know you’re out there and we will find you! Unless you’re not, in which case, disregard this message!

Further Reads: NASA

Not an Alien Megastructure, a Cloud of Dust on a 700-Day Orbit

This illustration depicts a hypothetical uneven ring of dust orbiting KIC 8462852, also known as Boyajian's Star or Tabby's Star. Credit: NASA/JPL-Caltech

The mystery of KIC 8462852 (aka. Boyajian’s Star or Tabby’s Star) continues to excite and intrigue! Ever since it was first seen to be undergoing strange and sudden dips in brightness (back in October of 2015) astronomers have been speculating as to what could be causing this. Since that time, various explanations have been offered, including large asteroids, a large planet, a debris disc or even an alien megastructure.

Many studies have been produced that have sought to assign some other natural explanation to the star’s behavior. The latest comes from an international team of scientists – which included Tabetha Boyajian, the lead author on the original 2016 paper. According to this latest study, which was recently published in The Astrophysical Journal, the star’s long-term dimming patterns are likely the result of an uneven dust cloud moving around the star. Continue reading “Not an Alien Megastructure, a Cloud of Dust on a 700-Day Orbit”

New Study Proposes a Giant, Space-Based Solar Flare Shield for Earth

A massive prominence erupts from the surface of the sun. Credit: NASA Goddard Space Flight Center

In today’s modern, fast-paced world, human activity is very much reliant on electrical infrastructure. If the power grids go down, our climate control systems will shut off, our computers will die, and all electronic forms of commerce and communication will cease. But in addition to that, human activity in the 21st century is also becoming increasingly dependent upon the infrastructure located in Low Earth Orbit (LEO).

Aside from the many telecommunications satellites that are currently in space, there’s also the International Space Station and a fleet of GPS satellites. It is for this reason that solar flare activity is considered a serious hazard, and mitigation of it a priority. Looking to address that, a team of scientists from Harvard University recently released a study that proposes a bold solution – placing a giant magnetic shield in orbit.

The study – which was the work of Doctor Manasavi Lingam and Professor Abraham Loeb from the Harvard Smithsonian Center for Astrophysicist (CfA) – recently appeared online under the title “Impact and Mitigation Strategy for Future Solar Flares“. As they explain, solar flares pose a particularly grave risk in today’s world, and will become an even greater threat due to humanity’s growing presence in LEO.

Solar flares have been a going concern for over 150 years, ever since the famous Carrington Event of 1859. Since that time, a great deal of effort has been dedicated to the study of solar flares from both a theoretical and observational standpoint. And thanks to the advances that have been made in the past 200 years in terms of astronomy and space exploration, much has been learned about the phenomena known as “space weather”.

At the same time, humanity’s increased reliance on electricity and space-based infrastructure have also made us more vulnerable to extreme space weather events. In fact, if the Carrington event were to take place today, it is estimated that it would cause global damage to electric power grids, satellites communications, and global supply chains.

The cumulative worldwide economic losses, according to a 2009 report by the Space Studies Board (“Severe Space Weather Events–Understanding Societal and Economic Impacts”), would be $10 trillion, and recovery would take several years. And yet, as Professor Loeb explained to Universe Today via email, this threat from space has received far less attention than other possible threats.

“In terms of risk from the sky, most of the attention in the past was dedicated to asteroids,” said Loeb. “They killed the dinosaurs and their physical impact in the past was the same as it will be in the future, unless their orbits are deflected. However, solar flares have little biological impact and their main impact is on technology. But a century ago, there was not much technological infrastructure around, and technology is growing exponentially. Therefore, the damage is highly asymmetric between the past and future.”

Artist’s concept of a large asteroid passing by the Earth-Moon system. Credit: A combination of ESO/NASA images courtesy of Jason Major/Lights in the Dark.

To address this, Lingham and Loeb developed a simple mathematical model to assess the economic losses caused by solar flare activity over time. This model considered the increasing risk of damage to technological infrastructure based on two factors. For one, they considered the fact that the energy of a solar flares increases with time, then coupled this with the exponential growth of technology and GDP.

What they determined was that on longer time scales, the rare types of solar flares that are very powerful become much more likely. Coupled with humanity’s growing presence and dependence on spacecraft and satellites in LEO, this will add up to a dangerous conjunction somewhere down the road. Or as Loeb explained:

“We predict that within ~150 years, there will be an event that causes damage comparable to the current US GDP of ~20 trillion dollars, and the damage will increase exponentially at later times until technological development will saturate. Such a forecast was never attempted before. We also suggest a novel idea for how to reduce the damage from energetic particles by a magnetic shield. This was my idea and was not proposed before.”

To address this growing risk, Lingham and Loeb also considered the possibility of placing a magnetic shield between Earth and the Sun. This shield would be placed at the Earth-Sun Lagrange Point 1, where it would be able to deflect charged particles and create an artificial bowshock around Earth. In this sense, this shield would protect Earth’s in a way that is similar to what its magnetic field already does, but to greater effect.

Illustration of the proposed magnetic deflector placed at the Earth-Sun L1 Lagrange Point. Credit: Lingam and Loeb, 2017

Based on their assessment, Lingham and Loeb indicate that such a shield is technically feasible in terms of its basic physical parameters. They were also able to provide a rudimentary timeline for the construction of this shield, not to mention some rough cost assessments. As Loeb indicated, such a shield could be built before this century is over, and at a fraction of the cost of what would be incurred from solar flare damage.

“The engineering project associated with the magnetic shield that we propose could take a few decades to construct in space,” he said. “The cost for lifting the needed infrastructure to space (weighting 100,000 tons) will likely be of order 100 billions of dollars, much less than the expected damage over a century.”

Interestingly enough, the idea of using a magnetic shield to protect planets has been proposed before. For example, this type of shield was also the subject of a presentation at this year’s “Planetary Science Vision 2050 Workshop“, which was hosted by NASA’s Planetary Science Division (PSD). This shield was recommended as a means of enhancing Mars’ atmosphere and facilitating crewed mission to its surface in the future.

During the course of the presentation, titled “A Future Mars Environment for Science and Exploration“, NASA Director Jim Green discussed how a magnetic shield could protect Mars’ tenuous atmosphere from solar wind. This would allow it to replenish over time, which would have the added benefit of warming Mars up and allowing liquid water to again flow on its surface. If this sounds similar to proposals for terraforming Mars, that’s because it is!

Artist’s impression of a flaring red dwarf star, orbited by an exoplanet. Credit: NASA, ESA, and G. Bacon (STScI)

Beyond Earth and the Solar System, the implications for this study are quite overwhelming. In recent years, many terrestrial planets have been found orbiting within nearby M-type (aka. red dwarf) star systems. Because of the way these planets orbit closely to their respective suns, and the variable and unstable nature of M-type stars, scientists have expressed doubts about whether or not these planets could actually be habitable.

In short, scientists have ventured that over the course of billions of years, rocky planets that orbit close to their suns, are tidally-locked with them, and are subject to regular solar flares would lose their atmospheres. In this respect, magnetic shields could be a possible solution to creating extra-solar colonies. Place a large shield in orbit at the L1 Lagrange point, and you never have to worry again about powerful magnetic storms ravaging the planet!

On top of that, this study offers a possible resolution to the Fermi Paradox. When looking for sign of Extra-Terrestrial Intelligence (ETI), it might make sense to monitor distant stars for signs of an orbiting magnetic shield. As Prof. Leob explained, such structures may have already been detected around distant stars, and could explain some of the unusual observations astronomers have made:

“The imprint of a shield built by another civilization could involve the changes it induces in the brightness of the host star due to occultation (similar behavior to Tabby’s star)  if the structure is big enough. The situation could be similar to Dyson’s spheres, but instead of harvesting the energy of the star the purpose of the infrastructure is to protect a technological civilization on a planet from the flares of its host star.”
It is a foregone conclusion that as time and technology progress, humanity’s presence in (and reliance on) space will increase. As such, preparing for the most drastic space weather events the Solar System can throw at us just makes sense. And when it comes to the big questions like “are we alone in the Universe?”, it also makes sense to take our boldest concepts and proposals and consider how they might point the way towards extra-terrestrial intelligence.

Further Reading: arXiv

LIGO Scientists who Detected Gravitational Waves Awarded Nobel Prize in Physics

Barry C. Barish and Kip S. Thorne, two of the recipients for the 2017 Nobel Prize in physics for their work with gravitational wave research. Credit: Caltech

In February of 2016, scientists working for the Laser Interferometer Gravitational-Wave Observatory (LIGO) made history when they announced the first-ever detection of gravitational waves. Since that time, multiple detections have taken place and scientific collaborations between observatories  – like Advanced LIGO and Advanced Virgo – are allowing for unprecedented levels of sensitivity and data sharing.

Not only was the first-time detection of gravity waves an historic accomplishment, it ushered in a new era of astrophysics. It is little wonder then why the three researchers who were central to the first detection have been awarded the 2017 Nobel Prize in Physics. The prize was awarded jointly to Caltech professors emeritus Kip S. Thorne and Barry C. Barish, along with MIT professor emeritus Rainer Weiss.

To put it simply, gravitational waves are ripples in space-time that are formed by major astronomical events – such as the merger of a binary black hole pair. They were first predicted over a century ago by Einstein’s Theory of General Relativity, which indicated that massive perturbations would alter the structure of space-time. However, it was not until recent years that evidence of these waves was observed for the first time.

The first signal was detected by LIGO’s twin observatories – in Hanford, Washington, and Livingston, Louisiana, respectively – and traced to a black mole merger 1.3 billion light-years away. To date, four detections have been, all of which were due to the mergers of black-hole pairs. These took place on December 26, 2015, January 4, 2017, and August 14, 2017, the last being detected by LIGO and the European Virgo gravitational-wave detector.

For the role they played in this accomplishment, one half of the prize was awarded jointly to Caltech’s Barry C. Barish – the Ronald and Maxine Linde Professor of Physics, Emeritus – and Kip S. Thorne, the Richard P. Feynman Professor of Theoretical Physics, Emeritus. The other half was awarded to Rainer Weiss, Professor of Physics, Emeritus, at the Massachusetts Institute of Technology (MIT).

As Caltech president Thomas F. Rosenbaum – the Sonja and William Davidow Presidential Chair and Professor of Physics – said in a recent Caltech press statement:

“I am delighted and honored to congratulate Kip and Barry, as well as Rai Weiss of MIT, on the award this morning of the 2017 Nobel Prize in Physics. The first direct observation of gravitational waves by LIGO is an extraordinary demonstration of scientific vision and persistence. Through four decades of development of exquisitely sensitive instrumentation—pushing the capacity of our imaginations—we are now able to glimpse cosmic processes that were previously undetectable. It is truly the start of a new era in astrophysics.”

This accomplishment was all the more impressive considering that Albert Einstein, who first predicted their existence, believed gravitational waves would be too weak to study. However, by the 1960s, advances in laser technology and new insights into possible astrophysical sources led scientists to conclude that these waves might actually be detectable.

The first gravity wave detectors were built by Joseph Weber, an astrophysics from the University of Maryland. His detectors, which were built in the 1960s, consisted of large aluminum cylinders  that would be driven to vibrate by passing gravitational waves. Other attempts followed, but all proved unsuccessful; prompting a shift towards a new type of detector involving interferometry.

One such instrument was developed by Weiss at MIT, which relied on the technique known as laser interferometry. In this kind of instrument, gravitational waves are measured using widely spaced and separated mirrors that reflect lasers over long distances. When gravitational waves cause space to stretch and squeeze by infinitesimal amounts, it causes the reflected light inside the detector to shift minutely.

At the same time, Thorne – along with his students and postdocs at Caltech – began working to improve the theory of gravitational waves. This included new estimates on the strength and frequency of waves produced by objects like black holes, neutron stars and supernovae. This culminated in a 1972 paper which Throne co-published with his student, Bill Press, which summarized their vision of how gravitational waves could be studied.

That same year, Weiss also published a detailed analysis of interferometers and their potential for astrophysical research. In this paper, he stated that larger-scale operations – measuring several km or more in size – might have a shot at detecting gravitational waves. He also identified the major challenges to detection (such as vibrations from the Earth) and proposed possible solutions for countering them.

Barry C. Barish and Kip S. Thorne, two of three recipients of the 2017 Nobel Prize in Physics. Credit: Caltech

In 1975, Weiss invited Thorne to speak at a NASA committee meeting in Washington, D.C., and the two spent an entire night talking about gravitational experiments. As a result of their conversation, Thorne went back to Calteh and proposed creating a experimental gravity group, which would work on interferometers in parallel with researchers at MIT, the University of Glasgow and the University of Garching (where similar experiments were being conducted).

Development on the first interferometer began shortly thereafter at Caltech, which led to the creation of a 40-meter (130-foot) prototype to test Weiss’ theories about gravitational waves. In 1984, all of the work being conducted by these respective institutions came together. Caltech and MIT, with the support of the National Science Foundation (NSF) formed the LIGO collaboration and began work on its two interferometers in Hanford and Livingston.

The construction of LIGO was a major challenge, both logistically and technically. However, things were helped immensely when Barry Barish (then a Caltech particle physicist) became the Principal Investigator (PI) of LIGO in 1994. After a decade of stalled attempts, he was also made the director of LIGO and put its construction back on track. He also expanded the research team and developed a detailed work plan for the NSF.

As Barish indicated, the work he did with LIGO was something of a dream come true:

“I always wanted to be an experimental physicist and was attracted to the idea of using continuing advances in technology to carry out fundamental science experiments that could not be done otherwise. LIGO is a prime example of what couldn’t be done before. Although it was a very large-scale project, the challenges were very different from the way we build a bridge or carry out other large engineering projects. For LIGO, the challenge was and is how to develop and design advanced instrumentation on a large scale, even as the project evolves.”

LIGO’s two facilities, located in Livingston, Louisiana, and Hanford, Washington. Credit: ligo.caltech.edu

By 1999, construction had wrapped up on the LIGO observatories and by 2002, LIGO began to obtain data. In 2008, work began on improving its original detectors, known as the Advanced LIGO Project. The process of converting the 40-m prototype to LIGO’s current 4-km (2.5 mi) interferometers was a massive undertaking, and therefore needed to be broken down into steps.

The first step took place between 2002 and 2010, when the team built and tested the initial interferometers. While this did not result in any detections, it did demonstrate the observatory’s basic concepts and solved many of the technical obstacles. The next phase – called Advanced LIGO, which took placed between 2010 and 2015 – allowed the detectors to achieve new levels of sensitivity.

These upgrades, which also happened under Barish’s leadership, allowed for the development of several key technologies which ultimately made the first detection possible. As Barish explained:

“In the initial phase of LIGO, in order to isolate the detectors from the earth’s motion, we used a suspension system that consisted of test-mass mirrors hung by piano wire and used a multiple-stage set of passive shock absorbers, similar to those in your car. We knew this probably would not be good enough to detect gravitational waves, so we, in the LIGO Laboratory, developed an ambitious program for Advanced LIGO that incorporated a new suspension system to stabilize the mirrors and an active seismic isolation system to sense and correct for ground motions.”

Rainer Weiss, famed MIT physicist and partial winner of the 2017 Nobel Prize in Physics. Credit: MIT/Bryce Vickmark

Given how central Thorne, Weiss and Barish were to the study of gravitational waves, all three were rightly-recognized as this year’s recipients of the Nobel Prize in Physics. Both Thorne and Barish were notified that they had won in the early morning hours on October 3rd, 2017. In response to the news, both scientists were sure to acknowledge the ongoing efforts of LIGO, the science teams that have contributed to it, and the efforts of Caltech and MIT in creating and maintaining the observatories.

“The prize rightfully belongs to the hundreds of LIGO scientists and engineers who built and perfected our complex gravitational-wave interferometers, and the hundreds of LIGO and Virgo scientists who found the gravitational-wave signals in LIGO’s noisy data and extracted the waves’ information,” said Thorne. “It is unfortunate that, due to the statutes of the Nobel Foundation, the prize has to go to no more than three people, when our marvelous discovery is the work of more than a thousand.”

“I am humbled and honored to receive this award,” said Barish. “The detection of gravitational waves is truly a triumph of modern large-scale experimental physics. Over several decades, our teams at Caltech and MIT developed LIGO into the incredibly sensitive device that made the discovery. When the signal reached LIGO from a collision of two stellar black holes that occurred 1.3 billion years ago, the 1,000-scientist-strong LIGO Scientific Collaboration was able to both identify the candidate event within minutes and perform the detailed analysis that convincingly demonstrated that gravitational waves exist.”

Looking ahead, it is also pretty clear that Advanved LIGO, Advanced Virgo and other gravitational wave observatories around the world are just getting started. In addition to having detected four separate events, recent studies have indicated that gravitational wave detection could also open up new frontiers for astronomical and cosmological research.

For instance, a recent study by a team of researchers from the Monash Center for Astrophysics proposed a theoretical concept known as ‘orphan memory’. According to their research, gravitational waves not only cause waves in space-time, but leave permanent ripples in its structure. By studying the “orphans” of past events, gravitational waves can be studied both as they reach Earth and long after they pass.

In addition, a study was released in August by a team of astronomers from the Center of Cosmology at the University of California Irvine that indicated that black hole mergers are far more common than we thought. After conducting a survey of the cosmos intended to calculate and categorize black holes, the UCI team determined that there could be as many as 100 million black holes in the galaxy.

Another recent study indicated that the Advanced LIGO, GEO 600, and Virgo gravitational-wave detector network could also be used to detect the gravitational waves created by supernovae. By detecting the waves created by star that explode near the end of their lifespans, astronomers could be able to see inside the hearts of collapsing stars for the first time and probe the mechanics of black hole formation.

The Nobel Prize in Physics is one of the highest honors that can be bestowed upon a scientist. But even greater than that is the knowledge that great things resulted from one’s own work. Decades after Thorne, Weiss and Barish began proposing gravitational wave studies and working towards the creation of detectors, scientists from all over the world are making profound discoveries that are revolutionizing the way we think of the Universe.

And as these scientists will surely attest, what we’ve seen so far is just the tip of the iceberg. One can imagine that somewhere, Einstein is also beaming with pride. As with other research pertaining to his theory of General Relativity, the study of gravitational waves is demonstrating that even after a century, his predictions were still bang on!

And be sure to check out this video of the Caltech Press Conference where Barish and Thorn were honored for their accomplishments:

Further Reading: NASA, Caltech

Determining the Mass of the Milky Way Using Hypervelocity Stars

An artist's conception of a hypervelocity star that has escaped the Milky Way. Credit: NASA

For centuries, astronomers have been looking beyond our Solar System to learn more about the Milky Way Galaxy. And yet, there are still many things about it that elude us, such as knowing its precise mass. Determining this is important to understanding the history of galaxy formation and the evolution of our Universe. As such, astronomers have attempted various techniques for measuring the true mass of the Milky Way.

So far, none of these methods have been particularly successful. However, a new study by a team of researchers from the Harvard-Smithsonian Center for Astrophysics proposed a new and interesting way to determine how much mass is in the Milky Way. By using hypervelocity stars (HVSs) that have been ejected from the center of the galaxy as a reference point, they claim that we can constrain the mass of our galaxy.

Their study, titled “Constraining Milky Way Mass with Hypervelocity Stars“, was recently published in the journal Astronomy and Astrophysics. The study was produced by Dr. Giacomo Fragione, an astrophysicist at the University of Rome, and Professor Abraham Loeb – the Frank B. Baird, Jr. Professor of Science, the Chair of the Astronomy Department, and the Director of the Institute for Theory and Computation at Harvard University.

Stars speeding through the Galaxy. Credit: ESA

To be clear, determining the mass of the Milky Way Galaxy is no simple task. On the one hand, observations are difficult because the Solar System lies deep within the disk of the galaxy itself. But at the same time, there’s also the mass of our galaxy’s dark matter halo, which is difficult to measure since it is not “luminous”, and therefore invisible to conventional methods of detection.

Current estimates of the galaxy’s total mass are based on the motions of tidal streamers of gas and globular clusters, which are both influenced by the gravitational mass of the galaxy. But so far, these measurements have produced mass estimates that range from one to several trillion solar-masses. As Professor Loeb explained to Universe Today via email, precisely measuring the mass of the Milky Way is of great importance to astronomers:

“The Milky Way provides a laboratory for testing the standard cosmological model. This model predicts that the number of satellite galaxies of the Milky Way depends sensitively on its mass. When comparing the predictions to the census of known satellite galaxies, it is essential to know the Milky Way mass. Moreover, the total mass calibrates the amount of invisible (dark) matter and sets the depth of the gravitational potential well and implies how fast should stars move for them to escape to intergalactic space.”

For the sake of their study, Prof. Loeb and Dr. Fragione therefore chose to take a novel approach, which involved modeling the motions of HVSs to determine the mass of our galaxy. More than 20 HVSs have been discovered within our galaxy so far, which travel at speeds of up to 700 km/s (435 mi/s) and are located at distances of about 100 to 50,000 light-years from the galactic center.

Artist’s conception of a hyperveloctiy star heading out from a spiral galaxy (similar to the Milky Way) and moving into dark matter nearby. Credit: Ben Bromley, University of Utah

These stars are thought to have been ejected from the center of our galaxy thanks to the interactions of binary stars with the supermassive black hole (SMBH) at the center of our galaxy – aka. Sagittarius A*. While their exact cause is still the subject of debate, the orbits of HVSs can be calculated since they are completely determined by the gravitational field of the galaxy.

As they explain in their study, the researchers used the asymmetry in the radial velocity distribution of stars in the galactic halo to determine the galaxy’s gravitational potential. The velocity of these halo stars is dependent on the potential escape speed of HVSs, provided that the time it takes for the HVSs to complete a single orbit is shorter than the lifetime of the halo stars.

From this, they were able to discriminate between different models for the Milky Way and the gravitational force it exerts. By adopting the nominal travel time of these observed HVSs – which they calculated to about 330 million years, about the same as the average lifetime of halo stars – they were able to derive gravitational estimates for the Milky Way which allowed for estimates on its overall mass.

“By calibrating the minimum speed of unbound stars, we find that the Milky Way mass is in the range of 1.2-1.9 trillions solar masses,” said Loeb. While still subject to a range, this latest estimate is a significant improvement over previous estimates. What’s more, these estimates are consistent our current cosmological models that attempt to account for all visible matter in the Universe, as well as dark matter and dark energy – the Lambda-CDM model.

Distribution of dark matter when the Universe was about 3 billion years old, obtained from a numerical simulation of galaxy formation. Credit: VIRGO Consortium/Alexandre Amblard/ESA

“The inferred Milky Way mass is in the range expected within the standard cosmological model,” said Leob, “where the amount of dark matter is about five times larger than that of ordinary (luminous) matter.”

Based on this breakdown, it can be said that normal matter in our galaxy – i.e. stars, planets, dust and gas – accounts for between 240 and 380 billion Solar Masses. So not only does this latest study provide more precise mass constraints for our galaxy, it could also help us to determine exactly how many star systems are out there – current estimates say that the Milky Way has between 200 to 400 billion stars and 100 billion planets.

Beyond that, this study is also significant to the study of cosmic formation and evolution. By placing more precise estimates on our galaxy’s mass, ones which are consistent with the current breakdown of normal matter and dark matter, cosmologists will be able to construct more accurate accounts of how our Universe came to be. One step clsoer to understanding the Universe on the grandest of scales!

Further Reading: Harvard Smithsonian CfA, Astronomy and Astrophysics

Old Mars Odyssey Data Indicates Presence of Ice Around Martian Equator

A new paper suggests hydrogen-possibly water ice-in the Medusa Fossae area of Mars, which is in an equatorial region of the planet to the lower left in this view. Image Credit: Steve Lee (University of Colorado), Jim Bell (Cornell University), Mike Wolff (Space Science Institute), and NASA

Finding a source of Martian water – one that is not confined to Mars’ frozen polar regions – has been an ongoing challenge for space agencies and astronomers alike. Between NASA, SpaceX, and every other public and private space venture hoping to conduct crewed mission to Mars in the future, an accessible source of ice would mean the ability to manufacture rocket fuel on sight and provide drinking water for an outpost.

So far, attempt to locate an equatorial source of water ice have failed. But after consulting old data from the longest-running mission to Mars in history – NASA’s Mars Odyssey spacecraft – a team of researchers from the John Hopkins University Applied Physics Laboratory (JHUAPL) announced that they may have found evidence of a source of water ice in the Medusae Fossae region of Mars.

This region of Mars, which is located in the equatorial region, is situated between the highland-lowland boundary near the Tharsis and Elysium volcanic areas. This area is known for its formation of the same name, which is a soft deposit of easily-erodible material that extends for about 5000 km (3,109 mi) along the equator of Mars. Until now, it was believed to be impossible for water ice to exist there.

Artist’s conception of the Mars Odyssey spacecraft. Credit: NASA/JPL

However, a team led by Jack Wilson – a post-doctoral researcher at the JHUAPL – recently reprocessed data from the Mars Odyssey spacecraft that showed unexpected signals. This data was collected between 2002 and 2009 by the mission’s neutron spectrometer instrument. After reprocessing the lower-resolution compositional data to bring it into sharper focus, the team found that it contained unexpectedly high signals of hydrogen.

To bring the information into higher-resolution, Wilson and his team applied image-reconstruction techniques that are typically used to reduce blurring and remove noise from medical and spacecraft imaging data. In so doing, the team was able to improve the data’s spatial resolution from about 520 km (320 mi) to 290 km (180 mi). Ordinarily, this kind of improvement could only be achieved by getting the spacecraft much closer to the surface.

“It was as if we’d cut the spacecraft’s orbital altitude in half,” said Wilson, “and it gave us a much better view of what’s happening on the surface.” And while the neutron spectrometer did not detect water directly, the high abundance of neutrons detected by the spectrometer allowed the research team to calculate the abundance of hydrogen. At high latitudes on Mars, this is considered to be a telltale sign of water ice.

The first time the Mars Odyssey spacecraft detected abundant hydrogen was in 2002, which appeared to be coming from subsurface deposits at high latitudes around Mars. These findings were confirmed in 2008, when NASA’s Phoenix Lander confirmed that the hydrogen took the form of water ice. However, scientists have been operating under the assumption that at lower latitudes, temperatures are too high for water ice to exist.

This artist’s concept of the Mars Reconnaissance Orbiter highlights the spacecraft’s radar capability. Credit: NASA/JPL

In the past, the detection of hydrogen in the equatorial region was thought to be due to the presence of hydrated minerals (i.e. past water). In addition, the Mars Reconnaissance Orbiter (MRO) and the ESA’s Mars Express orbiter have both conducted radar-sounding scans of the area, using their Shallow Subsurface Radar (SHARAD) and Mars Advanced Radar for Subsurface and Ionospheric Sounding (MARSIS) instruments, respectively.

These scans have suggested that there was either low-density volcanic deposits or water ice below the surface, though the results seemed more consistent with their being no water ice to speak of. As Wilson indicated, their results lend themselves to more than one possible explanation, but seem to indicate that water ice could part of the subsurface’s makeup:

“[I]f the detected hydrogen were buried ice within the top meter of the surface. there would be more than would fit into pore space in soil… Perhaps the signature could be explained in terms of extensive deposits of hydrated salts, but how these hydrated salts came to be in the formation is also difficult to explain. So for now, the signature remains a mystery worthy of further study, and Mars continues to surprise us.”

Given Mars’ thin atmosphere and the temperature ranges that are common around the equator – which get as high as 308 K (35 °C; 95 °F) by midday during the summer – it is a mystery how water ice could be preserved there. The leading theory though is that a mixture of ice and dust was deposited from the polar regions in the past. This could have happened back when Mars’ axial tilt was greater than it is today.

The MARSIS instrument on the Mars Express is a ground penetrating radar sounder used to look for subsurface water and ice. Credit: ESA

However, those conditions have not been present on Mars for hundreds of thousands or even millions of years. As such, any subsurface ice that was deposited there should be long gone by now. There is also the possibility that subsurface ice could be shielded by layers of hardened dust, but this too is insufficient to explain how water ice could have survived on the timescales involved.

In the end, the presence of abundant hydrogen in the Medusae Fossae region is just another mystery that will require further investigation. The same is true for deposits of water ice in general around the equatorial region of Mars. Such deposits mean that future missions would have a source of water for manufacturing rocket fuel.

This would shave billions of dollars of the costs of individual mission since spacecraft would not need to carry enough fuel for a return trip with them. As such, interplanetary spacecraft could be manufactured that would be smaller, lighter and faster. The presence of equatorial water ice could also be used to provide a steady supply of water for a future base on Mars.

Crews could be rotated in and out of this base once every two years – in a way that is similar to what we currently do with the International Space Station. Or – dare I say it? – a local source of water could be used to supply drinking, sanitation and irrigation water to eventual colonists! No matter how you slice it, finding an accessible source of Martian water is critical to the future of space exploration as we know it!

Further Reading: NASA

NASA’s Webb Space Telescope Launch Delayed to 2019

The 18-segment gold coated primary mirror of NASA’s James Webb Space Telescope is raised into vertical alignment in the largest clean room at the agency’s Goddard Space Flight Center in Greenbelt, Maryland, on Nov. 2, 2016. The secondary mirror mount booms are folded down into stowed for launch configuration. Credit: Ken Kremer/kenkremer.com
The 18-segment gold coated primary mirror of NASA’s James Webb Space Telescope is raised into vertical alignment in the largest clean room at the agency’s Goddard Space Flight Center in Greenbelt, Maryland, on Nov. 2, 2016. The secondary mirror mount booms are folded down into stowed for launch configuration. Credit: Ken Kremer/kenkremer.com

The most powerful space telescope ever built will have to wait on the ground for a few more months into 2019 before launching to the High Frontier and looking back nearly to the beginning of time and unraveling untold astronomical secrets on how the early Universe evolved – Engineers need a bit more time to complete the Webb telescopes incredibly complex assembly and testing here on Earth.

Blastoff of NASA’s mammoth James Webb Space Telescope (JWST) has been postponed from late 2018 to the spring of 2019.

“NASA’s James Webb Space Telescope now is planning to launch between March and June 2019 from French Guiana, following a schedule assessment of the remaining integration and test activities,” the agency announced.

Until now the Webb telescope was scheduled to launch on a European Space Agency (ESA) Ariane V booster from the Guiana Space Center in Kourou, French Guiana in October 2018.

“The change in launch timing is not indicative of hardware or technical performance concerns,” said Thomas Zurbuchen, associate administrator for NASA’s Science Mission Directorate at Headquarters in Washington, in a statement.

“Rather, the integration of the various spacecraft elements is taking longer than expected.”

NASA’s says the currently approved budget will not bust the budget or reduce the science output. It “accommodates the change in launch date, and the change will not affect planned science observations.”

NASA’s $8.8 Billion James Webb Space Telescope is the most powerful space telescope ever built and is the scientific successor to the phenomenally successful Hubble Space Telescope (HST).

The Webb Telescope is a joint international collaborative project between NASA, the European Space Agency (ESA) and the Canadian Space Agency (CSA).

Up close side-view of newly exposed gold coated primary mirrors installed onto mirror backplane holding structure of NASA’s James Webb Space Telescope inside the massive clean room at NASA’s Goddard Space Flight Center in Greenbelt, Maryland on May 3, 2016. Aft optics subsystem stands upright at center of 18 mirror segments between stowed secondary mirror mount booms. Credit: Ken Kremer/kenkremer.com

Since Webb is not designed to be serviced by astronauts, the extremely thorny telescope deployment process is designed to occur on its own over a period of several months and must be fully successful. Webb will be positioned at the L2 Lagrange point- a gravitationally stable spot approximately 930,000 miles (1.5 million km) away from Earth.

So its better to be safe than sorry and take the extra time needed to insure success of the hugely expensive project.

NASA’s James Webb Space Telescope sits in Chamber A at NASA’s Johnson Space Center in Houston awaiting the colossal door to close in July 2017 for cryogenic testing. Credits: NASA/Chris Gunn

Various completed components of the Webb telescope are undergoing final testing around the country to confirm their suitability for launch.

Critical cryogenic cooling testing of Webb’s mirrors and science instrument bus is proceeding well inside a giant chamber at NASA’s Johnson Space Center in Texas.

However integration and testing of the complex multilayered sunshield at Northrup Grumman’s Redondo Beach, Ca. facility is taking longer than expected and “has experienced delays.”

The tennis court sized sunshield will protect the delicate optics and state of the art infrared science instruments on NASA’s Webb Telescope.

Webb’s four research instruments cannot function without the essential cooling provided by the sunshield deployment to maintain them at an operating temperature of minus 388 degrees F (minus 233 degrees C).

The Webb telescopes groundbreaking sunshield subsystem consists of five layers of kapton that will keep the optics and instruments incredibly cool, by reducing the incoming sunside facing temperature more than 570 degrees Fahrenheit. Each layer is as thin as a human hair.

All 5 layers of the Webb telescope sunshield installed at Northrop Grumman’s clean room in Redondo Beach, California. The five sunshield membrane layers are each as thin as a human hair. Credits: Northrop Grumman Corp.

“Webb’s spacecraft and sunshield are larger and more complex than most spacecraft. The combination of some integration activities taking longer than initially planned, such as the installation of more than 100 sunshield membrane release devices, factoring in lessons learned from earlier testing, like longer time spans for vibration testing, has meant the integration and testing process is just taking longer,” said Eric Smith, program director for the James Webb Space Telescope at NASA Headquarters in Washington, in a statement.

“Considering the investment NASA has made, and the good performance to date, we want to proceed very systematically through these tests to be ready for a Spring 2019 launch.”

Artist’s concept of the James Webb Space Telescope (JWST) with Sunshield at bottom. Credit: NASA/ESA

Northrop Grumman designed the Webb telescope’s optics and spacecraft bus for NASA’s Goddard Space Flight Center in Greenbelt, Maryland, which manages Webb.

Watch for Ken’s onsite space mission reports direct from the Kennedy Space Center and Cape Canaveral Air Force Station, Florida.

Stay tuned here for Ken’s continuing Earth and Planetary science and human spaceflight news.

Ken Kremer

………….

Learn more about the upcoming ULA Atlas NRO NROL-52 spysat launch on Oct 5 and SpaceX Falcon 9 SES-11 launch on Oct 7, JWST, OSIRIS-REx, NASA missions and more at Ken’s upcoming outreach events at Kennedy Space Center Quality Inn, Titusville, FL:

Oct 3-6, 8: “ULA Atlas NRO NROL-52 spysat launch, SpaceX SES-11, CRS-12 resupply launches to the ISS, Intelsat35e, BulgariaSat 1 and NRO Spysat, SLS, Orion, Commercial crew capsules from Boeing and SpaceX , Heroes and Legends at KSCVC, ULA Atlas/John Glenn Cygnus launch to ISS, SBIRS GEO 3 launch, GOES-R weather satellite launch, OSIRIS-Rex, Juno at Jupiter, InSight Mars lander, SpaceX and Orbital ATK cargo missions to the ISS, ULA Delta 4 Heavy spy satellite, Curiosity and Opportunity explore Mars, Pluto and more,” Kennedy Space Center Quality Inn, Titusville, FL, evenings