New Study Proposes a Giant, Space-Based Solar Flare Shield for Earth

A massive prominence erupts from the surface of the sun. Credit: NASA Goddard Space Flight Center

In today’s modern, fast-paced world, human activity is very much reliant on electrical infrastructure. If the power grids go down, our climate control systems will shut off, our computers will die, and all electronic forms of commerce and communication will cease. But in addition to that, human activity in the 21st century is also becoming increasingly dependent upon the infrastructure located in Low Earth Orbit (LEO).

Aside from the many telecommunications satellites that are currently in space, there’s also the International Space Station and a fleet of GPS satellites. It is for this reason that solar flare activity is considered a serious hazard, and mitigation of it a priority. Looking to address that, a team of scientists from Harvard University recently released a study that proposes a bold solution – placing a giant magnetic shield in orbit.

The study – which was the work of Doctor Manasavi Lingam and Professor Abraham Loeb from the Harvard Smithsonian Center for Astrophysicist (CfA) – recently appeared online under the title “Impact and Mitigation Strategy for Future Solar Flares“. As they explain, solar flares pose a particularly grave risk in today’s world, and will become an even greater threat due to humanity’s growing presence in LEO.

Solar flares have been a going concern for over 150 years, ever since the famous Carrington Event of 1859. Since that time, a great deal of effort has been dedicated to the study of solar flares from both a theoretical and observational standpoint. And thanks to the advances that have been made in the past 200 years in terms of astronomy and space exploration, much has been learned about the phenomena known as “space weather”.

At the same time, humanity’s increased reliance on electricity and space-based infrastructure have also made us more vulnerable to extreme space weather events. In fact, if the Carrington event were to take place today, it is estimated that it would cause global damage to electric power grids, satellites communications, and global supply chains.

The cumulative worldwide economic losses, according to a 2009 report by the Space Studies Board (“Severe Space Weather Events–Understanding Societal and Economic Impacts”), would be $10 trillion, and recovery would take several years. And yet, as Professor Loeb explained to Universe Today via email, this threat from space has received far less attention than other possible threats.

“In terms of risk from the sky, most of the attention in the past was dedicated to asteroids,” said Loeb. “They killed the dinosaurs and their physical impact in the past was the same as it will be in the future, unless their orbits are deflected. However, solar flares have little biological impact and their main impact is on technology. But a century ago, there was not much technological infrastructure around, and technology is growing exponentially. Therefore, the damage is highly asymmetric between the past and future.”

Artist’s concept of a large asteroid passing by the Earth-Moon system. Credit: A combination of ESO/NASA images courtesy of Jason Major/Lights in the Dark.

To address this, Lingham and Loeb developed a simple mathematical model to assess the economic losses caused by solar flare activity over time. This model considered the increasing risk of damage to technological infrastructure based on two factors. For one, they considered the fact that the energy of a solar flares increases with time, then coupled this with the exponential growth of technology and GDP.

What they determined was that on longer time scales, the rare types of solar flares that are very powerful become much more likely. Coupled with humanity’s growing presence and dependence on spacecraft and satellites in LEO, this will add up to a dangerous conjunction somewhere down the road. Or as Loeb explained:

“We predict that within ~150 years, there will be an event that causes damage comparable to the current US GDP of ~20 trillion dollars, and the damage will increase exponentially at later times until technological development will saturate. Such a forecast was never attempted before. We also suggest a novel idea for how to reduce the damage from energetic particles by a magnetic shield. This was my idea and was not proposed before.”

To address this growing risk, Lingham and Loeb also considered the possibility of placing a magnetic shield between Earth and the Sun. This shield would be placed at the Earth-Sun Lagrange Point 1, where it would be able to deflect charged particles and create an artificial bowshock around Earth. In this sense, this shield would protect Earth’s in a way that is similar to what its magnetic field already does, but to greater effect.

Illustration of the proposed magnetic deflector placed at the Earth-Sun L1 Lagrange Point. Credit: Lingam and Loeb, 2017

Based on their assessment, Lingham and Loeb indicate that such a shield is technically feasible in terms of its basic physical parameters. They were also able to provide a rudimentary timeline for the construction of this shield, not to mention some rough cost assessments. As Loeb indicated, such a shield could be built before this century is over, and at a fraction of the cost of what would be incurred from solar flare damage.

“The engineering project associated with the magnetic shield that we propose could take a few decades to construct in space,” he said. “The cost for lifting the needed infrastructure to space (weighting 100,000 tons) will likely be of order 100 billions of dollars, much less than the expected damage over a century.”

Interestingly enough, the idea of using a magnetic shield to protect planets has been proposed before. For example, this type of shield was also the subject of a presentation at this year’s “Planetary Science Vision 2050 Workshop“, which was hosted by NASA’s Planetary Science Division (PSD). This shield was recommended as a means of enhancing Mars’ atmosphere and facilitating crewed mission to its surface in the future.

During the course of the presentation, titled “A Future Mars Environment for Science and Exploration“, NASA Director Jim Green discussed how a magnetic shield could protect Mars’ tenuous atmosphere from solar wind. This would allow it to replenish over time, which would have the added benefit of warming Mars up and allowing liquid water to again flow on its surface. If this sounds similar to proposals for terraforming Mars, that’s because it is!

Artist’s impression of a flaring red dwarf star, orbited by an exoplanet. Credit: NASA, ESA, and G. Bacon (STScI)

Beyond Earth and the Solar System, the implications for this study are quite overwhelming. In recent years, many terrestrial planets have been found orbiting within nearby M-type (aka. red dwarf) star systems. Because of the way these planets orbit closely to their respective suns, and the variable and unstable nature of M-type stars, scientists have expressed doubts about whether or not these planets could actually be habitable.

In short, scientists have ventured that over the course of billions of years, rocky planets that orbit close to their suns, are tidally-locked with them, and are subject to regular solar flares would lose their atmospheres. In this respect, magnetic shields could be a possible solution to creating extra-solar colonies. Place a large shield in orbit at the L1 Lagrange point, and you never have to worry again about powerful magnetic storms ravaging the planet!

On top of that, this study offers a possible resolution to the Fermi Paradox. When looking for sign of Extra-Terrestrial Intelligence (ETI), it might make sense to monitor distant stars for signs of an orbiting magnetic shield. As Prof. Leob explained, such structures may have already been detected around distant stars, and could explain some of the unusual observations astronomers have made:

“The imprint of a shield built by another civilization could involve the changes it induces in the brightness of the host star due to occultation (similar behavior to Tabby’s star)  if the structure is big enough. The situation could be similar to Dyson’s spheres, but instead of harvesting the energy of the star the purpose of the infrastructure is to protect a technological civilization on a planet from the flares of its host star.”
It is a foregone conclusion that as time and technology progress, humanity’s presence in (and reliance on) space will increase. As such, preparing for the most drastic space weather events the Solar System can throw at us just makes sense. And when it comes to the big questions like “are we alone in the Universe?”, it also makes sense to take our boldest concepts and proposals and consider how they might point the way towards extra-terrestrial intelligence.

Further Reading: arXiv

LIGO Scientists who Detected Gravitational Waves Awarded Nobel Prize in Physics

Barry C. Barish and Kip S. Thorne, two of the recipients for the 2017 Nobel Prize in physics for their work with gravitational wave research. Credit: Caltech

In February of 2016, scientists working for the Laser Interferometer Gravitational-Wave Observatory (LIGO) made history when they announced the first-ever detection of gravitational waves. Since that time, multiple detections have taken place and scientific collaborations between observatories  – like Advanced LIGO and Advanced Virgo – are allowing for unprecedented levels of sensitivity and data sharing.

Not only was the first-time detection of gravity waves an historic accomplishment, it ushered in a new era of astrophysics. It is little wonder then why the three researchers who were central to the first detection have been awarded the 2017 Nobel Prize in Physics. The prize was awarded jointly to Caltech professors emeritus Kip S. Thorne and Barry C. Barish, along with MIT professor emeritus Rainer Weiss.

To put it simply, gravitational waves are ripples in space-time that are formed by major astronomical events – such as the merger of a binary black hole pair. They were first predicted over a century ago by Einstein’s Theory of General Relativity, which indicated that massive perturbations would alter the structure of space-time. However, it was not until recent years that evidence of these waves was observed for the first time.

The first signal was detected by LIGO’s twin observatories – in Hanford, Washington, and Livingston, Louisiana, respectively – and traced to a black mole merger 1.3 billion light-years away. To date, four detections have been, all of which were due to the mergers of black-hole pairs. These took place on December 26, 2015, January 4, 2017, and August 14, 2017, the last being detected by LIGO and the European Virgo gravitational-wave detector.

For the role they played in this accomplishment, one half of the prize was awarded jointly to Caltech’s Barry C. Barish – the Ronald and Maxine Linde Professor of Physics, Emeritus – and Kip S. Thorne, the Richard P. Feynman Professor of Theoretical Physics, Emeritus. The other half was awarded to Rainer Weiss, Professor of Physics, Emeritus, at the Massachusetts Institute of Technology (MIT).

As Caltech president Thomas F. Rosenbaum – the Sonja and William Davidow Presidential Chair and Professor of Physics – said in a recent Caltech press statement:

“I am delighted and honored to congratulate Kip and Barry, as well as Rai Weiss of MIT, on the award this morning of the 2017 Nobel Prize in Physics. The first direct observation of gravitational waves by LIGO is an extraordinary demonstration of scientific vision and persistence. Through four decades of development of exquisitely sensitive instrumentation—pushing the capacity of our imaginations—we are now able to glimpse cosmic processes that were previously undetectable. It is truly the start of a new era in astrophysics.”

This accomplishment was all the more impressive considering that Albert Einstein, who first predicted their existence, believed gravitational waves would be too weak to study. However, by the 1960s, advances in laser technology and new insights into possible astrophysical sources led scientists to conclude that these waves might actually be detectable.

The first gravity wave detectors were built by Joseph Weber, an astrophysics from the University of Maryland. His detectors, which were built in the 1960s, consisted of large aluminum cylinders  that would be driven to vibrate by passing gravitational waves. Other attempts followed, but all proved unsuccessful; prompting a shift towards a new type of detector involving interferometry.

One such instrument was developed by Weiss at MIT, which relied on the technique known as laser interferometry. In this kind of instrument, gravitational waves are measured using widely spaced and separated mirrors that reflect lasers over long distances. When gravitational waves cause space to stretch and squeeze by infinitesimal amounts, it causes the reflected light inside the detector to shift minutely.

At the same time, Thorne – along with his students and postdocs at Caltech – began working to improve the theory of gravitational waves. This included new estimates on the strength and frequency of waves produced by objects like black holes, neutron stars and supernovae. This culminated in a 1972 paper which Throne co-published with his student, Bill Press, which summarized their vision of how gravitational waves could be studied.

That same year, Weiss also published a detailed analysis of interferometers and their potential for astrophysical research. In this paper, he stated that larger-scale operations – measuring several km or more in size – might have a shot at detecting gravitational waves. He also identified the major challenges to detection (such as vibrations from the Earth) and proposed possible solutions for countering them.

Barry C. Barish and Kip S. Thorne, two of three recipients of the 2017 Nobel Prize in Physics. Credit: Caltech

In 1975, Weiss invited Thorne to speak at a NASA committee meeting in Washington, D.C., and the two spent an entire night talking about gravitational experiments. As a result of their conversation, Thorne went back to Calteh and proposed creating a experimental gravity group, which would work on interferometers in parallel with researchers at MIT, the University of Glasgow and the University of Garching (where similar experiments were being conducted).

Development on the first interferometer began shortly thereafter at Caltech, which led to the creation of a 40-meter (130-foot) prototype to test Weiss’ theories about gravitational waves. In 1984, all of the work being conducted by these respective institutions came together. Caltech and MIT, with the support of the National Science Foundation (NSF) formed the LIGO collaboration and began work on its two interferometers in Hanford and Livingston.

The construction of LIGO was a major challenge, both logistically and technically. However, things were helped immensely when Barry Barish (then a Caltech particle physicist) became the Principal Investigator (PI) of LIGO in 1994. After a decade of stalled attempts, he was also made the director of LIGO and put its construction back on track. He also expanded the research team and developed a detailed work plan for the NSF.

As Barish indicated, the work he did with LIGO was something of a dream come true:

“I always wanted to be an experimental physicist and was attracted to the idea of using continuing advances in technology to carry out fundamental science experiments that could not be done otherwise. LIGO is a prime example of what couldn’t be done before. Although it was a very large-scale project, the challenges were very different from the way we build a bridge or carry out other large engineering projects. For LIGO, the challenge was and is how to develop and design advanced instrumentation on a large scale, even as the project evolves.”

LIGO’s two facilities, located in Livingston, Louisiana, and Hanford, Washington. Credit: ligo.caltech.edu

By 1999, construction had wrapped up on the LIGO observatories and by 2002, LIGO began to obtain data. In 2008, work began on improving its original detectors, known as the Advanced LIGO Project. The process of converting the 40-m prototype to LIGO’s current 4-km (2.5 mi) interferometers was a massive undertaking, and therefore needed to be broken down into steps.

The first step took place between 2002 and 2010, when the team built and tested the initial interferometers. While this did not result in any detections, it did demonstrate the observatory’s basic concepts and solved many of the technical obstacles. The next phase – called Advanced LIGO, which took placed between 2010 and 2015 – allowed the detectors to achieve new levels of sensitivity.

These upgrades, which also happened under Barish’s leadership, allowed for the development of several key technologies which ultimately made the first detection possible. As Barish explained:

“In the initial phase of LIGO, in order to isolate the detectors from the earth’s motion, we used a suspension system that consisted of test-mass mirrors hung by piano wire and used a multiple-stage set of passive shock absorbers, similar to those in your car. We knew this probably would not be good enough to detect gravitational waves, so we, in the LIGO Laboratory, developed an ambitious program for Advanced LIGO that incorporated a new suspension system to stabilize the mirrors and an active seismic isolation system to sense and correct for ground motions.”

Rainer Weiss, famed MIT physicist and partial winner of the 2017 Nobel Prize in Physics. Credit: MIT/Bryce Vickmark

Given how central Thorne, Weiss and Barish were to the study of gravitational waves, all three were rightly-recognized as this year’s recipients of the Nobel Prize in Physics. Both Thorne and Barish were notified that they had won in the early morning hours on October 3rd, 2017. In response to the news, both scientists were sure to acknowledge the ongoing efforts of LIGO, the science teams that have contributed to it, and the efforts of Caltech and MIT in creating and maintaining the observatories.

“The prize rightfully belongs to the hundreds of LIGO scientists and engineers who built and perfected our complex gravitational-wave interferometers, and the hundreds of LIGO and Virgo scientists who found the gravitational-wave signals in LIGO’s noisy data and extracted the waves’ information,” said Thorne. “It is unfortunate that, due to the statutes of the Nobel Foundation, the prize has to go to no more than three people, when our marvelous discovery is the work of more than a thousand.”

“I am humbled and honored to receive this award,” said Barish. “The detection of gravitational waves is truly a triumph of modern large-scale experimental physics. Over several decades, our teams at Caltech and MIT developed LIGO into the incredibly sensitive device that made the discovery. When the signal reached LIGO from a collision of two stellar black holes that occurred 1.3 billion years ago, the 1,000-scientist-strong LIGO Scientific Collaboration was able to both identify the candidate event within minutes and perform the detailed analysis that convincingly demonstrated that gravitational waves exist.”

Looking ahead, it is also pretty clear that Advanved LIGO, Advanced Virgo and other gravitational wave observatories around the world are just getting started. In addition to having detected four separate events, recent studies have indicated that gravitational wave detection could also open up new frontiers for astronomical and cosmological research.

For instance, a recent study by a team of researchers from the Monash Center for Astrophysics proposed a theoretical concept known as ‘orphan memory’. According to their research, gravitational waves not only cause waves in space-time, but leave permanent ripples in its structure. By studying the “orphans” of past events, gravitational waves can be studied both as they reach Earth and long after they pass.

In addition, a study was released in August by a team of astronomers from the Center of Cosmology at the University of California Irvine that indicated that black hole mergers are far more common than we thought. After conducting a survey of the cosmos intended to calculate and categorize black holes, the UCI team determined that there could be as many as 100 million black holes in the galaxy.

Another recent study indicated that the Advanced LIGO, GEO 600, and Virgo gravitational-wave detector network could also be used to detect the gravitational waves created by supernovae. By detecting the waves created by star that explode near the end of their lifespans, astronomers could be able to see inside the hearts of collapsing stars for the first time and probe the mechanics of black hole formation.

The Nobel Prize in Physics is one of the highest honors that can be bestowed upon a scientist. But even greater than that is the knowledge that great things resulted from one’s own work. Decades after Thorne, Weiss and Barish began proposing gravitational wave studies and working towards the creation of detectors, scientists from all over the world are making profound discoveries that are revolutionizing the way we think of the Universe.

And as these scientists will surely attest, what we’ve seen so far is just the tip of the iceberg. One can imagine that somewhere, Einstein is also beaming with pride. As with other research pertaining to his theory of General Relativity, the study of gravitational waves is demonstrating that even after a century, his predictions were still bang on!

And be sure to check out this video of the Caltech Press Conference where Barish and Thorn were honored for their accomplishments:

Further Reading: NASA, Caltech

Determining the Mass of the Milky Way Using Hypervelocity Stars

An artist's conception of a hypervelocity star that has escaped the Milky Way. Credit: NASA

For centuries, astronomers have been looking beyond our Solar System to learn more about the Milky Way Galaxy. And yet, there are still many things about it that elude us, such as knowing its precise mass. Determining this is important to understanding the history of galaxy formation and the evolution of our Universe. As such, astronomers have attempted various techniques for measuring the true mass of the Milky Way.

So far, none of these methods have been particularly successful. However, a new study by a team of researchers from the Harvard-Smithsonian Center for Astrophysics proposed a new and interesting way to determine how much mass is in the Milky Way. By using hypervelocity stars (HVSs) that have been ejected from the center of the galaxy as a reference point, they claim that we can constrain the mass of our galaxy.

Their study, titled “Constraining Milky Way Mass with Hypervelocity Stars“, was recently published in the journal Astronomy and Astrophysics. The study was produced by Dr. Giacomo Fragione, an astrophysicist at the University of Rome, and Professor Abraham Loeb – the Frank B. Baird, Jr. Professor of Science, the Chair of the Astronomy Department, and the Director of the Institute for Theory and Computation at Harvard University.

Stars speeding through the Galaxy. Credit: ESA

To be clear, determining the mass of the Milky Way Galaxy is no simple task. On the one hand, observations are difficult because the Solar System lies deep within the disk of the galaxy itself. But at the same time, there’s also the mass of our galaxy’s dark matter halo, which is difficult to measure since it is not “luminous”, and therefore invisible to conventional methods of detection.

Current estimates of the galaxy’s total mass are based on the motions of tidal streamers of gas and globular clusters, which are both influenced by the gravitational mass of the galaxy. But so far, these measurements have produced mass estimates that range from one to several trillion solar-masses. As Professor Loeb explained to Universe Today via email, precisely measuring the mass of the Milky Way is of great importance to astronomers:

“The Milky Way provides a laboratory for testing the standard cosmological model. This model predicts that the number of satellite galaxies of the Milky Way depends sensitively on its mass. When comparing the predictions to the census of known satellite galaxies, it is essential to know the Milky Way mass. Moreover, the total mass calibrates the amount of invisible (dark) matter and sets the depth of the gravitational potential well and implies how fast should stars move for them to escape to intergalactic space.”

For the sake of their study, Prof. Loeb and Dr. Fragione therefore chose to take a novel approach, which involved modeling the motions of HVSs to determine the mass of our galaxy. More than 20 HVSs have been discovered within our galaxy so far, which travel at speeds of up to 700 km/s (435 mi/s) and are located at distances of about 100 to 50,000 light-years from the galactic center.

Artist’s conception of a hyperveloctiy star heading out from a spiral galaxy (similar to the Milky Way) and moving into dark matter nearby. Credit: Ben Bromley, University of Utah

These stars are thought to have been ejected from the center of our galaxy thanks to the interactions of binary stars with the supermassive black hole (SMBH) at the center of our galaxy – aka. Sagittarius A*. While their exact cause is still the subject of debate, the orbits of HVSs can be calculated since they are completely determined by the gravitational field of the galaxy.

As they explain in their study, the researchers used the asymmetry in the radial velocity distribution of stars in the galactic halo to determine the galaxy’s gravitational potential. The velocity of these halo stars is dependent on the potential escape speed of HVSs, provided that the time it takes for the HVSs to complete a single orbit is shorter than the lifetime of the halo stars.

From this, they were able to discriminate between different models for the Milky Way and the gravitational force it exerts. By adopting the nominal travel time of these observed HVSs – which they calculated to about 330 million years, about the same as the average lifetime of halo stars – they were able to derive gravitational estimates for the Milky Way which allowed for estimates on its overall mass.

“By calibrating the minimum speed of unbound stars, we find that the Milky Way mass is in the range of 1.2-1.9 trillions solar masses,” said Loeb. While still subject to a range, this latest estimate is a significant improvement over previous estimates. What’s more, these estimates are consistent our current cosmological models that attempt to account for all visible matter in the Universe, as well as dark matter and dark energy – the Lambda-CDM model.

Distribution of dark matter when the Universe was about 3 billion years old, obtained from a numerical simulation of galaxy formation. Credit: VIRGO Consortium/Alexandre Amblard/ESA

“The inferred Milky Way mass is in the range expected within the standard cosmological model,” said Leob, “where the amount of dark matter is about five times larger than that of ordinary (luminous) matter.”

Based on this breakdown, it can be said that normal matter in our galaxy – i.e. stars, planets, dust and gas – accounts for between 240 and 380 billion Solar Masses. So not only does this latest study provide more precise mass constraints for our galaxy, it could also help us to determine exactly how many star systems are out there – current estimates say that the Milky Way has between 200 to 400 billion stars and 100 billion planets.

Beyond that, this study is also significant to the study of cosmic formation and evolution. By placing more precise estimates on our galaxy’s mass, ones which are consistent with the current breakdown of normal matter and dark matter, cosmologists will be able to construct more accurate accounts of how our Universe came to be. One step clsoer to understanding the Universe on the grandest of scales!

Further Reading: Harvard Smithsonian CfA, Astronomy and Astrophysics

Old Mars Odyssey Data Indicates Presence of Ice Around Martian Equator

A new paper suggests hydrogen-possibly water ice-in the Medusa Fossae area of Mars, which is in an equatorial region of the planet to the lower left in this view. Image Credit: Steve Lee (University of Colorado), Jim Bell (Cornell University), Mike Wolff (Space Science Institute), and NASA

Finding a source of Martian water – one that is not confined to Mars’ frozen polar regions – has been an ongoing challenge for space agencies and astronomers alike. Between NASA, SpaceX, and every other public and private space venture hoping to conduct crewed mission to Mars in the future, an accessible source of ice would mean the ability to manufacture rocket fuel on sight and provide drinking water for an outpost.

So far, attempt to locate an equatorial source of water ice have failed. But after consulting old data from the longest-running mission to Mars in history – NASA’s Mars Odyssey spacecraft – a team of researchers from the John Hopkins University Applied Physics Laboratory (JHUAPL) announced that they may have found evidence of a source of water ice in the Medusae Fossae region of Mars.

This region of Mars, which is located in the equatorial region, is situated between the highland-lowland boundary near the Tharsis and Elysium volcanic areas. This area is known for its formation of the same name, which is a soft deposit of easily-erodible material that extends for about 5000 km (3,109 mi) along the equator of Mars. Until now, it was believed to be impossible for water ice to exist there.

Artist’s conception of the Mars Odyssey spacecraft. Credit: NASA/JPL

However, a team led by Jack Wilson – a post-doctoral researcher at the JHUAPL – recently reprocessed data from the Mars Odyssey spacecraft that showed unexpected signals. This data was collected between 2002 and 2009 by the mission’s neutron spectrometer instrument. After reprocessing the lower-resolution compositional data to bring it into sharper focus, the team found that it contained unexpectedly high signals of hydrogen.

To bring the information into higher-resolution, Wilson and his team applied image-reconstruction techniques that are typically used to reduce blurring and remove noise from medical and spacecraft imaging data. In so doing, the team was able to improve the data’s spatial resolution from about 520 km (320 mi) to 290 km (180 mi). Ordinarily, this kind of improvement could only be achieved by getting the spacecraft much closer to the surface.

“It was as if we’d cut the spacecraft’s orbital altitude in half,” said Wilson, “and it gave us a much better view of what’s happening on the surface.” And while the neutron spectrometer did not detect water directly, the high abundance of neutrons detected by the spectrometer allowed the research team to calculate the abundance of hydrogen. At high latitudes on Mars, this is considered to be a telltale sign of water ice.

The first time the Mars Odyssey spacecraft detected abundant hydrogen was in 2002, which appeared to be coming from subsurface deposits at high latitudes around Mars. These findings were confirmed in 2008, when NASA’s Phoenix Lander confirmed that the hydrogen took the form of water ice. However, scientists have been operating under the assumption that at lower latitudes, temperatures are too high for water ice to exist.

This artist’s concept of the Mars Reconnaissance Orbiter highlights the spacecraft’s radar capability. Credit: NASA/JPL

In the past, the detection of hydrogen in the equatorial region was thought to be due to the presence of hydrated minerals (i.e. past water). In addition, the Mars Reconnaissance Orbiter (MRO) and the ESA’s Mars Express orbiter have both conducted radar-sounding scans of the area, using their Shallow Subsurface Radar (SHARAD) and Mars Advanced Radar for Subsurface and Ionospheric Sounding (MARSIS) instruments, respectively.

These scans have suggested that there was either low-density volcanic deposits or water ice below the surface, though the results seemed more consistent with their being no water ice to speak of. As Wilson indicated, their results lend themselves to more than one possible explanation, but seem to indicate that water ice could part of the subsurface’s makeup:

“[I]f the detected hydrogen were buried ice within the top meter of the surface. there would be more than would fit into pore space in soil… Perhaps the signature could be explained in terms of extensive deposits of hydrated salts, but how these hydrated salts came to be in the formation is also difficult to explain. So for now, the signature remains a mystery worthy of further study, and Mars continues to surprise us.”

Given Mars’ thin atmosphere and the temperature ranges that are common around the equator – which get as high as 308 K (35 °C; 95 °F) by midday during the summer – it is a mystery how water ice could be preserved there. The leading theory though is that a mixture of ice and dust was deposited from the polar regions in the past. This could have happened back when Mars’ axial tilt was greater than it is today.

The MARSIS instrument on the Mars Express is a ground penetrating radar sounder used to look for subsurface water and ice. Credit: ESA

However, those conditions have not been present on Mars for hundreds of thousands or even millions of years. As such, any subsurface ice that was deposited there should be long gone by now. There is also the possibility that subsurface ice could be shielded by layers of hardened dust, but this too is insufficient to explain how water ice could have survived on the timescales involved.

In the end, the presence of abundant hydrogen in the Medusae Fossae region is just another mystery that will require further investigation. The same is true for deposits of water ice in general around the equatorial region of Mars. Such deposits mean that future missions would have a source of water for manufacturing rocket fuel.

This would shave billions of dollars of the costs of individual mission since spacecraft would not need to carry enough fuel for a return trip with them. As such, interplanetary spacecraft could be manufactured that would be smaller, lighter and faster. The presence of equatorial water ice could also be used to provide a steady supply of water for a future base on Mars.

Crews could be rotated in and out of this base once every two years – in a way that is similar to what we currently do with the International Space Station. Or – dare I say it? – a local source of water could be used to supply drinking, sanitation and irrigation water to eventual colonists! No matter how you slice it, finding an accessible source of Martian water is critical to the future of space exploration as we know it!

Further Reading: NASA

NASA’s Webb Space Telescope Launch Delayed to 2019

The 18-segment gold coated primary mirror of NASA’s James Webb Space Telescope is raised into vertical alignment in the largest clean room at the agency’s Goddard Space Flight Center in Greenbelt, Maryland, on Nov. 2, 2016. The secondary mirror mount booms are folded down into stowed for launch configuration. Credit: Ken Kremer/kenkremer.com
The 18-segment gold coated primary mirror of NASA’s James Webb Space Telescope is raised into vertical alignment in the largest clean room at the agency’s Goddard Space Flight Center in Greenbelt, Maryland, on Nov. 2, 2016. The secondary mirror mount booms are folded down into stowed for launch configuration. Credit: Ken Kremer/kenkremer.com

The most powerful space telescope ever built will have to wait on the ground for a few more months into 2019 before launching to the High Frontier and looking back nearly to the beginning of time and unraveling untold astronomical secrets on how the early Universe evolved – Engineers need a bit more time to complete the Webb telescopes incredibly complex assembly and testing here on Earth.

Blastoff of NASA’s mammoth James Webb Space Telescope (JWST) has been postponed from late 2018 to the spring of 2019.

“NASA’s James Webb Space Telescope now is planning to launch between March and June 2019 from French Guiana, following a schedule assessment of the remaining integration and test activities,” the agency announced.

Until now the Webb telescope was scheduled to launch on a European Space Agency (ESA) Ariane V booster from the Guiana Space Center in Kourou, French Guiana in October 2018.

“The change in launch timing is not indicative of hardware or technical performance concerns,” said Thomas Zurbuchen, associate administrator for NASA’s Science Mission Directorate at Headquarters in Washington, in a statement.

“Rather, the integration of the various spacecraft elements is taking longer than expected.”

NASA’s says the currently approved budget will not bust the budget or reduce the science output. It “accommodates the change in launch date, and the change will not affect planned science observations.”

NASA’s $8.8 Billion James Webb Space Telescope is the most powerful space telescope ever built and is the scientific successor to the phenomenally successful Hubble Space Telescope (HST).

The Webb Telescope is a joint international collaborative project between NASA, the European Space Agency (ESA) and the Canadian Space Agency (CSA).

Up close side-view of newly exposed gold coated primary mirrors installed onto mirror backplane holding structure of NASA’s James Webb Space Telescope inside the massive clean room at NASA’s Goddard Space Flight Center in Greenbelt, Maryland on May 3, 2016. Aft optics subsystem stands upright at center of 18 mirror segments between stowed secondary mirror mount booms. Credit: Ken Kremer/kenkremer.com

Since Webb is not designed to be serviced by astronauts, the extremely thorny telescope deployment process is designed to occur on its own over a period of several months and must be fully successful. Webb will be positioned at the L2 Lagrange point- a gravitationally stable spot approximately 930,000 miles (1.5 million km) away from Earth.

So its better to be safe than sorry and take the extra time needed to insure success of the hugely expensive project.

NASA’s James Webb Space Telescope sits in Chamber A at NASA’s Johnson Space Center in Houston awaiting the colossal door to close in July 2017 for cryogenic testing. Credits: NASA/Chris Gunn

Various completed components of the Webb telescope are undergoing final testing around the country to confirm their suitability for launch.

Critical cryogenic cooling testing of Webb’s mirrors and science instrument bus is proceeding well inside a giant chamber at NASA’s Johnson Space Center in Texas.

However integration and testing of the complex multilayered sunshield at Northrup Grumman’s Redondo Beach, Ca. facility is taking longer than expected and “has experienced delays.”

The tennis court sized sunshield will protect the delicate optics and state of the art infrared science instruments on NASA’s Webb Telescope.

Webb’s four research instruments cannot function without the essential cooling provided by the sunshield deployment to maintain them at an operating temperature of minus 388 degrees F (minus 233 degrees C).

The Webb telescopes groundbreaking sunshield subsystem consists of five layers of kapton that will keep the optics and instruments incredibly cool, by reducing the incoming sunside facing temperature more than 570 degrees Fahrenheit. Each layer is as thin as a human hair.

All 5 layers of the Webb telescope sunshield installed at Northrop Grumman’s clean room in Redondo Beach, California. The five sunshield membrane layers are each as thin as a human hair. Credits: Northrop Grumman Corp.

“Webb’s spacecraft and sunshield are larger and more complex than most spacecraft. The combination of some integration activities taking longer than initially planned, such as the installation of more than 100 sunshield membrane release devices, factoring in lessons learned from earlier testing, like longer time spans for vibration testing, has meant the integration and testing process is just taking longer,” said Eric Smith, program director for the James Webb Space Telescope at NASA Headquarters in Washington, in a statement.

“Considering the investment NASA has made, and the good performance to date, we want to proceed very systematically through these tests to be ready for a Spring 2019 launch.”

Artist’s concept of the James Webb Space Telescope (JWST) with Sunshield at bottom. Credit: NASA/ESA

Northrop Grumman designed the Webb telescope’s optics and spacecraft bus for NASA’s Goddard Space Flight Center in Greenbelt, Maryland, which manages Webb.

Watch for Ken’s onsite space mission reports direct from the Kennedy Space Center and Cape Canaveral Air Force Station, Florida.

Stay tuned here for Ken’s continuing Earth and Planetary science and human spaceflight news.

Ken Kremer

………….

Learn more about the upcoming ULA Atlas NRO NROL-52 spysat launch on Oct 5 and SpaceX Falcon 9 SES-11 launch on Oct 7, JWST, OSIRIS-REx, NASA missions and more at Ken’s upcoming outreach events at Kennedy Space Center Quality Inn, Titusville, FL:

Oct 3-6, 8: “ULA Atlas NRO NROL-52 spysat launch, SpaceX SES-11, CRS-12 resupply launches to the ISS, Intelsat35e, BulgariaSat 1 and NRO Spysat, SLS, Orion, Commercial crew capsules from Boeing and SpaceX , Heroes and Legends at KSCVC, ULA Atlas/John Glenn Cygnus launch to ISS, SBIRS GEO 3 launch, GOES-R weather satellite launch, OSIRIS-Rex, Juno at Jupiter, InSight Mars lander, SpaceX and Orbital ATK cargo missions to the ISS, ULA Delta 4 Heavy spy satellite, Curiosity and Opportunity explore Mars, Pluto and more,” Kennedy Space Center Quality Inn, Titusville, FL, evenings

New Study Says Earth Avoided a “Carbon Overdose” During Formation

A new study from the University of Heidelberg suggests that flash-heating and carbon depletion could have been intrinsic to the emergence and evolution of life on Earth. Credit: NASA

According to the Nebular Hypothesis, the Sun and planets formed 4.6 billion years ago from a giant cloud of dust and gas. This began with the Sun forming in the center, and the remaining material forming a protoplanetary disc, from which the planets formed. Whereas the planets in the outer Solar System were largely made up of gases (i.e. the Gas Giants), those closer to the Sun formed from silicate minerals and metals (i.e. the terrestrial planets).

Despite having a pretty good idea of how this all came about, the question of exactly how the planets of the Solar System formed and evolved over the course of billions of year is still subject to debate. In a new study, two researchers from the University of Heidelberg considered the role played by carbon in both the formation of Earth and the emergence and evolution of life.

Their study, “Spatial Distribution of Carbon Dust in the Early Solar Nebula and the Carbon Content of Planetesimals“, recently appeared in the journal Astronomy and Astrophysics. The study was conducted by Hans-Peter Gail, from the Institute for Theoretical Astrophysics at the University of Heidelberg, and Mario Trieloff – from Heidelberg’s Institute of Earth Sciences and the Klaus-Tschira-Laboratory for Cosmochemistry.

A slice of the Allende meteorite with silicate globules of the size of a millimetre. Credit: Institute of Earth Science

For the sake of their study, the pair considered what role the element carbon – which is essential to life here on Earth – played in planetary formation. Essentially, scientists are of the opinion that during the earliest days of the Solar System – when it was still a giant cloud of dust and gas – carbon-rich materials were distributed to the inner Solar System from the outer Solar System.

Out beyond the “Frost Line” – where volatiles like water, ammonia and methane and are able to condense into ice – bodies containing frozen carbon compounds formed. Much like how water was distributed throughout the Solar System, that these bodies were supposedly kicked out of their orbits and sent towards the Sun, distributing volatile materials to the planetesimals that would eventually grow to become the terrestrial planets.

However, when one compares the kinds of meteors that distributed primordial material to Earth – aka. chondrite meteorites –  one notices a certain discrepancy. Basically, carbon is comparatively rare on Earth compared to these ancient rocks, the reason for which has remained a mystery. As Prof. Trieloff, who was the co-author on the study, explained in a University of Heidelberg press release:

“On Earth, carbon is a relatively rare element. It is enriched close to the Earth´s surface, but as a fraction of the total matter on Earth it is a mere one-half of 1/1000th. In primitive comets, however, the proportion of carbon can be ten percent or more.”

Artist’s conception of a solar system in formation. Credit: NASA/FUSE/Lynette Cook

“A substantial portion of the carbon in asteroids and comets is in long-chain and branched molecules that evaporate only at very high temperatures,” added Dr. Grail, the study’s lead author. “Based on the standard models that simulate carbon reactions in the solar nebula where the sun and planets originated, the Earth and the other terrestrial planets should have up to 100 times more carbon.”

To address this, the two researches constructed a model that assumed that short-duration flash-heating events – where the Sun heated the protoplanetary disc – were responsible for this discrepancy. They also assumed that all matter in the inner Solar System was heated to temperatures of between 1,300 and 1,800 °C (2372 to 3272 °F) before small planetesimals and terrestrial planets eventually formed.

Dr. Grail and Trieloff believe the evidence for this lies in the round grains in meteorites that form from molten droplets – known as chondrules. Unlike chondrite meteorites, which can be composed of up to a few percent carbon, chondrules are largely depleted of this element. This, they claim, was the a result of the same flash-heating events that took place before the chondrules could accrete to form meteorites. As Dr. Gail indicated:

“Only the spikes in temperature derived from the chondrule formation models can explain today’s low amount of carbon on the inner planets. Previous models did not take this process into account, but we apparently have it to thank for the correct amount of carbon that allowed the evolution of the Earth’s biosphere as we know it.”

Artist impression of the Late Heavy Bombardment period. Credit: NASA

In short, the discrepancy between the amount of carbon found in chondritic-rock material and that found on Earth can be explained by intense heating in the primordial Solar System. As Earth formed from chrondritic material, the extreme heat caused it to be depleted of its natural carbon. In addition to shedding light on what has been an ongoing mystery in astronomy, this study also offers new insight into how life in the Solar System began.

Basically, the researchers speculate that the flash-heating events in the inner Solar System may have been necessary for life here on Earth. Had there been too much carbon in the primordial material that coalesced into our planet, the result could have been a “carbon overdose”. This is because when carbon becomes oxidized, it forms carbon dioxide, a major greenhouse gas that can lead to a runaway heating effect.

This is what planetary scientists believe happened to Venus, where the presence of abundant CO2 – combined with its increased exposure to Solar radiation – led to the hellish environment that is there today. But on Earth, CO2 was removed from the atmosphere by the silicate-carbonate cycle, which allowed for Earth to achieve a balanced and life-sustaining environment.

“Whether 100 times more carbon would permit effective removal of the greenhouse gas is questionable at the very least,” said Dr. Trieloff. “The carbon could no longer be stored in carbonates, where most of the Earth’s CO2 is stored today. This much CO2 in the atmosphere would cause such a severe and irreversible greenhouse effect that the oceans would evaporate and disappear.”

Artist’s impression of the “Venus-like” exoplanet in a nearby star system. Credit: cfa.harvard.edu

It is a well-known fact that life here on Earth is carbon-based. However, knowing that conditions during the early Solar System prevented an overdose of carbon that could have turned Earth into a second Venus is certainly interesting. While carbon may be essential to life as we know it, too much can mean the death of it. This study could also come in handy when it comes to the search for life in extra-solar systems.

When examining distant stars, astronomers could ask, “were primordial conditions hot enough in the inner system to prevent a carbon overdose?” The answer to that question could be the difference between finding an Earth 2.0, or another Venus-like world!

Further Reading: University of Heidelberg, Astronomy and Astrophysics

Elon Musk Reveals Further Plans to Colonize Mars and Make Aerospace Transit a Reality

The founder of SpaceX said a planned interplanetary transport system would be downsized so it could carry out a range of tasks that would then pay for future Mars missions. Credit: AFP/Peter Parks

For years, Elon Musk and the company he founded to reduce the associated costs of space exploration (SpaceX) have been leading the charge in the development of private spaceflight. Beyond capturing the attention of the world with reusable rocket tests and the development of next-generation space vehicles, Musk has also garnered a lot of attention for his long-term plans.

These plans were the subject of a presentation made on Friday, September 29th, during the International Astronautical Congress (IAC) – which ran from September 25th to September 29th in Adelaide, Australia. During the course of the presentation, Musk detailed his plans to send cargo ships to Mars by 2022 and to conduct regular aerospace trips between major cities here on Earth.

Continue reading “Elon Musk Reveals Further Plans to Colonize Mars and Make Aerospace Transit a Reality”

New Study Sheds Light on How Earth and Mars Formed

Snapshot of a computer simulation of two (relatively small) planets colliding with each other. The colors show how the rock of the impacting body (dark grey, in center of impact area) accretes to the target body (rock; light grey), while some of the rock in the impact area is molten (yellow to white) or vaporised (red). Credit: Philip J. Carter

In accordance with the Nebular Hypothesis, the Solar System is believed to have formed through the process of accretion. Essentially, this began when a massive cloud of dust and gas (aka. the Solar Nebula) experienced a gravitational collapse at its center, giving birth to the Sun. The remaining dust and gas then formed into a protoplanetary disc around the Sun, which gradually coalesced to form the planets.

However, much about the process of how planets evolved to become distinct in their compositions has remained a mystery. Luckily, a new study by a team of researchers from the University of Bristol has approached the subject with a fresh perspective. By examining a combination of Earth samples and meteorites, they have shed new light on how planets like Earth and Mars formed and evolved.

The study, titled “Magnesium Isotope Evidence that Accretional Vapour Loss Shapes Planetary Compositions“, recently appeared in the scientific journal Nature. Led by Remco C. Hin, a senior research associate from the School of Earth Sciences at the University of Bristol, the team compared samples of rock from Earth, Mars, and the Asteroid Vesta to compare the levels of magnesium isotopes within them.

Artist’s impression of the early Solar System, where collision between particles in an accretion disc led to the formation of planetesimals and eventually planets. Credit: NASA/JPL-Caltech

Their study attempted answering what has been a lingering question in the scientific community – i.e. did the planets form the way they are today, or did they acquire their distinctive compositions over time? As Dr. Remco Hin explained in a University of Bristol press release:

“We have provided evidence that such a sequence of events occurred in the formation of the Earth and Mars, using high precision measurements of their magnesium isotope compositions. Magnesium isotope ratios change as a result of silicate vapour loss, which preferentially contains the lighter isotopes. In this way, we estimated that more than 40 per cent of the Earth’s mass was lost during its construction. This cowboy building job, as one of my co-authors described it, was also responsible for creating the Earth’s unique composition.

To break it down, accretion consists of clumps of material colliding with neighboring clumps to form larger objects. This process is very chaotic, and material is often lost as well as accumulated due to the extreme heat generated by these high-speed collisions. This heat is also believed to have created oceans of magma on the planets as they formed, not to mention temporary atmospheres of vaporized rock.

Until planets become about the same size as Mars, their force of gravitational attraction was too weak to hold onto these atmospheres. And as more collisions took place, the composition of these atmosphere and of the planets themselves would have changes substantially. How exactly the terrestrial planets – Mercury, Venus, Earth and Mars – obtained their current, volatile-poor compositions over time is what scientists have hoped to address.

Artist impression of the Late Heavy Bombardment period. Credit: NASA

For example, some believe that the planets current compositions are the result of particular combinations of gas and dust during the earliest periods of planet formation – where terrestrial planets are silicate/metal rich, but volatile poor, because of which elements were most abundant closest to the Sun. Others have suggested that their current composition is a consequence of their violent growth and collisions with other bodies.

To shed light on this, Dr. Hin and his associates analyzed samples of Earth, along with meteorites from Mars and the asteroid Vesta using a new analytical approach. This technique is capable of obtaining more accurate measurements of magnesium isotope rations than any previous method. This method also showed that all differentiated bodies – like Earth, Mars and Vesta – have isotopically heavier magnesium compositions than chondritic meteorites.

From this, they were able to draw three conclusions. For one, they found that Earth, Mars and Vesta have distinct magnesium isotope rations that could not be explained by condensation from the Solar Nebula. Second, they noted that the study of heavy magnesium isotopes revealed that in all cases, the planets lost about 40% percent of their mass during their formation period, following repeated episodes of vaporization.

Last, they determined that the accretion process results in other chemical changes that generate the unique chemical characteristics of Earth. In short, their study showed that Earth, Mars and Vesta all experiences significant losses of material after formation, which means that their peculiar compositions were likely the result of collisions over time. As Dr Hin added:

“Our work changes our views on how planets attain their physical and chemical characteristics. While it was previously known that building planets is a violent process and that the compositions of planets such as Earth are distinct, it was not clear that these features were linked. We now show that vapour loss during the high energy collisions of planetary accretion has a profound effect on a planet’s composition.”

Their study also indicated that this violent formation process could be characteristic of planets in general. These findings are not only significant when it comes to the formation of the Solar System, but of extra-solar planets as well. When it comes time to explore distant star systems, the distinctive compositions of their planets will tell us much about the conditions from which they formed, and how they came to be.

Further Reading: University of Bristol, Nature

LIGO and Virgo Observatories Detect Black Holes Colliding

In February 2016, LIGO detected gravity waves for the first time. As this artist's illustration depicts, the gravitational waves were created by merging black holes. The third detection just announced was also created when two black holes merged. Credit: LIGO/A. Simonnet.
Artist's impression of merging binary black holes. Credit: LIGO/A. Simonnet.

On February 11th, 2016, scientists at the Laser Interferometer Gravitational-wave Observatory (LIGO) announced the first detection of gravitational waves. This development, which confirmed a prediction made by Einstein’s Theory of General Relativity a century ago, has opened up new avenues of research for cosmologists and astrophysicists. Since that time, more detections have been made, all of which were said to be the result of black holes merging.

The latest detection took place on August 14th, 2017, when three observatories – the Advanced LIGO and the Advanced Virgo detectors – simultaneously detected the gravitational waves created by merging black holes. This was the first time that gravitational waves were detected by three different facilities from around the world, thus ushering in a new era of globally-networked research into this cosmic phenomena.

The study which detailed these observations was recently published online by the LIGO Scientific Collaboration and the Virgo Collaboration. Titled “GW170814 : A Three-Detector Observation of Gravitational Waves from a Binary Black Hole Coalescence“, this study has also been accepted for publication in the scientific journal Physical Review Letters.

Aerial view of the Virgo Observatory. Credit: The Virgo collaboration/CCO 1.0

The event, designated as GW170814, was observed at 10:30:43 UTC (06:30:43 EDT; 03:30:43 PDT) on August 14th, 2017. The event was detected by the National Science Foundation‘s two LIGO detectors (located in Livingston, Louisiana, and Hanford, Washington) and the Virgo detector located near Pisa, Italy – which is maintained by the National Center for Scientific Research (CNRS) and the National Institute for Nuclear Physics (INFN).

Though not the first instance of gravitational waves being detected, this was the first time that an event was detected by three observatories simultaneously. As France Córdova, the director of the NSF, said in a recent LIGO press release:

“Little more than a year and a half ago, NSF announced that its Laser Interferometer Gravitational Wave Observatory had made the first-ever detection of gravitational waves, which resulted from the collision of two black holes in a galaxy a billion light-years away. Today, we are delighted to announce the first discovery made in partnership between the Virgo gravitational-wave observatory and the LIGO Scientific Collaboration, the first time a gravitational wave detection was observed by these observatories, located thousands of miles apart. This is an exciting milestone in the growing international scientific effort to unlock the extraordinary mysteries of our universe.”

Based on the waves detected, the LIGO Scientific Collaboration (LSC) and Virgo collaboration were able to determine the type of event, as well as the mass of the objects involved. According to their study, the event was triggered by the merger of two black holes – which were 31 and 25 Solar Masses, respectively. The event took place about 1.8 billion light years from Earth, and resulted in the formation of a spinning black hole with about 53 Solar Masses.

LIGO’s two facilities, located in Livingston, Louisiana, and Hanford, Washington. Credit: ligo.caltech.edu

What this means is that about three Solar Masses were converted into gravitational-wave energy during the merger, which was then detected by LIGO and Virgo. While impressive on its own, this latest detection is merely a taste of what gravitational wave detectors like the LIGO and Virgo collaborations can do now that they have entered their advanced stages, and into cooperation with each other.

Both Advanced LIGO and Advanced Virgo are second-generation gravitational-wave detectors that have taken over from previous ones. The LIGO facilities, which were conceived, built, and are operated by Caltech and MIT, collected data unsuccessfully between 2002 and 2010. However, as of September of 2015, Advanced LIGO went online and began conducting two observing runs – O1 and O2.

Meanwhile, the original Virgo detector conducted observations between 2003 and October of 2011, once again without success. By February of 2017, the integration of the Advanced Virgo detector began, and the instruments went online by the following April. In 2007, Virgo and LIGO also partnered to share and jointly analyze the data recorded by their respective detectors.

In August of 2017, the Virgo detector joined the O2 run, and the first-ever simultaneous detection took place on August 14th, with data being gathered by all three LIGO and Virgo instruments. As LSC spokesperson David Shoemaker – a researcher with the Massachusetts Institute of Technology (MIT) – indicated, this detection is just the first of many anticipated events.

Artist’s impression of two merging black holes, which has been theorized to be a source of gravitational waves. Credit: Bohn, Throwe, Hébert, Henriksson, Bunandar, Taylor, Scheel/SXS

“This is just the beginning of observations with the network enabled by Virgo and LIGO working together,” he said. “With the next observing run planned for fall 2018, we can expect such detections weekly or even more often.”

Not only will this mean that scientists have a better shot of detecting future events, but they will also be able to pinpoint them with far greater accuracy. In fact, the transition from a two- to a three-detector network is expected to increase the likelihood of pinpointing the source of GW170814 by a factory of 20. The sky region for GW170814 is just 60 square degrees – more than 10 times smaller than with data from LIGO’s interferometers alone.

In addition, the accuracy with which the distance to the source is measured has also benefited from this partnership. As Laura Cadonati, a Georgia Tech professor and the deputy spokesperson of the LSC, explained:

“This increased precision will allow the entire astrophysical community to eventually make even more exciting discoveries, including multi-messenger observations. A smaller search area enables follow-up observations with telescopes and satellites for cosmic events that produce gravitational waves and emissions of light, such as the collision of neutron stars.”

Artist’s impression of gravitational waves. Credit: NASA

In the end, bringing more detectors into the gravitational-wave network will also allow for more detailed test’s of Einstein’s theory of General Relativity. Caltech’s David H. Reitze, the executive director of the LIGO Laboratory, also praised the new partnership and what it will allow for.

“With this first joint detection by the Advanced LIGO and Virgo detectors, we have taken one step further into the gravitational-wave cosmos,” he said. “Virgo brings a powerful new capability to detect and better locate gravitational-wave sources, one that will undoubtedly lead to exciting and unanticipated results in the future.”

The study of gravitational waves is a testament to the growing capability of the world’s science teams and the science of interferometry. For decades, the existence of gravitational waves was merely a theory; and by the turn of the century, all attempts to detect them had yielded nothing. But in just the past eighteen months, multiple detections have been made, and dozens more are expected in the coming years.

What’s more, thanks to the new global network and the improved instruments and methods, these events are sure to tell us volumes about our Universe and the physics that govern it.

Further Reading: NSF, LIGO-Caltech, LIGO DD

New Study Provides Explanation for Pluto’s Giant Blades of Ice

Pluto’s bladed terrain as seen from New Horizons during its July 2015 flyby. Credits: NASA/JHUAPL/SwRI

When it made its historic flyby of Pluto in July of 2015, the New Horizons spacecraft gave scientists and the general public the first clear picture of what this distant dwarf planet looks like. In addition to providing breathtaking images of Pluto’s “heart”, its frozen plains, and mountain chains, one of the more interesting features it detected was Pluto’s mysterious “bladed terrain”.

According to data obtained by New Horizons, these features are made almost entirely out of methane ice and resemble giant blades. At the time of their discovery, what caused these features remained unknown. But according to new research by members of the New Horizons team, it is possible that these features are the result of a specific kind of erosion that is related to Pluto’s complex climate and geological history.

Ever since the New Horizons probe provided a detailed look at Pluto’s geological features, the existence of these jagged ridges has been a source of mystery. They are located at the highest altitudes on Pluto’s surface near it’s equator, and can reach several hundred feet in altitude. In that respect, they are similar to penitentes, a type of structure found in high-altitude snowfields along Earth’s equator.

Penitentes, on the southern end of the Chajnantor plain in Chile. Credits: Wikimedia Commons/ESO

These structures are formed through sublimation, where atmospheric water vapor freezes to form standing, blade-like ice structures. The process is based on sublimation, where rapid changes in temperature cause water to transition from a vapor to a solid (and back again) without changing into a liquid state in between. With this in mind, the research team considered various mechanisms for the formation of these ridges on Pluto.

What they determined was that Pluto’s bladed terrain was the result of atmospheric methane freezing at extreme altitudes on Pluto, which then led to ice structures similar to the ones found on Earth.The team was led by Jeffrey Moore, a research scientist at NASA’s Ames Research Center who was also a New Horizons’ team member. As he explained in a NASA press statement:

“When we realized that bladed terrain consists of tall deposits of methane ice, we asked ourselves why it forms all of these ridges, as opposed to just being big blobs of ice on the ground. It turns out that Pluto undergoes climate variation and sometimes, when Pluto is a little warmer, the methane ice begins to basically ‘evaporate’ away.”

But unlike on Earth, the erosion of these features are related to changes that take place over the course of eons. This should come as no surprise seeing as how Pluto’s orbital period is 248 years (or 90,560 Earth days), meaning it takes this long to complete a single orbit around the Sun. In addition, the eccentric nature of it orbit means that its distance from the Sun ranges considerably, from 29.658 AU at perihelion to 49.305 AU at aphelion.

Maps based on New Horizons’ data on the topography (top) and composition (bottom) of Pluto’s surface. Both indicate the section of Pluto where the bladed terrain was observed. Credits: NASA/JHUAPL/SwRI/LPI

When the planet is farthest from the Sun, methane freezes out of the atmosphere at high altitudes. And as it gets closer to the Sun, these ice features melt and turn directly into atmospheric vapor again. As a result of this discovery, we now know that the surface and air of Pluto are apparently far more dynamic than previously thought. Much in the same way that Earth has a water cycle, Pluto may have a methane cycle.

This discovery could also allow scientists to map out locations of Pluto which were not photographed in high-detail. When the New Horizons mission conducted its flyby, it took high-resolution pictures of only one side of Pluto – designated as the “encounter hemisphere”. However, it was only able to observe the other side at lower resolution, which prevented it from being mapped in detail.

But based on this new study, NASA researchers and their collaborators have been able to conclude that these sharp ridges may be a widespread feature on Pluto’s “far side”. The study is also significant in that it advances our understanding of Pluto’s global geography and topography, both past and present. This is due to the fact that it demonstrated a link between atmospheric methane and high-altitude features. As such, researchers can now infer elevations on Pluto by looking for concentrations of methane in its atmosphere.

Not long ago, Pluto was considered one of the least-understood bodies in our Solar System, thanks to its immense distance from the Sun. However, thanks to ongoing studies made possible by the data collected by the New Horizons mission, scientists are becoming increasingly familiar with what its surface looks like, not to mention the types of geological and climatological forces that have shaped it over time.

And be sure to enjoy this video that details the discovery of Pluto’s bladed terrain, courtesy of NASA’s Ames Research Center:

Further Reading: NASA