There was Evidence for Europa’s Geysers Hiding in Plain Sight in Old Spacecraft Data From 1997

Artist’s illustration of Jupiter and Europa (in the foreground) with the Galileo spacecraft after its pass through a plume erupting from Europa’s surface. Credits: NASA/JPL-Caltech/Univ. of Michigan

Jupiter’s moon Europa continues to fascinate and amaze! In 1979, the Voyager missions provided the first indications that an interior ocean might exist beneath it’s icy surface. Between 1995 and 2003, the Galileo spaceprobe provided the most detailed information to date on Jupiter’s moons to date. This information bolstered theories about how life could exist in a warm water ocean located at the core-mantle boundary.

Even though the Galileo mission ended when the probe crashed into Jupiter’s atmosphere, the spaceprobe is still providing vital information on Europa. After analyzing old data from the mission, NASA scientists have found independent evidence that Europa’s interior ocean is venting plumes of water vapor from its surface. This is good news for future mission to Europa, which will attempt to search these plumes for signs of life.

The study which describes their findings, titled “Evidence of a plume on Europa from Galileo magnetic and plasma wave signatures“, recently appeared in the journal Nature Astronomy. The study was led by Xianzhe Jia, a space physicist from the Department of Climate and Space Sciences and Engineering at the University of Michigan, and included members from UCLA and the University of Iowa.

Artist’s concept of the Galileo space probe passing through the Jupiter system. Credit: NASA

The data was collected in 1997 by Galileo during a flyby of Europa that brought it to within 200 km (124 mi) of the moon’s surface. At the time, its Magnetometer (MAG) sensor detected a brief, localized bend in Jupiter’s magnetic field, which remained unexplained until now. After running the data through new and advanced computer models, the team was able to create a simulation that showed that this was caused by interaction between the magnetic field and one of the Europa’s plumes.

This analysis confirmed ultraviolet observations made by NASA’s Hubble Space Telescope in 2012, which suggested the presence of water plumes on the moon’s surface. However, this new analysis used data collected much closer to the source, which indicated how Europa’s plumes interact with the ambient flow of plasma contained within Jupiter’s powerful magnetic field.

In addition to being the lead author on this study, Jia is also the co-investigator for two instruments that will travel aboard the Europa Clipper mission – which may launch as soon as 2022 to explore the moon’s potential habitability. Jia’s and his colleagues were inspired to reexamine data from the Galileo mission thanks to Melissa McGrath, a member of the SETI Institute and also a member of the Europa Clipper science team.

During a presentation to her fellow team scientists, McGrath highlighted other Hubble observations of Europa. As Jiang explained in a recent NASA press release:

“The data were there, but we needed sophisticated modeling to make sense of the observation. One of the locations she mentioned rang a bell. Galileo actually did a flyby of that location, and it was the closest one we ever had. We realized we had to go back. We needed to see whether there was anything in the data that could tell us whether or not there was a plume.”

Artist’s impression of a water vapor plume on Europa. Credit: NASA/ESA/K. Retherford/SWRI

When they first examined the information 21 years ago, the high-resolution data obtained by the MAG instrument showed something strange. But it was thanks to the lessons provided by the Cassini mission, which explored the plumes on Saturn’s moon Enceladus, that the team knew what to look for. This included material from the plumes which became ionized by the gas giant’s magnetosphere, leaving a characteristic blip in the magnetic field.

After reexamining the data, they found that the same characteristic bend (localized and brief) in the magnetic field was present around Europa. Jia’s team also consulted data from Galileo’s Plasma Wave Spectrometer (PWS) instrument to measure plasma waves caused by charged particles in gases around Europa’s atmosphere, which also appeared to back the theory of a plume.

This magnetometry data and plasma wave signatures were then layered into new 3D modeling developed by the team at the University of Michigan (which simulated the interactions of plasma with Solar system bodies). Last, they added the data obtained from Hubble in 2012 that suggested the dimensions of the potential plumes. The end result was a simulated plume that matched the magnetic field and plasma signatures they saw in the Galileo data.

As Robert Pappalardo, a Europa Clipper project scientist at NASA’s Jet Propulsion Laboratory (JPL), indicated:

“There now seem to be too many lines of evidence to dismiss plumes at Europa. This result makes the plumes seem to be much more real and, for me, is a tipping point. These are no longer uncertain blips on a faraway image.” 

Artist’s concept of a Europa Clipper mission, which will study Europa in 2022-2025 to search for signs of life. Credit: NASA/JPL

The findings are certainly good news for the Europa Clipper mission, which is expected to make the journey to Jupiter between 2022 and 2025. When this probe arrives in the Jovian system, it will establish an orbit around Jupiter and conduct rapid, low-altitude flybys of Europa. Assuming that plume activity does take place on the surface of the moon, the Europa Clipper will sample the frozen liquid and dust particles for signs of life.

“If plumes exist, and we can directly sample what’s coming from the interior of Europa, then we can more easily get at whether Europa has the ingredients for life,” Pappalardo said. “That’s what the mission is after. That’s the big picture.”

At present, the mission team is busy looking at potential orbital paths for the Europa Clipper mission. With this new research in hand, the team will choose a path that will take the spaceprobe above the plume locations so that it is in an ideal position to search them for signs of life. If all goes as planned, the Europa Clipper could be the first of several probes that finally proves that there is life beyond Earth.

And be sure to check out this video of the Europa Clipper mission, courtesy of NASA:

Further Reading: NASA, Nature

For the First Time, Astronomers Have Found a Star That Survived its Companion Exploding as Supernova

Hubble image of the supernova SN 2001ig, which indicated the presence of a companion. Credits: NASA, ESA, S. Ryder (Australian Astronomical Observatory), and O. Fox (STScI)

A Type II supernova is a truly amazing astronomical event. As with all supernovae, a Type II consists of a star experiencing core collapse at the end of its life cycle and exploding, causing it to shed its outer layers. A subclass of this type is known as Type IIb, which are stars that have been stripped of their hydrogen fuel and undergo collapse because they are no longer able to maintain fusion in their core.

Seventeen years ago, astronomers were fortunate enough to witness a Type IIb supernova in the galaxy NGC 7424, located 40 million light-years away in the southern constellation Grus. Now that this supernova has faded, the Hubble Space Telescope recently captured the first image of a surviving companion, thus demonstrating that supernovae do indeed happen in double-star systems.

The study, titled “Ultraviolet Detection of the Binary Companion to the Type IIb SN 2001ig“, was recently published in the Astrophysical Journal. The study was led by Stuart Ryder of the Australian Astronomical Observatory and included members from California Institute of Technology (Caltech), the Space Telescope Science Institute (STSI), the University of Amsterdam, the University of Arizona, the University of York, and the University of California.

This discovery is the most compelling evidence to date that some supernovae originate as a result of siphoning between binary pairs. As Stuart Ryder indicated in a recent NASA press release:

“We know that the majority of massive stars are in binary pairs. Many of these binary pairs will interact and transfer gas from one star to the other when their orbits bring them close together.”

The supernova, called SN 2001ig, was pinpointed by astronomers in 2002 using the European Southern Observatory’s Very Large Telescope (VLT). In 2004, these observations were followed-up with the Gemini South Observatory, which first hinted at the presence of a surviving binary companion. Knowing the exact coordinates, Ryder and his team were able to focus Hubble on that location as the supernova’s glow faded.

The find was especially fortuitous because it might also shed light on a astronomical mystery, which is how stripped-envelop supernovae lose their outer envelopes. Originally, scientists believed they were the result of stars with very fast winds that pushed off their outer envelopes. However, when astronomers began looking for the primary stars which spawned these supernovae, they could not find them.

Artist’s impression of a pulsar siphoning material from a companion star. Credit: NASA

As Ori Fox, a member of the Space Telescope Science Institute and a co-author on the paper, explained:

“That was especially bizarre, because astronomers expected that they would be the most massive and the brightest progenitor stars. Also, the sheer number of stripped-envelope supernovas is greater than predicted.”

This led scientists to theorize that many of the stripped-envelop stars were the primary in lower-mass binary star systems. All that remained was to find a supernova that was part of a binary system, which Ryder and his colleagues set out to do. This was no easy task, seeing as how the companion was rather faint and at the very limits of what Hubble could see.

In addition, not many supernovae are known to go off within this distance range. Last, but not least, they had to know the exact position through very precise measurements. Thanks to Hubble’s exquisite resolution and ultraviolet capability, they were able to find and photograph the surviving companion.

Prior to the supernova, the stars orbited each other with a period of about one year. When the primary star exploded, it had an impact on the companion, but it remained intact. Because of this, SN 2001ig is the first surviving companion to ever be photographed.

Artist’s rendering of SN 1993J, where a red supergiant supernova progenitor star (left) is exploding after having transferred about 10 solar masses of hydrogen gas to the blue companion star (right). Credit: ESA

Looking ahead, Ryder and his team hope to precisely determine how many supernovae with stripped envelopes have companions. At present, it is estimated that at least half of them do, while the other half lose their outer enveloped due to stellar winds. Their next goal is to examine completely stripped-envelope supernovae, as opposed to SN 2001ig and SN 1993J, which were only about 90% stripped.

Luckily, they won’t have to wait as long to examine these completely stripped-envelope supernovae, since they don’t have as much shock interaction with gas in their surrounding environment. In short, since they lost their outer envelopes long before they exploded, they fade much faster. This means that the team will only have to wait two to three years before looking for the surviving companions.

Their efforts are also likely to be helped by the deployment of the James Webb Space Telescope (JWST), which is scheduled to launch in 2020. Depending on what they find, astronomers may be ready to resolve the mystery of what causes the different types of supernovae, which could also reveal more about the life cycles of stars and the birth of black holes.

Further Reading: NASA, The Astrophysical Journal

How Many Planets is TESS Going to Find?

Artist Illustration of TESS and its 4 telescopes. Credit: NASA/MIT
Artist concept of the Transiting Exoplanet Survey Satellite and its 4 telescopes. Credit: NASA/MIT

The Transiting Exoplanet Survey Satellite (TESS), NASA’s latest exoplanet-hunting space telescope, was launched into space on Wednesday, April 18th, 2018. As the name suggests, this telescope will use the Transit Method to detect terrestrial-mass planets (i.e. rocky) orbiting distant stars. Alongside other next-generation telescopes like the James Webb Space Telescope (JWST), TESS will effectively pick up where telescopes like Hubble and Kepler left off.

But just how many planets is TESS expected to find? That was the subject of a new study by a team researchers who attempted to estimate just how many planets TESS is likely to discover, as well as the physical properties of these planets and the stars that they orbit. Altogether, they estimate TESS will find thousands of planets orbiting a variety of stars during its two-year primary mission.

The study, titled “A Revised Exoplanet Yield from the Transiting Exoplanet Survey Satellite (TESS)“, recently appeared online. The study was led by Thomas Barclay, an associate research scientist at the NASA Goddard Space Flight Center and the University of Maryland, and included Joshua Pepper (an astrophysicist at Lehigh University) and Elisa Quintana (a research scientist with the SETI Institute and NASA Ames Research Center).

As Thomas Barclay told Universe Today via email:

“TESS builds off the legacy of Kepler. Kepler was primarily a statistical mission and taught us that planets are everywhere. However, it wasn’t optimized for finding excellent individual planets for further study. Now that we know planets are common, we can launch something like TESS to search for the planets that we will undertake intensive studies of using ground and space-based telescopes. Planets that TESS will find will on average be 10x closer and 100x brighter.”

For the sake of their study, the team created a three-step model that took into account the stars TESS will observe, the number of planets each one is likely to have, and the likelihood of TESS spotting them. These included the kinds of planets that would be orbiting around dwarf stars ranging from A-type to K-type (like our Sun), and lower-mass M-type (red dwarf) stars.

“To estimate how many planets TESS will find we took stars that will be observed by TESS and simulated a population of planets orbiting them,” said Barclay. “The exoplanet population stats all come from studies that used Kepler data. Then, using models of TESS performance, we estimated how many of those planets would be detected by TESS. This is where we get our yield numbers from.”

The first step was straightforward, thanks to the availability of the Candidate Target List (CTL) – a prioritized list of target stars that the TESS Target Selection Working Group determined were the most suitable stars for detecting small planets. They then ranked the 3.8 million stars that are included in the latest version based on their brightness and radius and determined which of these TESS is likely to observe.

Liftoff of the SpaceX Falcon 9 rocket carrying NASA’s TESS spacecraft. Image credit: NASA TV

The second step consisted of assigning planets to each star based on a Poisson distribution, a statistical technique where a given number is assigned to each star (in this case, 0 or more). Each planet was then assigned six physical properties drawn at random, including an orbital period, a radius, an eccentricity, a periastron angle, an inclination to our line of sight, and a mid-time of first transit.

Last, they attempted to estimate how many of these planets would generate a detectable transit signal. As noted, TESS will rely on the Transit Method, where periodic dips in a star’s brightness are used to determine the presence of one or more orbiting planets, as well as place constraints on their sizes and orbital periods. For this, they considered the flux contamination of nearby stars, the number of transits, and the transit duration.

Ultimately, they determined with 90% confidence that TESS is likely to detect 4430–4660 new exoplanets during its two years mission:

“The results is that we predict that TESS will find more than 4000 planets, with hundreds smaller than twice the size of Earth. The primary goal of TESS is to find planets that are bright enough for ground-based telescope to measure their mass. We estimate that TESS could lead to triple the number of planets smaller than 4 Earth-radii with mass measurements.”

As of April 1st, 2018, a total 3,758 exoplanets have been confirmed in 2,808 systems, with 627 systems having more than one planet. In other words, Barclay and his team estimate that the TESS mission will effectively double the number of confirmed exoplanets and triple the number of Earth-sized and Super-Earth’s during its primary mission.

This will begin after a series of orbital maneuvers and engineering tests, which are expected to last for about two months. With the exoplanet catalog thus expanded, we can expect that there will be many more “Earth-like” candidates available for study. And while we still will not be able to determine if any of them have life, we may perhaps find some that show signs of a viable atmosphere and water on the surfaces.

The hunt for life beyond Earth will continue for many years to come! And in the meantime, be sure to enjoy this video about the TESS mission, courtesy of NASA:

Further Reading: Astrobites, arXiv

Facial Recognition Deep Learning Software is Surprisingly Good at Identifying Galaxies Too

Evolution diagram of a galaxy. First the galaxy is dominated by the disk component (left) but active star formation occurs in the huge dust and gas cloud at the center of the galaxy (center). Then the galaxy is dominated by the stellar bulge and becomes an elliptical or lenticular galaxy. Credit: NAOJ

A lot of attention has been dedicated to the machine learning technique known as “deep learning”, where computers are capable of discerning patterns in data without being specifically programmed to do so. In recent years, this technique has been applied to a number of applications, which include voice and facial recognition for social media platforms like Facebook.

However, astronomers are also benefiting from deep learning, which is helping them to analyze images of galaxies and understand how they form and evolve. In a new study, a team of international researchers used a deep learning algorithm to analyze images of galaxies from the Hubble Space Telescope. This method proved effective at classifying these galaxies based on what stage they were in their evolution.

The study, titled “Deep Learning Identifies High-z Galaxies in a Central Blue Nugget Phase in a Characteristic Mass Range“, recently appeared online and has been accepted for publication in the Astrophysical Journal. The study was led by Marc Huertes-Company of the University Paris Diderot and included members from the University of California Santa Cruz (UCSC), the Hebrew University, the Space Telescope Science Institute, the University of Pennsylvania Philadelphia, MINES ParisTech and Shanghai Normal University (SNHU).

A ‘deep learning’ algorithm trained on images from cosmological simulations is surprisingly successful at classifying real galaxies in Hubble images. Credit: HST/CANDELS

In the past, Marc Huertas-Company has already applied deep learning methods to Hubble data for the sake of galaxy classification. In collaboration with David Koo and Joel Primack, both of whom are professor emeritus’ at UC Santa Cruz (and with support from Google), Huertas-Company and the team spent the past two summers developing a neural network that could identify galaxies at different stages in their evolution.

“This project was just one of several ideas we had,” said Koo in a recent USCS press release. “We wanted to pick a process that theorists can define clearly based on the simulations, and that has something to do with how a galaxy looks, then have the deep learning algorithm look for it in the observations. We’re just beginning to explore this new way of doing research. It’s a new way of melding theory and observations.”

For the sake of their study, the researchers used computer simulations to generate mock images of galaxies as they would look in observations by the Hubble Space Telescope. The mock images were used to train the deep learning neural network to recognize three key phases of galaxy evolution that had been previously identified in the simulations. The researchers then used the network to analyze a large set of actual Hubble images.

As with previous images anaylzed by Huertas-Company, these images part of Hubble’s Cosmic Assembly Near-infrared Deep Extragalactic Legacy Survey (CANDELS) project – the largest project in the history of the Hubble Space Telescope. What they found was that the neural network’s classifications of simulated and real galaxies was remarkably consistent. As Joel Primack explained:

“We were not expecting it to be all that successful. I’m amazed at how powerful this is. We know the simulations have limitations, so we don’t want to make too strong a claim. But we don’t think this is just a lucky fluke.”

A spiral galaxy ablaze in the blue light of young stars from ongoing star formation (left) and an elliptical galaxy bathed in the red light of old stars (right). Credit: SDSS

 

The research team was especially interested in galaxies that have a small, dense, star-forming region known as a “blue nugget”. These regions occur early in the evolution of gas-rich galaxies, when big flows of gas into the center of a galaxy cause the formation of young stars that emit blue light. To simulate these and other types of galaxies, the team relied on state-of-the-art VELA simulations developed by Primack and an international team of collaborators.

In both the simulated and observational data, the computer program found that the “blue nugget” phase occurs only in galaxies with masses within a certain range. This was followed by star formation ending in the central region, leading to the compact “red nugget” phase, where the stars in the central region exit their main sequence phase and become red giants.

The consistency of the mass range was exciting because it indicated that the neural network was identifying a pattern that results from a key physical process in real galaxies – and without having to be specifically told to do so. As Koo indicated, this study as a big step forward for astronomy and AI, but a lot of research still needs to be done:

“The VELA simulations have had a lot of success in terms of helping us understand the CANDELS observations. Nobody has perfect simulations, though. As we continue this work, we will keep developing better simulations.”

Artist’s representation of an active galactic nucleus (AGN) at the center of a galaxy. Credit: NASA/CXC/M.Weiss

For instance, the team’s simulations did not include the role played by Active Galactic Nuclei (AGN). In larger galaxies, gas and dust is accreted onto a central Supermassive Black Hole (SMBH) at the core, which causes gas and radiation to be ejected in huge jets. Some recent studies have indicated how this may have an arresting effect on star formation in galaxies.

However, observations of distant, younger galaxies have shown evidence of the phenomenon observed in the team’s simulations, where gas-rich cores lead to the blue nugget phase. According to Koo, using deep learning to study galactic evolution has the potential to reveal previously undetected aspects of observational data. Instead of observing galaxies as snapshots in time, astronomers will be able to simulate how they evolve over billions of years.

“Deep learning looks for patterns, and the machine can see patterns that are so complex that we humans don’t see them,” he said. “We want to do a lot more testing of this approach, but in this proof-of-concept study, the machine seemed to successfully find in the data the different stages of galaxy evolution identified in the simulations.”

In the future, astronomers will have more observation data to analyze thanks to the deployment of next-generation telescopes like the Large Synoptic Survey Telescope (LSST), the James Webb Space Telescope (JWST), and the Wide-Field Infrared Survey Telescope (WFIRST). These telescopes will provide even more massive datasets, which can then be analyzed by machine learning methods to determine what patterns exist.

Astronomy and artificial intelligence, working together to better our understanding of the Universe. I wonder if we should put it on the task of finding a Theory of Everything (ToE) too!

Further Reading: UCSC, Astrophysical Journal

The Most Distant Star Ever Seen, Only 4.4 Billion Years After the Big Bang

Composite image showing the discovery of the most distant known star using the NASA/ESA Hubble Space Telescope. Credit: NASA & ESA and P. Kelly (University of California, Berkeley)

In 1990, the Hubble Space Telescope was placed into Low Earth Orbit. Since then, Hubble has gone on to become the most well-known space observatory and has revealed some never-before-seen things about our Universe. Despite the subsequent deployment of several flagship telescopes – like the Kepler Space Telescope, the Chandra X-ray Observatory and the Spitzer Space TelescopeHubble is still accomplishing some amazing feats.

For instance, a team of astronomers recently used Hubble to locate the most distant star ever discovered. This hot blue star, which was located in a galaxy cluster, existed just 4.4 billion years after the Big Bang. The discovery of this star is expected to provide new insights into the formation and evolution of stars and galaxy clusters during the early Universe, as well as the nature of dark matter itself.

The discovery was made by an international team of scientists led by Patrick Kelly (of the University of Minnesota), Jose Diego (of the Instituto de Física de Cantabria in Spain) and Steven Rodney (of the University of South Carolina). Together, they observed the distant star in the galaxy cluster MACS J1149-2223 in April 2016 while studying the supernova explosion known as heic1525 (aka. Refsdal).

Using a technique known as gravitational microlensing, team relied on the total mass of the galaxy cluster itself to magnify the light coming from the supernova. However, while looking for this supernova, the team found an unexpected point source of light in the same galaxy. As Patrick Kelly explained in a recent Hubble press release:

“Like the Refsdal supernova explosion the light of this distant star got magnified, making it visible for Hubble. This star is at least 100 times farther away than the next individual star we can study, except for supernova explosions.”

The light observed from this star – named Lensed Star 1 (LS1) – was emitted just 4.4 billion years after the Big Bang (when the Universe was just 30% of its current age). The light was only detectable thanks to the microlensing effect caused by mass of the galaxy cluster and a compact object about three times the mass of our Sun within the galaxy itself. This allowed for the light coming from the star to be magnified by a factor of 2000.

Interestingly enough, the team also realized that this was not the first time this star had been observed. During a previous observation of the galaxy cluster, made in October 2016, the star was also acquired in an image – but went unnoticed at the time. As Diego noted:

“We were actually surprised to not have seen this second image in earlier observations, as also the galaxy the star is located in can be seen twice. We assume that the light from the second image has been deflected by another moving massive object for a long time — basically hiding the image from us. And only when the massive object moved out of the line of sight the second image of the star became visible.”

After finding the star in their survey, the team used Hubble again to obtain spectra from LS1 and determined that it is a B-type supergiant star – an extremely bright and blue class of star that has several times the mass of our Sun and is more than twice as hot. Given the star’s age, the discovery of LS1 is find on its own. At the same time, the discovery of this star will allow astronomers to gain new insights into the galaxy cluster itself.

As Steven Rodney indicated, “We know that the microlensing was caused by either a star, a neutron star, or a stellar-mass black hole.” As such, the discovery of LS1 will allow astronomers to study these objects (the latter of which are invisible) and estimate how many of them exist within this galaxy cluster.

Learning more about the constituents of galaxy clusters – the largest and most massive structures in the Universe – will also provide important clues about the composition of the Universe overall and how it evolved over time. This includes the important role played by dark matter in the evolution the Universe. As Kelly explained:

“If dark matter is at least partially made up of comparatively low-mass black holes, as it was recently proposed, we should be able to see this in the light curve of LS1. Our observations do not favour the possibility that a high fraction of dark matter is made of these primordial black holes with about 30 times the mass of the Sun.”

With the deployment of next-generation telescopes – like the James Webb Space Telescope – astronomers hope to learn even more about the earliest stars in the Universe. In so doing, they will be able to learn more about how it evolved over the past 10 billion years or so, and gain vital clues as to how dark matter played a role. In the meantime, Hubble still plays an all-important role in expanding our understanding of the cosmos.

And be sure to enjoy this episode of Hubblecast that explains this impressive find, courtesy of the ESA:

Further Reading: Hubble Space Telescope

Hubble Finds a Galaxy with Almost no Dark Matter

The galaxy known as NGC 1052-DF2, am ultra diffuse galaxy that appears to have little or no dark matter. Credit: NASA, ESA, and P. van Dokkum (Yale University)

Since the 1960s, astrophysicists have postulated that in addition to all the matter that we can see, the Universe is also filled with a mysterious, invisible mass. Known as “Dark Matter”, it’s existence was proposed to explain the “missing mass” of the Universe, and is now considered a fundamental part of it. Not only is it theorized to make up about 80% of the Universe’s mass, it is also believed to have played a vital role in the formation and evolution of galaxies.

However, a recent finding may throw this entire cosmological perspective sideways. Based on observations made using the NASA/ESA Hubble Space Telescope and other observatories around the world, astronomers have found a nearby galaxy (NGC 1052-DF2) that does not appear to have any dark matter. This object is unique among galaxies studied so far, and could force a reevaluation of our predominant cosmological models.

The study which details their findings, titled “A galaxy lacking dark matter“, recently appeared in the journal Nature. Led by Pieter van Dokkum of Yale University, the study also included members from the Max Planck Institute for Astronomy, San Jose State University, the University of California Observatories, the University of Toronto, and the Harvard-Smithsonian Center for Astrophysics

Image of the ultra diffuse galaxy NGC 1052-DF2, created from images forming part of the Digitized Sky Survey 2. Credit:ESA/Hubble, NASA, Digitized Sky Survey 2. Acknowledgement: Davide de Martin

For the sake of their study, the team consulted data from the Dragonfly Telephoto Array (DFA), which was used to identify NGC 1052-DF2. Based on data from Hubble, the team was able to determined its distance – 65 million light-years from the Solar System – as well as its size and brightness. In addition, the team discovered that NGC 1052-DF52 is larger than the Milky Way but contains about 250 times fewer stars, which makes it an ultra diffuse galaxy.

As van Dokkum explained, NGC 1052-DF2 is so diffuse that it’s essentially transparent. “I spent an hour just staring at this image,” he said. “This thing is astonishing: a gigantic blob so sparse that you see the galaxies behind it. It is literally a see-through galaxy.”

Using data from the Sloan Digital Sky Survey (SDSS), the Gemini Observatory, and the Keck Observatory, the team studied the galaxy in more detail. By measuring the dynamical properties of ten globular clusters orbiting the galaxy, the team was able to infer an independent value of the galaxy’s mass – which is comparable to the mass of the stars in the galaxy.

This led the team to conclude that either NGC 1052-DF2 contains at least 400 times less dark matter than is predicted for a galaxy of its mass, or none at all. Such a finding is unprecedented in the history of modern astronomy and defied all predictions. As Allison Merritt – an astronomer from Yale University, the Max Planck Institute for Astronomy and a co-author on the paper – explained:

“Dark matter is conventionally believed to be an integral part of all galaxies — the glue that holds them together and the underlying scaffolding upon which they are built… There is no theory that predicts these types of galaxies — how you actually go about forming one of these things is completely unknown.”

“This invisible, mysterious substance is by far the most dominant aspect of any galaxy. Finding a galaxy without any is completely unexpected; it challenges standard ideas of how galaxies work,” added van Dokkum.

However, it is important to note that the discovery of a galaxy without dark matter does not disprove the theory that dark matter exists. In truth, it merely demonstrates that dark matter and galaxies are capable of being separate, which could mean that dark matter is bound to ordinary matter through no force other than gravity. As such, it could actually help scientists refine their theories of dark matter and its role in galaxy formation and evolution.

In the meantime, the researchers already have some ideas as to why dark matter is missing from NGC 1052-DF2. On the one hand, it could have been the result of a cataclysmic event, where the birth of a multitude of massive stars swept out all the gas and dark matter. On the other hand, the growth of the nearby massive elliptical galaxy (NGC 1052) billions of years ago could have played a role in this deficiency.

However, these theories do not explain how the galaxy formed. To address this, the team is analyzing images that Hubble took of 23 other ultra-diffuse galaxies for more dark-matter deficient galaxies. Already, they have found three that appear to be similar to NGC 1052-DF2, which could indicate that dark-matter deficient galaxies could be a relatively common occurrence.

If these latest findings demonstrate anything, it is that the Universe is like an onion. Just when you think you have it figured out, you peal back an additional layer and find a whole new set of mysteries. They also demonstrate that after 28 years of faithful service, the Hubble Space Telescope is still capable of teaching us new things. Good thing too, seeing as the launch of its successor has been delayed until 2020!

Further Reading: Hubble Space Telescope

Try to Contain Your Surprise. James Webb is Getting Delayed to 2020

Illustration of NASA's James Webb Space Telescope. Credits: NASA
Illustration of NASA's James Webb Space Telescope. Credits: NASA

Once it deploys, the James Webb Space Telescope (JWST) will be the most powerful and technically complex space telescope ever deployed. Using its powerful suite of infrared-optimized instruments, this telescope will be able to study the earliest stars and galaxies in the Universe, extra-solar planets around nearby stars, and the planets, moons and asteroids of our Solar System.

Unfortunately, due to its complexity and the need for more testing, the launch of the JWST has been subject to multiple delays. And as of this morning, NASA announced that the launch JWST has been delayed yet again. According to a statement issued by the agency, the launch window for the JWST is now targeted for sometime around May 2020.

The decision came after an independent assessment by the project’s Standing Review Board (SRB) of the remaining tasks, all of which are part of the final stage of integration and testing before the JWST launches. These tasks consist of integrating the combined optics and science instruments onto the spacecraft element, then testing them to ensure that they will deploy properly and work once they are in space.

The Space Telescope for Air, Road, and Sea (STTARS) is a custom-designed container that holds the James Webb’s Optical Telescope and Integrated Science (OTIS) instrument module. In this image its being unloaded from a U.S. military C-5 Charlie aircraft at Los Angeles International Airport (LAX) on Feb. 2, 2018. Image: NASA/Chris Gunn

This assessment came on the heels of a report issued by the Government Accountability Office (GAO) in February that expressed concerns over further delays and cost overruns. These concerns were based on the fact that it is typically in the final phase when problems are found and schedules revised, and that only 1.5 months of schedule reserved remained (at the time) until the end of the telescope’s launch window – which was scheduled for 2019.

But as acting NASA Administrator Robert Lightfoot stressed, the JWST is still a go:

“Webb is the highest priority project for the agency’s Science Mission Directorate, and the largest international space science project in U.S. history. All the observatory’s flight hardware is now complete, however, the issues brought to light with the spacecraft element are prompting us to take the necessary steps to refocus our efforts on the completion of this ambitious and complex observatory.”

NASA also announced that it is establishing an external Independent Review Board (IRB) chaired by Thomas Young – a highly-respected NASA and industry veteran who has a long history of chairing advisory committees and analyzing organizational and technical issues. The IRB findings, along with the SRB data, will be considered by NASA to set a more specific launch date, and will be presented to Congress this summer.

In the meantime, NASA and the European Space Agency (ESA) will be setting a new launch readiness date for the Ariane 5 rocket that will bring the JWST into space. Once a launch date is set, NASA will also be providing a cost estimate that may exceed the $8 billion budget cap established by Congress in 2011. This too is in keeping with the GAO’s report, which predicted cost overruns.

The Space Telescope Transporter for Air, Road and Sea (STTARS) being opened at Northrop Grumman on March 8th, 2018, to reveal the combined optics and science instruments of NASA’s James Webb Space Telescope. Credits: NASA/Chris Gunn

For those who have been following the JWST’s development, this news should come as no surprise. Due to its complexity and the need for extensive testing, the launch of the JWST has been delayed several times in recent years. In addition, the final phase consists of some of the most challenging work, where the 6.5-meter telescope and science payload element are being joined with the spacecraft element to complete the observatory.

In addition, the science team also needs to ensure that the observatory can be folded up to fit inside the Ariane 5 rocket that will launch it into space. They also need to ensure that it will unfold again once it reaches space, deploying its sunshield, mirrors and primary mirror. Beyond that, there are also the technical challenges of building a complex observatory that was created here on Earth, but designed to operate in space.

Not only does all of this represent a very technically-challenging feet, it is the first time that any space telescope has had to perform it. Already, the JWST has completed an extensive range of tests to ensure that it will reach its orbit roughly 1.6 million km (1 million mi) from Earth. And while delays can be discouraging, they also increase the likelihood of mission success.

As Thomas Zurbuchen, the associate administrator for NASA’s Science Mission Directorate, stated:

“Considering the investment NASA and our international partners have made, we want to proceed systematically through these last tests, with the additional time necessary, to be ready for a May 2020 launch.”

The combined optics and science instruments of NASA’s James Webb Space Telescope being removed from the Space Telescope Transporter for Air, Road and Sea (STTARS) at the Northrop Grumman company headquarters on March 8th, 2018. Credits: NASA/Chris Gunn

The next step in testing will take several months, and will consist of the spacecraft element undergoing tests to simulate the vibrational, acoustic and thermal environments it will experience during its launch and operations. Once complete, the project engineers will integrate and test the fully assembled observatory and verify that all its components work together properly.

And then (fingers crossed!) this ambitious telescope will finally be ready to take to space and start collecting light. In so doing, scientists from all around the world hope to shed new light on some of the most fundamental questions of science – namely, how did the Universe evolve, is their life in our Solar System beyond Earth, are their habitable worlds beyond our Solar System, and are there other civilizations out there?

Bottom line, NASA remains committed to deploying the James Webb Space Telescope. So even if the answers to these questions are delayed a little, they are still coming!

Further Reading: NASA

James Webb is Enduring its Final Stage of Testing Before it Ships off for Kourou, French Guiana

The combined optics and science instruments of NASA’s James Webb Space Telescope being removed from the Space Telescope Transporter for Air, Road and Sea (STTARS) at the Northrop Grumman company headquarters on March 8th, 2018. Credits: NASA/Chris Gunn

Once deployed, the James Webb Space Telescope (JWST) will be the most powerful telescope ever built. As the spiritual and scientific successor to the Hubble, Spitzer, and Kepler space telescopes, this space observatory will use its advanced suite of infrared instruments to the look back at the earliest stars and galaxies, study the Solar System in depth, and help characterize extra-solar planets (among other things).

Unfortunately, the launch of the JWST has been subject to multiple delays, with the launch date now set for some time in 2019. Luckily, on Thursday, March 8th, engineers at the Northrop Grumman company headquarters began the final step in the observatory’s integration and testing. Once complete, the JWST will be ready to ship to French Guiana, where it will be launched into space.

This final phase consisted of removing the combined optics and science instruments from their shipping containers – known as the Space Telescope Transporter for Air, Road and Sea (STTARS) – which recently arrived after being testing at NASA’s Johnson Space Center in Houston. This constitutes half the observatory, and includes the telescope’s 6.5 meter (21.3 foot) golden primary mirror.

The Space Telescope Transporter for Air, Road and Sea (STTARS) being opened at Northrop Grumman on March 8th, 2018, to reveal the combined optics and science instruments of NASA’s James Webb Space Telescope. Credits: NASA/Chris Gunn

The science payload was also tested at NASA’s Goddard Space Flight Center last year to ensure it could handle the vibrations associated with space launches and the temperatures and vacuum conditions of space. The other half of the observatory consists of the integrated spacecraft and sunshield, which is in the final phase of assembly at the Northrop Grumman company headquarters.

These will soon undergo a launch environment test to prove that they are ready to be combined with the science payload. Once both halves are finished being integrated, addition testing will be performed to guarantee the  fully assembled observatory can operate at the L2 Earth-Sun Lagrange Point. As Eric Smith, the program director for the JWST at NASA Headquarters, said in a recent NASA press statement:

“Extensive and rigorous testing prior to launch has proven effective in ensuring that NASA’s missions achieve their goals in space. Webb is far along into its testing phase and has seen great success with the telescope and science instruments, which will deliver the spectacular results we anticipate.”

These final tests are crucial to ensuring that that the observatory deploys properly and can operate once it is in space. This is largely because of the telescope’s complicated design, which needs to be folded in order to fit inside the Ariane 5 rocket that it will carry it into space. Once it reaches its destination, the telescope will have to unfold again, deploying its sunshield, mirrors and primary mirror.

The James Webb Space Telescope’s sunshield being deployed inside a cleanroom at Northrop Grumman’s company headquarter’s, in October 2017. Credits: Northrop Grumman

Not only does all of this represented a very technically-challenging feet, it is the first time that any space telescope has had to perform it. Beyond that, there are also the technical challenges of building a complex observatory that is designed to operate in space. While the JWST’s optics and science instruments were all built at room temperature here on Earth, they had to be designed to operate at cryogenic temperatures.

As such, its mirrors had to be precisely polished and formed that they would achieve the correct shape once they cool in space. Similarly, its sunshield will be operating in a zero gravity environment, but was built and tested here on Earth where the gravity is a hefty 9.8 m/s² (1 g). In short, the James Webb Space Telescope is the largest and most complex space telescope ever built, and is one of NASA’s highest priority science projects.

It is little wonder then why NASA has had to put the JWST through such a highly-rigorous testing process. As Smith put it:

“At NASA, we do the seemingly impossible every day, and it’s our job to do the hardest things humankind can think of for space exploration. The way we achieve success is to test, test and retest, so we understand the complex systems and verify they will work.”

The James Webb Space Telescope (which is scheduled to launch in 2019) will be the most powerful telescope ever deployed. Credit: NASA/JPL

Knowing that the JWST is now embarking on the final phase of its development – and that its engineers are confident it will perform up to task – is certainly good news. Especially in light of a recent report from the US Government Accountability Office (GAO), which stated that more delays were likely and that the project would probably exceed its original budget cap of $8 billion.

As the report indicated, it is the final phase of integration and testing where problems are most likely to be found and schedules revised. However, the report also stated that “Considering the investment NASA has made, and the good performance to date, we want to proceed very systematically through these tests to be ready for a Spring 2019 launch.”

In other words, there is no indication whatsoever that Congress is considering cancelling the project, regardless of further delays or cost overruns. And when the JWST is deployed, it will use its 6.5 meter (21-foot) infrared-optimized telescopes will search to a distance of over 13 billion light years, allow astronomers to study the atmospheres of Solar Planets, exoplanets, and other objects within our Solar System.

So while the JWST may not make its launch window in 2019, we can still expect that it will be taking to space in the near future. And when it does, we can also expect that what it reveals about our Universe will be mind-blowing!

Further Reading: NASA

James Webb Telescope is Probably Going to be Delayed Again, and Could Exceed a Congress Spending Cap

The James Webb Space Telescope will be the first of the Super Telescopes to see first light. It is scheduled to be launched in October, 2018. Image credit: NASA/Desiree Stover
The James Webb Telescope will be the most powerful telecope once it is deployed. However, delays and cost overruns could be a problem. Credit: NASA/Desiree Stover

When the James Webb Space Telescope takes to space, some tremendous scientific discoveries are expected to result. As the spiritual and scientific successor to the Hubble, Spitzer, and Kepler Space Telescopes, this space observatory will use its advanced suite of infrared instruments to the look back at the early Universe, study the Solar System, and help characterize extra-solar planets.

Unfortunately, the launch of this mission has been delayed several times now, with the launch date now set for some time in 2019. And based on the amount of work NASA needs to do complete the JWST before launch, the Government Accountability Office (GAO) believes that more delays are coming and believes that the project is likely to exceed the cost cap set by Congress in 2011 at $8 billion. 

Part of the problem is that all the remaining schedule reserve – the extra time set aside in the event of delays or unforeseen risks – was recently used to address technical issues. These include the “anomalous readings” detected from the telescope during vibration testing back in December 2016. NASA responded to this by giving the project up to 4 months of schedule reserve by extending the launch window.

The JWST sunshield being unfolded in the clean room at Northrop Grumman Aerospace Systems in Redondo Beach, California. Credits: Northrop Grumman Corp.

However, in 2017, NASA delayed the launch window again by 5 months, from October 2018 to a between March and June 2019. This delay was requested by the project team, who indicated that they needed to address lessons learned from the initial folding and deployment of the observatory’s sunshield. As Eric Smith, the program director for the James Webb Space Telescope at NASA Headquarters, explained to Congress at the time:

“Webb’s spacecraft and sunshield are larger and more complex than most spacecraft. The combination of some integration activities taking longer than initially planned, such as the installation of more than 100 sunshield membrane release devices, factoring in lessons learned from earlier testing, like longer time spans for vibration testing, has meant the integration and testing process is just taking longer. Considering the investment NASA has made, and the good performance to date, we want to proceed very systemmatically through these tests to be ready for a Spring 2019 launch.”

Given the remaining integration and test work that lies ahead, more delays are expected. According to the GAO, it is this phase where problems are most likely to be found and schedules revised. Coupled with the fact that only 1.5 months of schedule reserves remain until the end of the launch window, they anticipate that additional launch delays are likely, which will also require budget increases.

Initially, the budget estimates that were set by Congress indicated that the observatory would cost $1.6 billion and would launch by 2011, with an overall cost cap set at $8 billion. However, NASA has revised the budget multiple times since then (in conjunction with the multiple delays) and estimates that the budget for a 2019 launch window would now be $8.8 billion.

The James Webb Space Telescope being placed in the Johnson Space Center’s historic Chamber A on June 20th, 2017. Credit: NASA/JSC

Once deployed, the JWST will be the most powerful space telescope ever built and will serve thousands of astronomers worldwide. As a collaborative project between NASA, the European Space Agency (ESA), and the Canadian Space Agency (CSA), it also representative of the new era of international cooperation. But by far, the most impressive thing about this mission is the scientific discoveries it is expected to make.

It’s 6.5 meter (21-foot) infrared-optimized telescopes will search to a distance of over 13 billion light years, allowing it to study the first stars and galaxies that formed. It will also allow astronomers to study the atmospheres of Solar Planets and exoplanets and other objects within our Solar System. As such, and delays and cost overruns in the project are cause for concern.

In the meantime, the project’s Standing Review Board will conduct an independent review in early 2018 to determine if the June 2019 launch window can still be met. With so many experiments and surveys planned for the telescope, it would be no exaggeration to say that a lot is riding on its successful completion and deployment. Best of luck passing review James Webb Space Telescope!

Further Reading: Government Accountability Office

Precise New Measurements From Hubble Confirm the Accelerating Expansion of the Universe. Still no Idea Why it’s Happening

These Hubble Space Telescope images showcase two of the 19 galaxies analyzed in a project to improve the precision of the universe's expansion rate, a value known as the Hubble constant. The color-composite images show NGC 3972 (left) and NGC 1015 (right), located 65 million light-years and 118 million light-years, respectively, from Earth. The yellow circles in each galaxy represent the locations of pulsating stars called Cepheid variables. Credits: NASA, ESA, A. Riess (STScI/JHU)

In the 1920s, Edwin Hubble made the groundbreaking revelation that the Universe was in a state of expansion. Originally predicted as a consequence of Einstein’s Theory of General Relativity, this confirmation led to what came to be known as Hubble’s Constant. In the ensuring decades, and thanks to the deployment of next-generation telescopes – like the aptly-named Hubble Space Telescope (HST) – scientists have been forced to revise this law.

In short, in the past few decades, the ability to see farther into space (and deeper into time) has allowed astronomers to make more accurate measurements about how rapidly the early Universe expanded. And thanks to a new survey performed using Hubble, an international team of astronomers has been able to conduct the most precise measurements of the expansion rate of the Universe to date.

This survey was conducted by the Supernova H0 for the Equation of State (SH0ES) team, an international group of astronomers that has been on a quest to refine the accuracy of the Hubble Constant since 2005. The group is led by Adam Reiss of the Space Telescope Science Institute (STScI) and Johns Hopkins University, and includes members from the American Museum of Natural History, the Neils Bohr Institute, the National Optical Astronomy Observatory, and many prestigious universities and research institutions.

Illustration of the depth by which Hubble imaged galaxies in prior Deep Field initiatives, in units of the Age of the Universe. Credit: NASA and A. Feild (STScI)

The study which describes their findings recently appeared in The Astrophysical Journal under the title “Type Ia Supernova Distances at Redshift >1.5 from the Hubble Space Telescope Multi-cycle Treasury Programs: The Early Expansion Rate“. For the sake of their study, and consistent with their long term goals, the team sought to construct a new and more accurate “distance ladder”.

This tool is how astronomers have traditionally measured distances in the Universe, which consists of relying on distance markers like Cepheid variables – pulsating stars whose distances can be inferred by comparing their intrinsic brightness with their apparent brightness. These measurements are then compared to the way light from distance galaxies is redshifted to determine how fast the space between galaxies is expanding.

From this, the Hubble Constant is derived. To build their distant ladder, Riess and his team conducted parallax measurements using Hubble’s Wide Field Camera 3 (WFC3) of eight newly-analyzed Cepheid variable stars in the Milky Way. These stars are about 10 times farther away than any studied previously – between 6,000 and 12,000 light-year from Earth – and pulsate at longer intervals.

To ensure accuracy that would account for the wobbles of these stars, the team also developed a new method where Hubble would measure a star’s position a thousand times a minute every six months for four years. The team then compared the brightness of these eight stars with more distant Cepheids to ensure that they could calculate the distances to other galaxies with more precision.

Illustration showing three steps astronomers used to measure the universe’s expansion rate (Hubble constant) to an unprecedented accuracy, reducing the total uncertainty to 2.3 percent. Credits: NASA/ESA/A. Feild (STScI)/and A. Riess (STScI/JHU)

Using the new technique, Hubble was able to capture the change in position of these stars relative to others, which simplified things immensely. As Riess explained in a NASA press release:

“This method allows for repeated opportunities to measure the extremely tiny displacements due to parallax. You’re measuring the separation between two stars, not just in one place on the camera, but over and over thousands of times, reducing the errors in measurement.”

Compared to previous surveys, the team was able to extend the number of stars analyzed to distances up to 10 times farther. However, their results also contradicted those obtained by the European Space Agency’s (ESA) Planck satellite, which has been measuring the Cosmic Microwave Background (CMB) – the leftover radiation created by the Big Bang – since it was deployed in 2009.

By mapping the CMB, Planck has been able to trace the expansion of the cosmos during the early Universe – circa. 378,000 years after the Big Bang. Planck’s result predicted that the Hubble constant value should now be 67 kilometers per second per megaparsec (3.3 million light-years), and could be no higher than 69 kilometers per second per megaparsec.

The Big Bang timeline of the Universe. Cosmic neutrinos affect the CMB at the time it was emitted, and physics takes care of the rest of their evolution until today. Credit: NASA/JPL-Caltech/A. Kashlinsky (GSFC).

Based on their sruvey, Riess’s team obtained a value of 73 kilometers per second per megaparsec, a discrepancy of 9%. Essentially, their results indicate that galaxies are moving at a faster rate than that implied by observations of the early Universe. Because the Hubble data was so precise, astronomers cannot dismiss the gap between the two results as errors in any single measurement or method. As Reiss explained:

“The community is really grappling with understanding the meaning of this discrepancy… Both results have been tested multiple ways, so barring a series of unrelated mistakes. it is increasingly likely that this is not a bug but a feature of the universe.”

These latest results therefore suggest that some previously unknown force or some new physics might be at work in the Universe. In terms of explanations, Reiss and his team have offered three possibilities, all of which have to do with the 95% of the Universe that we cannot see (i.e. dark matter and dark energy). In 2011, Reiss and two other scientists were awarded the Nobel Prize in Physics for their 1998 discovery that the Universe was in an accelerated rate of expansion.

Consistent with that, they suggest that Dark Energy could be pushing galaxies apart with increasing strength. Another possibility is that there is an undiscovered subatomic particle out there that is similar to a neutrino, but interacts with normal matter by gravity instead of subatomic forces. These “sterile neutrinos” would travel at close to the speed of light and could collectively be known as “dark radiation”.

This illustration shows the evolution of the Universe, from the Big Bang on the left, to modern times on the right. Credit: NASA

Any of these possibilities would mean that the contents of the early Universe were different, thus forcing a rethink of our cosmological models. At present, Riess and colleagues don’t have any answers, but plan to continue fine-tuning their measurements. So far, the SHoES team has decreased the uncertainty of the Hubble Constant to 2.3%.

This is in keeping with one of the central goals of the Hubble Space Telescope, which was to help reduce the uncertainty value in Hubble’s Constant, for which estimates once varied by a factor of 2.

So while this discrepancy opens the door to new and challenging questions, it also reduces our uncertainty substantially when it comes to measuring the Universe. Ultimately, this will improve our understanding of how the Universe evolved after it was created in a fiery cataclysm 13.8 billion years ago.

Further Reading: NASA, The Astrophysical Journal