Astrophotographer Captures Musk’s Tesla Roadster Moving Through Space

Astrophotographer Rogelio Vernal Andreo with his gear all set up. His rig is a complex set up, including dual Takahashi telescopes photographing the same part of the sky simultaneously. Image; Rogelio Bernal Andreo (DeepSkyColors.com) (CC BY-NC-ND 3.0)

An astrophotographer in California has captured images of Elon Musk’s Tesla Roadster on its journey around our Sun. In the early morning of February 9th, Rogelio Bernal Andreo captured images of the Roadster as it appeared just above the horizon. To get the images, Andreo made use of an impressive arsenal of technological tools.

Andreo knew that photographing the Roadster would be a challenge, since it was over a million miles away at the time. But he has the experience and equipment to pull it off. The first task was to determine where the Tesla would be in the sky. Luckily, NASA’s JPL creates lists of coordinates for objects in the sky, called ephemerides. Andreo found the ephemeris for Starman and the Roadster, and it showed that the pair would be in the Hydra constellation, and that they would be only about 20 degrees above the horizon. That’s a challenge, because it means photographing through more atmospheric density.

The Tesla Roadster and its pilot “Starman” leaving Earth behind. Image: SpaceX

However, the Roadster and its driver would be bright enough to do it. As Andreo says in his blog, “The ephemeris from the JPL also indicated that the Roadster’s brightness would be at magnitude 17.5, and I knew that’s perfectly achievable.” So he gathered his gear, hopped in his vehicle, and went for it.

Andreo’s destination was the Monte Bello Open Space Preserve, a controlled-access area for which he has a night-time use permit. This area is kind of close to the San Francisco Bay Area, so the sky is a little bright for astrophotography, but since the Roadster has a magnitude of 17.5, he thought it was doable. Plus, it’s a short drive from his home.

Once he arrived there, he set up his impressive array of gear: dual telescopes and cameras, along with a tracking telescope and computers running specialized software. Andreo explains it best:

“Let me give you a brief description of my gear – also the one I use for most of my deep-sky images. I have a dual telescope system: two identical telescopes and cameras in parallel, shooting simultaneously at the very same area of the sky – same FOV, save a few pixels. The telescopes are Takahashi FSQ106EDX. Their aperture is 106mm (about 4″) and they give you a native 530mm focal length at f/5. The cameras are SBIG STL11k monochrome CCD cameras, one of the most legendary full-frame CCD cameras for astronomy (not the best one today, mind you, but still pretty decent). All this gear sits on a Takahashi EM-400 mount, the beast that will move it at hair-thin precision during the long exposures. I brought the temperature of the CCD sensors to -20C degrees (-4F) using the CCD’s internal cooling system.”

CCD’s with internal cooling systems. Very impressive!

The Takahashi FSQ106. Two of these beasts are at the heart of Andreo’s astrophotography system. Image: Takahashi Telescopes

Andreo uses a specialized focusing system to get his images. He uses focusers from Robofocus and precision focusing software called FocusMax. He also uses a third, smaller telescope called an autoguider. It focuses on a single star in the Field of View and follows it religiously. When that star moves, the whole rig moves. As Andreo says on his blog, “Autoguiding provides a much better mount movement than tracking, which is leaving up to the mount to blindly “follow” the sky. By actually “following” a star, we can make sure there’ll be no trails whether our exposures are 2 or 30 minutes long.”

Once he was all set up, there was time pressure. The Roadster would only be above the horizon for a short time and the Moon was coming up and threatening to wash out the sky. Andreo got going, but his first shots showed nothing.

Where the Roadster should be, Andreo’s photos showed nothing. But he wasn’t deterred. Image: Rogelio Bernal Andreo, (DeepSkyColor.com) (CC BY-NC-ND 3.0)

Andreo felt that once he got home and could process the images properly, the Tesla Roadster and its driver would be somewhere in his images. He kept taking pictures until about 5 AM. Cold and tired, he finally packed up his gear and went home.

“…no matter what I did, I could not find the Roadster.” Astrophotographer Rogelio Bernal Andreo

After some sleep, he began working with his images. “After a few hours of sleep, I started playing with the data and no matter what I did, I could not find the Roadster. I kept checking the coordinates, nothing made sense. So I decided to try again. The only difference would be that this time the Moon would rise around 3:30am, so I could try star imaging at 2:30am and get one hour of Moon-free skies, maybe that would help.”

Rogelio Bernal Andreo is a very accomplished astrophotographer. His images have been chosen as NASA’s Astronomy Photo of the Day over 50 times. This close-up of the Orion Nebula was chosen as APOD on June 4, 2017. The three bright stars are Orion’s belt. Image: Rogelio Bernal Andreo (DeepSkyColors.com) (CC BY-NC-ND 3.0)

So Andreo set out to capture the Roadster again. The next night, at the same location, he set up his gear again. But this time, some clouds rolled in, and Andreo got discouraged. He stayed to wait for the sky to improve, but it didn’t. By about 4 AM he packed up and headed home.

After a nap, he went over his photos, but still couldn’t find the Roadster. It was a puzzle, because he knew the Roadster’s coordinates. Andreo is no rookie, his photos have been published many times in Astronomy Magazine, Sky and Telescope, National Geographic, and other places. His work has also been chosen as NASA’s APOD (Astronomy Picture of the Day) more than 50 times. So when he can’t find something in his images that should be there, it’s puzzling.

Then he had an A-HA! moment:

“Then it hit me!! When I created the ephemeris from the JPL’s website, I did not enter my coordinates!! I went with the default, whatever that might be! Since the Roadster is still fairly close to us, parallax is significant, meaning, different locations on Earth will see Starman at slightly different coordinates. I quickly recalculate, get the new coordinates, go to my images and thanks to the wide field captured by my telescopes… boom!! There it was!! Impossible to miss!! It had been right there all along, I just never noticed!”

Andreo is clearly a dedicated astrophotographer, and this is a neat victory for him. He deserves a tip of the hat from space fans. Why not check out his website—his gallery is amazing!—and share a comment with him.

Rogelio Bernal Andreo’s website: DeepSkyColors.com
His gallery: http://www.deepskycolors.com/rba_collections.html
Also, check out his Flickr page: https://www.flickr.com/photos/deepskycolors/

Andreo explained how he got the Roadster images in this post on his blog: Capturing Starman from 1 Million Miles

Witness The Power Of A Fully Operational ESPRESSO Instrument. Four Telescopes Acting As One

The ESPRESSO (Echelle SPectrograph for Rocky Exoplanet and Stable Spectroscopic Observations) instrument collects the light from all four of the 8.2-metre telescopes of the ESO's Very Large Telescope in Chile. The combined light-collecting area makes it the largest optical telescope in existence. Image: ESO/L. Calcada

It’s been 20 years since the first of the four Unit Telescopes that comprise the ESO’s Very Large Telescope (VLT) saw first light. Since the year 2000 all four of them have been in operation. One of the original goals of the VLT was to have all four of the ‘scopes work in combination, and that has now been achieved.

The instrument that combines the light from all four of the VLT ‘scopes is called ESPRESSO, which stands for Echelle SPectrograph for Rocky Exoplanet and Stable Spectroscopic Observations. ESPRESSO captures the light from each of the 8.2 meter mirrors in the four Unit Telescopes of the VLT. That combination makes ESPRESSO, in effect, the largest optical telescope in the world.

The huge diffraction grating is at the heart of the ultra-precise ESPRESSO spectrograph. In this image, the diffraction grating is undergoing testing in the cleanroom at ESO Headquarters in Garching bei München, Germany. Image: ESO/M. Zamani

Combining the power of the four Unit Telescopes of the VLT is a huge milestone for the ESO. As ESPRESSO instrument scientist at ESO, Gaspare Lo Curto, says, “ESO has realised a dream that dates back to the time when the VLT was conceived in the 1980s: bringing the light from all four Unit Telescopes on Cerro Paranal together at an incoherent focus to feed a single instrument!” The excitement is real, because along with its other science goals, ESPRESSO will be an extremely powerful planet-hunting telescope.

“ESO has realised a dream that dates back to the time when the VLT was conceived in the 1980s.” – Gaspare Lo Curto, ESPRESSO instrument scientist.

ESPRESSO uses a system of mirrors, lenses, and prisms to transmit the light from each of the four VLT ‘scopes to the spectrograph. This is accomplished with a network of tunnels that was incorporated into the VLT when it was built. ESPRESSO has the flexibility to combine the light from all four, or from any one of the telescopes. This observational flexibility was also an original design goal for ESPRESSO.

The four Unit Telescopes often operate together as the VLT Interferometer, but that’s much different than ESPRESSO. The VLT Interferometer allows astronomers to study extreme detail in bright objects, but it doesn’t combine the light from the four Unit Telescopes into one instrument. ESPRESSO collects the light from all four ‘scopes and splits it into its component colors. This allows detailed analysis of the composition of distant objects.

ESPRESSO team members gather in the control room during ESPRESSO’s first light. Image: ESO/D. Megevand

ESPRESSO is a very complex instrument, which explains why it’s taken until now to be implemented. It works with a principle called “incoherent focus.” In this sense, “incoherent” means that the light from all four telescopes is added together, but the phase information isn’t included as it is with the VLT Interferometer. What this boils down to is that while both the VLT Interferometer and ESPRESSO both use the light of all four VLT telescopes, ESPRESSO only has the spatial resolution of a single 8.2 mirror. ESPRESSO, as its name implies, is all about detailed spectrographic analysis. And in that, it will excel.

“ESPRESSO working with all four Unit Telescopes gives us an enticing foretaste of what the next generation of telescopes, such as ESO’s Extremely Large Telescope, will offer in a few years.” – ESO’s Director General, Xavier Barcons

ESPRESSO is the successor to HARPS, the High Accuracy Radial velocity Planet Searcher, which up until now has been our best exoplanet hunter. HARPS is a 3.6 meter telescope operated by the ESO, and also based on an echelle spectrograph. But the power of ESPRESSO will dwarf that of HARPS.

There are three main science goals for ESPRESSO:

  • Planet Hunting
  • Measuring the Variation of the Fundamental Physical Constants
  • Analyzing the Chemical Composition of Stars in Nearby Galaxies

Planet Hunting

ESPRESSO will take highly precise measurements of the radial velocities of solar type stars in other solar systems. As an exoplanet orbits its star, it takes part in a dance or tug-of-war with the star, the same way planets in our Solar System do with our Sun. ESPRESSO will be able to measure very small “dances”, which means it will be able to detect very small planets. Right now, our planet-hunting instruments aren’t as sensitive as ESPRESSO, which means our exoplanet search results are biased to larger planets. ESPRESSO should detect more smaller, Earth-size planets.

The four Unit Telescopes that make up the ESO’s Very Large Telescope, at the Paranal Observatory> Image: By ESO/H.H.Heyer [CC BY 4.0 (http://creativecommons.org/licenses/by/4.0)], via Wikimedia Commons

Measuring the Variation of the Fundamental Physical Constants

This is where the light-combining power of ESPRESSO will be most useful. ESPRESSO will be used to observe extremely distant and faint quasars, to try and measure the variation of the fundamental physical constants in our Universe. (If there are any variations, that is.) It’s not only the instrument’s light-combining capability that allows this, but also the instrument’s extreme stability.

Specifically, the ESPRESSO will try to take our most accurate measurements yet of the fine structure constant, and the proton to electron mass ratio. Astronomers want to know if these have changed over time. They will use ESPRESSO to examine the ancient light from these distant quasars to measure any change.

Analyzing the Chemical Composition of Stars in Nearby Galaxies

ESPRESSO will open up new possibilities in the measurement of stars in nearby galaxies. It’s high efficiency and high resolution will allow astronomers to study stars outside of the Milky Way in unprecedented detail. A better understanding of stars in other galaxies is always a priority item in astronomy.

We’ll let Project Scientist Paolo Molaro have the last word, for now. “This impressive milestone is the culmination of work by a large team of scientists and engineers over many years. It is wonderful to see ESPRESSO working with all four Unit Telescopes and I look forward to the exciting science results to come.”

James Webb Makes The Journey From Houston To Los Angeles; Last Stop Before It Heads To The Launch Facility In 2019

A look inside the cavernous cargo hold of the C5 aircraft that carried the James Webb to California. Image: NASA/Chris Gunn

The two halves of the James Webb Space Telescope are now in the same location and ready to take the next step on JWST’s journey. On February 2nd, Webb’s Optical Telescope and Integrated Science instrument module (OTIS) arrived at Northrop Grumman Aerospace Systems in Redondo Beach, California. The integrated spacecraft, consisting of the spacecraft bus and sunshield, were already there, waiting for OTIS so they could join together and become a complete spacecraft.

“The team will begin the final stages of integration of the world’s largest space telescope.” – Scott Willoughby, Northrop Grumman’s Program Manage for the JWST.

“It’s exciting to have both halves of the Webb observatory – OTIS and the integrated spacecraft element – here at our campus,” said Scott Willoughby, vice president and program manager for Webb at Northrop Grumman. “The team will begin the final stages of integration of the world’s largest space telescope.”

The Space Telescope for Air, Road, and Sea (STTARS) is a custom-designed container that holds the James Webb’s Optical Telescope and Integrated Science (OTIS) instrument module. In this image its being unloaded from a U.S. military C-5 Charlie aircraft at Los Angeles International Airport (LAX) on Feb. 2, 2018. Image: NASA/Chris Gunn

OTIS arrived from the Johnson Space Center in Houston, where it had successfully completed its cryogenic testing. To prepare for that journey, OTIS was placed inside a custom shipping container designed to protect the delicate and expensive Webb Telescope from any damage. That specially designed container is called the Space Telescope Transporter for Air, Road and Sea (STTARS).

STTARS is a massive container, measuring 4.6 meters (15 feet) wide, 5.2 meters (17 feet) tall, and 33.5 meters feet (110) long, and weighing approximately 75,000 kilograms (almost 165,000 pounds). It’s much larger than the James Webb itself, but even then, the primary mirror wings and the secondary mirror tripod must be folded into flight configuration in order to fit.

The Space Telescope Transporter for Air, Road and Sea (STTARS) NASA’s at Johnson Space Center in Houston. Image: NASA/Chris Gunn

The next step for the JWST is to join the spacecraft itself with OTIS. Once that happens, JWST will be complete and fully integrated. Then there’ll be more tests called observatory-level testing. After that, another journey inside STTARS to Kouru, French Guiana, where the JWST will be launched in 2019.

“This is a major milestone.” – Eric Smith, director of the James Webb Space Telescope Program at NASA.

“This is a major milestone,” said Eric Smith, director of the James Webb Space Telescope Program at NASA. “The Webb observatory, which is the work of thousands of scientists and engineers across the globe, will be carefully tested to ensure it is ready to launch and enable scientists to seek the first luminous objects in the universe and search for signs of habitable planets.”

You can’t fault people, either NASA personnel or the rest of us, for getting excited about each development in the James Webb Space Telescope story. Every time the thing twitches or moves, our excitement re-spawns. It seems like everything that happens with the JWST is now a milestone in its long, uncertain journey. It’s easy to see why.

The Space Telescope That Almost Wasn’t

The James Webb ran into a lot of problems during its development. As can be expected for a ground-breaking, technology-pushing project like the Webb, it’s expensive. In 2011, when the project was well underway, it was revealed that the Webb would cost $8.8 billion, much more than the initial budget of $1.6 billion. The House of Representatives cancelled the project, then restored it, though funding was capped at $8 billion.

That was the main hurdle facing the development of the JWST, but there were others, including timeline delays. The most recent timeline change moved the launch date from 2017 to Spring 2019. As of now, the James Webb is on schedule, and on target to meet its revised budget.

The First “Super Telescope”

The JWST is the first of the “Super Telescopes” to be in operation. Once it’s in place at LaGrange Point 2 (L2), about 1.5 million km (930,000 miles) from Earth, it will begin observing, primarily in infrared. It will surpass both the Hubble Telescope and the Spitzer Telescope, and will “look back in time” to some of oldest stars and galaxies in the universe. It will also examine exoplanets and contribute to the search for life.

Good News For The Search For Life, The Trappist System Might Be Rich In Water

This artist’s impression shows several of the planets orbiting the ultra-cool red dwarf star TRAPPIST-1. New observations and analysis have yielded good estimates of the densities of all seven of the Earth-sized planets and suggest that they are rich in volatile materials, probably water. Image Credit: ESO

When we finally find life somewhere out there beyond Earth, it’ll be at the end of a long search. Life probably won’t announce its presence to us, we’ll have to follow a long chain of clues to find it. Like scientists keep telling us, at the start of that chain of clues is water.

The discovery of the TRAPPIST-1 system last year generated a lot of excitement. 7 planets orbiting the star TRAPPIST-1, only 40 light years from Earth. At the time, astronomers thought at least some of them were Earth-like. But now a new study shows that some of the planets could hold more water than Earth. About 250 times more.

Continue reading “Good News For The Search For Life, The Trappist System Might Be Rich In Water”

SpaceX Performs an Experimental High Retrothrust and Survives a Water Landing

This SpaceX rocket was performing a very high retro-thrust landing in water. It wasn't expected to survive, but did. Image: SpaceX

SpaceX’s most recent rocket launch saw the Falcon 9 perform a high retro-thrust over water, with no drone ship in sight. SpaceX never intended to reuse this rocket, and they haven’t said exactly why.

This launch was conducted on January 31st, and the payload was a communications satellite called GovSat-1. It’s a public-private partnership, and GovSat-1 is a heavy satellite which was placed into a particularly high orbit. It will be used by the government of Luxembourg, and by a private European company called SES. It’ll provide secure communications and surveillance for the military, and it has anti-jamming features to help it resist attack.

A high orbit and a heavy payload means that the Falcon 9 that launched it might not have had enough fuel for its customary drone landing. But other Falcon 9s have launched payloads this high and landed on droneships for reuse. So what gives?

According to SpaceX, they never planned to land and reuse this one. They didn’t exactly say why they did it this way, but it’s been speculated that this one was an older iteration of the Falcon 9 known as the Block3. This is the second time SpaceX flew a Block 3 iteration without trying to reuse it. The first time they launched one without reusing it, it carried 10 Iridium satellites into low-Earth orbit.

The Falcon 9 is flying in Block 4 configuration now, with Block 5 coming in the near future. SpaceX says that the Falcon 9 Block 5 will improve the performance and the reusability of the rocket in the future. They’ve also stated that the Block 5 will be the final configuration. Maybe they let this one land in the ocean because it’s just not needed anymore.

SpaceX’s reusable rocketry technology is their primary development. The main booster of their Falcon 9 can be reconditioned and used again and again, keeping costs down. After lift-off, and after the primary stage is released, the main-stage booster lands on a SpaceX drone ship, where it is secured and delivered to shore to be reused.

In this case, SpaceX wanted to test a high retro-thrust landing. The test consisted of three separate burns performed over water, rather than on a drone ship, to avoid damaging the ship. The rocket itself wasn’t expected to survive, but did. Or it partly survived, anyway. As Elon Musk confirmed in his tweet:

The retro-thrust rockets on SpaceX rockets like the Falcon 9 allow the rocket to land softly. They thrust in the opposite direction the rocket is landing, and cushion the Falcon 9’s landing on the droneship.

With the successful static test of SpaceX’s Falcon Heavy last week, a first launch for the Heavy is in sight. Testing high retro-thrust landings could be related to the upcoming first launch, even though, as Elon Musk said, merely getting the Falcon Heavy off the pad and back would constitute a successful first flight. But that’s just a guess.

The Falcon Heavy is designed to be reusable, just like its little brother, the Falcon 9. Reusability is key to SpaceX and is the whole reason Musk started the company: to make spaceflight more affordable, and to help humanity travel beyond the Moon.

SpaceX plans to tow this Falcon 9 back to shore and see if it can be salvaged. But after being dunked in salt water, any meaningful salvage seems unlikely. Who knows. Maybe Elon Musk will use it for flame-thrower target practice.

But the fate of this single rocket isn’t really that important in the grand scheme of things. What’s important is that SpaceX is still testing designs, and still pushing the boundaries of lower-cost spaceflight.

With that in mind, here’s hoping the whiz kids at SpaceX can destroy a few more rockets. After all, it’s all in the name of science.

The First Results From The IllustrisTNG Simulation Of The Universe Has Been Completed, Showing How Our Cosmos Evolved From The Big Bang

IllustrisTNG is a new simulation model for the Universe. It used over 24,000 processors over the course of more than two months to produce the largest hydrodynamic simulation project to date for the emergence of cosmic structures. Image: IllustrisTNG

The first results of the IllustrisTNG Project have been published in three separate studies, and they’re shedding new light on how black holes shape the cosmos, and how galaxies form and grow. The IllustrisTNG Project bills itself as “The next generation of cosmological hydrodynamical simulations.” The Project is an ongoing series of massive hydrodynamic simulations of our Universe. Its goal is to understand the physical processes that drive the formation of galaxies.

At the heart of IllustriousTNG is a state of the art numerical model of the Universe, running on one of the most powerful supercomputers in the world: the Hazel Hen machine at the High-Performance Computing Center in Stuttgart, Germany. Hazel Hen is Germany’s fastest computer, and the 19th fastest in the world.

The Hazel Hen Supercomputer is based on Intel processors and Cray network technologies. Image: IllustrisTNG

Our current cosmological model suggests that the mass-energy density of the Universe is dominated by dark matter and dark energy. Since we can’t observe either of those things, the only way to test this model is to be able to make precise predictions about the structure of the things we can see, such as stars, diffuse gas, and accreting black holes. These visible things are organized into a cosmic web of sheets, filaments, and voids. Inside these are galaxies, which are the basic units of cosmic structure. To test our ideas about galactic structure, we have to make detailed and realistic simulated galaxies, then compare them to what’s real.

Astrophysicists in the USA and Germany used IllustrisTNG to create their own universe, which could then be studied in detail. IllustrisTNG correlates very strongly with observations of the real Universe, but allows scientists to look at things that are obscured in our own Universe. This has led to some very interesting results so far, and is helping to answer some big questions in cosmology and astrophysics.

How Do Black Holes Affect Galaxies?

Ever since we’ve learned that galaxies host supermassive black holes (SMBHs) at their centers, it’s been widely believed that they have a profound influence on the evolution of galaxies, and possibly on their formation. That’s led to the obvious question: How do these SMBHs influence the galaxies that host them? Illustrious TNG set out to answer this, and the paper by Dr. Dylan Nelson at the Max Planck Institute for Astrophysics shows that “the primary driver of galaxy color transition is supermassive blackhole feedback in its low-accretion state.”

“The only physical entity capable of extinguishing the star formation in our large elliptical galaxies are the supermassive black holes at their centers.” – Dr. Dylan Nelson, Max Planck Institute for Astrophysics,

Galaxies that are still in their star-forming phase shine brightly in the blue light of their young stars. Then something changes and the star formation ends. After that, the galaxy is dominated by older, red stars, and the galaxy joins a graveyard full of “red and dead” galaxies. As Nelson explains, “The only physical entity capable of extinguishing the star formation in our large elliptical galaxies are the supermassive black holes at their centers.” But how do they do that?

Nelson and his colleagues attribute it to supermassive black hole feedback in its low-accretion state. What that means is that as a black hole feeds, it creates a wind, or shock wave, that blows star-forming gas and dust out of the galaxy. This limits the future formation of stars. The existing stars age and turn red, and few new blue stars form.

This is a rendering of gas velocity in a massive galaxy cluster in IllustrisTNG. Black areas are hardly moving, and white areas are moving at greater than 1000km/second. The black areas are calm cosmic filaments, the white areas are near super-massive black holes (SMBHs). The SMBHs are blowing away the gas and preventing star formation. Image: IllustrisTNG

How Do Galaxies Form and How Does Their Structure Develop?

It’s long been thought that large galaxies form when smaller galaxies join up. As the galaxy grows larger, its gravity draws more smaller galaxies into it. During these collisions, galaxies are torn apart. Some stars will be scattered, and will take up residence in a halo around the new, larger galaxy. This should give the newly-created galaxy a faint background glow of stellar light. But this is a prediction, and these pale glows are very hard to observe.

“Our predictions can now be systematically checked by observers.” – Dr. Annalisa Pillepich (Max Planck Institute for Astrophysics)

IllustrisTNG was able to predict more accurately what this glow should look like. This gives astronomers a better idea of what to look for when they try to observe this pale stellar glow in the real Universe. “Our predictions can now be systematically checked by observers,” Dr. Annalisa Pillepich (MPIA) points out, who led a further IllustrisTNG study. “This yields a critical test for the theoretical model of hierarchical galaxy formation.”

A composite image from IllustrisTNG. Panels on the left show galaxy-galaxy interactions and the fine-grained structure of extended stellar halos. Panels on the right show stellar light projections from two massive central galaxies at the present day. It’s easy to see how the light from massive central galaxies overwhelms the light from stellar halos. Image: IllustrisTNG

IllustrisTNG is an on-going series of simulations. So far, there have been three IllustrisTNG runs, each one creating a larger simulation than the previous one. They are TNG 50, TNG 100, and TNG 300. TNG300 is much larger than TNG50 and allows a larger area to be studied which reveals clues about large-scale structure. Though TNG50 is much smaller, it has much more precise detail. It gives us a more detailed look at the structural properties of galaxies and the detailed structure of gas around galaxies. TNG100 is somewhere in the middle.

TNG 50, TNG 100, and TNG 300. Image: IllustrisTNG

IllustrisTNG is not the first cosmological hydrodynamical simulation. Others include Eagle, Horizon-AGN, and IllustrisTNG’s predecessor, Illustris. They have shown how powerful these predictive theoretical models can be. As our computers grow more powerful and our understanding of physics and cosmology grow along with them, these types of simulations will yield greater and more detailed results.

Now That NASA’s Missing IMAGE Satellite Has Been Found, Talking To It Is Going To Be Difficult

This picture shows NASA's IMAGE spacecraft undergoing launch preparations in early 2000. Credit: NASA

It’s easy to imagine the excitement NASA personnel must have felt when an amateur astronomer contacted NASA to tell them that he might have found their missing IMAGE satellite. After all, the satellite had been missing for 10 years.

IMAGE, which stands for Imager for Magnetopause-to-Aurora Global Exploration, was launched on March 25th, 2000. In Dec. 2005 the satellite failed to make routine contact, and in 2007 it failed to reboot. After that, the mission was declared over.

NASA’s IMAGE satellite. Credit: NASA

It’s astonishing that after 10 years, the satellite has been found. It’s even more astonishing that it was an amateur who found it. As if the story couldn’t get any more interesting, the amateur astronomer who found it—Scott Tilly of British Columbia, Canada—was actually looking for a different missing satellite: the secret ZUMA spy satellite launched by the US government on January 7, 2018. (If you’re prone to wearing a tin foil hat, now might be a good time to reach for one.)

NASA’s half-ton IMAGE satellite being launched from Vandenberg Air Force Base on March 25th, 2000. IMAGE was the first satellite designed to actually “see” most of the major charged particle systems in the space surrounding Earth. Image: NASA

After Tilly contacted NASA, they hurried to confirm that it was indeed IMAGE that had been found. To do that, NASA employed 5 separate antennae to seek out any radio signals from the satellite. As of Monday, Jan. 29, signals received from all five sites were consistent with the radio frequency characteristics expected of IMAGE.

In a press release, NASA said, “Specifically, the radio frequency showed a spike at the expected center frequency, as well as side bands where they should be for IMAGE. Oscillation of the signal was also consistent with the last known spin rate for IMAGE.”

“…the radio frequency showed a spike at the expected center frequency…” – NASA Press Release confirming the discovery of IMAGE

Then, on January 30, the Johns Hopkins Applied Physics Lab (JHUAPL) reported that they had successfully collected telemetry data from the satellite. In that signal was the ID code 166, the code for IMAGE. There were probably some pretty happy people at NASA.

So, now what?

A diagram of NASA’s IMAGE satellite. Image: NASA

NASA’s next step is to confirm without a doubt that this is indeed IMAGE. That means capturing and analyzing the data in the signal. That will be a technical challenge, because the types of hardware and operating systems used in the IMAGE Mission Operations Center no longer exist. According to NASA, “other systems have been updated several versions beyond what they were at the time, requiring significant reverse-engineering.” But that should be no problem for NASA. After all, they got Apollo 13 home safely, didn’t they?

If NASA is successful at decoding the data in the signal, the next step is to attempt to turn on IMAGE’s science payload. NASA has yet to decide how to proceed if they’re successful.

IMAGE was the first spacecraft designed to “see the invisible,” as they put it back then. Prior to IMAGE, spacecraft examined Earth’s magnetosphere by detecting particles and fields they encountered as they passed through them. But this method had limited success. The magnetosphere is enormous, and simply sampling a small path—while better than nothing—did not give us an accurate understanding of it.

During its mission, IMAGE did a lot of great science. In July 2000, a spectacular solar storm caused auroras as far south as Mexico. IMAGE captured these images of those poweful auroras. Credit: NASA

IMAGE was going to do things differently. It used 3-dimensional imaging techniques to measure simultaneously the densities, energies and masses of charged particles throughout the inner magnetosphere. To do this, IMAGE carried a payload of 7 instruments:

  • High Energy Neutral Atom (HENA) imager
  • Medium Energy Neutral Atom (MENA) imager
  • Low Energy Neutral Atom (LENA) imager
  • Extreme Ultraviolet (EUV) imager
  • Far Ultraviolet (FUV) imager
  • Radio Plasma Imager (RPI)
  • Central Instrument Data Processor (CIDP)

These instruments allowed IMAGE to not only do great science, and to capture great images, but also to create some stunning never-seen-before movies of auroral activity.

This is a fascinating story, and it’ll be interesting to see if NASA can establish meaningful contact with IMAGE. Will it have a treasure trove of unexplored data on-board? Can it be re-booted and brought back into service? We’ll have to wait and see.

This story is also interesting culturally. IMAGE was in service at a time when the internet wasn’t as refined as it is currently. NASA has mastered the internet and public communications now, but back then? Not so much. For example, to build up interest around the mission, NASA gave IMAGE its own theme song, titled “To See The Invisible.” Yes, seriously.

But that’s just a side-note. IMAGE was all about great science, and it accomplished a lot. You can read all about IMAGE’s science achievements here.

Microbes May Help Astronauts Turn Human Waste Into Food

Researchers at Penn State University are developing a way to use microbes to turn human waste into food on long space voyages. Image: Yuri Gorby, Rensselaer Polytechnic Institute
Microbes play a critical role on Earth. Understanding how they react to space travel is crucial to ensuring astronaut health. Credit: Yuri Gorby, Rensselaer Polytechnic Institute

Geoscience researchers at Penn State University are finally figuring out what organic farmers have always known: digestive waste can help produce food. But whereas farmers here on Earth can let microbes in the soil turn waste into fertilizer, which can then be used to grow food crops, the Penn State researchers have to take a different route. They are trying to figure out how to let microbes turn waste directly into food.

There are many difficulties with long-duration space missions, or with lengthy missions to other worlds like Mars. One of the most challenging difficulties is how to take enough food. Food for a crew of astronauts on a 6-month voyage to Mars, and enough for a return trip, weighs a lot. And all that weight has to be lifted into space by expensive rockets.

SpaceX's reusable rockets are bringing down the cost of launching things into space, but the cost is still prohibitive. Any weight savings contribute to a missions feasibility, including a reduction in food supplies for long space journeys. In this image, a SpaceX Falcon 9 recycled rocket lifts off at sunset at 6:53 PM EDT on 11 Oct 2017.  Credit: Ken Kremer/Kenkremer.com
SpaceX’s reusable rockets are bringing down the cost of launching things into space, but the cost is still prohibitive. Any weight savings contribute to a missions feasibility, including a reduction in food supplies for long space journeys. In this image, a SpaceX Falcon 9 recycled rocket lifts off at sunset at 6:53 PM EDT on 11 Oct 2017. Credit: Ken Kremer/Kenkremer.com

Carrying enough food for a long voyage in space is problematic. Up until now, the solution for providing that food has been focused on growing it in hydroponic chambers and greenhouses. But that also takes lots of space, water, and energy. And time. It’s not really a solution.

“It’s faster than growing tomatoes or potatoes.” – Christopher House, Penn State Professor of Geosciences

What the researchers at Penn State, led by Professor of Geosciences Christopher House, are trying to develop, is a method of turning waste directly into an edible, nutritious substance. Their aim is to cut out the middle man, as it were. And in this case, the middle men are plants themselves, like tomatoes, potatoes, or other fruits and vegetables.

We've always assumed that astronauts working on Mars would feed themselves by growing Earthly crops in simulated Earth conditions. But that requires a lot of energy, space, and materials. It may not be necessary. An artist's illustration of a greenhouse on Mars. Image Credit: SAIC
We’ve always assumed that astronauts working on Mars would feed themselves by growing Earthly crops in simulated Earth conditions. But that requires a lot of energy, space, and materials. It may not be necessary. An artist’s illustration of a greenhouse on Mars. Image Credit: SAIC

“We envisioned and tested the concept of simultaneously treating astronauts’ waste with microbes while producing a biomass that is edible either directly or indirectly depending on safety concerns,” said Christopher House, professor of geosciences, Penn State. “It’s a little strange, but the concept would be a little bit like Marmite or Vegemite where you’re eating a smear of ‘microbial goo.'”

The Penn State team propose to use specific microorganisms to turn waste directly into edible biomass. And they’re making progress.

At the heart of their work are things called microbial reactors. Microbial reactors are basically vessels designed to maximize surface area for microbes to populate. These types of reactors are used to treat sewage here on Earth, but not to produce an edible biomass.

“It’s a little strange, but the concept would be a little bit like Marmite or Vegemite where you’re eating a smear of ‘microbial goo.'” – Christopher House, Penn State Professor of Geosciences

To test their ideas, the researchers constructed a cylindrical vessel four feet long by four inches in diameter. Inside it, they allowed select microorganisms to come into contact with human waste in controlled conditions. The process was anaerobic, and similar to what happens inside the human digestive tract. What they found was promising.

“Anaerobic digestion is something we use frequently on Earth for treating waste,” said House. “It’s an efficient way of getting mass treated and recycled. What was novel about our work was taking the nutrients out of that stream and intentionally putting them into a microbial reactor to grow food.”

One thing the team discovered is that the process readily produces methane. Methane is highly flammable, so very dangerous on a space mission, but it has other desirable properties when used in food production. It turns out that methane can be used to grow another microbe, called Methylococcus capsulatus. Methylococcus capsulatus is used as an animal food. Their conclusion is that the process could produce a nutritious food for astronauts that is 52 percent protein and 36 percent fats.

“We used materials from the commercial aquarium industry but adapted them for methane production.” – Christopher House, Penn State Professor of Geosciences

The process isn’t simple. The anaerobic process involved can produce pathogens very dangerous to people. To prevent that, the team studied ways to grow microbes in either an alkaline environment or a high-heat environment. After raising the system pH to 11, they found a strain of the bacteria Halomonas desiderata that thrived. Halomonas desiderata is 15 percent protein and 7 percent fats. They also cranked the system up to a pathogen-killing 158 degrees Fahrenheit, and found that the edible Thermus aquaticus grew, which is 61 percent protein and 16 percent fats.

Conventional waste treatment plants, like this one in England, take several days to treat waste. The anerobic system tested by the Penn State team treated waste in as little as 13 hours. Image: Nick Allen, CC BY-SA 4.0

Their system is based on modern aquarium systems, where microbes live on the surface of a filter film. The microbes take solid waste from the stream and convert it to fatty acids. Then, those fatty acids are converted to methane by other microbes on the same surface.

Speed is a factor in this system. Existing waste management treatment typically takes several days. The team’s system removed 49 to 59 percent of solids in 13 hours.

This system won’t be in space any time soon. The tests were conducted on individual components, as proof of feasibility. A complete system that functioned together still has to be built. “Each component is quite robust and fast and breaks down waste quickly,” said House. “That’s why this might have potential for future space flight. It’s faster than growing tomatoes or potatoes.”

The team’s paper was published here, in the journal Life Sciences In Space Research.

The Most Detailed Map Ever Made of the Milky Way in Radio Waves

The FUGIN project used the 45 meter Nobeyama radio telescope in Japan to produce the most detailed radio wave map yet of the Milky Way. Image: NAOJ/NASA/JPL-Caltech
The FUGIN project used the 45 meter Nobeyama radio telescope in Japan to produce the most detailed radio wave map yet of the Milky Way. Top: Three color (false color) radio map of the Milky Way (l=10-50 deg) obtained by the FUGIN Project. Red, green, and blue represent the radio intensities of 12CO, 13CO, and C18O, respectively. Second Line: Infrared image of the same region obtained by the Spitzer Space Telescope. Red, green, and blue represent the intensities of 24?m, 8?m, and 5.8?m radio waves respectively. Top Zoom-In: Three color radio map of the Milky Way (l=12-22 deg) obtained by the FUGIN Project. The colors are the same as the top image. Lower-Left Zoom-In: Enlarged view of the W51 region. The colors are the same as the top image.Lower-Right Zoom-In: Enlarged view of the M17 region. The colors are the same as the top image. Image: NAOJ/NASA/JPL-Caltech

A Japanese telescope has produced our most detailed radio wave image yet of the Milky Way galaxy. Over a 3-year time period, the Nobeyama 45 meter telescope observed the Milky Way for 1100 hours to produce the map. The image is part of a project called FUGIN (FOREST Unbiased Galactic plane Imaging survey with the Nobeyama 45-m telescope.) The multi-institutional research group behind FUGIN explained the project in the Publications of the Astronomical Society of Japan and at arXiv.

The Nobeyama 45 meter telescope is located at the Nobeyama Radio Observatory, near Minamimaki, Japan. The telescope has been in operation there since 1982, and has made many contributions to millimeter-wave radio astronomy in its life. This map was made using the new FOREST receiver installed on the telescope.

When we look up at the Milky Way, an abundance of stars and gas and dust is visible. But there are also dark spots, which look like voids. But they’re not voids; they’re cold clouds of molecular gas that don’t emit visible light. To see what’s happening in these dark clouds requires radio telescopes like the Nobeyama.

The Nobeyama 45m radio telescope at the Nobeyama Radio Observatory in Japan. Image:NAOJ
The Nobeyama 45m radio telescope at the Nobeyama Radio Observatory in Japan. Image:NAOJ

The Nobeyama was the largest millimeter-wave radio telescope in the world when it began operation, and it has always had great resolution. But the new FOREST receiver has improved the telescope’s spatial resolution ten-fold. The increased power of the new receiver allowed astronomers to create this new map.

The new map covers an area of the night sky as wide as 520 full Moons. The detail of this new map will allow astronomers to study both large-scale and small-scale structures in new detail. FUGIN will provide new data on large structures like the spiral arms—and even the entire Milky Way itself—down to smaller structures like individual molecular cloud cores.

FUGIN is one of the legacy projects for the Nobeyama. These projects are designed to collect fundamental data for next-generation studies. To collect this data, FUGIN observed an area covering 130 square degrees, which is over 80% of the area between galactic latitudes -1 and +1 degrees and galactic longitudes from 10 to 50 degrees and from 198 to 236 degrees. Basically, the map tried to cover the 1st and 3rd quadrants of the galaxy, to capture the spiral arms, bar structure, and the molecular gas ring.

Starscape photograph taken at Nobeyama Radio Observatory by Norikazu Okabe. The FUGIN observation region (l=10-50 deg) is marked. Credit: National Astronomical Observatory of Japan
Starscape photograph taken at Nobeyama Radio Observatory by Norikazu Okabe. The FUGIN observation region (l=10-50 deg) is marked. Credit: National Astronomical Observatory of Japan

The aim of FUGIN is to investigate physical properties of diffuse and dense molecular gas in the galaxy. It does this by simultaneously gathering data on three carbon dioxide isotopes: 2CO, 13CO, and 18CO. Researchers were able to study the distribution and the motion of the gas, and also the physical characteristics like temperature and density. And the studying has already paid off.

FUGIN has already revealed things previously hidden. They include entangled filaments that weren’t obvious in previous surveys, as well as both wide-field and detailed structures of molecular clouds. Large scale kinematics of molecular gas such as spiral arms were also observed.

An artist’s image showing the major features of the Milky Way galaxy. Credit: NASA/JPL-Caltech, ESO, J. Hurt

But the main purpose is to provide a rich data-set for future work by other telescopes. These include other radio telescopes like ALMA, but also telescopes operating in the infrared and other wavelengths. This will begin once the FUGIN data is released in June, 2018.

Millimeter wave radio astronomy is powerful because it can “see” things in space that other telescopes can’t. It’s especially useful for studying the large, cold gas clouds where stars form. These clouds are as cold as -262C (-440F.) At temperatures that low, optical scopes can’t see them, unless a bright star is shining behind them.

Even at these extremely low temperatures, there are chemical reactions occurring. This produces molecules like carbon monoxide, which was a focus of the FUGIN project, but also others like formaldehyde, ethyl alcohol, and methyl alcohol. These molecules emit radio waves in the millimeter range, which radio telescopes like the Nobeyama can detect.

The top-level purpose of the FUGIN project, according to the team behind the project, is to “provide crucial information about the transition from atomic gas to molecular gas, formation of molecular clouds and dense gas, interaction between star-forming regions and interstellar gas, and so on. We will also investigate the variation of physical properties and internal structures of molecular clouds in various environments, such as arm/interarm and bar, and evolutionary stage, for example, measured by star-forming activity.”

This new map from the Nobeyama holds a lot of promise. A rich data-set like this will be an important piece of the galactic puzzle for years to come. The details revealed in the map will help astronomers tease out more detail on the structures of gas clouds, how they interact with other structures, and how stars form from these clouds.

NASA’s Insight Lander Spreads Its Solar Wings. It’ll Fly To Mars In May, 2018

The Insight lander responds to commands to spread its solar arrays during a January 23, 2018 test at the Lockheed Martin clean room in Littleton, Colorado. Image: Lockheed Martin Space
The Insight lander responds to commands to spread its solar arrays during a January 23, 2018 test at the Lockheed Martin clean room in Littleton, Colorado. Image: Lockheed Martin Space

May 2018 is the launch window for NASA’s next mission to Mars, the InSight Lander. InSight is the next member of what could be called a fleet of human vehicles destined for Mars. But rather than working on the question of Martian habitability or suitability for life, InSight will try to understand the deeper structure of Mars.

InSight stands for Interior Exploration using Seismic Investigations, Geodesy and Heat Transport. InSight will be the first robotic explorer to visit Mars and study the red planet’s deep interior. The work InSight does should answer questions about the formation of Mars, and those answers may apply to the history of the other rocky planets in the Solar System. The lander, (InSight is not a rover) will also measure meteorite impacts and tectonic activity happening on Mars currently.

This video helps explain why Mars is a good candidate to answer questions about how all our rocky planets formed, not just Mars itself.

InSight was conceived as part of NASA’s Discovery Program, which are missions focused on important questions all related to the “content, origin, and evolution of the solar system and the potential for life elsewhere”, according to NASA. Understanding how our Solar System and its planets formed is a key part of the Discovery Program, and is the question InSight was built to answer.

This artist's illustration of InSight on a photo background of Mars shows the lander fully deployed. The solar arrays are open, and in the foreground two of its instruments are shown. On the left is the SEIS instrument, and on the right is the HP3 probe. Image: NASA/Lockheed Martin
This artist’s illustration of InSight on a photo background of Mars shows the lander fully deployed. The solar arrays are open, and in the foreground two of its instruments are shown. On the left is the SEIS instrument, and on the right is the HP3 probe. Image: NASA/Lockheed Martin

To do its work, InSight will deploy three instruments: SEIS, HP³, and RISE.

SEIS

This is InSight’s seismic instrument, designed to take the Martian pulse. It stands for Seismic Experiment for Internal Structure.

In this image, InSight's Instrument Deployment Arm is practicing placing SEIS on the surface. Image: NASA/Lockheed Martin
In this image, InSight’s Instrument Deployment Arm is practicing placing SEIS on the surface. Image: NASA/Lockheed Martin

SEIS sits patiently under its dome, which protects it from Martian wind and thermal effects, and waits for something to happen. What’s it waiting for? For seismic waves caused by Marsquakes, meteorite impacts, or by the churning of magma deep in the Martian interior. These waves will help scientists understand the nature of the material that first formed Mars and the other rocky planets.

HP³

HP³ is InSight’s heat probe. It stands for Heat Flow and Physical Properties Probe. Upon deployment on the Martian surface, HP³ will burrow 5 meters (16 ft.) into Mars. No other instrument has ever pierced Mars this deeply. Once there, it will measure the heat flowing deeply within Mars.

In this image, the Heat Flow and Physical Properties Probe is shown inserted into Mars. Image: NASA
In this image, the Heat Flow and Physical Properties Probe is shown inserted into Mars. Image: NASA

Scientists hope that the heat measured by HP³ will help them understand whether or not Mars formed from the same material that Earth and the Moon formed from. It should also help them understand how Mars evolved after it was formed.

RISE

RISE stands for Rotation and Interior Structure Experiment. RISE will measure the Martian wobble as it orbits the Sun, by precisely tracking InSight’s position on the surface. This will tell scientists a lot about the deep inner core of Mars. The idea is to determine the depth at which the Martian core is solid. It will also tell us which elements are present in the core. Basically, RISE will tell us how Mars responds to the Sun’s gravity as it orbits the Sun. RISE consists of two antennae on top of InSight.

The two RISE antennae are shown in this image. RISE will reveal information about the Martian core by tracking InSight's position while Mars orbits the Sun. Image: NASA/Lockheed Martin
The two RISE antennae are shown in this image. RISE will reveal information about the Martian core by tracking InSight’s position while Mars orbits the Sun. Image: NASA/Lockheed Martin

InSight will land at Elysium Planitia which is a flat and smooth plain just north of the Martian equator. This is considered a perfect location or InSight to study the Martian interior. The landing sight is not far from where Curiosity landed at Gale Crater in 2012.

InSight will land at Elysium Planitia, just north of the Martian equator. Image: NASA/JPL-CalTech
InSight will land at Elysium Planitia, just north of the Martian equator. Image: NASA/JPL-CalTech

InSight will be launched to Mars from Vandenberg Air Force Base in California by an Atlas V-401 rocket. The trip to Mars will take about 6 months. Once on the Martian surface, InSight’s mission will have a duration of about 728 Earth days, or just over 1 Martian year.

InSight won’t be launching alone. The Atlas that launches the lander will also launch another NASA technology experiment. MarCO, or Mars Cube One, is two suitcase-size CubeSats that will travel to Mars behind InSight. Once in orbit around Mars, their job is to relay InSight data as the lander enters the Martian atmosphere and lands. This will be the first time that miniaturized CubeSat technology will be tested at another planet.

One of the MarCO Cubesats that will be launched with InSight. This will be the first time that CubeSat technology will be tested at another planet. Image: NASA/JPL-CalTech
One of the MarCO Cubesats that will be launched with InSight. This will be the first time that CubeSat technology will be tested at another planet. Image: NASA/JPL-CalTech

If the MarCO experiment is successful, it could be a new way of relaying mission data to Earth. MarCO will relay news of a successful landing, or of any problems, much sooner. However, the success of the InSight lander is not dependent on a successful MarCO experiment.