Since the Universe is big and old, and life on Earth didn’t take relatively long to evolve, then life should be everywhere in the Universe. And yet, no matter how hard we look, we don’t see any evidence of it out there, not on Mars, not sending us radio messages, and not taking over entire galaxies and using up all their energy.
This, of course, is the Fermi Paradox, and it’s an absolutely fascinating concept to think about. There are many possible resolutions to the Fermi Paradox, but most of them are unsatisfying. Sure, we could be living in a cosmic zoo, or we fundamentally misunderstand how difficult it’ll be to travel to another star.
And maybe we’re just the first lifeforms in the observable Universe that have reached the level of technology that can conceive of exploring the Universe. But then, what are the chances of that? That really seems unlikely.
But then there’s the idea of the Great Filter. That there’s some kind of event that affects every single intelligent civilization, stopping it from reaching out into the galaxy, sending out signals, and exploring other worlds. Something wipes them out every time.
And considering the fact that we’re on the verge of becoming a multi-planet species ourselves, this concept of the Great Filter becomes even more unsettling.
It could be right around the corner from us.
Our friends at Kurzgesagt just released a video all about the Great Filter, and honestly, I think it’s the best video they’ve ever done. The animation, as always, is excellent, but the way they approach the Great Filter is really innovative, showing how evidence of life in the Universe is actually a bad sign, since it means we’re probably not the first life forms out there.
Which means the Great Filter is even more likely.
If you want to support what Kurzgesagt is doing, join their Patreon program and help them make even more videos.
On Wednesday, January 31st (i.e. today!), a spectacular celestial event occurred. For those who live in the western part of North America, Alaska, and the Hawaiian islands, it was visible in the wee hours of the morning – and some people were disciplined enough to roll out of bed to see it! This was none other than the highly-anticipated “Super Blue Moon“, a rare type of full moon that on this occasion was special for a number of reasons.
For one, it was the third in a series of “supermoons”, where a Full Moon coincides with the Moon being closer in its orbit to Earth (aka. perigee) and thus appears larger. It was also the second full moon of the month, which is otherwise known as a “Blue Moon“. Lastly, for those in right locations, the Moon also passed through the Earth’s shadow, giving it a reddish tint (known as a “Red Moon” or “Blood Moon”).
In short, you could say that what was occurred this morning was a “super blue blood moon.” And as you can see, some truly awesome pictures were taken of this celestial event from all over the world. Here is a collection of pictures that a number of skilled photographers and star gazers have chosen to share with us. Enjoy!
“Thanks to everyone who used the #universetoday hashtag on Instagram to let us know about your pictures. There are many many more in there, so check it out.”
It’s easy to imagine the excitement NASA personnel must have felt when an amateur astronomer contacted NASA to tell them that he might have found their missing IMAGE satellite. After all, the satellite had been missing for 10 years.
IMAGE, which stands for Imager for Magnetopause-to-Aurora Global Exploration, was launched on March 25th, 2000. In Dec. 2005 the satellite failed to make routine contact, and in 2007 it failed to reboot. After that, the mission was declared over.
It’s astonishing that after 10 years, the satellite has been found. It’s even more astonishing that it was an amateur who found it. As if the story couldn’t get any more interesting, the amateur astronomer who found it—Scott Tilly of British Columbia, Canada—was actually looking for a different missing satellite: the secret ZUMA spy satellite launched by the US government on January 7, 2018. (If you’re prone to wearing a tin foil hat, now might be a good time to reach for one.)
After Tilly contacted NASA, they hurried to confirm that it was indeed IMAGE that had been found. To do that, NASA employed 5 separate antennae to seek out any radio signals from the satellite. As of Monday, Jan. 29, signals received from all five sites were consistent with the radio frequency characteristics expected of IMAGE.
In a press release, NASA said, “Specifically, the radio frequency showed a spike at the expected center frequency, as well as side bands where they should be for IMAGE. Oscillation of the signal was also consistent with the last known spin rate for IMAGE.”
“…the radio frequency showed a spike at the expected center frequency…” – NASA Press Release confirming the discovery of IMAGE
Then, on January 30, the Johns Hopkins Applied Physics Lab (JHUAPL) reported that they had successfully collected telemetry data from the satellite. In that signal was the ID code 166, the code for IMAGE. There were probably some pretty happy people at NASA.
So, now what?
NASA’s next step is to confirm without a doubt that this is indeed IMAGE. That means capturing and analyzing the data in the signal. That will be a technical challenge, because the types of hardware and operating systems used in the IMAGE Mission Operations Center no longer exist. According to NASA, “other systems have been updated several versions beyond what they were at the time, requiring significant reverse-engineering.” But that should be no problem for NASA. After all, they got Apollo 13 home safely, didn’t they?
If NASA is successful at decoding the data in the signal, the next step is to attempt to turn on IMAGE’s science payload. NASA has yet to decide how to proceed if they’re successful.
IMAGE was the first spacecraft designed to “see the invisible,” as they put it back then. Prior to IMAGE, spacecraft examined Earth’s magnetosphere by detecting particles and fields they encountered as they passed through them. But this method had limited success. The magnetosphere is enormous, and simply sampling a small path—while better than nothing—did not give us an accurate understanding of it.
IMAGE was going to do things differently. It used 3-dimensional imaging techniques to measure simultaneously the densities, energies and masses of charged particles throughout the inner magnetosphere. To do this, IMAGE carried a payload of 7 instruments:
High Energy Neutral Atom (HENA) imager
Medium Energy Neutral Atom (MENA) imager
Low Energy Neutral Atom (LENA) imager
Extreme Ultraviolet (EUV) imager
Far Ultraviolet (FUV) imager
Radio Plasma Imager (RPI)
Central Instrument Data Processor (CIDP)
These instruments allowed IMAGE to not only do great science, and to capture great images, but also to create some stunning never-seen-before movies of auroral activity.
This is a fascinating story, and it’ll be interesting to see if NASA can establish meaningful contact with IMAGE. Will it have a treasure trove of unexplored data on-board? Can it be re-booted and brought back into service? We’ll have to wait and see.
This story is also interesting culturally. IMAGE was in service at a time when the internet wasn’t as refined as it is currently. NASA has mastered the internet and public communications now, but back then? Not so much. For example, to build up interest around the mission, NASA gave IMAGE its own theme song, titled “To See The Invisible.” Yes, seriously.
But that’s just a side-note. IMAGE was all about great science, and it accomplished a lot. You can read all about IMAGE’s science achievements here.
Special Guest:
Andrzej Stewart currently works in Mission Control at the Johnson Space Center in Houston, Texas. However, from 2015-2016, Andrzej acted as the Chief Engineering Officer during the year-long Hawaii Space Exploration Analog and Simulation (Hi-SEAS) IV Mars simulation mission on Mauna Loa. Prior to that he participated in NASA’s Human Exploration Research Analog (HERA) simulation where he acted as the flight engineer.
Aside from his mission-simulation participation, Andrzej has extensive design and engineering experience within the space program having worked on projects such as Spitzer, NASA’s Deep Space Network, and the Orion spacecraft.
You can read about Andrzej’s time “”on Mars”” and learn more about him by visiting his blog, Surfing with the Aliens.
Announcements:
If you would like to join the Weekly Space Hangout Crew, visit their site here and sign up. They’re a great team who can help you join our online discussions!
We record the Weekly Space Hangout every Wednesday at 5:00 pm Pacific / 8:00 pm Eastern. You can watch us live on Universe Today, or the Weekly Space Hangout YouTube page – Please subscribe!
Geoscience researchers at Penn State University are finally figuring out what organic farmers have always known: digestive waste can help produce food. But whereas farmers here on Earth can let microbes in the soil turn waste into fertilizer, which can then be used to grow food crops, the Penn State researchers have to take a different route. They are trying to figure out how to let microbes turn waste directly into food.
There are many difficulties with long-duration space missions, or with lengthy missions to other worlds like Mars. One of the most challenging difficulties is how to take enough food. Food for a crew of astronauts on a 6-month voyage to Mars, and enough for a return trip, weighs a lot. And all that weight has to be lifted into space by expensive rockets.
Carrying enough food for a long voyage in space is problematic. Up until now, the solution for providing that food has been focused on growing it in hydroponic chambers and greenhouses. But that also takes lots of space, water, and energy. And time. It’s not really a solution.
“It’s faster than growing tomatoes or potatoes.” – Christopher House, Penn State Professor of Geosciences
What the researchers at Penn State, led by Professor of Geosciences Christopher House, are trying to develop, is a method of turning waste directly into an edible, nutritious substance. Their aim is to cut out the middle man, as it were. And in this case, the middle men are plants themselves, like tomatoes, potatoes, or other fruits and vegetables.
“We envisioned and tested the concept of simultaneously treating astronauts’ waste with microbes while producing a biomass that is edible either directly or indirectly depending on safety concerns,” said Christopher House, professor of geosciences, Penn State. “It’s a little strange, but the concept would be a little bit like Marmite or Vegemite where you’re eating a smear of ‘microbial goo.'”
The Penn State team propose to use specific microorganisms to turn waste directly into edible biomass. And they’re making progress.
At the heart of their work are things called microbial reactors. Microbial reactors are basically vessels designed to maximize surface area for microbes to populate. These types of reactors are used to treat sewage here on Earth, but not to produce an edible biomass.
“It’s a little strange, but the concept would be a little bit like Marmite or Vegemite where you’re eating a smear of ‘microbial goo.'” – Christopher House, Penn State Professor of Geosciences
To test their ideas, the researchers constructed a cylindrical vessel four feet long by four inches in diameter. Inside it, they allowed select microorganisms to come into contact with human waste in controlled conditions. The process was anaerobic, and similar to what happens inside the human digestive tract. What they found was promising.
“Anaerobic digestion is something we use frequently on Earth for treating waste,” said House. “It’s an efficient way of getting mass treated and recycled. What was novel about our work was taking the nutrients out of that stream and intentionally putting them into a microbial reactor to grow food.”
One thing the team discovered is that the process readily produces methane. Methane is highly flammable, so very dangerous on a space mission, but it has other desirable properties when used in food production. It turns out that methane can be used to grow another microbe, called Methylococcus capsulatus. Methylococcus capsulatus is used as an animal food. Their conclusion is that the process could produce a nutritious food for astronauts that is 52 percent protein and 36 percent fats.
“We used materials from the commercial aquarium industry but adapted them for methane production.” – Christopher House, Penn State Professor of Geosciences
The process isn’t simple. The anaerobic process involved can produce pathogens very dangerous to people. To prevent that, the team studied ways to grow microbes in either an alkaline environment or a high-heat environment. After raising the system pH to 11, they found a strain of the bacteria Halomonas desiderata that thrived. Halomonas desiderata is 15 percent protein and 7 percent fats. They also cranked the system up to a pathogen-killing 158 degrees Fahrenheit, and found that the edible Thermus aquaticus grew, which is 61 percent protein and 16 percent fats.
Their system is based on modern aquarium systems, where microbes live on the surface of a filter film. The microbes take solid waste from the stream and convert it to fatty acids. Then, those fatty acids are converted to methane by other microbes on the same surface.
Speed is a factor in this system. Existing waste management treatment typically takes several days. The team’s system removed 49 to 59 percent of solids in 13 hours.
This system won’t be in space any time soon. The tests were conducted on individual components, as proof of feasibility. A complete system that functioned together still has to be built. “Each component is quite robust and fast and breaks down waste quickly,” said House. “That’s why this might have potential for future space flight. It’s faster than growing tomatoes or potatoes.”
For almost two decades, NASA’s Earth Observatory has provided a constant stream of information about the Earth’s climate, water cycle, and meteorological patterns. This information has allowed scientists to track weather systems, map urban development and agriculture, and monitor for changes in the atmosphere. This has been especially important given the impact of Anthropogenic Climate Change.
Consider the animation recently released by the Earth Observatory, which show how the city of Cape Town, South Africa has been steadily depleting its supply of fresh water over the past few years. Based on multiple sources of data, this illustration and the images it is based on show how urbanization, over-consumption, and changes in weather patterns around Cape Town are leading to a water crisis.
These images that make up this animation are partly based on satellite data of Cape Town’s six major reservoirs, which was acquired between January 3rd, 2014, and January 14th, 2018. Of these six reservoirs, the largest is the Theewaterskloof Dam, which has a capacity of 480 billion liters (126.8 billion gallons) and accounts for about 41% of the water storage capacity available to Cape Town.
All told, these damns collectively store up to 898,000 megaliters (230 billion gallons) of water for Cape Town’s four million people. But according to data provided by NASA Earth Observatory, Landsat data from the U.S. Geological Survey, and water level data from South Africa’s Department of Water and Sanitation, these reservoirs have been seriously depleted thanks an ongoing drought in the region.
As you can see from the images (and from the animation above), the reservoirs have been slowly shrinking over the past few years. The extent of the reservoirs is shown in blue while dry areas are represented in grey to show how much their water levels have changed. While the decrease is certainly concerning, what is especially surprising is how rapidly it has taken place.
In 2014, Theewaterskloof was near full capacity, and during the previous year, the weather station at Cape Town airport indicated that the region experienced more rainfall than it had seen in decades. Over 682 millimeters (27 inches) of rain was reported in total that year, whereas 515 mm (20.3 in) is considered to be a normal annual rainfall for the region.
However, the region began to experience a drought in 2015 as rainfall faltered to just 325 mm (12.8 in). The next year was even worse with 221 mm (8.7 in); and in 2017, the station recorded just 157 mm (6.2 in) of rain. As of January 29th, 2018, the six reservoirs were at just 26% of their total capacity and Theewaterskloof Dam was in the worst shape, with just 13% of its capacity.
Naturally, this is rather dire news for Cape Town’s 4 million residents, and has led to some rather stark predictions. According to a recent statement made by the mayor of Cape Town, if current consumption patterns continue then the city’s disaster plan will have to be enacted. Known as Day Zero, this plan will go into effect when the city’s reservoirs reach 13.5% of capacity, and will result in water being turned off for all but hospitals and communal taps.
At this point, most people in the city will be left without tap water for drinking, bathing, or other uses and will be forced to procure water from some 200 collection points throughout the city. At present, Day Zero is expected to happen on April 12th, depending on weather patterns and consumption in the coming months.
Ordinarily, the rainy season last from May to September, and the implementation of Day Zero will depend on the level of rainfall. By the end of January, farmers will also stop drawing from the system for irrigation, meaning that water supplies prior to the rainy season could be stretched a little longer.
This is not the first time that Cape Town has been faced with the prospect of a Day Zero. Back in May of 2017, the city was declared a disaster area as the annual rainfall proved to be less than hoped for. This led to the province instituting the Disaster Management Act, which gives the provincial government the power to re-prioritize funding and enact conservation measures to preserve water in preparation for the dry season.
By the following September, Cape Town authorities released a series of guidelines for water usage that banned the use of all drinking water for non-essential purposes and urged people to use less than 87 liters (23 gallons) of water per person, per day. At the same time, authorities indicated that they were pursuing efforts to increase the supply of water by recycling, establish new desalinization facilities, and drill for new sources of groundwater.
But with the drought going into it’s fourth year, there is once again fear that the water crisis is not going to end anytime soon. According to an analysis performed by Piotr Wolski, a hydrologist at the Climate Systems Analysis Group at the University of Cape Town, this sort of pattern is something that happens every 1000 years or so. This conclusion was based on rainfall patterns dating back to 1923.
However, population growth and a lack of new infrastructure in the region has made the current water crisis what it is. Between 1995 and 2018, the population of Cape Town grew by roughly 80% while the capacity of the region’s dams grew by just 15%. However, the current predicament has accelerated plans to increase the water supply by creating new infrastructure and diverting water from the Berg River to the Voëlvlei Dam (now scheduled for completion by 2019).
For people living in many other parts of the world this story is a very familiar one. This includes California, which has been experiencing annual droughts since 2012; and southern India, which was hit by the worst drought in decades in 2016. All over the planet, growing populations and over-consumption are combining with shifting weather patterns and environmental impact to create a growing water crisis.
But as the saying goes, “necessity is the mother of invention”. And there’s nothing like an impending crisis to make people take stock of a problem and look for solutions!
The supertelescopes are coming, enormous ground and space-based observatories that’ll let us directly observe the atmospheres of distant worlds. We know there’s life on Earth, and our atmosphere tells the tale, so can we do the same thing with extrasolar planets? It turns out, coming up with a single biosignature, a chemical in the atmosphere that tells you that yes, absolutely, there’s life on that world, is really tough.
I’ve got to admit, I’ve been pretty bad for this in the past. In old episodes of Astronomy Cast and the Weekly Space Hangout, even here in the Guide to Space, I’ve said that if we could just sample the atmosphere of a distant world, we could say with conviction if there’s life there.
Just detect ozone in the atmosphere, or methane, or even pollution and you could say, “there’s life there.” Well, future Fraser is here to correct past Fraser. While I admire his naive enthusiasm for the search for aliens, it turns out, as always, things are going to be more difficult than we previously thought.
Astrobiologists are actually struggling to figure out a single smoking gun biosignature that could be used to say there’s life out there. And that’s because natural processes seem to have clever ways of fooling us.
What are some potential biosignatures, why are they problematic, and what will it take to get that confirmation?
Let’s start with a world close to home: Mars.
For almost two decades, astronomers have detected large clouds of methane in the atmosphere of Mars. Here on Earth, methane comes from living creatures, like bacteria and farting cows. Furthermore, methane is easily broken down by sunlight, which means that this isn’t ancient methane leftover from billions of years ago. Some process on Mars is constant replenishing it.
But what?
Well, in addition to life, methane can form naturally through volcanism, when rocks interact with heated water.
NASA tried to get to the bottom of this question with the Spirit and Opportunity rovers, and it was expected that Curiosity should have the tools on board to find the source of the methane.
Over the course of several months, Curiosity did detect a boost of methane down there on the surface, but even that has led to a controversy. It turns out the rover itself was carrying methane, and could have contaminated the area around itself. Perhaps the methane it detected came from itself. It’s also possible that a rocky meteorite fell nearby and released some gas that contaminated the results.
The European Space Agency’s ExoMars mission arrived at Mars in October, 2016. Although the Schiaparelli Lander was destroyed, the Trace Gas Orbiter survived the journey and began mapping the atmosphere of Mars in great detail, searching for places that could be venting methane, and so far, we don’t have conclusive results.
In other words, we’ve got a fleet of orbiters and landers at Mars, equipped with instruments designed to sniff out the faintest whiff of methane on Mars.
There’s some really intriguing hints about how the methane levels on Mars seem to rise and fall with the seasons, indicating life, but astrobiologists still don’t agree.
Extraordinary claims require extraordinary evidence and all that.
Some telescopes can already measure the atmospheres of planets orbiting other stars. For the last decade, NASA’s Spitzer Space Telescope has been mapping out the atmospheres of various worlds. For example, here’s a map of the hot jupiter HD 189733b. The place sucks, but wow, to measure an atmosphere, of another planet, that’s pretty spectacular.
They perform this feat by measuring the chemicals of the star while the planet is passing in front of it, and then measure it when there’s no planet. That tells you what chemicals the planet is bringing to the party.
They also were able to measure the atmosphere of HAT-P-26b, which is a relatively small Neptune-sized world orbiting a nearby star, and were surprised to find water vapor in the atmosphere of the planet.
Does that mean there’s life? Wherever we find water on Earth we find life. Nope, you can totally get water without having life.
When it launches in 2019, NASA’s James Webb Space Telescope is going to take this atmospheric sensing to the next level, allowing astronomers to study the atmospheres of many more worlds with a much higher resolution.
One of the first targets for Webb will be the TRAPPIST-1 system with its half-dozen planets orbiting in the habitable zone of a red dwarf star. Webb should be able to detect ozone, methane, and other potential biosignatures for life.
So what will it take to be able to view a distant world and know for sure there’s life there.
Astrobiologist John Lee Grenfell from the German Aerospace Centre recently created a report, going through all the exoplanetary biosignatures that could be out there, and reviewed them for how likely they were to be an indication of life on another world.
The first target will be molecular oxygen, or O2. You’re breathing it right now. Well, 21% of every breath, anyway. Oxygen will last in the atmosphere of another world for thousands of years without a source.
It’s produced here on Earth by photosynthesis, but if a world is being battered by its star, and losing atmosphere, then the hydrogen is blown off into space, and molecular oxygen can remain. In other words, you can’t be certain either way.
How about ozone, aka O3? O2 is converted into O3 through a chemical process in the atmosphere. It sounds like a good candidate, but the problem is that there are natural processes that can produce ozone too. There’s an ozone layer on Venus, one on Mars, and they’ve even been detected around icy moons in the Solar System.
There’s nitrous oxide, also known as laughing gas. It’s produced as an output by bacteria in the soil, and helps contribute to the Earth’s nitrogen cycle. And there’s good news, Earth seems to be the only world in the Solar System that has nitrous oxide in its atmosphere.
But scientists have also developed models for how this chemical could have been generated in the Earth’s early history when its sulfur-rich ocean interacted with nitrogen on the planet. In fact, both Venus and Mars could have gone through a similar cycle.
In other words, you might be seeing life, or you might be seeing a young planet.
Then there’s methane, the chemical we spent so much time talking about. And as I mentioned, there’s methane produced by life here on Earth, but it’s also on Mars, and there are liquid oceans of methane on Titan.
Astrobiologists have suggested other hydrocarbons, like ethane, isoprene, but these have their own problems too.
What about the pollutants emitted by advanced civilizations? Astrobiologists call these “technosignatures”, and they could include things like chlorofluorocarbons, or nuclear fallout. But again, these chemicals would be hard to detect light years away.
Astronomers have suggested that we should search for dead earths, just to set a baseline. These would be worlds located in the habitable zone, but clearly life never got going. Just rock, water and a non-biologically created atmosphere.
The problem is that we probably can’t even figure out a way to confirm that a world is dead either. The kinds of chemicals you’d expect to see in the atmosphere, like carbon dioxide could be absorbed by oceans, so you can’t even make a negative confirmation.
One method might not even involve scanning atmospheres at all. The vegetation here on Earth reflects back a very specific wavelength of light in the 700-750 nanometer region. Astrobiologists call this the “red edge”, because you’ll see a 5X increase in reflectivity compared to other surfaces.
Although we don’t have the telescopes to do this today, there are some really clever ideas, like looking at how the light from a planet reflects onto a nearby moon, and analyze that. Searching for exoplanet earthshine.
In fact, back in the Earth’s early history, it would have looked more purple because of Archaean bacteria.
There’s a whole fleet of spacecraft and ground observatories coming online that’ll help us push further into this question.
ESA’s Gaia mission is going to map and characterize 1% of the stars in the Milky Way, telling us what kinds of stars are out there, as well as detect thousands of planets for further observation.
The Transiting Exoplanet Space Survey, or TESS, launches in 2018, and will find all the transiting Earth-sized and larger exoplanets in our neighborhood.
The PLATO 2 mission will find rocky worlds in the habitable zone, and James Webb will be able to study their atmospheres. We also talked about the massive LUVOIR telescope that could come online in the 2030s, and take these observations to the next level.
And there are many more space and ground-based observatories in the works.
As this next round of telescopes comes online, the ones capable of directly measuring the atmosphere of an Earth-sized world orbiting another star, astrobiologists are going to struggling to find a biosignature that provides a clear sign there’s life there.
Instead of certainty, it looks like we’re going to have the same struggle to make sense of what we’re seeing. Astronomers will be disagreeing with each other, developing new techniques and new instruments to answer unsolved questions.
It’s going to take a while, and the uncertainty is going to be tough to handle. But remember, this is probably the most important scientific question that anyone can ask: are we alone in the Universe?
In December of 2013, the European Space Agency’s Gaia mission took to space. Since that time, this space observatory has been studying a billion astronomical objects – including stars, planets, comets, asteroids and galaxies – for the sake of creating the most precise 3D space catalog ever made. By the time the mission wraps up (later this year, barring extensions), it is expected to reveal some truly amazing things about our Universe.
In fact, with the first release of its data, the Gaia probe revealed something that has gone completely unnoticed until now. While viewing Sirius, the brightest star in the night sky, Gaia revealed a stellar cluster that had previously been obscured by Sirius’ bright light. This cluster – now known as the Gaia 1 Cluster – is now available to the public thanks to a picture that was taken by an amateur astronomer from Germany.
Given its brightness and the fact that it is visible from just about anywhere on the planet, Sirius has been known since antiquity, and was featured prominently in the astrological and astronomical traditions of many cultures. To the ancient Egyptians, the star was used to keep track of time and agriculture, since its return to the sky was linked to the annual flooding of the Nile.
In Ancient Greek mythology, Sirius represented the eye of the Canis Major constellation. Along with Canis Minor, it formed the Great Dog that diligently followed Orion, the Hunter. In Chinese astronomy, the star is known as the star of the “celestial wolf” and lies in the Mansion of Jing. And when Ptolemy created his influential astronomical tract in the 3rd century CE (the Almagest), he used Sirius as the location for the globe’s central meridian.
By the mid-19th century, astronomers determined that Sirius is actually a binary star system. Essentially, the star system consists of a main sequence white dwarf that is roughly two Solar masses and a white dwarf that is slightly more massive than our Sun. Sirius’ bright appearance means that astronomers have had plenty of light to study the star’s properties, but also causes it to outshine other celestial objects in its vicinity.
However, in the course of counting the stars around Sirius, Gaia’s sophisticated instruments managed to detect the Gaia 1 Cluster for the first time. News of both this cluster and another newly-discovered one (the Gaia 2 Cluster) became public after the first release of Gaia data, which took place in September 2016. News of this discovery sent ripples through the astronomical community and has led to much research into this cluster and its companion.
News of the discovery also prompted attempts to visually capture the cluster. Roughly a year ago, Harald Kaiser – an amateur astronomer from Karlsruhe, Germany – attended a public talk about the Gaia mission, where he learned about the Gaia 1 Cluster being spotted near Sirius. Kaiser then eagerly waited for the next clear night so he could find the cluster himself using his 30 cm telescope.
After snapping a picture of Sirius and correcting for its bright glare, he was able to capture some of the brightest stars in the cluster. As you can see from the image he took (at top), the cluster lies slightly to the left of Sirius and shows a smattering of some of its largest and brightest stars. In addition to revealing the location of this cluster, Kaiser’s efforts are also part of a larger effort to capitalize on the Gaia mission’s progress.
According to a study released in February of last year – led by Sergey Kopsov of Carnegie Melon University – Gaia 1 is a particularly massive cluster. In essence, it weighs in at an impressive 22,000 Solar Masses, is about 29 light-years (9 parsecs) in diameter, and is located 15,000 light years (4.6 kiloparsecs) from Earth. In addition to its size and the fact that it was previously undiscovered, it’s proximity also makes it an opportune target for future research.
The announcement of this cluster has also caused a fair degree of excitement in the scientific community since it validates the capabilities of Gaia and serves as an example of the kinds of things it is expected to reveal. Astronomers are now looking forward to Gaia’s second data release (planned for April 25th) which is expected to provide even more possibilities for new and exciting discoveries.
And be sure to check out this video about the Gaia mission, courtesy of the ESA:
A Japanese telescope has produced our most detailed radio wave image yet of the Milky Way galaxy. Over a 3-year time period, the Nobeyama 45 meter telescope observed the Milky Way for 1100 hours to produce the map. The image is part of a project called FUGIN (FOREST Unbiased Galactic plane Imaging survey with the Nobeyama 45-m telescope.) The multi-institutional research group behind FUGIN explained the project in the Publications of the Astronomical Society of Japan and at arXiv.
The Nobeyama 45 meter telescope is located at the Nobeyama Radio Observatory, near Minamimaki, Japan. The telescope has been in operation there since 1982, and has made many contributions to millimeter-wave radio astronomy in its life. This map was made using the new FOREST receiver installed on the telescope.
When we look up at the Milky Way, an abundance of stars and gas and dust is visible. But there are also dark spots, which look like voids. But they’re not voids; they’re cold clouds of molecular gas that don’t emit visible light. To see what’s happening in these dark clouds requires radio telescopes like the Nobeyama.
The Nobeyama was the largest millimeter-wave radio telescope in the world when it began operation, and it has always had great resolution. But the new FOREST receiver has improved the telescope’s spatial resolution ten-fold. The increased power of the new receiver allowed astronomers to create this new map.
The new map covers an area of the night sky as wide as 520 full Moons. The detail of this new map will allow astronomers to study both large-scale and small-scale structures in new detail. FUGIN will provide new data on large structures like the spiral arms—and even the entire Milky Way itself—down to smaller structures like individual molecular cloud cores.
FUGIN is one of the legacy projects for the Nobeyama. These projects are designed to collect fundamental data for next-generation studies. To collect this data, FUGIN observed an area covering 130 square degrees, which is over 80% of the area between galactic latitudes -1 and +1 degrees and galactic longitudes from 10 to 50 degrees and from 198 to 236 degrees. Basically, the map tried to cover the 1st and 3rd quadrants of the galaxy, to capture the spiral arms, bar structure, and the molecular gas ring.
The aim of FUGIN is to investigate physical properties of diffuse and dense molecular gas in the galaxy. It does this by simultaneously gathering data on three carbon dioxide isotopes: 2CO, 13CO, and 18CO. Researchers were able to study the distribution and the motion of the gas, and also the physical characteristics like temperature and density. And the studying has already paid off.
FUGIN has already revealed things previously hidden. They include entangled filaments that weren’t obvious in previous surveys, as well as both wide-field and detailed structures of molecular clouds. Large scale kinematics of molecular gas such as spiral arms were also observed.
But the main purpose is to provide a rich data-set for future work by other telescopes. These include other radio telescopes like ALMA, but also telescopes operating in the infrared and other wavelengths. This will begin once the FUGIN data is released in June, 2018.
Millimeter wave radio astronomy is powerful because it can “see” things in space that other telescopes can’t. It’s especially useful for studying the large, cold gas clouds where stars form. These clouds are as cold as -262C (-440F.) At temperatures that low, optical scopes can’t see them, unless a bright star is shining behind them.
Even at these extremely low temperatures, there are chemical reactions occurring. This produces molecules like carbon monoxide, which was a focus of the FUGIN project, but also others like formaldehyde, ethyl alcohol, and methyl alcohol. These molecules emit radio waves in the millimeter range, which radio telescopes like the Nobeyama can detect.
The top-level purpose of the FUGIN project, according to the team behind the project, is to “provide crucial information about the transition from atomic gas to molecular gas, formation of molecular clouds and dense gas, interaction between star-forming regions and interstellar gas, and so on. We will also investigate the variation of physical properties and internal structures of molecular clouds in various environments, such as arm/interarm and bar, and evolutionary stage, for example, measured by star-forming activity.”
This new map from the Nobeyama holds a lot of promise. A rich data-set like this will be an important piece of the galactic puzzle for years to come. The details revealed in the map will help astronomers tease out more detail on the structures of gas clouds, how they interact with other structures, and how stars form from these clouds.