Exoplanet-Hunting TESS Satellite to be Launched by SpaceX

A conceptual image of the Transiting Exoplanet Survey Satellite. Image Credit: MIT
A conceptual image of the Transiting Exoplanet Survey Satellite. Image Credit: MIT

The search for exoplanets is heating up, thanks to the deployment of space telescopes like Kepler and the development of new observation methods. In fact, over 1800 exoplanets have been discovered since the 1980s, with 850 discovered just last year. That’s quite the rate of progress, and Earth’s scientists have no intention of slowing down!

Hot on the heels of the Kepler mission and the ESA’s deployment of the Gaia space observatory last year, NASA is getting ready to launch TESS (the Transiting Exoplanet Survey Satellite). And to provide the launch services, NASA has turned to one of its favorite commercial space service providers – SpaceX.

The launch will take place in August 2017 from the Cape Canaveral Air Force Station in Florida, where it will be placed aboard a Falcon 9 v1.1 – a heavier version of the v 1.0 developed in 2013. Although NASA has contracted SpaceX to perform multiple cargo deliveries to the International Space Station, this will be only the second time that SpaceX has assisted the agency with the launch of a science satellite.

This past September, NASA also signed a lucrative contract with SpaceX worth $2.6 billion to fly astronauts and cargo to the International Space Station. As part of the Commercial Crew Program, SpaceX’s Falcon 9 and Dragon spacecraft were selected by NASA to help restore indigenous launch capability to the US.

James Webb Space Telescope. Image credit: NASA/JPL
Artist’s impression of the James Webb Space Telescope, the space observatory scheduled for launch in 2018. Image Credit: NASA/JPL

The total cost for TESS is estimated at approximately $87 million, which will include launch services, payload integration, and tracking and maintenance of the spacecraft throughout the course of its three year mission.

As for the mission itself, that has been the focus of attention for many years. Since it was deployed in 2009, the Kepler spacecraft has yielded more and more data on distant planets, many of which are Earth-like and potentially habitable. But in 2013, two of four reaction wheels on Kepler failed and the telescope has lost its ability to precisely point toward stars. Even though it is now doing a modified mission to hunt for exoplanets, NASA and exoplanet enthusiasts have been excited by the prospect of sending up another exoplanet hunter, one which is even more ideally suited to the task.

Once deployed, TESS will spend the next three years scanning the nearest and brightest stars in our galaxy, looking for possible signs of transiting exoplanets. This will involve scanning nearby stars for what is known as a “light curve”, a phenomenon where the visual brightness of a star drops slightly due to the passage of a planet between the star and its observer.

By measuring the rate at which the star dims, scientists are able to estimate the size of the planet passing in front of it. Combined with measurements the star’s radial velocity, they are also able to determine the density and physical structure of the planet. Though it has some drawbacks, such as the fact that stars rarely pass directly in front of their host stars, it remains the most effective means of observing exoplanets to date.

Number of extrasolar planet discoveries per year through September 2014, with colors indicating method of detection:   radial velocity   transit   timing   direct imaging   microlensing. Image Credit: Public domain
Number of extrasolar planet discoveries on up to Sept. 2014, with colors indicating method of detection. Blue: radial velocity; Green: transit; Yellow: timing, Red: direct imaging; Orange: microlensing. Image Credit: Alderon/Wikimedia Commons

In fact, as of 2014, this method became the most widely used for determining the presence of exoplanets beyond our Solar System. Compared to other methods – such as measuring a star’s radial velocity, direct imaging, the timing method, and microlensing – more planets have been detected using the transit method than all the other methods combined.

In addition to being able to spot planets by the comparatively simple method of measuring their light curve, the transit method also makes it possible to study the atmosphere of a transiting planet. Combined with the technique of measuring the parent star’s radial velocity, scientists are also able to measure a planet’s mass, density, and physical characteristics.

With TESS, it will be possible to study the mass, size, density and orbit of exoplanets. In the course of its three-year mission, TESS will be looking specifically for Earth-like and super-Earth candidates that exist within their parent star’s habitable zone.

This information will then be passed on to Earth-based telescopes and the James Webb Space Telescope – which will be launched in 2018 by NASA with assistance from the European and Canadian Space Agencies – for detailed characterization.

The TESS Mission is led by the Massachusetts Institute of Technology – who developed it with seed funding from Google – and is overseen by the Explorers Program at NASA’s Goddard Space Flight Center in Greenbelt, Maryland.

Further Reading: NASA, SpaceX

 

Rogue Star HIP 85605 on Collision Course with our Solar System, but Earthlings Need Not Worry

Collisions of neutron stars produce powerful gamma-ray bursts – and heavy elements like gold (Credit: Dana Berry, SkyWorks Digital, Inc.)

It’s known as HIP 85605, one of two stars that make up a binary in the Hercules constellation roughly 16 light years away. And if a recent research paper produced by Dr. Coryn Bailer-Jones of the Max Planck Institute for Astronomy in Heidelberg, Germany is correct, it is on a collision course with our Solar System.

Now for the good news: according to Bailer-Jones’ calculations, the star will pass by our Solar System at a distance of 0.04 parsecs, which is equivalent to 8,000 times the distance between the Earth and the Sun (8,000 AUs). In addition, this passage will not affect Earth or any other planet’s orbit around the Sun. And perhaps most importantly of all, none of it will be happening for another 240,000 to 470,000 years from now.

“Even though the galaxy contains very many stars,” Bailer-Jones told Universe Today via email, “the spaces between them are huge. So even over the (long) life of our galaxy so far, the probability of any two stars have actually collided — as opposed to just coming close — is extremely small.”

However, in astronomical terms, that still counts as a near-miss. In a universe that is 46 billion light years in any direction – and that’s just the observable part of it – an event that is expected to take place just 50 light days away is considered to be pretty close. And in the context of space and time, a quarter of a million to half a million years is the very near future.

The real concern is the effect that the passage of HIP 85605 could have on the Oort Cloud – the massive cloud of icy planetesimals that surrounds the Solar System. Given that it’s distance is between 20,000 and 50,000 AU from our Sun, HIP 85605 would actually move through the Oort cloud and cause serious disruption.

The layout of the solar system, including the Oort Cloud, on a logarithmic scale. Credit: NASA
The layout of the Solar System, including the Oort Cloud, which lies 50,000 AU from our Sun. Credit: NASA

Many of these planetesimals could be blown off into space, but others could be sent hurtling towards Earth. Assuming humanity is still around at this point in time, this could present a bit of an inconvenience, even if it is spread over the course of a million years.

As it stands, such “close encounters” between stars are quite rare. Stellar collisions usually only occur within binaries, where white dwarfs or neutron stars are concerned. “The exception to this is physically bound binary stars in a tight orbit,” said Bailer-Jones. “It can and does happen that one star expands during its evolution and will then interfere with the evolution of the other star. Neutron-neutron star pairs can even merge.”

But of course, on an astronomical timescale, stars passing each other by as they perform their cosmic dance is actually a pretty common occurrence. As part of Bailer-Jones larger study of over 50,000 stars within our galaxy, this “close encounter”  is one of several predicted to take place in the coming years.

Of all of them, only HIP 85605 is expected to come within a single parsec between 240 and 470 thousand years from now. He also indicates with (90% confidence) that the last time such an encounter took place was 3.8 million years ago when gamma Microscopii – a G7 giant which has two and a half times the mass of our Sun – came within 0.35-1.34 pc of our system, which may have caused a large perturbation in the Oort cloud.

Chandra data (above, graph) on J0806 show that its X-rays vary with a period of 321.5 seconds, or slightly more than five minutes. This implies that the X-ray source is a binary star system where two white dwarf stars are orbiting each other (above, illustration) only 50,000 miles apart, making it one of the smallest known binary orbits in the Galaxy. According to Einstein's General Theory of Relativity, such a system should produce gravitational waves - ripples in space-time - that carry energy away from the system and cause the stars to move closer together. X-ray and optical observations indicate that the orbital period of this system is decreasing by 1.2 milliseconds every year, which means that the stars are moving closer at a rate of 2 feet per year.
Tightly bound binary stars, like the ones illustrated here, sometimes result in stellar collisions. Credit: Chandra

On his MPIA webpage, in the study’s FAQ section, Bailer-Jones claims that his research into stellar close encounters was motivated by a desire to study the potential impacts of astronomical phenomena on Earth, and is part of a larger program named “astroimpacts”.

“I am interested in the history of the Earth,” he says, “and astronomical phenomena have clearly played a role in this. But what role precisely, how significant, and what can we expect to happen in the future?” Whereas several studies have been conducted in the past, he feels that the methods – which include assuming a linear relative motion of stars – produces inaccurate results.”

In contrast, Bailer-Jones study relies on “more recent data or re-analyses of data to produce hopefully more accurate results, and then compensate more rigorously for the uncertainties in the data, so that I can attach probabilities to my statements.”

As a result of this, he predicts that HIP 85605 has a 90% chance of passing within a single parsec of our Sun in the next 240 to 470 thousands years. However, he also admits that if the astronomy is incorrect, the next closest encounter won’t be happening for another 1.3 million years, when a K7 dwarf known as GL 710 is predicted to pass within 0.10 – 0.44 parsecs.

Bailer-Jones also believes that the European Space Agency’s Gaia spacecraft will help make more accurate predictions in the future. By understanding and mapping the environment of the Milky Way Galaxy, measuring the gravitational potential and determining the velocity of stars, scientists will be able to see how their various orbits around the galaxy’s center could cause them to intersect.

Artistic impression of what Kepler-186f may look like. Image Credit:  NASA Ames/SETI Institute/JPL-CalTech
It is likely that passing stars have a system of exoplanets (like Kepler-186f pictured here), which would place them within a few parsecs of Earth. Image Credit: NASA Ames/SETI Institute/JPL-CalTech

But perhaps the most interesting question explored on his webpage is the possibility of using stellar close encounters as a shortcut for exploring exoplanets. According to current cosmological models, the majority of stars within our galaxy are believed to host exoplanets.

So if a star is passing us at just a few parsecs (or even with a single parsec) why not hop on over and investigate its planets? Well, as Bailer-Jones indicates, that’s not really a practical idea: “Traveling to a star passing our solar system at a distance of around 1 pc with a relative speed of 30 km/s is no easier than traveling the the nearby stars (the nearest of which is just over 1 pc away). And we would have to wait 10s of thousands of years for the next encounter. If we can ever achieve interstellar travel, I don’t suppose it would take that long to achieve, so why wait?”

Darn. Still, if there’s one thing this phenomena and Bailer-Jones study reminds us, it is that in the course of dancing around the center of the Milky Way, stars are not fixed in a single point in space. Not only do they periodically move within reach of each other, they can also have an affect on life within them.

Alas, the timescale on which such things happen, not to mention the consequences they entail, are so large that people here on Earth need not worry. By the time HIP 85605 or GL 710 come within a parsec or two of us, we’ll either be long-since dead or too highly evolved to care!

*Update: According to a new study posted by Erick E. Mamajek and associates on arXiv, the passage of the recently-discovered low mass star W0720 (aka. “Scholtz Star”) – roughly 70,000 years ago and at a distance of 0.25 Parsecs from our Sun – was the closest encounter our Solar System has had with another star. They calculate the possibility that it would have penetrated the System’s Outer Oort Cloud at 98%. However, they also estimate that the impact it would have had on the flux of long-period comets was negligible, but that the passage also highlights how “dynamically important Oort Cloud perturbers may be lurking among nearby stars”.

Having read the study, Bailer-Jones claims on the updated FAQ section of his MPIA webpage that their analysis appears to be correct. Based on the assumption that the star was moving on a constant velocity relative to the Sun prior to the encounter, he agrees that the calculations on the distances and timing of the passage are valid. While his own study identified a possible closer encounter (Hip 85605), he reiterates that the data on this star is of poor quality. Meanwhile, another close encounter took place involving Hip 89825; but here, the approach distance is estimated to have been 0.02 Parsecs larger. Hence, W0720 can be said to have been the closest encounter with some degree of certainty at this time.

The study appeared on Feb. 16th at arXiv Astrophysics.

Further Reading: arXiv Astrophysics, Max Planck Institute of Astronomy

2015 Expected to be a Record-Breaking Year for Soyuz-2 Workhorse

A Soyuz-2 rocket lifts off from Kourou on April 3, 2014, with Sentinel-1A satellite. Credit: ESA

2014 was a banner year for the Russian Space Agency, with a record-setting fourteen launches of the next generation unmanned Soyuz-2 rocket. A number of other firsts took place in the course of the year as well, cementing the Soyuz family of rockets as the most flown and most reliable rocket group ever.

But already it seems as though the new year will be an even better year, with a full 20 missions already scheduled to take place, a number of them holdovers from 2014.

The Soyuz 2 launcher currently operates alongside the Soyuz-U (mainly used for launching the unmanned Progress Resupply Spacecraft to the International Space Station) and the Soyuz FG (primarily used for human flights with the Soyuz Spacecraft for missions to ISS), but according to Spaceflight 101, the Soyuz 2 will eventually replace the other vehicles once they are phased out.

In fact, in October of 2014, the Soyuz 2 had its first launch of a Progress cargo spacecraft. Other achievements were that the last two launches of the year were conducted without the aid of DM blocks – a derivative of the Blok D upper stage launch rocket developed during the 1960’s.

As Leonid Shalimov, the CEO of NPO Avtomatiki, the Russian electronic engineering and research organization, said in an interview with the government-owned Russian news agency TASS: “Fourteen launches of Soyuz-2 were carried out in 2014 – a record number in the company history,” he said. “Meanwhile, a total of 19 launches were planned in the outgoing year, five have been postponed till 2015.”

Soyuz-2 rocket preparing to launch from the Plesetsk Cosmodrome in June, 2013. Image Credit: Russian Space News
Soyuz-2 rocket preparing to launch from the Plesetsk Cosmodrome in June, 2013. Image Credit: Russian Space News

As a leader in the development of radio-electronic equipment and rocket space systems, the company is behind the development of a number of automated and integrated control systems that are used in space, at sea, heavy industry, and by oil and natural gas companies.

However, it is arguably the company’s work with Soyuz-2 rockets that has earned the most attention. As a general designation for the newest version of the rocket, the Soyuz-2 is essentially a three-stage rocket carrier and will be used to transport crews and supplies into Low Earth Orbit (LEO).

Compared to previous generations of the rocket, the Soyuz-2 features updated engines with improved injection systems on the first-stage boosters, as well as the two core engine stages.

Unlike previous incarnations, the Soyuz-2 can also be launched from a fixed launched platform since they are capable of performing rolls while in flight to change their heading. The old analog control systems have also been upgraded with a new digital flight control and telemetry systems that can adapt to changing conditions in mid-flight.

Russia is developing a new generation Advanced Crew Transportation System. Its first flight to the Moon is planned for 2028. Credit: TASS
The Advanced Crew Transportation System, a next-generation reusable craft intended for a Russian lunar mission in 2028. Credit: TASS

In total, some 42 launches of this rocket have taken place over the past decade, the first taking place on November 8th, 2004  from the Plesetsk Cosmodrome – located about 200 km outside of Archangel.

The majority of launches were for the sake of deploying weather, observation and communication satellites.

You can see a full list of Soyuz launches and missions scheduled for 2015 here at the RussianSpaceWeb.

Long-term, the Soyuz-2 is also expected to play a key role in Russia’s plan for a manned lunar mission, which is tentatively scheduled to take place in 2028.

Further Reading: TASS

Making the Trip to Mars Cheaper and Easier: The Case for Ballistic Capture

How long does it take to get to Mars
A new proposal for sending craft to Mars could save money and offer more flexible launch windows. Credit: NASA

When sending spacecraft to Mars, the current, preferred method involves shooting spacecraft towards Mars at full-speed, then performing a braking maneuver once the ship is close enough to slow it down and bring it into orbit.

Known as the “Hohmann Transfer” method, this type of maneuver is known to be effective. But it is also quite expensive and relies very heavily on timing. Hence why a new idea is being proposed which would involve sending the spacecraft out ahead of Mars’ orbital path and then waiting for Mars to come on by and scoop it up.

This is what is known as “Ballistic Capture”, a new technique proposed by Professor Francesco Topputo of the Polytechnic Institute of Milan and Edward Belbruno, a visiting associated researcher at Princeton University and former member of NASA’s Jet Propulsion Laboratory.

In their research paper, which was published in arXiv Astrophysics in late October, they outlined the benefits of this method versus traditional ones. In addition to cutting fuel costs, ballistic capture would also provide some flexibility when it comes to launch windows.

MAVEN was launched into a Hohmann Transfer Orbit with periapsis at Earth's orbit and apoapsis at the distance of the orbit of Mars. Credit: NASA
MAVEN was launched into a Hohmann Transfer Orbit with periapsis at Earth’s orbit and apoapsis at the distance of the orbit of Mars. Credit: NASA

Currently, launches between Earth and Mars are limited to period where the rotation between the two planets is just right. Miss this window, and you have to wait another 26 months for a new one to come along.

At the same time, sending a rocket into space, through the vast gulf that separates Earth’s and Mars’ orbit, and then firing thrusters in the opposite direction to slow down, requires a great deal of fuel. This in turn means that the spacecraft responsible for transporting satellites, rovers, and (one day) astronauts need to be larger and more complicated, and hence more expensive.

As Belbruno told Universe Today via email:  “This new class of transfers is very promising for giving a new approach to future Mars missions that should lower cost and risk.  This new class of transfers should be applicable to all the planets. This should give all sorts of new possibilities for missions.”

The idea was first proposed by Belbruno while he was working for JPL, where he was trying to come up with numerical models for low-energy trajectories. “I first came up with the idea of ballistic capture in early 1986 when working on a JPL study called LGAS (Lunar Get Away Special),” he said. “This study involved putting a tiny 100 kg solar electric spacecraft in orbit around the Moon that was first ejected from a Get Away Special Canister on the Space Shuttle.”

The Hiten spacecraft, part of the MUSES Program, was built by the Institute of Space and Astronautical Science of Japan and launched on January 24, 1990. It was Japan's first lunar probe. Credit: JAXA
The Hiten spacecraft, built by the Institute of Space and Astronautical Science of Japan, was Japan’s first lunar probe. Credit: JAXA

The test of the LGAS was not a resounding success, as it would be two years before it got to the Moon. But in 1990, when Japan was looking to rescue their failed lunar orbiter, Hiten, he submitted proposals for a ballistic capture attempt that were quickly incorporated into the mission.

“The time of flight for this one was 5 months,” he said. “It was successfully used in 1991 to get Hiten to the Moon.” And since that time, the LGAS design has been used for other lunar missions, including the ESA’s SMART-1 mission in 2004 and NASA’s GRAIL mission in 2011.

But it is in future missions, which involve much greater distances and expenditures of fuel, that Belbruno felt would most benefit from this method. Unfortunately, the idea met with some resistance, as no missions appeared well-suited to the technique.

“Ever since 1991 when Japan’s Hiten used the new ballistic capture transfer to the Moon, it was felt that finding a useful one for Mars was not possible due to Mars much longer distance and its high orbital velocity about the Sun. However, I was able to find one in early 2014 with my colleague Francesco Topputo.”

Artist's impression of India’s Mars Orbiter Mission (MOM). Credit: ISRO
India’s Mars Orbiter Mission (MOM) was one of the most successful examples of the Hohmann Transfer method. Credit: ISRO

Granted, there are some drawbacks to the new method. For one, a spacecraft sent out ahead of Mars’ orbital path would take longer to get into orbit than one that slows itself down to establish orbit.

In addition, the Hohmann Transfer method is a time-tested and reliable one. One of the most successful applications of this maneuver took place back in September, when the Mars Orbiter Mission (MOM) made its historic orbit around the Red Planet. This not only constituted the first time an Asian nation reached Mars, it was also the first time that any space agency had achieved a Mars orbit on the first try.

Nevertheless, the possibilities for improvements over the current method of sending craft to Mars has people at NASA excited. As James Green, director of NASA’s Planetary Science Division, said in an interview with Scientific American: “It’s an eye-opener. This [ballistic capture technique] could not only apply here to the robotic end of it but also the human exploration end.”

Don’t be surprised then if upcoming missions to Mars or the outer Solar System are performed with greater flexibility, and on a tighter budget.

Further Reading: arXiv Astrophysics

Student Team Wants to Terraform Mars Using Cyanobacteria

Living Mars. Credit: Kevin Gill
Artist concept of a 'Living' Mars. Credit: Kevin Gill

While scientists believe that at one time, billions of years ago, Mars had an atmosphere similar to Earth’s and was covered with flowing water, the reality today is quite different. In fact, the surface of Mars is so hostile that a vacation in Antarctica would seem pleasant by comparison.

In addition to the extreme cold, there is little atmosphere to speak of and virtually no oxygen. However, a team of students from Germany wants to change that. Their plan is to introduce cyanobacteria into the atmosphere which would convert the ample supplies of CO² into oxygen gas, thus paving the way for possible settlement someday.

The team, which is composed of students and volunteer scientists from the University of Applied Science and the Technical University in Darmstadt, Germany, call their project “Cyano Knights”. Basically, they plan to seed Mars’ atmosphere with cyanobacteria so it can convert Mars’ most abundant gas (CO2, which accounts for 96% of the Martian atmosphere) into something breathable by humans.

The Mars One University Competition poster. Credit: Mars One
Promotional image for the Mars One University Competition. Credit: Mars One

Along with teams from other universities and technical colleges taking part in the Mars One University Competition, the Cyano Knights hope that their project will be the one sent to the Red Planet in advance of the company’s proposed settlers.

This competition officially began this past summer, as part of the Mars One’s drive to enlist the support and participation of universities from all around the world. All those participating will have a chance to send their project aboard the company’s first unmanned lander, which will be sent to Mars in 2018.

Working out of the laboratory of Cell Culture Technology of the University of Applied Science, the Cyano Knights selected cyanobacteria because of its extreme ruggedness. Here on Earth, the bacteria lives in conditions that are hostile to other life forms, hence why they seemed like the perfect candidate.

As the team leader Robert P. Schröder, said to astrowatch.net: “Cyanobacteria do live in conditions on Earth where no life would be expected. You find them everywhere on our planet! It is the first step on Mars to test microorganisms.”

Cyanobacteria Spirulina. Credit: cyanoknights.bio
Cyanobacteria Spirulina. Credit: cyanoknights.bio

The other reason for sending cyanobacteria to Mars, in advance of humans, is the biological function they perform. As an organism that produces oxygen gas through photosynthesis to obtain nutrients, cyanobacteria are thought to have played a central role in the evolution of Earth’s atmosphere.

It is estimated that 2.7 billion years ago, they were pivotal in converting it from a toxic fume to the nitrogen and oxygen-rich one that we all know and love. This, in turn, led to the formation of the ozone layer which blocks out harmful UV rays and allowed for the proliferation of life.

According to their project description, the cyanobacteria, once introduced, will “deliver oxygen made of their photosynthesis, reducing carbon dioxide and produce an environment for living organisms like us. Furthermore, they can supply food and important vitamins for a healthy nutrition.”

Of course, the team is not sure how much of the bacteria will be needed to make a dent in Mars’ carbon-rich atmosphere, nor how much of the oxygen could be retained. But much like the other teams taking part in this competition, the goal here is to find out how terrestrial organisms will fare in the Martian environment.

Artist's concept of a Martian astronaut standing outside the Mars One habitat. Credit: Bryan Versteeg/Mars One
Artist’s concept of a Martian astronaut standing outside the Mars One habitat. Credit: Bryan Versteeg/Mars One

The Cyano Knights hope that one day, manned mission will be able to take advantage of the oxygen created by these bacteria by either combining it with nitrogen to create breathable air, or recuperating it for consumption over and over again.

Not only does their project call for the use of existing technology, it also takes advantage of studies being conducted by NASA and other space agencies. As it says on their team page: “On the international space station they do experiments with cyanobacteria too. So let us take it to the next level and investigate our toughest life form on Mars finding the best survival species for mankind! We are paving the way for future Mars missions, not only to have breathable air!”

Other concepts include germinating seeds on Mars to prove that it is possible to grow plants there, building a miniature greenhouse, measuring the impact of cosmic surface and solar radiation on the surface, and processing urine into water.

All of these projects are aimed at obtaining data that will contribute to our understanding of the Martian landscape and be vital to any human settlements or manned missions there in the future.

For more information on the teams taking part in the competition, and to vote for who you would like to win, visit the Mars One University Competition page. Voting submission will be accepted until Dec. 31, 2014 and the winning university payload will be announced on Jan. 5, 2015.

Further Reading: CyanoKnights, MarsOne University Competition

Elon Musk’s Hyperloop Might Become A Reality After All

Concept art for the Hyperloop high-speed train. Credit: Reuters

Fans of Elon Musk and high-speed transit are sure to remember the Hyperloop. Back in 2013, Musk dropped the idea into the public mind with a paper that claimed that using the right technology, a high-speed train could make the trip from San Fransisco to Los Angeles in just 35 minutes.

However, Musk also indicated that he was too busy to build such a system, but that others were free to take a crack at it. And it seems that a small startup from El Segundo, California is prepared to do just that.

That company is JumpStartFund, a startup that combines elements of crowdfunding and crowd-sourcing to make innovation happen. Dirk Ahlborn, the CEO of JumpStartFund, believes they can build Musk’s vision of a solar-powered transit system that would transport people at up to speeds of 1280 km/h (800 mph).

Together with SpaceX, JumpStartFund has created a subsidiary called Hyperloop Transportation Technologies (HTT), Inc. to oversee all the necessary components to creating the system. This included bringing together 100 engineers from all over the country who work for such giants of industry as Boeing, NASA, Yahoo!, Airbus, SpaceX, and Salesforce.

Concept art of what a completed Hyperloop would look like amidst the countryside. Credit: HTT/JumpStartFund
Concept art of what a completed Hyperloop would look like amidst the countryside. Credit: HTT/JumpStartFund

Last week, these engineers came together for the first time to get the ball rolling, and what they came up with a 76-page report (entitled “Crowdstorm”) that spelled out exactly how they planned to proceed. By their own estimates, they believe they can complete the Hyperloop in just 10 years, and at a cost of $16 billion.

A price tag like that would be sure to scare most developers away. However, Ahlborn is undeterred and believes that all obstacles, financial or otherwise, can be overcome. As he professed in an interview with Wired this week: “I have almost no doubt that once we are finished, once we know how we are going to build and it makes economical sense, that we will get the funds.”

The HTT report also covered the basic design and engineering principles that would go into the building of the train, as Musk originally proposed it. Basically, this consists of pods cars that provide their own electricity through solar power, and which are accelerated through a combination of linear induction motors and low air pressure.

Much has been made of this latter aspect of the idea, and has often compared to the kinds of pneumatic tubes that used to send messages around office buildings in the mid-20th century. But of course, what is called for with the Hyperloop is bit more sophisticated.

Concept art showing different "classes" for travel. Credit: HTT
Concept art showing different “classes” for travel, which would include business class for those who can afford it. Credit: HTT/JumpStartFund

Basically, the Hyperloop will operate by providing each capsule with a soft air cushion to float on, avoiding direct contact with rails or the tube, while electromagnetic induction is used to speed up or slow the capsules down, depending on where they are in the transit system.

However, the HTT engineers indicated that such a system need not be limited to California. As it says in the report: “While it would of course be fantastic to have a Hyperloop between LA and SF as originally proposed, those aren’t the only two cities in the US and all over the world that would seriously benefit from the Hyperloop. Beyond the dramatic increase in speed and decrease in pollution, one of the key advantages the Hyperloop offers over existing designs for high-speed rail is the cost of construction and operations.”

The report also indicated the kind of price bracket they would be hoping to achieve. As it stands, HTT’s goal is “to keep the ticket price between LA and SF in the $20-$30 range,” with double that amount for return tickets. But with an overall price tag of $16 billion, the report also makes allowances for going higher: “[Our] current projected cost is closer to $16 billion,” they claim, “implying a need for a higher ticket price, unless the loop transports significantly more than 7.4 million annually, or the timeline for repayment is extended.”

In addition, the report also indicates that they are still relying heavily on Musk’s alpha document for much of their cost assessment. As a result, they can’t be specific on pricing or what kinds of revenues the Hyperloop can be expected to generate once its up and running.

The Hyperloop, as originally conceived within Musk's alpha document. Credit: Tesla Motors
The Hyperloop, as originally conceived within Musk’s alpha document. Credit: Tesla Motors

Also, there’s still plenty of logistical issues that need to be worked out, not to mention the hurdles of zoning, local politics and environmental assessments. Basically, HTT can look forward to countless challenges before they even begin to break ground. And since they are depending on crowdfunding to raise the necessary funds, it is not even certain whether or not they will be able to meet the burden of paying for it.

However, both Ahlborn and the HTT engineering team remain optimistic. Ahlborn believes the financial hurdles will be overcome, and if there was one thing that came through in the team’s report, it was the belief that something like the Hyperloop needs to happen in the near future. As the  team wrote in the opening section of “Crowdstorm”:

“It quickly becomes apparent just how dramatically the Hyperloop could change transportation, road congestion and minimize the carbon footprint globally. Even without naming any specific cities, it’s apparent that the Hyperloop would greatly increase the range of options available to those who want to continue working where they do, but don’t wish to live in the same city, or who want to live further away without an unrealistic commute time; solving some of the major housing issues some metropolitan areas are struggling with.”

Only time will tell if the Hyperloop will become the “fifth mode of transportation” (as Musk referred to it initially) or just a pipe-dream. But when it was first proposed, it was clear that what the Hyperloop really needed was someone who believed in it and enough money to get it off the ground. As of now, it has the former. One can only hope the rest works itself out with time.

Further Reading: JumpStartFund, SpaceX/Hyperloop, Crowdstorm

What is the Average Surface Temperature on Venus?

False color radar topographical map of Venus provided by Magellan. Credit: Magellan Team/JPL/NASA

Venus is often referred to as our “sister planet,” due to the many geophysical similarities that exist between it Earth. For starters, our two planets are close in mass, with Venus weighing in at 4.868 x 1024 kg compared to Earth’s 5.9736×1024 kg. In terms of size, the planets are almost identical, with Venus measuring 12,100 km in diameter and Earth 12,742 km.

In terms of density and gravity, the two are neck and neck – with Venus boasting 86.6% of the former and 90.7% of the latter. Venus also has a thick atmosphere, much like our own, and it is believed that both planets share a common origin, forming at the same time out of a condensing clouds of dust particles around 4.5 billion years ago.

However, for all the characteristics these two planets have in common, average temperature is not one of them. Whereas the Earth has an average surface temperature of 14 degrees Celsius, the average temperature of Venus is 460 degrees Celsius. That is roughly 410 degrees hotter than the hottest deserts on our planet.

In fact, at a searing 750 K (477 °C), the surface of Venus is the hottest in the solar system. Venus is closer to the Sun by 108 million km, (about 30% closer than the Earth), but it is mainly due to the planet’s thick atmosphere. Unlike Earth’s, which is composed primarily of nitrogen, oxygen and ozone, Venus’ atmosphere is an incredibly dense cloud of carbon dioxide and sulfur dioxide gas.

The combination of these gases in high concentrations causes a catastrophic greenhouse effect that traps incident sunlight and prevents it from radiating into space. This results in an estimated surface temperature boost of 475 K (201.85 °C), leaving the surface a molten, charred mess that nothing (that we know of) can live on. Atmospheric pressure also plays a role, being 91 times that of what it is here on Earth; and clouds of toxic vapor constantly rain sulfuric acid on the surface.

In addition, the surface temperature on Venus does not vary like it does here on Earth. On our planet, temperatures vary wildly due to the time of year and even more so based on the location on our planet. The hottest temperature ever recorded on Earth was 70.7°C in the Lut Desert of Iran in 2005. On the other end of the spectrum, the coldest temperature ever recorded on Earth was in Vostok, Antarctica at -89.2 C.

But on Venus, the surface temperature is 460 degrees Celsius, day or night, at the poles or at the equator. Beyond its thick atmosphere, Venus’ axial tilt (aka. obliquity) plays a role in this temperature consistency. Earth’s axis is tilted 23.4 ° in relation to the Sun, whereas Venus’ is only tilted by 3 °.

The only respite from the heat on Venus is to be found around 50 km into the atmosphere. It is at that point that temperatures and atmospheric pressure are equal to that of Earth’s. It is for this reason that some scientists believe that floating habitats could be constructed here, using Venus’ thick clouds to buoy the habitats high above the surface. Additionally, in 2014, a group of mission planners from NASA Langely came up with a mission to Venus’ atmosphere using airships.

These habitats could play an important role in the terraforming of Venus as well, acting as scientific research stations that could either fire off the excess atmosphere off into space, or introduce bacteria or chemicals that could convert all the CO2 and SO2 into a hospitable, breathable atmosphere.

Beyond the fact that it is a hot and hellish landscape, very little is known about Venus’ surface environment. This is due to the thick atmosphere, which has made visual observation impossible. The sulfuric acid is also problematic since clouds composed of it are highly reflective of visible light, which prevents optical observation. Probes have been sent to the surface in the past, but the volatile and corrosive environment means that anything that lands there can only survive for a few hours.

3-D perspective of the Venusian volcano, Maat Mons generated from radar data from NASA’s Magellan mission.
3-D perspective of the Venusian volcano, Maat Mons generated from radar data from NASA’s Magellan mission. Credit: Magellan Team/NASA/JPL

What little we know about the planet’s surface has come from years worth of radar imaging, the most recent of which was conducted by NASA’s Magellan spacecraft (aka. the Venus Radar Mapper). Using synthetic aperture radar, the robotic space probe spent four years (1990-1994) mapping the surface of Venus and measuring its gravitational field before its orbit decayed and it was “disposed of” in the planet’s atmosphere.

The images provided by this and other missions revealed a surface dominated by volcanoes. There are at least 1,000 volcanoes or volcanic centers larger than 20 km in diameter on Venus’ harsh landscape. Many scientists believe Venus was resurfaced by volcanic activity 300 to 500 million years ago. Lava flows are a testament to this, which appear to have produced channels of hardened magma that extend for hundreds of km in all directions. The mixture of volcanic ash and the sulfuric acid clouds is also known to produce intense lightning and thunder storms.

The temperature of Venus is not the only extreme on the planet. The atmosphere is constantly churned by hurricane force winds reaching 360 kph. Add to that the crushing air pressure and rainstorms of sulfuric acid, and it becomes easy to see why Venus is such a barren, lifeless rock that has been hard to explore.

We have written many articles about Venus for Universe Today. Here are some interesting facts about Venus, and here’s an article about Venus Greenhouse Effect. And here is an article about the many interesting pictures taken of Venus over the past few decades.

If you’d like more information on Venus, check out Hubblesite’s News Releases about Venus, and here’s a link to NASA’s Solar System Exploration Guide on Venus.

We’ve also recorded an entire episode of Astronomy Cast all about Venus. Listen here, Episode 50: Venus.

Reference:
NASA

The Milky Way’s New Neighbor May Tell Us Things About the Universe

This dwarf spheroidal galaxy in the constellation Fornax is a satellite of our Milky Way and is one of 10 used in Fermi's dark matter search. The motions of the galaxy's stars indicate that it is embedded in a massive halo of matter that cannot be seen. Credit: ESO/Digital Sky Survey 2

As part of the Local Group, a collection of 54 galaxies and dwarf galaxies that measures 10 million light years in diameter, the Milky Way has no shortage of neighbors. However, refinements made in the field of astronomy in recent years are leading to the observation of neighbors that were previously unseen. This, in turn, is changing our view of the local universe to one where things are a lot more crowded.

For instance, scientists working out of the Special Astrophysical Observatory in Karachai-Cherkessia, Russia, recently found a previously undetected dwarf galaxy that exists 7 million light years away. The discovery of this galaxy, named KKs3, and those like it is an exciting prospect for scientists, since they can tell us much about how stars are born in our universe.

The Russian team, led by Prof Igor Karachentsev of the Special Astrophysical Observatory (SAO), used the Hubble Space Telescope Advanced Camera for Surveys (ACS) to locate KKs3 in the southern sky near the constellation of Hydrus. The discovery occurred back in August 2014, when they finalized their observations a series of stars that have only one ten-thousandth the mass of the Milky Way.

Such dwarf galaxies are far more difficult to detect than others due to a number of distinct characteristics. KKs3 is what is known as a dwarf spheroid (or dSph) galaxy, a type that has no spiral arms like the Milky Way and also suffers from an absence of raw materials (like dust and gas). Since they lack the materials to form new stars, they are generally composed of older, fainter stars.

Image of the KKR 25 dwarf spheroid galaxy obtained by the Special Astrophysical Observatory using the HST. Credit: SAO RAS/Hubble
Image of the KKR 25 dwarf spheroid galaxy obtained by the Special Astrophysical Observatory using the HST. Credit: SAO RAS

In addition, these galaxies are typically found in close proximity to much larger galaxies, like Andromeda, which appear to have gobbled up their gas and dust long ago. Being faint in nature, and so close to far more luminous objects, is what makes them so tough to spot by direct observation.

Team member Prof Dimitry Makarov, also of the Special Astrophysical Observatory, described the process: “Finding objects like Kks3 is painstaking work, even with observatories like the Hubble Space Telescope. But with persistence, we’re slowly building up a map of our local neighborhood, which turns out to be less empty than we thought. It may be that are a huge number of dwarf spheroidal galaxies out there, something that would have profound consequences for our ideas about the evolution of the cosmos.”

Painstaking is no exaggeration. Since they are devoid of materials like clouds of gas and dust fields, scientists are forced to spot these galaxies by identifying individual stars. Because of this, only one other isolated dwarf spheroidal has been found in the Local Group: a dSph known as KKR 25, which was also discovered by the Russian research team back in 1999.

But despite the challenges of spotting them, astronomers are eager to find more examples of dSph galaxies. As it stands, it is believed that these isolated spheroids must have been born out of a period of rapid star formation, before the galaxies were stripped of their dust and gas or used them all up.

Studying more of these galaxies can therefore tell us much about the process star formation in our universe. The Russian team expects that the task will become easier in the coming years as the James Webb Space Telescope and the European Extremely Large Telescope begin service.

Much like the Spitzer Space Telescope, these next-generation telescopes are optimized for infrared detection and will therefore prove very useful in picking out faint stars. This, in turn, will also give us a more complete understanding of our universe and all that it holds.

Further Reading: Royal Astronomical Society

Meteoric Evidence Suggests Mars May Have a Subsurface Reservoir

Scientists were able to gauge the rate of water loss on Mars by measuring the ratio of water and HDO from today and 4.3 billion years ago. Credit: Kevin Gill

It is a scientific fact that water exists on Mars. Though most of it today consists of water ice in the polar regions or in subsurface areas near the temperate zones, the presence of H²O has been confirmed many times over. It is evidenced by the sculpted channels and outflows that still mark the surface, as well as the presence of clay and mineral deposits that could only have been formed by water. Recent geological surveys provide more evidence that Mars’ surface was once home to warm, flowing water billions of years ago.

But where did the water go? And how and when did it disappear exactly? As it turns out, the answers may lie here on Earth, thanks to meteorites from Mars that indicate that it may have a global reservoir of ice that lies beneath the surface.

Together, researchers from the Tokyo Institute of Technology, the Lunar and Planetary Institute in Houston, the Carnegie Institution for Science in Washington and NASA’s Astromaterials Research and Exploration Science Division examined three Martian meteorites. What they found were samples of water that contained hydrogen atoms that had a ratio of isotopes distinct from that found in water in Mars’ mantle and atmosphere.

Mudstone formations in the Gale Crater show the flat bedding of sediments deposited at the bottom of a lakebed. Credit: NASA/JPL-Caltech/MSSS
Mudstone formations in the Gale Crater show the flat bedding of sediments deposited at the bottom of a lakebed. Credit: NASA/JPL-Caltech/MSSS

This new study examined meteors obtained from different periods in Mars’ past. What the researchers found seemed to indicate that water-ice may have existed beneath the crust intact over long periods of time.

As Professor Tomohiro told Universe Today via email, the significance of this find is that “the new hydrogen reservoir (ground ice and/or hydrated crust) potentially accounts for the “missing” surface water on Mars.”

Basically, there is a gap between what is thought to have existed in the past, and what is observed today in the form of water ice. The findings made by Tomohiro and the international research team help to account for this.

“The total inventory of “observable” current surface water (that mostly occurs as polar ice, ~10E6 km3) is more than one order magnitude smaller than the estimated volume of ancient surface water (~10E7 to 10E8 km3) that is thought to have covered the northern lowlands,” said Tomohiro. “The lack of water at the surface today was problematic for advocates of such large paleo-ocean and -lake volume.”

Meteorites from Mars, like NWA 7034 (shown here), contain evidence of Mars' watery past. Credit: NASA
Meteorites from Mars, like NWA 7034 (shown here), contain evidence of Mars’ watery past. Credit: NASA

In their investigation, the researchers compared the water, hydrogen isotopes and other volatile elements within the meteorites. The results of these examinations forced them to consider two possibilities: In one, the newly identified hydrogen reservoir is evidence of a near-surface ice interbedded with sediment. The second possibility, which seemed far more likely, was that they came from hydrated rock that exists near the top of the Martian crust.

“The evidence is the ‘non-atmospheric’ hydrogen isotope composition of this reservoir,” Tomohiro said. “If this reservoir occurs near the surface, it should easily interact with the atmosphere, resulting in “isotopic equilibrium”.  The non-atmospheric signature indicates that this reservoir must be sequestered elsewhere of this red planet, i.e. ground-ice.”

While the issue of the “missing Martian water” remains controversial, this study may help to bridge the gap between Mars supposed warm, wet past and its cold and icy present. Along with other studies performed here on Earth – as well as the massive amounts of data being transmitted from the many rover and orbiters operating on and in orbit of the planet – are helping to pave the way towards a manned mission, which NASA plans to mount by 2030.

The team’s findings are reported in the journal Earth and Planetary Science Letters.

Further Reading: NASA

Compromises Lead to Climate Change Deal

Secretary-General Addresses Lima Climate Action High-level Meeting. Credit: UN Photo/Mark Garten

Earlier this month, delegates from the various states that make up the UN met in Lima, Peru, to agree on a framework for the Climate Change Conference that is scheduled to take place in Paris next year. For over two weeks, representatives debated and discussed the issue, which at times became hotly contested and divisive.

In the end, a compromise was reached between rich and developing nations, which found themselves on opposite sides for much of the proceedings.

And while few member states walked away feeling they had received all they wanted, many expressed that the meeting was an important step on the road to the 2015 Climate Change Conference. It is hoped that this conference will, after 20 years of negotiations, create the first binding and universal agreement on climate change.

The 2015 Paris Conference will be the 21st session of the Conference of the Parties who signed the 1992 United Nations Framework Convention on Climate Change (UNFCCC) and the 11th session of the Meeting of the Parties who drafted the 1997 Kyoto Protocol.

The objective of the conference is to achieve a legally binding and universal agreement on Climate Change specifically aimed at curbing greenhouse gas emissions to limit global temperature increases to an average of 2 degrees Celsius above pre-industrial levels.

This map represents global temperature anomalies averaged from 2008 through 2012. Credit: NASA Goddard Institute for Space Studies/NASA Goddard's Scientific Visualization Studio.
This map represents global temperature anomalies averaged from 2008 through 2012. Credit: NASA Goddard Institute for Space Studies/NASA Goddard’s Scientific Visualization Studio.

This temperature increase is being driven by increased carbon emissions that have been building steadily since the late 18th century and rapidly in the 20th. According to NASA, CO² concentrations have not exceeded 300 ppm in the upper atmosphere for over 400,000 years, which accounts for the whole of human history.

However, in May of last year, the National Oceanic and Atmospheric Administration (NOAA) announced that these concentrations had reached 400 ppm, based on ongoing observations from the Mauna Loa Observatory in Hawaii.

Meanwhile, research conducted by the U.S. Global Change Research Program indicates that by the year 2100, carbon dioxide emissions could either level off at about 550 ppm or rise to as high as 800. This could mean the difference between a temperature increase of 2.5 °C, which is sustainable, and an increase of 4.5 °C (4.5 – 8 °F), which would make life untenable for many regions of the planet.

Hence the importance of reaching, for the first time in over 20 years of UN negotiations, a binding and universal agreement on the climate that will involve all the nations of the world. And with the conclusion of the Lima Conference, the delegates have what they believe will be a sufficient framework for achieving that next year.

While many environmental groups see the framework as an ineffectual compromise, it was hailed by members of the EU as a step towards the long-awaited global climate deal that began in 1992.

“The decisions adopted in Lima pave the way for the adoption of a universal and meaningful agreement in 2015,” said UN Secretary-General Ban Ki-moon in a statement issued at the conclusion of the two-week meeting. In addition, Peru’s environment minister – Manuel Pulgar-Vidal, who chaired the summit – was quoted by the BBC as saying: “As a text it’s not perfect, but it includes the positions of the parties.”

Al Gore and UNEP Executive Director Achim Steiner at the China Pavilion. Credit: UNEP
Al Gore and UNEP Executive Director Achim Steiner at the China Pavilion at the Lima Conference. Credit: UNEP

Amongst the criticisms leveled by environmental groups is the fact that many important decisions were postponed, and that the draft agreement contained watered-down language.

For instance, on national pledges, it says that countries “may” include quantifiable information showing how they intend to meet their emissions targets, rather than “shall”. By making this optional, environmentalists believe that signatories will be entering into an agreement that is not binding and therefore has no teeth.

However, on the plus side, the agreement kept the 194 members together and on track for next year. Concerns over responsibilities between developed and developing nations were alleviated by changing the language in the agreement, stating that countries have “common but differentiated responsibilities”.

Other meaningful agreements were reached as well, which included boosted commitments to a Green Climate Fund (GCF), financial aid for “vulnerable nations”, new targets to be set for carbon emission reductions, a new process of Multilateral Assessment to achieve new levels of transparency for carbon-cutting initiatives, and new calls to raise awareness by putting climate change into school curricula.

In addition, the Lima Conference also led to the creation of The 1 Gigaton Coalition, a UN-coordinated group dedicated to promoting renewable energy. As stated by the UNEP, this group was created “to boost efforts to save billions of dollars and billions of tonnes of CO² emissions each year by measuring and reporting reductions of greenhouse gas emissions resulting from projects and programs that promote renewable energy and energy efficiency in developing countries.”

A massive, over 7-metre-high balloon, representing one tonne of carbon dioxide (CO2). Credit: UN Photo/Mark Garten
A massive, over 7-metre-high balloon, representing one tonne of carbon dioxide (CO2). Credit: UN Photo/Mark Garten

Coordinated by the United Nations Environment Programme (UNEP) with the support of the Government of Norway, they will be responsible for measuring CO² reductions through the application of renewable energy projects. The coalition was formed in light of the fact that while many nations have such initiatives in place, they are not measuring or reporting the drop in greenhouse gases that result.

They believe that, if accurately measured, these drops in emissions would equal 1 Gigaton by the year 2020. This would not only be beneficial to the environment, but would result in a reduced financial burden for governments all across the world.

As UNEP Executive Director Achim Steiner stated in a press release: “Our global economy could be $18 trillion better off by 2035 if we adopted energy efficiency as a first choice, while various estimates put the potential from energy efficient improvements anywhere between 2.5 and 6.8 gigatons of carbon per year by 2030.”

Ultimately, the 1 Gigaton Coalition hopes to provide the information that demonstrates unequivocally that energy efficiency and renewables are helping to close the gap between current emissions levels and what they will need to come down to if we hope to meet a temperature increase of just 2 °C. This, as already stated, could mean the difference between life and death for many people, and ultimately for the environment as a whole.

The location of UNFCCC talks are rotated by regions throughout United Nations countries. The 2015 conference will be held at Le Bourget from 30 November to 11 December 2015.

Further Reading: UN, UNEP, UNFCCC