This Is What It Looks Like to Freefall From Space

Felix Baumgartner about to step out of his pressurized capsule on October 14, 2012 (Credit: Red Bull)


Remember BASE jumper Felix Baumgartner’s incredible freefall from the “edge of space” in October 2012? The highly anticipated (and highly publicized) Red Bull-sponsored stunt was watched live by viewers around the world (including me — it was very cool!) and set new records for highest jump, fastest freefall, and highest balloon-powered human flight. That day Baumgartner even broke the long-standing record held by his mentor Col. Joe Kittinger, who jumped from 102,800 feet in August 1960… and with seven GoPro Hero2 cameras mounted to Felix’s high-tech suit and helmet, you can see what he saw during every one of the 127,852 feet that he fell down to Earth.

(That’s ah, over 24 miles/39 km. *Gulp.*)

The video above was released today by GoPro, and is a more polished and edited version than the one released by Red Bull this past October. Check it out above, or for full vertigo-inducing* freefall effect watch it in fullscreen HD on YouTube. *Consider yourself warned!

HT to Robert Gonzalez at io9

Young Planets Migrated In Double-Star Systems, Model Shows

Artist's conception of Kepler 34b, which orbits two stars. Credit: David A. Aguilar (CfA)

Binary star systems are downright dangerous due to their complex gravitational interactions that can easily grind a planet to pieces. So how is it that we have found a few planets in these Tattooine-like environments?

Research led by the University of Bristol show that most planets formed far away from their central stars and then migrated in at some point in their history, according to research collected concerning Kepler-34b and other exoplanets.

The scientists did “computer simulations of the early stages of planet formation around the binary stars using a sophisticated model that calculates the effect of gravity and physical collisions on and between one million planetary building blocks,” stated the university.

“They found that the majority of these planets must have formed much further away from the central binary stars and then migrated to their current location.”

You can read more about the research in Astrophysical Journal Letters. It was led by Bristol graduate student Stefan Lines with participation from advanced research fellow and computational astrophysicst Zoe Lienhardt, among other collaborators.

Foom! Flaming Rocket Sled Tests Parachute For Mars Spacecraft

The "rocket sled" that is a part of the Low-Density Supersonic Decelerator Project testing methods to slow spacecraft before they land. Credit: NASA

Watch the video above to the two-minute mark (and beyond) and we guarantee a brilliant start to your Friday. “Enter Sandman” indeed, Metallica. Look past the flames and thrust, however, and you will see a parachute test in action that could help spacecraft land safely on Mars one day.

This is an undated “rocket sled” test of the Low-Density Supersonic Decelerator, a technology aiming to be a more advanced way to bring spacecraft to Mars besides the 1970s-era Viking parachutes that were used as late as the Curiosity mission.

And supersonic flight tests of this technology will take place this year and next, according to NASA. The technology could be used on spacecraft as early as 2018, the agency added.

“NASA seeks to use atmospheric drag as a solution, saving rocket engines and fuel for final maneuvers and landing procedures,” the agency states on the project’s web page. “The heavier planetary landers of tomorrow, however, will require much larger drag devices than any now in use to slow them down — and those next-generation drag devices will need to be deployed at higher supersonic speeds to safely land vehicle, crew and cargo.”

“One of the tests on my LDSD project, which combines the Navy version of a Blackhawk helicopter, a giant 110 foot parachute, 3000 pounds of rope, a very big pulley, four rockets, and a railroad track in the desert. The test successfully uncovered a design flaw in the parachute before we flew one like it on a much more expensive test — which is exactly what this test was for,” wrote collaborator Mark Adler (a fellow at the Jet Propulsion Laboratory who was a mission manager for the Spirit rover) on Google Plus.

As part of this project, NASA is testing three devices. The first is a huge parachute (30.5 meters, or 100 feet) that will deploy when the spacecraft is at about 1.5 to 2 times the speed of sound to slow it down.

NASA's Curiosity rover heads for a successful landing Aug. 6 under its parachute. Picture snapped by NASA's Mars Reconnaissance Orbiter's  High-Resolution Imaging Science Experiment (HiRISE). Credit: NASA/JPL-Caltech/Univ. of Arizona
NASA’s Curiosity rover heads for a successful landing Aug. 6 under its parachute. Picture snapped by NASA’s Mars Reconnaissance Orbiter’s High-Resolution Imaging Science Experiment (HiRISE). Credit: NASA/JPL-Caltech/Univ. of Arizona

At faster speeds, NASA also plans inflatable aerodynamic decelerators, which it describes as “very large, durable, balloon-like pressure vessels.” These devices are being tested in two versions: six-meter and eight-meter (19.7 feet and 26.2 feet). They are designed to balloon around the spacecraft to slow it down from 3.5 times the speed of sound to at least twice the speed of sound, if not lower.

“All three devices will be the largest of their kind ever flown at speeds several times greater than the speed of sound,” NASA stated.

The project is a NASA technology demonstration mission led by the Jet Propulsion Laboratory. This test and similar ones were conducted at the conducted at the U.S. Naval Air Weapons Station at China Lake, Calif. More videos and information are available at LDSD’s webpage.

Huge hat-tip to @marsroverdriver for highlighting this on his Twitter account yesterday (Thursday).

From Webcam to Planetcam: Planetary Imaging on the Cheap

Photo by Author

It’s a question we get often.

“What sort of gear did you use to capture that?” folks ask, imagining that I’m using a setup that required a second mortgage to pay for.

People are often surprised at the fact that I’m simply using a converted off-the-shelf webcam modified to fit into the eyepiece-holder of a telescope, along with freeware programs to control the camera, stack,and clean up images. And while there are multi-thousand dollar rigs available commercially that yield images that would have been the envy of professional observatories even a decade ago, you may just find that you have the gear lying around to start doing planetary and lunar photography tonight.

OK, I’ll admit: you do need a laptop and telescope, (things that we typically have “laying around” our house!) but these are the two priciest items on the list to get started. Living the vagabond life of a veteran, a teacher, and a freelance science writer assures that our preferred cameras for conversion are always in the double-digit dollar range.

Converted "Planetcam" installed on the 'scope.
Our first converted “Planetcam” installed on the ‘scope.

But converted webcam imaging is not new. We first read about the underground movement over a decade ago. Back in the day, amateur astrophotographers were hacking their Phillips Vesta and ToUcam Pro webcams with stunning results. Celestron, Meade and Orion later caught up to the times and released their own commercial versions for planetary imaging some years later.

A few freeware installations and the modification of a Logitech 3000 that I bought on rebate for 50$ later, and I was soon imaging planets that same night.

Photo by author
Modified webcams, old (right) and new (left).

Just about any webcam will yield decent results, though the discontinued Phillips ToUcam Pro webcams are still the heavily sought after Holy Grail of webcam astrophotography. The modification simply consists of removing the camera lens (don’t do this with any camera that you don’t want to gut and void the warranty) and attaching a standard 1 ¼” eyepiece barrel in its place using cement glue.

For camera control, I use a program called K3CCDTools. This was freeware once upon a time, now the program costs $50 to install. I still find it well worth using, though I’ve been turned on to some equally useful programs out there that are still free. (more on that in a bit).

K3CCDTools will process your images from start to finish, but I find that Registax is great for post-image processing. Plus, you don’t want to waste valuable scope time processing images: I do the maximum number of video captures in the field, and then tinker with them later on cloudy nights.

Screen cap
A screen capture of K3CCD tools during a daytime alignment test. Note the focusing dialog (FFT) box to the right.

Stacking video captures enables you to “grab” those brief moments of fine atmospheric seeing. Many astrophotographers will manually select the best frames from thousands one by one, but I’ll have to admit we’re often impatient and find the selection algorithm on Registax does an acceptable job of selecting the top 10% of images in a flash.

And like Photoshop, a college course could be taught around Registax. Don’t be intimidated, but do feel free to experiment! After stacking and optimizing, we find the true power in making the images “pop” often lies in the final step, known as wavelet processing.  A round of sharpening and  contrast boosting in Photoshop can also go a long way, just remember that the goal is to apply the minimum to get the job done, rather than looking unnatural and over-processed.

Photos by author
A photo mosaic of the historic Mars opposition of 2003.

At the eyepiece, the first target hurdle is object acquisition. A standard webcam can go after bright targets such as the Moon, the Sun (with the proper filter) planets, and bright double stars. We’ve even nabbed the International Space Station with our rig using a low-tech but effective tracking method. Your field of view, however, will typically be very narrow; my webcam coupled to a Celestron C8” Schmidt-Cassegrain typically yields a field of view about 10’ on a side. You’ll want to center the object in the eyepiece at the highest power possible, then plop the camera in place.

The next battle is centering and focusing the object on the screen. An out-of-focus planet scatters light: tweaking the focus back and forth sometimes reveals the silvery “doughnut” of the planet lurking just out of view.

From there, you’ll want the object in as razor sharp a focus as possible. K3CCDTools has a great feature for this known as a Fine Focusing Tool (FFT). Some observers also using focusing masks, which can also be easily built — remember, were being cheapskates! — out of cardboard. Be sure those reflector mirrors are properly collimated as well.

Photos by author
Objects shot over the years (clockwise from the upper left): the close double star Porrima, Saturn, the International Space Station, and Venus.

Don’t be surprised if the planet initially looks over-saturated. You’ll want to access the manual controls of via the camera software to take the brightness, contrast and color saturation down to acceptable levels. I typically shoot at about 15 frames a second. Fun Fact: the “shutter speed” of the dark adapted “Mark 1 human eyeball” is generally quoted around 1/20th of a second, slower than you’d think!

Note: all those thousands of frames of video go somewhere… be sure to occasionally clean them off your hard-drive, as it will swiftly fill up!

When you image makes a big difference as well. The best time to shoot an object is when it transits the local north-south meridian and is at its highest point above the horizon. The reason for this is that you’re looking through the thinnest possible cross-section of the often turbulent atmosphere.

Universe Today reader Scott Chapman of Montpelier, Virginia also recently shared with us his exploits in planetary webcam imaging and his technique:

Credit-Scott Chapman
A webcam image of the Mare Crisium region on the Moon. Credit-Scott Chapman

“Recently, while looking for an affordable basic telescope, to see if I really had any interest in astronomy, searches and reviews led me to purchase a 70mm refractor. The last thing on my mind was that I could expect to take any pictures of what I might see.

Previously, I had assumed that the only way to take even basic pictures of sky objects was with equipment that was way out of my price range. Imagine my surprise to learn that I could use a simple webcam that I already had sitting around!”

Like many of us mere mortal budget astrophotographers, Scott’s goal was great images at low cost. He also shared with us the programs he uses;

SharpCap2: For capturing .avi video files from the webcam connected to the telescope.

VirtualDub: For shortening the .avi video.

PIPP: For optimization of stacked images.

AutoStakkert2: Selects and stacks the best frames into a single .tiff file using a simple 3-step process. Scott notes that its “MUCH easier for a beginner to use than Registax!”

-Registax6: The latest version of the software mentioned above.

JPEGView: For final cropping and file conversion. (I sometimes also use ye ole Paint for this).

Even after a decade of planetary imaging, some of these were new to us as well, a testament to just how far the technique has continued to evolve. Astrophotography and astronomy are lifelong pursuits, and we continue to learn new things every day.

The current camera I’m shooting with is a Logitech c270 that I call my “Wal-Mart 20$ Blue Light Special.” (Yes, I know that’s Kmart!) Lots of discussion forums exist out there as well, including the QuickCam and Unconventional Imaging Astronomy Group (QCUIAG) on Yahoo!

Some observers have even taken to gutting and modifying their webcams entirely, adding in cooling fans, more sensitive chips, longer exposure times and more.

All great topics for a future post. Let us know of your trials and triumphs in webcam planetary photography!

-Watch Dave Dickinson pit his 20$ webcam against multi-thousand dollar rigs weekly in the Virtual Star Party.

-Be sure to send those webcam pics in to Universe Today!

 

Search for Planetary Nurseries in the Latest Citizen Science Project

Image Credit: diskdetectives.org

Growing up, my sister played video games and I read books. Now that she has a one-year-old daughter we constantly argue over how her little girl should spend her time. Should she read books in order to increase her vocabulary and stretch her imagination? Or should she play video games in order to strengthen her hand-eye coordination and train her mind to find patterns?

I like to believe that I did so well in school because of my initial unadorned love for books. But I might be about to lose that argument as gamers prove their value in science and more specifically astronomy.

Take a quick look through Zooniverse and you’ll be amazed by the number of Citizen Science projects. You can explore the surface of the moon in Moon Zoo, determine how galaxies form in Galaxy Zoo and search for Earth-like planets in Planet Hunters.

In 2011 two citizen scientists made big news when they discovered two exoplanet candidates — demonstrating that human pattern recognition can easily compliment the powerful computer algorithms created by the Kepler team.

But now we’re introducing yet another Citizen Science project: Disk Detective.

Planets form and grow within dusty circling planes of gas that surround young stars. However, there are many outstanding questions and details within this process that still elude us. The best way to better understand how planets form is to directly image nearby planetary nurseries. But first we have to find them.

zooniverse

“Through Disk Detective, volunteers will help the astronomical community discover new planetary nurseries that will become future targets for NASA’s Hubble Space Telescope and its successor, the James Webb Space Telescope,” said the chief scientist for NASA Goddard’s Sciences and Exploration Directorate, James Garvin, in a press release.

NASA’s Wide-field Infrared Survey Explorer (WISE) scanned the entire sky at infrared wavelengths for a year. It took detailed measurements of more than 745 million objects.

Astronomers have used complex computer algorithms to search this vast amount of data for objects that glow bright in the infrared. But now they’re calling on your help. Not only do planetary nurseries glow in the infrared but so do galaxies, interstellar dust clouds and asteroids.

While there’s likely to be thousands of planetary nurseries glowing bright in the data, we have to separate them from everything else. And the only way to do this is to inspect every single image by eye — a monumental challenge for any astronomer — hence the invention of Disk Detective.

Brief animations allow the user to help classify the object based on relatively simple criteria, such as whether or not the object is round or if there are multiple objects.

“Disk Detective’s simple and engaging interface allows volunteers from all over the world to participate in cutting-edge astronomy research that wouldn’t even be possible without their efforts,” said Laura Whyte, director of Citizen Science at the Adler Planetarium in Chicago, Ill.

The project is hoping to find two types of developing planetary environments, distinguished by their age. The first, known as a young stellar object disk is, well, young. It’s less than 5 million years old and contains large quantities of gas. The second, known as a debris disk, is older than 5 million years. It contains no gas but instead belts of rocky or icy debris similar to our very own asteroid and Kupier belts.

So what are you waiting for? Head to Disk Detective and help astronomers understand how complex worlds form in dusty disks of gas. The book will be there when you get back.

The original press release may be found here.

Close Encounters of the Lunar Kind – LRO images LADEE

This dissolve animation compares the LRO image (geometrically corrected) of LADEE captured on Jan 14, 2014 with a computer-generated and labeled image of LADEE . LRO and LADEE are both NASA science spacecraft currently in orbit around the Moon. Credit: NASA/Goddard/Arizona State University

This dissolve animation compares the LRO image (geometrically corrected) of LADEE captured on Jan 14, 2014 with a computer-generated and labeled image of LADEE . LRO and LADEE are both NASA science spacecraft currently in orbit around the Moon. Credit: NASA/Goddard/Arizona State University
Story updated[/caption]

A pair of NASA spacecraft orbiting Earth’s nearest celestial neighbor just experienced a brief ‘Close Encounter of the Lunar Kind’.

Proof of the rare orbital tryst has now been revealed by NASA in the form of spectacular imagery (see above and below) just released showing NASA’s recently arrived Lunar Atmosphere and Dust Environment Explorer (LADEE) lunar orbiter being photographed by a powerful camera aboard NASA’s five year old Lunar Reconnaissance Orbiter (LRO) – as the two orbiters met for a fleeting moment just two weeks ago.

See above a dissolve animation that compares the LRO image (geometrically corrected) of LADEE captured on Jan. 14, 2014 with a computer-generated and labeled LADEE image.

All this was only made possible by a lot of very precise orbital calculations and a spacecraft ballet of sorts that had to be nearly perfectly choreographed and timed – and spot on to accomplish.

This subsection of the LRO image, expanded four times, shows the smeared view of LADEE against the lunar background..   LADEE is about 2 meters in the long direction. Lunar scene about 81 meter wide.  Credit:  NASA/Goddard/Arizona State University
This subsection of the LRO image, expanded four times, shows the smeared view of LADEE against the lunar background. LADEE is about 2 meters in the long direction. Lunar scene about 81 meter wide. Credit: NASA/Goddard/Arizona State University

Both sister orbiters were speeding along at over 3600 MPH (1,600 meters per second) while traveling perpendicularly to one another!

So the glimpse was short but sweet.

LADEE flies in an equatorial orbit (east-to-west) while LRO travels in a polar orbit (south-to-north). LADEE achieved lunar orbit on Oct. 6, 2013 amidst the federal government shutdown.

Thus their orbits align only infrequently.

The LRO orbiter did a pirouette to precisely point its high resolution narrow angle camera (NAC) while hurtling along in lunar orbit, barely 5.6 miles (9 km) above LADEE.

And it was all over in less than the wink of an eye!

LADEE entered LRO’s Narrow Angle Camera (NAC) field of view for 1.35 milliseconds and a smeared image of LADEE was snapped. LADEE appears in four lines of the LROC image, and is distorted right-to-left.

Both spacecraft are tiny – barely two meters in length.

“Since LROC is a pushbroom imager, it builds up an image one line at a time, thus catching a target as small and fast as LADEE is tricky!” wrote Mark Robinson, LROC principal investigator of Arizona State University.

So the fabulous picture was only possible as a result of close collaboration and extraordinary teamwork between NASA’s LADEE, LRO and LROC camera mission operations teams.

NASA’s LRO imaged NASA’s LADEE, about 5.6 miles (9 km) beneath it, at 8:11 p.m. EST on Jan. 14, 2014. (LROC NAC image M1144387511LR).  Image width is 821 meters, or about 898 yards.)   Credit:   NASA/Goddard/Arizona State University
NASA’s LRO imaged NASA’s LADEE, about 5.6 miles (9 km) beneath it, at 8:11 p.m. EST on Jan. 14, 2014. (LROC NAC image M1144387511LR). Image width is 821 meters, or about 898 yards.) Credit: NASA/Goddard/Arizona State University

LADEE passed directly beneath the LRO orbit plane a few seconds before LRO crossed the LADEE orbit plane, meaning a straight down LROC image would have just missed LADEE, said NASA.

LRO spacecraft (top) protected by gray colored blankets is equipped with 7 science instruments located at upper right side of spacecraft. LRO cameras are pointing to right. LRO is piggybacked atop NASA’s LCROSS spacecraft.  Payload fairing in background protects the spacecraft during launch and ascent. Credit: Ken Kremer
LRO spacecraft (top) protected by gray colored blankets is equipped with 7 science instruments located at upper right side of spacecraft. LRO cameras are pointing to right. LRO is piggybacked atop NASA’s LCROSS spacecraft. Payload fairing in background protects the spacecraft during launch and ascent. Credit: Ken Kremer

Therefore, LRO was rolled 34 degrees to the west so the LROC detector (one line) would be precisely oriented to catch LADEE as it passed beneath.

“Despite the blur it is possible to find details of the spacecraft. You can see the engine nozzle, bright solar panel, and perhaps a star tracker camera (especially if you have a correctly oriented schematic diagram of LADEE for comparison),” wrote Robinson in a description.

See the LADEE schematic in the lead image herein.

LADEE was launched Sept. 6, 2013 from NASA Wallops in Virginia on a science mission to investigate the composition and properties of the Moon’s pristine and extremely tenuous atmosphere, or exosphere, and untangle the mysteries of its lofted lunar dust.

Since LADEE is now more than halfway through its roughly 100 day long mission, timing was of the essence before the craft takes a death dive into the moon’s surface.

You can see a full scale model of LADEE at the NASA Wallops visitor center, which offers free admission.

Full scale model of NASA’s LADEE lunar orbiter on display at the free visitor center at NASA’s Wallops Flight Facility in Virginia. Credit: Ken Kremer/kenkremer.com
Full scale model of NASA’s LADEE lunar orbiter on display at the free visitor center at NASA’s Wallops Flight Facility in Virginia. Credit: Ken Kremer/kenkremer.com

LRO launched Sept. 18, 2009 from Cape Canaveral, Florida to conduct comprehensive investigations of the Moon with seven science instruments and search for potential landing sites for a return by human explorers. It has collected astounding views of the lunar surface, including the manned Apollo landing sites as well as a treasure trove of lunar data.

In addition to NASA’s pair of lunar orbiters, China recently soft landed two probes on the Moon.

So be sure to read my new story detailing how LRO took some stupendous Christmas time 2013 images of China’s maiden lunar lander and rover; Chang’e-3 and Yutu from high above- here.

Stay tuned here for Ken’s continuing LADEE, Chang’e-3, Orion, Orbital Sciences, SpaceX, commercial space, Mars rover and more news.

Ken Kremer

Launch of NASA’s LADEE lunar orbiter on Friday night Sept. 6, at 11:27 p.m. EDT on the maiden flight of the Minotaur V rocket from NASA Wallops, Virginia, viewing site 2 miles away. Antares rocket launch pad at left.  Credit: Ken Kremer/kenkremer.com
Launch of NASA’s LADEE lunar orbiter on Friday night Sept. 6, at 11:27 p.m. EDT on the maiden flight of the Minotaur V rocket from NASA Wallops, Virginia, viewing site 2 miles away. Antares rocket launch pad at left. Credit: Ken Kremer/kenkremer.com

A Secret Solar Eclipse from Outer Space

The sun seen in six different colors of wavelengths of light as the moon passed across from the perspective of NASA's Solar Dynamics Observatory this morning between about 7:30 and 10 a.m. CST. Credit: NASA

Call it the eclipse nobody saw. NASA’s Solar Dynamics Observatory (SDO) got its own private solar eclipse showing from its geosynchronous orbital perch today. Twice a year during new phase, the moon glides in front of the sun from the observatory’s perspective. Although we can’t be there in person to see it, the remote view isn’t too shabby. The events are called lunar transits rather than eclipses since they’re seen from outer space. Transits typically last about a half hour, but at 2.5 hours, today’s was one of the longest ever recorded. The next one occurs on July 26, 2014.


Today’s lunar transit of the sun followed by a strong solar flare

When an eclipse ends, the fun is usually over, but not this time. Just as the moon slid off the sun’s fiery disk, a strong M6.6 solar flare exploded from within a new, very active sunspot group rounding the eastern limb and blasted a CME (coronal mass ejection) into space. What a show!

Approximate view of the moon transiting the sun from SDO's viewpoint. Credit: NASA
Approximate view of the moon transiting the sun from SDO’s viewpoint. To make sure SDO didn’t run down its batteries when the sun was blocked, mission control juiced them up beforehand. Credit: NASA

SDO circles Earth in a geosynchronous orbit about 22,000 miles high and photographs the sun continuously day and night from a vantage point high above Mexico and the Pacific Ocean. About 1.5 terabytes of solar data or the equivalent of half a million songs from iTunes are downloaded to antennas in White Sands, New Mexico every day.

For comparison, the space station, which orbits much closer to Earth, would make a poor solar observatory, since Earth blocks the sun for half of every 90 minute orbit.

When you look at the still pictures and video, notice how distinct the edge of the moon appears. With virtually no atmosphere, the moon takes a “sharp” bite out of the sun.

SDO orbits about 22,000 miles above Earth, tracing out a figure-8 (called an analemma) above the Pacific and Mexico every 24 hours. Credit: NASA Read more: http://www.universetoday.com/#ixzz2ruidvZJ5
SDO orbits about 22,000 miles above Earth, tracing out a figure-8 (called an analemma) above the Pacific and Mexico every 24 hours. Credit: NASA
Read more: http://www.universetoday.com/#ixzz2ruidvZJ5

SDO amazes with its spectacular pictures of the sun taken in 10 different wavelengths of light every 10 seconds; additional instruments study vibrations on the sun’s surface, magnetic fields and how much UV radiation the sun pours into space.

Compared to all the hard science, the twice a year transits are a sweet side benefit much like the cherries topping a sundae.

You can make your own movie of today’s partial eclipse by visiting the SDO website  and following these easy steps:

* Click on the Data tab and select AIA/HMI Browse Data
* Click on the Enter Start Date window, select a start date and time and click Done
* Click on Enter End Date and click Done
* Under Telescopes, pick the color (wavelength) sun you want
* Select View in the display box
* Click Submit at the bottom and watch a video of your selected pictures

Earth’s Water Story Gets A Plot Twist From Space Rock Search

Artist's conception of asteroids and a gas giant planet. Credit: Harvard-Smithsonian Center for Astrophysics

We at Universe Today have snow on our minds these days with all this Polar Vortex talk. From out the window, the snowflakes all look the same, but peer at flakes under a microscope and you can see all these different designs pop up. Turns out that our asteroid belt between Mars and Jupiter is also much more diverse than previously believed, all because astronomers took the time to do a detailed survey.

Here’s the interesting thing: the diversity, the team says, implies that Earth-like planets would be hard to find, which could be a blow for astronomers seeking an Earth 2.0 somewhere out in the universe if other research agrees.

To jump back a couple of steps, there’s a debate about how water arose on Earth. One theory is that back billions of years ago when the solar system was settling into its current state — a time when planetesimals were crashing into each other constantly and the larger planets possibly migrated between different orbits — comets and asteroids bearing water crashed into a proto-Earth.

Artist's conception of asteroids or comets bearing water to a proto-Earth. Credit: Harvard-Smithsonian Center for Astrophysics
Artist’s conception of asteroids or comets bearing water to a proto-Earth. Credit: Harvard-Smithsonian Center for Astrophysics

“If true, the stirring provided by migrating planets may have been essential to bringing those asteroids,” the astronomers stated in a press release. “This raises the question of whether an Earth-like exoplanet would also require a rain of asteroids to bring water and make it habitable. If so, then Earth-like worlds might be rarer than we thought.”

To take this example further, the researchers found that the asteroid belt comes from a mix of locations around the solar system. Well, a model the astronomers cite shows that Jupiter once migrated much closer to the sun, basically at the same distance as where Mars is now.

When Jupiter migrated, it disturbed everything in its wake and possibly removed as much as 99.9 per cent of the original asteroid population. And other planet migrations in general threw in rocks from everywhere into the asteroid belt. This means the origin of water in the belt could be more complicated than previously believed.

You can read more details of the survey in the journal Nature. Data was gathered from the Sloan Digital Sky Survey and the research was led by Francesca DeMeo, a Hubble postdoctoral fellow at the Harvard-Smithsonian Center for Astrophysics.

Source: Harvard-Smithsonian Center for Astrophysics

‘Stupid Astronaut Tricks’ Spread The Joy of Space To New Astronaut Class

NASA astronaut candidate Christina Hammock starts a fire successfully during wilderness survival training near Rangeley, Maine in August 2013. Credit: NASA/Lauren Harnett

You sure couldn’t hide those grins on television from the Astronaut Candidate Class of 2013 when the call came from the International Space Station.

NASA’s latest recruits were at the Smithsonian National Air and Space Museum in Washington, D.C. at an event today (Thursday) for students. Amid the many youngster questions to Expedition 38 astronauts Mike Hopkins and Rick Mastracchio, astronaut candidate Jessica Meir managed one of her own: was the wait worth it?

Hovering in front of the camera, four-time flyer Mastracchio vigorously shook his hand “no” to laughter from the audience. Hopkins answered her more seriously: “It is definitely worth it. It is the most amazing experience I think you can ever have. Floating is just truly incredible; it just never gets old.”

Minutes later, Hopkins demonstrated a “stupid astronaut trick”: doing Road Runner-style sprinting in place in mid-air. The laughing crew signed off — “So they’re floating off now?” asked event moderator and veteran astronaut Leland Melvin — and the new class had the chance to answer questions of their own.

While the class expressed effusive delight at being astronauts — they were hired last year, so the feeling is quite new to them — Meir said that there was some sadness at leaving the careers they had before. As a recent article in Air&Space Smithsonian pointed out, this class will have several years to wait for a seat into space because there aren’t robust shuttle crews of seven people going up several times a year any more. The Soyuz only carries three people at a time, and there are fewer missions that last for a longer time.

There also is some ambiguity about where the astronauts will go. The International Space Station has been extended until at least 2024, but astronaut candidate Anne McClain added today that an asteroid or Mars are other things being considered for their class. “This class is such an exciting time to be at NASA,” she said.

Other questions asked of the class at the event include who is going to go in space first, and from a wee future astronaut, which planet they’d prefer to go to. You can watch the whole broadcast on the link above.

Behind the Scenes: The “Making Of” the First Brown Dwarf Surface Map

Two views of the first brown dwarf map
Two views of the brown dwarf map for Luhman 16B. Image credit: ESO/I. Crossfield

By now, you will probably have heard that astronomers have produced the first global weather map for a brown dwarf. (If you haven’t, you can find the story here.) May be you’ve even built the cube model or the origami balloon model of the surface of the brown dwarf Luhman 16B the researchers provided (here).

Since one of my hats is that of public information officer at the Max Planck Institute for Astronomy, where most of the map-making took place, I was involved in writing a press release about the result. But one aspect that I found particularly interesting didn’t get much coverage there. It’s that this particular bit of research is a good example of how fast-paced astronomy can be these days, and, more generally, it shows how astronomical research works. So here’s a behind-the-scenes look – a making-of, if you will – for the first brown dwarf surface map (see image on the right).

As in other sciences, if you want to be a successful astronomer, you need to do something new, and go beyond what’s been done before. That, after all, is what publishable new results are all about. Sometimes, such progress is driven by larger telescopes and more sensitive instruments becoming available. Sometimes, it’s about effort and patience, such as surveying a large number of objects and drawing conclusion from the data you’ve won.

Ingenuity plays a significant role. Think of the telescopes, instruments and analytical methods developed by astronomers as the tools in a constantly growing tool box. One way of obtaining new results is to combine these tools in new ways, or to apply them to new objects.

That’s why our opening scene is nothing special in astronomy: It shows Ian Crossfield, a post-doctoral researcher at the Max Planck Institute for Astronomy, and a number of colleagues (including institute director Thomas Henning) in early March 2013, discussing the possibility of applying one particular method of mapping stellar surfaces to a class of objects that had never been mapped in this way before.

The method is called Doppler imaging. It makes use of the fact that light from a rotating star is slightly shifted in frequency as the star rotates. As different parts of the stellar surfaces go by, whisked around by the star’s rotation, the frequency shifts vary slightly different depending on where the light-emitting region is located on the star. From these systematic variations, an approximate map of the stellar surface can be reconstructed, showing darker and brighter areas. Stars are much too distant for even the largest current telescopes to discern surface details, but in this way, a surface map can be reconstructed indirectly.

The method itself isn’t new. The basic concept was invented in the late 1950s, and the 1980s saw several applications to bright, slowly rotating stars, with astronomers using Doppler imaging to map those stars’ spots (dark patches on a stellar surface; the stellar analogue to Sun spots).

Crossfield and his colleagues were wondering: Could this method be applied to a brown dwarf – an intermediary between planet and star, more massive than a planet, but with insufficient mass for nuclear fusion to ignite in the object’s core, turning it into a star? Sadly, some quick calculations, taking into account what current telescopes and instruments can and cannot do as well as the properties of known brown dwarfs, showed that it wouldn’t work.

The available targets were too faint, and Doppler imaging needs lots of light: for one because you need to split the available light into the myriad colors of a spectrum, and also because you need to take many different rather short measurements – after all, you need to monitor how the subtle frequency shifts caused by the Doppler effect change over time.

So far, so ordinary. Most discussions of how to make observations of a completely new type probably come to the conclusion that it cannot be done – or cannot be done yet. But in this case, another driver of astronomical progress made an appearance: The discovery of new objects.

Artist's impression of the WISE satellite
Kevin Luhman discovered the brown dwarf pair in data from NASA’s Wide-field Infrared Survey Explorer (WISE; artist’s impression). Image: NASA/JPL-Caltech

On March 11, Kevin Luhman, an astronomer at Penn State University, announced a momentous discovery: Using data from NASA’s Wide-field Infrared Survey Explorer (WISE), he had identified a system of two brown dwarfs orbiting each other. Remarkably, this system was at a distance of a mere 6.5 light-years from Earth. Only the Alpha Centauri star system and Barnard’s star are closer to Earth than that. In fact, Barnard’s star was the last time an object was discovered to be that close to our Solar system – and that discovery was made in 1916.

Modern astronomers are not known for coming up with snappy names, and the new object, which was designated WISE J104915.57-531906.1, was no exception. To be fair, this is not meant to be a real name; it’s a combination of the discovery instrument WISE with the system’s coordinates in the sky. Later, the alternative designation “Luhman 16AB” for the system was proposed, as this was the 16th binary system discovered by Kevin Luhman, with A and B denoting the binary system’s two components.

These days, the Internet gives the astronomical community immediate access to new discoveries as soon as they are announced. Many, probably most astronomers begin their working day by browsing recent submissions to astro-ph, the astrophysical section of the arXiv, an international repository of scientific papers. With a few exceptions – some journals insist on exclusive publication rights for at least a while –, this is where, in most cases, astronomers will get their first glimpse of their colleagues’ latest research papers.

Luhman posted his paper “Discovery of a Binary Brown Dwarf at 2 Parsecs from the Sun” on astro-ph on March 11. For Crossfield and his colleagues at MPIA, this was a game-changer. Suddenly, here was a brown dwarf for which Doppler imaging could conceivably work, and yield the first ever surface map of a brown dwarf.

However, it would still take the light-gathering power of one of the largest telescopes in the world to make this happen, and observation time on such telescopes is in high demand. Crossfield and his colleagues decided they needed to apply one more test before they would apply. Any object suitable for Doppler imaging will flicker ever so slightly, growing slightly brighter and darker in turn as brighter or darker surface areas rotate into view. Did Luhman 16A or 16B flicker – in astronomer-speak: did one of them, or perhaps both, show high variability?

Astronomy comes with its own time scales. Communication via the Internet is fast. But if you have a new idea, then ordinarily, you can’t just wait for night to fall and point your telescope accordingly. You need to get an observation proposal accepted, and this process takes time – typically between half a year and a year between your proposal and the actual observations. Also, applying is anything but a formality. Large facilities, like the European Southern Observatory’s Very Large Telescopes, or space telescopes like the Hubble, typically receive applications for more than 5 times the amount of observing time that is actually available.

But there’s a short-cut – a way for particularly promising or time-critical observing projects to be completed much faster. It’s known as “Director’s Discretionary Time”, as the observatory director – or a deputy – are entitled to distribute this chunk of observing time at their discretion.

Image of the MPG/ESO 2.2 m telescope at ESO's La Silla observatory
To monitor Luhman 16A and B’s brightness flucutations, Beth Biller used the MPG/ESO 2.2. meter telescope at ESO’s La Silla Observatory. Image credit: ESO/José Francisco Salgado (josefrancisco.org)

On April 2, Beth Biller, another MPIA post-doc (she is now at the University of Edinburgh), applied for Director’s Discretionary Time on the MPG/ESO 2.2 m telescope at ESO’s La Silla observatory in Chile. The proposal was approved the same day.

Biller’s proposal was to study Luhman 16A and 16B with an instrument called GROND. The instrument had been developed to study the afterglows of powerful, distant explosions known as gamma ray bursts. With ordinary astronomical objects, astronomers can take their time. These objects will not change much over the few hours an astronomer makes observations, first using one filter to capture one range of wavelengths (think “light of one color”), then another filter for another wavelength range. (Astronomical images usually capture one range of wavelengths – one color – at a time. If you look at a color image, it’s usually the result of a series of observations, one color filter at a time.)

Gamma ray bursts and other transient phenomena are different. Their properties can change on a time scale of minutes, leaving no time for consecutive observations. That is why GROND allows for simultaneous observations of seven different colors.

Biller had proposed to use GROND’s unique capability for recording brightness variations for Luhman 16A and 16B in seven different colors simultaneously – a kind of measurement that had never been done before at this scale. The most simultaneous information researchers had gotten from a brown dwarf had been at two different wavelengths (work by Esther Buenzli, then at the University of Arizona’s Steward Observatory, and colleagues). Biller was going for seven. As slightly different wavelength regimes contain information about gas at slightly different colors, such measurements promised insight into the layer structure of these brown dwarfs – with different temperatures corresponding to different atmospheric layers at different heights.

For Crossfield and his colleagues – Biller among them –, such a measurement of brightness variations should also show whether or not one of the brown dwarfs was a good candidate for Doppler imaging.

Image of the TRAPPIST telescope in its dome.
The robotic telescope TRAPPIST, also at ESO’s La Silla observatory, was the first to find brightness fluctuations of the brown dwarf Luhman 16B.

As it turned out, they didn’t even have to wait that long. A group of astronomers around Michaël Gillon had pointed the small robotic telescope TRAPPIST, designed for detecting exoplanets by the brightness variations they cause when passing between their host star and an observer on Earth, to Luhman 16AB. The same day that Biller had applied for observing time, and her application been approved, the TRAPPIST group published a paper “Fast-evolving weather for the coolest of our two new substellar neighbours”, charting brightness variations for Luhman 16B.

This news caught Crossfield thousands of miles from home. Some astronomical observations do not require astronomers to leave their cozy offices – the proposal is sent to staff astronomers at one of the large telescopes, who make the observations once the conditions are right and send the data back via Internet. But other types of observations do require astronomers to travel to whatever telescope is being used – to Chile, say, to or to Hawaii.

When the brightness variations for Luhman 16B were announced, Crossfield was observing in Hawaii. He and his colleagues realized right away that, given the new results, Luhman 16B had moved from being a possible candidate for the Doppler imaging technique to being a promising one. On the flight from Hawaii back to Frankfurt, Crossfield quickly wrote an urgent observing proposal for Director’s Discretionary Time on CRIRES, a spectrograph installed on one of the 8 meter Very Large Telescopes (VLT) at ESO’s Paranal observatory in Chile, submitting his application on April 5. Five days later, the proposal was accepted.

View of the 8 meter telescope Antu
Antu, the first of the four 8 meter Unit Telescopes (UTs) of the Very Large Telescope (VLT) shortly after installation in 2000. Image: ESO

On May 5, the giant 8 meter mirror of Antu, one of the four Unit Telescopes of the Very Large Telescope, turned towards the Southern constellation Vela (the “Sail of the Ship”). The light it collected was funneled into CRIRES, a high-resolution infrared spectrograph that is cooled down to about -200 degrees Celsius (-330 Fahrenheit) for better sensitivity.

Three and two weeks earlier, respectively, Biller’s observations had yielded rich data about the variability of both the brown dwarfs in the intended seven different wavelength bands.

At this point, no more than two months had passed between the original idea and the observations. But paraphrasing Edison’s famous quip, observational astronomy is 1% observation and 99% evaluation, as the raw data are analyzed, corrected, compared with models and inferences made about the properties of the observed objects.

For Beth Biller’s multi-wavelength monitoring of brightness variations, this took about five months. In early September, Biller and 17 coauthors, Crossfield and numerous other MPIA colleagues among them, submitted their article to the Astrophysical Journal Letters (ApJL) after some revisions, it was accepted on October 17. From October 18 onward, the results were accessible online at astro-ph, and a month later they were published on the ApJL website.

In late September, Crossfield and his colleagues had finished their Doppler imaging analysis of the CRIRES data. Results of such an analysis are never 100% certain, but the astronomers had found the most probable structure of the surface of Luhman 16B: a pattern of brighter and darker spots; clouds made of iron and other minerals drifting on hydrogen gas.

As is usual in the field, the text they submitted to the journal Nature was sent out to a referee – a scientist, who remains anonymous,  and who gives recommendations to the journal’s editors whether or not a particular article should be published. Most of the time, even for an article the referee thinks should be published, he or she has some recommendations for improvement. After some revisions, Nature accepted the Crossfield et al. article in late December 2013.

With Nature, you are only allowed to publish the final, revised version on astro-ph or similar servers no less than 6 month after the publication in the journal. So while a number of colleagues will have heard about the brown dwarf map on January 9 at a session at the 223rd Meeting of the American Astronomical Society, in Washington, D.C., for the wider astronomical community, the online publication, on January 29, 2014, will have been the first glimpse of this new result. And you can bet that, seeing the brown dwarf map, a number of them will have started thinking about what else one could do. Stay tuned for the next generation of results.

And there you have it: 10 months of astronomical research, from idea to publication, resulting in the first surface map of a brown dwarf (Crossfield et al.) and the first seven-wavelength-bands-study of brightness variations of two brown dwarfs (Biller et al.). Taken together, the studies provide fascinating image of complex weather patterns on an object somewhere between a planet and a star the beginning of a new era for brown dwarf study, and an important step towards another goal: detailed surface maps of giant gas planets around other stars.

On a more personal note, this was my first ever press release to be picked up by the Weather Channel.