DARPA’s Experimental Space Plane XS-1 Starts Development

Concept images for DARPA’s Experimental Spaceplane (XS-1) program. Credit: DARPA.

The Defense Advanced Research Projects Agency (DARPA) is looking to develop a fully-reusable unmanned spaceplane, and they are now ready to start working their proposed Experimental Spaceplane (XS-1). The agency has put together a “special forces” of sorts in the space industry, awarding prime contracts for the first phase of development to a combination of six companies. These six are a combination of “old” and “new” space companies and are:

The Boeing Company (working with Blue Origin, LLC)
Masten Space Systems (working with XCOR Aerospace)
Northrop Grumman Corporation (working with Virgin Galactic)

“We chose performers who could prudently integrate existing and up-and-coming technologies and operations, while making XS-1 as reliable, easy-to-use and cost-effective as possible,” Jess Sponable, DARPA program manager. “We’re eager to see how their initial designs envision making spaceflight commonplace—with all the potential military, civilian and commercial benefits that capability would provide.”

Each commercial entity will be able to outline their vision of the XS-1, but DARPA wants the the spaceplane to provide aircraft-like access to space for deploying small satellites to orbit and it its development, they’d like to create technology for next-generation hypersonic vehicles, — and do it more affordably.

They envision that a reusable first stage would fly to hypersonic speeds at a suborbital altitude. Then, one or more expendable upper stages would separate and deploy a satellite into low Earth orbit (LEO). The reusable first stage would then return to earth, land and be prepared for the next flight.

Key to the development, DARPA says, are modular components, durable thermal protection systems and automatic launch, flight and recovery systems that should significantly reduce logistical needs, enabling rapid turnaround between flights.

DARPA’s key technical goals for the XS-1 include flying 10 times in 10 days, flying to Mach 10+ at least once and launching a representative small payload to orbit. The program also seeks to reduce the cost of access to space for 3,000- to 5,000-pound payloads to less than $5 million per flight.

Source: DARPA

Big Thinking: You Create An X-Prize Contest To Solve A Pressing Problem

The X-Prize winning SpaceShip One (credit: Scaled Composites)

Remember the Ansari X-Prize, when there was a race about a decade ago for the first private spaceship to go into space and then return? The result not only saw Burt Rutan’s SpaceShipOne make it into suborbit, but also launched Virgin Galactic — one of the most talked-about space companies today.

Imagine if you had a burning problem that you wanted to solve. It could be related to space exploration, or astronomy, or climate change, or something else altogether.

In recognition of this, the X-Prize foundation has spun off a new company called HeroX to crowdsource ideas and funding for a prize competition. And Universe Today’s Fraser Cain, who has just joined the organization, wants readers to help him out with the ImagineX challenge! More details below.

“Imagine if a large enough group of people could come together, pool their resources, and issue a challenge that would inspire competitors to solve it,” Fraser wrote on his Google+ page, pointing out the X-Prize organization itself has broadened its scope to contests related to oil cleanup and low-carbon emission vehicles, among others.

“The goal with HeroX is that anyone can come and create a challenge,” he added. “And then anyone can pledge to help fund the prize. And then anyone can compete to solve the challenge and win the prize.”

According to HeroX’s ImagineX website, these are the broad guidelines:

  • Addresses a problem that people want solved
  • Is important to a large number of people
  • Is solvable
  • Engages people in discussing, competing and solving the challenge
  • Provides all the required information for a challenge to run on HeroX.com

Submissions will be judged on quality of submission, crowd engagement and influence and crowd appeal, and you can read more detailed guidelines here. Think carefully about your idea and when you’re ready, be sure to contribute before the deadline of Sept. 1, 2014. Winners will receive a cool $10,000.

For more information, read up on ImagineX here.

NASA Mars Lander InSight ‘Go’ For Construction

Artist's conception of the NASA InSight Mars lander. Credit: NASA/JPL-Caltech

It’s time to get ready for Mars, again! NASA has given the approval to begin construction on its 2016 mission, the Interior Exploration Using Seismic Investigations, Geodesy and Heat Transport (InSight) mission.

As the mission implies, the lander (which isn’t moveable) will focus on learning more about the inside of Mars. The idea is to figure out how terrestrial planets are “differentiated” inside between core, mantle and crust. Also, watchers of the Mars program may recognize some parts of the lander, as it will borrow the design from the successful Phoenix mission in 2008.

“We will incorporate many features from our Phoenix spacecraft into InSight, but the differences between the missions require some differences in the InSight spacecraft,” stated Stu Spath, InSight program manager at Lockheed Martin.

“For example, the InSight mission duration is 630 days longer than Phoenix, which means the lander will have to endure a wider range of environmental conditions on the surface.”

View of Mars' surface near the north pole from the Phoenix lander. Credit: NASA/JPL-Calech/University of Arizona
View of Mars’ surface near the north pole from the Phoenix lander. Credit: NASA/JPL-Calech/University of Arizona

NASA mission planners are still determining where InSight will go, but they expect it will be a site near the equator of Mars and that it will last at least two years on the surface.

The Mars lander will include a robotic arm with “surface and burrowing” instruments whose projects are led by the French and German space agencies, which are CNES (National Center of Space Studies) and DLR (German Center for Aerospace), respectively. CNES will contribute a seismic experiment to look at “Marsquakes” and when meteors smack the surface, while DLR’s science experiment will look at interior planetary heat.

Mars on March 8, 2014 shows not only clouds over Hellas but evening limb clouds. Credit: W.L. Chin
Mars on March 8, 2014 shows not only clouds over Hellas but evening limb clouds. Credit: W.L. Chin

The seismometer will sit on the surface, covered up to protect it from the cold and wind, while the heat-flow probe will be hammered in about three to five yards or meters. Investigators also plan an experiment that will communicate with NASA’s Deep Space Network antenna network to see how much the rotation of Mars wobbles, which could hint if the core of the Red Planet is solid or liquid. The mission will also include wind, temperature and pressure sensors, as well as a magnetometer.

“Mars actually offers an advantage over Earth itself for understanding how habitable planetary surfaces can form,” stated Bruce Banerdt, InSight principal investigator at NASA’s Jet Propulsion Laboratory. “Both planets underwent the same early processes. But Mars, being smaller, cooled faster and became less active while Earth kept churning. So Mars better preserves the evidence about the early stages of rocky planets’ development.”

Construction will be led by Lockheed Martin. You can check out more information about InSight at this website. NASA has several missions working at Mars right now, such as the Mars Curiosity rover, the Opportunity rover and the orbiting Mars Reconnaissance Orbiter and Mars Odyssey spacecraft.

Source: Jet Propulsion Laboratory

What to Wear? The History and Future of Spacesuits

Credit:

The issue of “what to wear?” takes on an extra dimension of life and death when it comes to space travel. Upon exiting a spacecraft on a spacewalk, an astronaut becomes his very own personal satellite in orbit about the Earth and must rely on the flimsy layer of his suit to provide them with a small degree of protection from radiation and extreme fluctuations of heat and cold.

We recently had a chance to see the past, present and future of space suit technology in the Smithsonian Institutions’ touring Suited for Space exhibit currently on display at the Tampa Bay History Center in Tampa, Florida.

Tampa Bay History Center Director of Marketing Manny Leto recently gave Universe Today an exclusive look at the traveling display. If you think you know space suits, Suited for Space will show you otherwise, as well as give you a unique perspective on a familiar but often overlooked and essential piece of space hardware. And heck, it’s just plain fascinating to see the design and development of some of these earlier suits as well as videos and stills of astronauts at work – and yes, sometimes even at play – in them.

One of the highlights of the exhibit are some unique x-ray images of iconic suits from space travel history. Familiar suits become new again in these images by Smithsonian photographer Mark Avino, which includes a penetrating view of Neil Armstrong’s space suit that he wore on Apollo 11.

Credit
X-ray images of Neil Armstrong’s historic suit on display in Suited for Space. (Photo by author).

Space suits evolved from pressure suits developed for high-altitude flights in the 1950’s, and Suited for Space traces that progression. It was particularly interesting to see the depiction of Wiley Post’s 1934 suit, complete with steel cylindrical helmet and glass portal! Such early suits resembled diving bell suits of yore — think Captain Nemo in a chemsuit. Still, this antiquated contraption was the first practical full pressure suit that functioned successfully at over 13,000 metres altitude.

Credit:
Wiley Post’s 1934 “rubber bladder suit.” (Photo by author).

No suit that has been into space is allowed to tour due to the fragility of many historic originals that are now kept at the Smithsonian, though several authentic suits used in training during the U.S. space program are on display. We thought it was  interesting to note how the evolution of the spacesuit closely followed the development of composites and materials through the mid-20th century. You can see the progression from canvas, glass and steel in the early suits right up though the advent of the age of plastic and modern fabrics. Designs have flirted with the idea of rigid and semi-rigid suits before settling on the modern day familiar white astronaut suit.

credit
A x-ray photo of an EX-1A spacesuit. (Photo by author).

Spacesuit technology has also always faced the ultimate challenge of protecting an astronaut from the rigors of space during Extra-Vehicular Activity, or EVA.

Cosmonaut Alexey Leonov performed the first 12 minute space walk during Voskhod 2 back in 1965, and NASA astronaut Ed White became the first American to walk in space on Gemini 4 just months later. Both space walkers had issues with over-heating, and White nearly didn’t make it back into his Gemini capsule.

credit
Early evolution of space suits on display at the Suited for Space exhibit. (Photo by author).

Designing a proper spacesuit was a major challenge that had to be overcome. In 1962, Playtex (yes THAT Playtex) was awarded a contract to develop the suits that astronauts would wear on the Moon. Said suits had 13 distinct layers and weighed 35 kilograms here on Earth. The Playtex industrial division eventually became known as the International Latex Corporation or ILC Dover, which still makes spacesuits for ISS crewmembers today. It’s also fascinating to see some of the alternate suits proposed, including one “bubble suit” with arms and legs (!) that was actually tested but, thankfully, was never used.

These suits were used by astronauts on the Moon, to repair Hubble, build the International Space Station and much more. Al Worden recounts performing the “most distant EVA ever” on the return from the Moon in his book Falling to Earth. This record will still stand until the proposed asteroid retrieval mission in the coming decade, which will see astronauts performing the first EVA ever in orbit around Earth’s Moon.

credit
An A5-L Spacesuit. Credit: Smithsonian/Suited for Space.

And working in a modern spacesuit during an EVA is anything but routine. CSA Astronaut Chris Hadfield said in his recent book An Astronaut’s Guide to Life on Earth that “Spacewalking is like rock climbing, weightlifting, repairing a small engine and performing an intricate pas de deux – simultaneously, while encased in a bulky suit that’s scraping your knuckle, fingertips and collarbone raw.”

And one only has to look at the recent drama that cut ESA astronaut Luca Parmitamo’s EVA short last year to realize that your spacesuit is the only thin barrier that exists between yourself and the perils of space.

“We’re delighted to host our first Smithsonian Institution Travelling Exhibition Service (SITES) and we think that Florida’s close ties to NASA and the space program make it a great fit for us,” said Rodney Kite-Powell, the Tampa Bay History Center’s Saunders Foundation Curator of History.

Be sure to catch this fascinating exhibit coming to a city near you!

-And you can see these suits in action on the up and coming future EVAs for 2014.

-Here’s the schedule for Suited for Space Exhibit tour.

-Astronaut Nicole Stott (veteran of STS-128, -129, -133, & ISS Expeditions 20 and 21) will also be on hand at the Tampa Bay History Center on March 2014 (Date to be Announced) to present Suited for Space: An Astronaut’s View.

– Follow the Tampa Bay History Museum of Twitter as @TampaBayHistory.

 

Space Station to Get a ‘Laser Cannon’

CATS in the laboratory. Credit: NASA/GSFC.

What’s a space station without a laser cannon?

The International Space Station will be getting its very own laser at the end of 2014. And unlike the planet-smashing capabilities of the Death Star of Star Wars fame, this laser will to be enlisted for the purpose of science.

It’s called CATS, and no, it isn’t the latest attempt to put feline astronauts in space. CATS stands for the Cloud Aerosol Transport System. The goal of CATS is to study the distribution of tiny particles of dust and air contaminants known as aerosols.

Developed by research scientist Matt McGill at NASA’s Goddard Space Flight Center in Greenbelt Maryland, CATS is slated to head to the International Space Station later this year on September 12th aboard SpaceX’s CRS-5 flight of the Dragon spacecraft. CATS will be installed on the Japanese Experiment Module-Exposed Facility (JEM-EF) and will demonstrate the utility of state-of-the-art multi-wavelength laser technology to study aerosol distribution and transport in the atmosphere.

Such knowledge is critical in understanding the path and circulation of aerosols and pollutants worldwide. When the Eyjafjallajökull volcano erupted in Iceland back in 2010, many trans-Atlantic flights were grounded as a precaution. These measures are necessary as several flights have suffered engine failures in the past due to encounters with volcanic ash clouds, such as the four engine failure of KLM Fight 867 in 1989 and the British Airways Flight 9 incident over Southeast Asia in 1982. Knowing where these dangerous ash clouds are is crucial to the safety of air travel.

The expanding ash cloud spewing from Iceland's Eyjafjallajökull volcano as seen from space in 2010. Credit: NASA.
The expanding ash cloud spewing from Iceland’s Eyjafjallajökull volcano as seen from space in 2010. Credit: NASA.

To accomplish this, CATS will emit 5,000 1 milliJoule laser pulses a second at the 1064, 532 and 355 nanometer wavelengths.  This represents a vast improvement in power requirements and thermal capabilities over a similar instrument currently in service aboard the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) Earth remote sensing spacecraft.

And it’s that third 355 nanometer wavelength that will make CATS stand out from CALIPSO. This will also allow researchers to differentiate between particle size and measure the horizontal and vertical distribution of aerosol particles in the atmosphere. CATS will also be capable of measuring the number of individual photons being reflected back at it, which will provide a much better resolution and understanding of current atmospheric activity.

“You get better data quality because you make fewer assumptions, and you get, presumably, a more accurate determination of what kind of particles you’re seeing in the atmosphere,” McGill said in a recent press release.

The International Space Station also provides a unique vantage point for CATS. In a highly inclined 51.7 degree orbit, the station passes over a good swath of the planet on 16 orbits daily on a westward moving ground track that repeats roughly every three days. This will assure CATS has coverage over a large percentage of the planet, including known pollutant transport routes across the northern Pacific and down from Canada over the U.S. Great Lakes region.

While the first two lasers will operate in the infrared and visual wavelengths, said third laser will work in the ultraviolet. And while this will give CATS an enhanced capability, engineers also worry that it may also be susceptible to contamination.  “If you get contamination on any of your outgoing optics, they can self-destruct, and then your system is dead. You end up with a very limited instrument lifetime,” McGill said.

Still, if CATS is successful, it may pave the way for larger, free-flying versions that will monitor long-range atmospheric patterns and shifts in climate due to natural and man-made activity. And the ISS makes a good platform to test pathfinder missions like CATS at low cost. “In our current budget-constrained environment, we need to use what we already have, such as the [station], to do more with less,” McGill said.

CALIPSO's LiDAR imaged from the ground by Gregg Hendry in 2008. Used with permission.
CALIPSO’s LiDAR imaged from the ground by Gregg Hendry in 2008. Used with permission.

The advent of a LiDAR system aboard the ISS has also generated a spirited discussion in the satellite tracking community concerning prospects for spotting CATS in operation from the ground. The CALIPSO LiDAR has been captured by ground spotters in the past. However, CALIPSO fires a much more powerful 110 milliJoule pulse at a rate of 20 times a second. Still, the lower power CATS system will be firing at a much faster rate, delivering a cumulative 5,000 milliJoules a second.  CATS won’t be bright enough to show up on an illuminated pass of the ISS, but it just might be visible during darkened passes of the ISS through the Earth’s shadow. And, unlike CALLIPSO — which is part of the difficult to observe A-Train of Earth-observing satellites — the ISS passes in view of a majority of humanity. At very least, activity from CATS will be worth watching out for, and may well be seen either visually or photographically.

We’ll soon be adding CATS to the long list of outstanding science experiments being conducted aboard the International Space Station, and the sight of this “fully armed and operational battle station” may soon be coming to a dark sky site near you!

From Webcam to Planetcam: Planetary Imaging on the Cheap

Photo by Author

It’s a question we get often.

“What sort of gear did you use to capture that?” folks ask, imagining that I’m using a setup that required a second mortgage to pay for.

People are often surprised at the fact that I’m simply using a converted off-the-shelf webcam modified to fit into the eyepiece-holder of a telescope, along with freeware programs to control the camera, stack,and clean up images. And while there are multi-thousand dollar rigs available commercially that yield images that would have been the envy of professional observatories even a decade ago, you may just find that you have the gear lying around to start doing planetary and lunar photography tonight.

OK, I’ll admit: you do need a laptop and telescope, (things that we typically have “laying around” our house!) but these are the two priciest items on the list to get started. Living the vagabond life of a veteran, a teacher, and a freelance science writer assures that our preferred cameras for conversion are always in the double-digit dollar range.

Converted "Planetcam" installed on the 'scope.
Our first converted “Planetcam” installed on the ‘scope.

But converted webcam imaging is not new. We first read about the underground movement over a decade ago. Back in the day, amateur astrophotographers were hacking their Phillips Vesta and ToUcam Pro webcams with stunning results. Celestron, Meade and Orion later caught up to the times and released their own commercial versions for planetary imaging some years later.

A few freeware installations and the modification of a Logitech 3000 that I bought on rebate for 50$ later, and I was soon imaging planets that same night.

Photo by author
Modified webcams, old (right) and new (left).

Just about any webcam will yield decent results, though the discontinued Phillips ToUcam Pro webcams are still the heavily sought after Holy Grail of webcam astrophotography. The modification simply consists of removing the camera lens (don’t do this with any camera that you don’t want to gut and void the warranty) and attaching a standard 1 ¼” eyepiece barrel in its place using cement glue.

For camera control, I use a program called K3CCDTools. This was freeware once upon a time, now the program costs $50 to install. I still find it well worth using, though I’ve been turned on to some equally useful programs out there that are still free. (more on that in a bit).

K3CCDTools will process your images from start to finish, but I find that Registax is great for post-image processing. Plus, you don’t want to waste valuable scope time processing images: I do the maximum number of video captures in the field, and then tinker with them later on cloudy nights.

Screen cap
A screen capture of K3CCD tools during a daytime alignment test. Note the focusing dialog (FFT) box to the right.

Stacking video captures enables you to “grab” those brief moments of fine atmospheric seeing. Many astrophotographers will manually select the best frames from thousands one by one, but I’ll have to admit we’re often impatient and find the selection algorithm on Registax does an acceptable job of selecting the top 10% of images in a flash.

And like Photoshop, a college course could be taught around Registax. Don’t be intimidated, but do feel free to experiment! After stacking and optimizing, we find the true power in making the images “pop” often lies in the final step, known as wavelet processing.  A round of sharpening and  contrast boosting in Photoshop can also go a long way, just remember that the goal is to apply the minimum to get the job done, rather than looking unnatural and over-processed.

Photos by author
A photo mosaic of the historic Mars opposition of 2003.

At the eyepiece, the first target hurdle is object acquisition. A standard webcam can go after bright targets such as the Moon, the Sun (with the proper filter) planets, and bright double stars. We’ve even nabbed the International Space Station with our rig using a low-tech but effective tracking method. Your field of view, however, will typically be very narrow; my webcam coupled to a Celestron C8” Schmidt-Cassegrain typically yields a field of view about 10’ on a side. You’ll want to center the object in the eyepiece at the highest power possible, then plop the camera in place.

The next battle is centering and focusing the object on the screen. An out-of-focus planet scatters light: tweaking the focus back and forth sometimes reveals the silvery “doughnut” of the planet lurking just out of view.

From there, you’ll want the object in as razor sharp a focus as possible. K3CCDTools has a great feature for this known as a Fine Focusing Tool (FFT). Some observers also using focusing masks, which can also be easily built — remember, were being cheapskates! — out of cardboard. Be sure those reflector mirrors are properly collimated as well.

Photos by author
Objects shot over the years (clockwise from the upper left): the close double star Porrima, Saturn, the International Space Station, and Venus.

Don’t be surprised if the planet initially looks over-saturated. You’ll want to access the manual controls of via the camera software to take the brightness, contrast and color saturation down to acceptable levels. I typically shoot at about 15 frames a second. Fun Fact: the “shutter speed” of the dark adapted “Mark 1 human eyeball” is generally quoted around 1/20th of a second, slower than you’d think!

Note: all those thousands of frames of video go somewhere… be sure to occasionally clean them off your hard-drive, as it will swiftly fill up!

When you image makes a big difference as well. The best time to shoot an object is when it transits the local north-south meridian and is at its highest point above the horizon. The reason for this is that you’re looking through the thinnest possible cross-section of the often turbulent atmosphere.

Universe Today reader Scott Chapman of Montpelier, Virginia also recently shared with us his exploits in planetary webcam imaging and his technique:

Credit-Scott Chapman
A webcam image of the Mare Crisium region on the Moon. Credit-Scott Chapman

“Recently, while looking for an affordable basic telescope, to see if I really had any interest in astronomy, searches and reviews led me to purchase a 70mm refractor. The last thing on my mind was that I could expect to take any pictures of what I might see.

Previously, I had assumed that the only way to take even basic pictures of sky objects was with equipment that was way out of my price range. Imagine my surprise to learn that I could use a simple webcam that I already had sitting around!”

Like many of us mere mortal budget astrophotographers, Scott’s goal was great images at low cost. He also shared with us the programs he uses;

SharpCap2: For capturing .avi video files from the webcam connected to the telescope.

VirtualDub: For shortening the .avi video.

PIPP: For optimization of stacked images.

AutoStakkert2: Selects and stacks the best frames into a single .tiff file using a simple 3-step process. Scott notes that its “MUCH easier for a beginner to use than Registax!”

-Registax6: The latest version of the software mentioned above.

JPEGView: For final cropping and file conversion. (I sometimes also use ye ole Paint for this).

Even after a decade of planetary imaging, some of these were new to us as well, a testament to just how far the technique has continued to evolve. Astrophotography and astronomy are lifelong pursuits, and we continue to learn new things every day.

The current camera I’m shooting with is a Logitech c270 that I call my “Wal-Mart 20$ Blue Light Special.” (Yes, I know that’s Kmart!) Lots of discussion forums exist out there as well, including the QuickCam and Unconventional Imaging Astronomy Group (QCUIAG) on Yahoo!

Some observers have even taken to gutting and modifying their webcams entirely, adding in cooling fans, more sensitive chips, longer exposure times and more.

All great topics for a future post. Let us know of your trials and triumphs in webcam planetary photography!

-Watch Dave Dickinson pit his 20$ webcam against multi-thousand dollar rigs weekly in the Virtual Star Party.

-Be sure to send those webcam pics in to Universe Today!

 

Compare the Space Station’s Internet Speed with Yours

The International Space Station as seen from the crew of STS-119. Credit: NASA

We now take it for granted that astronauts on the International Space Station can tweet and post things on Facebook and G+ live from space, but it wasn’t always so. Before January of 2010, any emails, news, or Twitter messages were sent to and from the ISS in uplink and downlink packages, so for example, Twitter messages from the astronauts were downlinked to mission control in Houston, and someone there posted them on the astronauts’ Twitter accounts. But now they have “live” internet. However, as you can imagine, there are no fiberoptic cables hooking up to the ISS, so the internet speeds aren’t blazing fast. Find out how fast in this latest video update from NASA’s Space to Ground, a weekly update on what’s happening aboard the ISS.

Internet Search Yields No Evidence of Time Travelers

Comet ISON was used in a search for time travelers. NASA’s Hubble Space Telescope provides a close-up look of Comet ISON (C/2012 S1), as photographed on April 10. Credit: NASA, ESA, J.-Y. Li (Planetary Science Institute), and the Hubble Comet ISON Imaging Science Team.

You can find anything on the internet, right? A new study reveals, however, that you can’t find evidence of time travelers on the internet. Credible time travelers, that is.

The study was conducted by astrophysicist Robert Nemiroff who is part of the Astronomy Picture of the Day (APOD) team, along with some of his students from Michigan Technological University.

They did three separate types of searches, and developed a search strategy based on what they call “prescient knowledge.” They looked for discussions on social media and various websites where there might be evidence of a mention of something or someone before people should have known about it. If they were able to find evidence of that, it could indicate that whoever wrote it had traveled from the future.

They selected search terms relating to two recent phenomena, Pope Francis and Comet ISON, and began looking for references to them before they were known to exist.

First, they looked for specific terms on Twitter, then secondly looked for “prescient” inquiries submitted to a search engine, and the third search involved a request for a direct Internet communication, either by email or tweet, pre-dating to the time of the inquiry.

The team used a variety of search engines, such as Google and Bing, and combed through Facebook and Twitter.

Their results? “No time travelers were discovered,” says the abstract of their paper.

“In our limited search we turned up nothing,” Nemiroff said in a press release. “I didn’t really think we would. But I’m still not aware of anyone undertaking a search like this. The Internet is essentially a vast database, and I thought that if time travelers were here, their existence would have already come out in some other way, maybe by posting winning lottery numbers before they were selected.”

So far, no lottery winners have confessed to using time travel to make their winnings.

In the case of Comet ISON, there were no mentions before it was discovered in September 2012. They discovered only one blog post referencing a Pope Francis before Jorge Mario Bergoglio was elected head of the Catholic Church on March 16, but it seemed more accidental than prescient.

In the third part of their search, the researchers created a post in September 2013 asking readers to email or tweet one of two messages on or before August 2013: “#ICanChangeThePast2” or “#ICannotChangeThePast2.”

No replies have been given … yet.

And just in case you’re wondering credible time travelers do not include the two “chrononauts” who said they time traveled with a young Barack Obama.

Nemiroff and physics graduate student Teresa Wilson will present their findings today, Monday, Jan. 6, at the American Astronomical Society meeting in Washington, DC.

Turn on Your Heart Light and Meet NASA’s “Superhero” Robot

A concept drawing of what eventually became Valkyrie, Johnson Space Center's entry in the DARPA Robotics Challenge. Credit: NASA, via DARPA.

Here’s a new DARPA-inspired, NASA-built robot, complete with a glowing NASA Meatball in its chest, reminiscent of ET’s heart light. The robot’s name is Valkyrie and she was created by a team at the Johnson Space Center as part of the DARPA Robotics Challenge, a contest designed to find the life-saving robot of the future. While NASA’s current robot — Robonaut 2 – is just now getting a pair of legs, “Val” (officially named “R5″ by NASA) is a 1.9 meter tall, 125 kilogram, (6-foot 2-inch, 275-pound) rescue robot that can walk over multiple kinds of terrain, climb a ladder, use tools, and even drive.

According to an extensive article about the new robot in IEEE Spectrum, “This means that Valkyrie has to be capable of operating in the same spaces that a person would operate in, under the control of humans who have only minimal training with robots, which is why the robot’s design is based on a human form.”

Why is NASA building more robots? The thinking is that NASA could send human-like robots to Mars before they send humans. Right now, Valkyrie is not space-rated, but the team at JSC is just getting started.

She’s loaded with cameras, LIDAR, SONAR, is strong and powerful, and is just a great-looking robot.

“We really wanted to design the appearance of this robot to be one that was, when you saw it you’d say, wow, that’s awesome.” Nicolaus Radford, Project and Group Lead at the Dexterous Robotics Lab and JSC.

NASA Halts Work on its New Nuclear Generator for Deep Space Exploration

MSL's MMRTG in the laboratory. (Credit: NASA).

Another blow was dealt to deep space exploration this past weekend. The announcement comes from Jim Green, NASA’s Planetary Science Division Director. The statement outlines some key changes in NASA’s radioisotope program, and will have implications for the future exploration of the outer solar system.

An Advanced Stirling Converter prototype in the laboratory. (Credit: NASA).
An Advanced Stirling Converter prototype in the laboratory. (Credit: NASA).

We’ve written about the impending plutonium shortage and what it means for the future of spaceflight, as well as the recent restart of plutonium production. NASA is the only space agency that has conducted missions to the outer planets — even the European Space Agency’s Huygens lander had to hitch a ride with Cassini to get to Titan — and plutonium made this exploration possible. Continue reading “NASA Halts Work on its New Nuclear Generator for Deep Space Exploration”