Astronomers have been assessing a new machine learning algorithm to determine how reliable it is for finding gravitational lenses hidden in images from all sky surveys. This type of AI was used to find about 5,000 potential gravitational lenses, which needed to be confirmed. Using spectroscopy for confirmation, the international team has now determined the technique has a whopping 88% success rate, which means this new tool could be used to find thousands more of these magical quirks of physics.
Continue reading “A Computer Algorithm is 88% Accurate in Finding Gravitational Lenses”A Machine-Learning Algorithm Just Found 301 Additional Planets in Kepler Data
Looking to the future, astronomers are excited to see how machine learning – aka. deep learning and artificial intelligence (AI) – will enhance surveys. One field that is already benefitting in the search for extrasolar planets, where researchers rely on machine-learning algorithms to distinguish between faint signals and background noise. As this field continues to transition from discovery to characterization, the role of machine intelligence is likely to become even more critical.
Take the Kepler Space Telescope, which accounted for 2879 confirmed discoveries (out of the 4,575 exoplanets discovered made to date) during its nearly ten years of service. After examining the data collected by Kepler using a new deep-learning neural network called ExoMiner, a research team at NASA’s Ames Research Center was able to detect 301 more planetary signals and add them to the growing census of exoplanets.
Continue reading “A Machine-Learning Algorithm Just Found 301 Additional Planets in Kepler Data”NASA’s Perseverance Rover: The Most Ambitious Space Mission Ever?
When it comes to Mars exploration, NASA has more success than any other agency. This week, they’ll attempt to land another sophisticated rover on the Martian surface to continue the search for evidence of ancient life. The Mars Perseverance rover will land on Mars on Thursday, February 18th, and it’s bringing some very ambitious technologies with it.
Continue reading “NASA’s Perseverance Rover: The Most Ambitious Space Mission Ever?”Machine Learning Software is Now Doing the Exhausting Task of Counting Craters On Mars
Does the life of an astronomer or planetary scientists seem exciting?
Sitting in an observatory, sipping warm cocoa, with high-tech tools at your disposal as you work diligently, surfing along on the wavefront of human knowledge, surrounded by fine, bright people. Then one day—Eureka!—all your hard work and the work of your colleagues pays off, and you deliver to humanity a critical piece of knowledge. A chunk of knowledge that settles a scientific debate, or that ties a nice bow on a burgeoning theory, bringing it all together. Conferences…tenure…Nobel Prize?
Well, maybe in your first year of university you might imagine something like that. But science is work. And as we all know, not every minute of one’s working life is super-exciting and gratifying.
Sometimes it can be dull and repetitious.
Continue reading “Machine Learning Software is Now Doing the Exhausting Task of Counting Craters On Mars”AI Upscales Apollo Lunar Footage to 60 FPS
As exciting and thrilling as it is to watch all the historic footage from the Apollo Moon landings, you have to admit, the quality is sometimes not all that great. Even though NASA has worked on restoring and enhancing some of the most popular Apollo footage, some of it is still grainy or blurry — which is indicative of the video technology available in the 1960s.
But now, new developments in artificial intelligence have come to the rescue, providing viewers a nearly brand new experience in watching historic Apollo video.
A photo and film restoration specialist, who goes by the name of DutchSteamMachine, has worked some AI magic to enhance original Apollo film, creating strikingly clear and vivid video clips and images.
Continue reading “AI Upscales Apollo Lunar Footage to 60 FPS”Rovers Will be Starting to Make Their Own Decisions About Where to Search for Life
We all know how exploration by rover works. The rover is directed to a location and told to take a sample. Then it subjects that sample to analysis and sends home the results. It’s been remarkably effective.
But it’s expensive and time-consuming to send all this data home. Will this way of doing things still work? Or can it be automated?
Continue reading “Rovers Will be Starting to Make Their Own Decisions About Where to Search for Life”NASA Tests Water Powered Spacecraft in Orbit
Picture two tissue box-sized spacecraft orbiting Earth.
Then picture them communicating, and using a water-powered thruster to approach each other. If you can do that, then you’re up to speed on one of the activities of NASA’s Small Spacecraft Technology Program (SSTP.) It’s all part of NASA’s effort to develop small spacecraft to serve their space exploration, science, space operations, and aeronautics endeavors.
Continue reading “NASA Tests Water Powered Spacecraft in Orbit”Scientists are Using Artificial Intelligence to See Inside Stars Using Sound Waves
How in the world could you possibly look inside a star? You could break out the scalpels and other tools of the surgical trade, but good luck getting within a few million kilometers of the surface before your skin melts off. The stars of our universe hide their secrets very well, but astronomers can outmatch their cleverness and have found ways to peer into their hearts using, of all things, sound waves. Continue reading “Scientists are Using Artificial Intelligence to See Inside Stars Using Sound Waves”
Astronaut Scott Tingle Was Able To Control A Ground-Based Robot… From Space.
If something called “Project METERON” sounds to you like a sinister project involving astronauts, robots, the International Space Station, and artificial intelligence, I don’t blame you. Because that’s what it is (except for the sinister part.) In fact, the Meteron Project (Multi-Purpose End-to-End Robotic Operation Network) is not sinister at all, but a friendly collaboration between the European Space Agency (ESA) and the German Aerospace Center (DLR.)
The idea behind the project is to place an artificially intelligent robot here on Earth under the direct control of an astronaut 400 km above the Earth, and to get the two to work together.
“Artificial intelligence allows the robot to perform many tasks independently, making us less susceptible to communication delays that would make continuous control more difficult at such a great distance.” – Neil Lii, DLR Project Manager.
On March 2nd, engineers at the DLR Institute of Robotics and Mechatronics set up the robot called Justin in a simulated Martian environment. Justin was given a simulated task to carry out, with as few instructions as necessary. The maintenance of solar panels was the chosen task, since they’re common on landers and rovers, and since Mars can get kind of dusty.
The first test of the METERON Project was done in August. But this latest test was more demanding for both the robot and the astronaut issuing the commands. The pair had worked together before, but since then, Justin was programmed with more abstract commands that the operator could choose from.
American astronaut Scott Tingle issued commands to Justin from a tablet aboard the ISS, and the same tablet also displayed what Justin was seeing. The human-robot team had practiced together before, but this test was designed to push the pair into more challenging tasks. Tingle had no advance knowledge of the tasks in the test, and he also had no advance knowledge of Justin’s new capabilities. On-board the ISS, Tingle quickly realized that the panels in the simulation down here were dusty. They were also not pointed in the optimal direction.
This was a new situation for Tingle and for Justin, and Tingle had to choose from a range of commands on the tablet. The team on the ground monitored his choices. The level of complexity meant that Justin couldn’t just perform the task and report it completed, it meant that Tingle and the robot also had to estimate how clean the panels were after being cleaned.
“Our team closely observed how the astronaut accomplished these tasks, without being aware of these problems in advance and without any knowledge of the robot’s new capabilities,” says DLR engineer Daniel Leidner.
The next test will take place in Summer 2018 and will push the system even further. Justin will have an even more complex task before him, in this case selecting a component on behalf of the astronaut and installing it on the solar panels. The German ESA astronaut Alexander Gerst will be the operator.
If the whole point of this is not immediately clear to you, think Mars exploration. We have rovers and landers working on the surface of Mars to study the planet in increasing detail. And one day, humans will visit the planet. But right now, we’re restricted to surface craft being controlled from Earth.
What METERON and other endeavours like it are doing, is developing robots that can do our work for us. But they’ll be smart robots that don’t need to be told every little thing. They are just given a task and they go about doing it. And the humans issuing the commands could be in orbit around Mars, rather than being exposed to all the risks on the surface.
“Artificial intelligence allows the robot to perform many tasks independently, making us less susceptible to communication delays that would make continuous control more difficult at such a great distance,” explained Neil Lii, DLR Project Manager. “And we also reduce the workload of the astronaut, who can transfer tasks to the robot.” To do this, however, astronauts and robots must cooperate seamlessly and also complement one another.
That’s why these tests are important. Getting the astronaut and the robot to perform well together is critical.
“This is a significant step closer to a manned planetary mission with robotic support,” says Alin Albu-Schäffer, head of the DLR Institute of Robotics and Mechatronics. It’s expensive and risky to maintain a human presence on the surface of Mars. Why risk human life to perform tasks like cleaning solar panels?
“The astronaut would therefore not be exposed to the risk of landing, and we could use more robotic assistants to build and maintain infrastructure, for example, with limited human resources.” In this scenario, the robot would no longer simply be the extended arm of the astronaut: “It would be more like a partner on the ground.”
An Artificial Intelligence Just Found 56 New Gravitational Lenses
Gravitational lenses are an important tool for astronomers seeking to study the most distant objects in the Universe. This technique involves using a massive cluster of matter (usually a galaxy or cluster) between a distant light source and an observer to better see light coming from that source. In an effect that was predicted by Einstein’s Theory of General Relativity, this allows astronomers to see objects that might otherwise be obscured.
Recently, a group of European astronomers developed a method for finding gravitational lenses in enormous piles of data. Using the same artificial intelligence algorithms that Google, Facebook and Tesla have used for their purposes, they were able to find 56 new gravitational lensing candidates from a massive astronomical survey. This method could eliminate the need for astronomers to conduct visual inspections of astronomical images.
The study which describes their research, titled “Finding strong gravitational lenses in the Kilo Degree Survey with Convolutional Neural Networks“, recently appeared in the Monthly Notices of the Royal Astronomical Society. Led by Carlo Enrico Petrillo of the Kapteyn Astronomical Institute, the team also included members of the National Institute for Astrophysics (INAF), the Argelander-Institute for Astronomy (AIfA) and the University of Naples.
While useful to astronomers, gravitational lenses are a pain to find. Ordinarily, this would consist of astronomers sorting through thousands of images snapped by telescopes and observatories. While academic institutions are able to rely on amateur astronomers and citizen astronomers like never before, there is imply no way to keep up with millions of images that are being regularly captured by instruments around the world.
To address this, Dr. Petrillo and his colleagues turned to what are known as “Convulutional Neural Networks” (CNN), a type of machine-learning algorithm that mines data for specific patterns. While Google used these same neural networks to win a match of Go against the world champion, Facebook uses them to recognize things in images posted on its site, and Tesla has been using them to develop self-driving cars.
As Petrillo explained in a recent press article from the Netherlands Research School for Astronomy:
“This is the first time a convolutional neural network has been used to find peculiar objects in an astronomical survey. I think it will become the norm since future astronomical surveys will produce an enormous quantity of data which will be necessary to inspect. We don’t have enough astronomers to cope with this.”
The team then applied these neural networks to data derived from the Kilo-Degree Survey (KiDS). This project relies on the VLT Survey Telescope (VST) at the ESO’s Paranal Observatory in Chile to map 1500 square degrees of the southern night sky. This data set consisted of 21,789 color images collected by the VST’s OmegaCAM, a multiband instrument developed by a consortium of European scientist in conjunction with the ESO.
These images all contained examples of Luminous Red Galaxies (LRGs), three of which wee known to be gravitational lenses. Initially, the neural network found 761 gravitational lens candidates within this sample. After inspecting these candidates visually, the team was able to narrow the list down to 56 lenses. These still need to be confirmed by space telescopes in the future, but the results were quite positive.
As they indicate in their study, such a neural network, when applied to larger data sets, could reveal hundreds or even thousands of new lenses:
“A conservative estimate based on our results shows that with our proposed method it should be possible to find ?100 massive LRG-galaxy lenses at z ~> 0.4 in KiDS when completed. In the most optimistic scenario this number can grow considerably (to maximally ? 2400 lenses), when widening the colour-magnitude selection and training the CNN to recognize smaller image-separation lens systems.”
In addition, the neural network rediscovered two of the known lenses in the data set, but missed the third one. However, this was due to the fact that this lens was particularly small and the neural network was not trained to detect lenses of this size. In the future, the researchers hope to correct for this by training their neural network to notice smaller lenses and rejects false positives.
But of course, the ultimate goal here is to remove the need for visual inspection entirely. In so doing, astronomers would be freed up from having to do grunt work, and could dedicate more time towards the process of discovery. In much the same way, machine learning algorithms could be used to search through astronomical data for signals of gravitational waves and exoplanets.
Much like how other industries are seeking to make sense out of terabytes of consumer or other types of “big data”, the field astrophysics and cosmology could come to rely on artificial intelligence to find the patterns in a Universe of raw data. And the payoff is likely to be nothing less than an accelerated process of discovery.
Further Reading: Netherlands Research School for Astronomy , MNRAS