Click for a full size image
NASA Satellite Observes Iceland’s Eyjafjallajökull Volcano in Infrared
On Sat., April 17, 2010, the Advanced Land Imager (ALI) instrument onboard NASA’s Earth Observing-1 (EO-1) spacecraft obtained this false-color infrared image of Iceland’s Eyjafjallajökull volcano from an altitude of 705 kilometers (438 miles). A strong thermal source (denoted in red) is visible at the base of the Eyjafjallajökull plume. Above and to the right, strong thermal emission is also seen from the lava flows located at Fimmvorduhals between March 20 and April 13, 2010. This is where lava first reached the surface, generating impressive lava fountains and lava flows. As the Fimmvorduhals episode was in a location with no ice cap, there was little of the violent interaction between lava and water that took place at Eyjafjallajökull and that generated the massive volcanic plume. To the east of Fimmvorduhals is the Myrdalsjökull ice cap, beneath which slumbers the mighty Katla volcano. Katla has erupted 20 times in recorded history, with the last eruption occurring in 1918. This ALI image is 38 kilometers (24 miles) wide, and has a resolution of 30 meters (98 feet) per pixel. Up is north-northeast.
The EO-1 spacecraft is managed by NASA’s Goddard Space Flight Center, Greenbelt, Md. EO-1 is the satellite remote-sensing asset used by the Volcano Sensor Web developed by NASA’s Jet Propulsion Laboratory, Pasadena, Calif., which is being used to monitor this, and other, volcanic eruptions around the world.
Image Credit: NASA/JPL/EO-1 Mission/GSFC/Ashley Davies
A.I. Will Prepare Robots for the Unknown
How do you get a robot to recognize a surprise?
That’s a question artificial intelligence researchers are mulling, especially as A.I. begins to change space research.
A new article in the journal Science: Robotics offers an overview of how A.I. has been used to make discoveries on space missions. The article, co-authored by Steve Chien and Kiri Wagstaff of NASA’s Jet Propulsion Laboratory, Pasadena, California, suggests that autonomy will be a key technology for the future exploration of our solar system, where robotic spacecraft will often be out of communication with their human controllers.
In a sense, space scientists are doing field research virtually, with the help of robotic spacecraft.
“The goal is for A.I. to be more like a smart assistant collaborating with the scientist and less like programming assembly code,” said Chien, a senior research scientist on autonomous space systems. “It allows scientists to focus on the ‘thinking’ things — analyzing and interpreting data — while robotic explorers search out features of interest.”
Science is driven by noticing the unexpected, which is easier for a trained human who knows when something is surprising. For robots, this means having a sense of what’s “normal” and using machine learning techniques to detect statistical anomalies.
“We don’t want to miss something just because we didn’t know to look for it,” said Wagstaff, a principal data scientist with JPL’s machine learning group. “We want the spacecraft to know what we expect to see and recognize when it observes something different.”
Spotting unusual features is one use of A.I. But there’s an even more complex use that will be essential for studying ocean worlds, like Jupiter’s moon Europa.
“If you know a lot in advance, you can build a model of normality — of what the robot should expect to see,” Wagstaff said. “But for new environments, we want to let the spacecraft build a model of normality based on its own observations. That way, it can recognize surprises we haven’t anticipated.”
Imagine, for example, A.I. spotting plumes erupting on ocean worlds. These eruptions can be spontaneous and could vary greatly in how long they last. A.I. could enable a passing spacecraft to reprioritize its operations and study these phenomena “on the fly,” Chien said.
JPL has led the development of several key examples for space A.I. Dust devils swirling across the Martian surface were imaged by NASA’s Opportunity rover using a program called WATCH. That program later evolved into AEGIS, which helps the Curiosity rover’s ChemCam instrument pick new laser targets that meet its science team’s parameters without needing to wait for interaction with scientists on Earth. AEGIS can also fine-tune the pointing of the ChemCam laser.
Closer to home, A.I. software called the Autonomous Sciencecraft Experiment studied volcanoes, floods and fires while on board Earth Observing-1, a satellite managed by NASA’s Goddard Spaceflight Center, Greenbelt, Maryland. EO-1’s Hyperion instrument also used A.I. to identify sulfur deposits on the surface of glaciers — a task that could be important for places like Europa, where sulfur deposits would be of interest as potential biosignatures.
A.I. allows spacecraft to prioritize the data it collects, balancing other needs like power supply or limited data storage. Autonomous management of systems like these is being prototyped for NASA’s Mars 2020 rover (which will also use AEGIS for picking laser targets).
While autonomy offers exciting new advantages to science teams, both Chien and Wagstaff stressed that A.I. has a long way to go.
“For the foreseeable future, there’s a strong role for high-level human direction,” Wagstaff said. “But A.I. is an observational tool that allows us to study science that we couldn’t get otherwise.”
source: NASA – Jet Propulsion Laboratory – California Institute of Technology