Yuri Gagarin, first human to travel to space, is less well known as the holder of a second record from that same flight: the first human to travel at hypersonic speeds through the earth’s atmosphere, which he accomplished on the descent. Hypersonics, the broad term for anything moving through air at Mach 5, or five times the speed of sound, has been a field about established trajectory, about enduring and returning and landing where intended from launch.
This test, done in 2002 at Langely research center, was part of decades of research into creating viable hypersonic weapons and vehicles. A new project from Sandia National Laboratories hopes to add autonomy to the mix. (DARPA/ONR/NASA Langley Research Center)
On April 25, Sandia National Laboratories announced a proposal to add autonomous navigation to hypersonic vehicles.
Hypersonics themselves are hardly new technology. At White Sands in 1949, the United States pushed a modified V-2 rocket to a speed of 5,150 miles per hour, making it likely the first human-produced object to reach hypersonic speed, though the rocket was destroyed in the testing. By 1981, Albuquerque’s Sandia Labs had conducted the “Sandia Winged Energetic Reentry Vehicle Experiment,” which yielded information about hypersonic vehicles if not useful prototypes. Sandia also worked on the Strategic Target System program from 1985 into the 1990s, which explored guidance systems at hypersonic speeds, and has worked on other hypersonic projects in the years since.
The latest initiative, then, is less about the physics of hypersonic flight, and more about the software guiding flight decisions at hypersonic speeds.
“At extreme speeds, the flight is incredibly challenging to plan for and program,” said Alex Roesler, a senior manager at Sandia who leads the coalition. Sandia Labs is looking to AI as a way around the difficulty of planning hypersonic flight in advance of launch.
The announcement comes nestled in a release about a new research coalition that the Lab is spearheading, called Autonomy New Mexico. Broadly focused on AI for aerospace systems, the coalition includes Georgia Institute of Technology, Purdue University, the University of Illinois Urbana-Champaign, the University of New Mexico, Stanford University, Texas A&M University, The University of Texas at Austin, and Utah State University.
As envisioned, the artificial intelligence aboard a hypersonic vehicle tackles the complex problems of superfast navigation and offers options to human controllers, who remain in the loop.
“In theory, artificial intelligence could generate a hypersonic flight plan in minutes for human review and approval, and in milliseconds a semi-autonomous vehicle could self-correct in flight to compensate for unexpected flight conditions or a change in the target’s location,” a Sandia Labs release read. “A human monitoring the flight could regain control by turning off the course-correcting function at any time.”
As with all autonomous systems, hypersonic autonomy will require a robust combination of sensors and onboard processing. Complicating matters will be the fact that a hypersonic vehicle is designed to operate at speeds usually reserved for astronaut atmospheric reentry. Incorporating meaningful communication between a human controller and the hypersonic vehicle is not impossible, but it is largely a task that risks being subsumed by autonomy. If the machine can already suggest the courses of action a human operate will want it to take, it isn’t hard to imagine humans trusting the autonomous machines, and then surrendering the role of human decision-making in the launch.
Is it possible for in-the-loop control to endure at hypersonic speeds?
Further implementation questions abound for Sandia Labs and Autonomy New Mexico, after they’ve tackled if autonomous sensing and navigation is possible in a vehicle hurtling over 3,836 mph. That’s to say nothing of the difficulty in identifying and adjusting trajectories to hit new targets. There’s a long road ahead for hypersonic autonomy, and it’s unclear if the entire premise is sound.