EUROSATORY 2022: The way the US military plans for the future battlefield, data will perennially crisscross the battlefield, creating a next-generation vulnerability: bandwidth bottlenecks.
Intelligence, surveillance and reconnaissance (ISR) data will be sent to targeting and fire control systems. Sensor data will augment battlefield view of soldiers through the goggles strapped on their heads. Soldiers, of course, need to communicate with each other on the ground, and connect to air assets or back to the brigade operations center. And medical evacuation flights will dispatch wound information across a network to field hospitals several minutes before arrival.
All that eats up a lot of bandwidth, a scarce resource even before an enemy starts trying to jam or otherwise interfere with it. But the Army sees one potential option to free up valuable space through increasing autonomy on the battlefield.
“The more autonomy you have, the less communications you need,” said Ted Maciuba, deputy director of robotics requirements at Army Futures Command during a presentation at Eurosatory in Paris last week. “I think that is going to help us be able to work our way forward so that we can get to the point where we need less and less of the very valuable spectrum that is out there — not only to control robots, but to do the mission command things to include voice communications and data in the future.”
Related: Lawmakers want clarity on JADC2 efforts: Who’s getting what, when?
That’s because the robotic platforms will have more onboard processing instead of needing to send all information over a network back to a human, and won’t require a human to micro-manage robots’ movements remotely. Bandwidth constraints now would make soldiers “hard pressed” to control multiple robots at once, Macuiba said, but “as we get autonomy onto those robots, we will be reducing the amount of bandwidth required.”
Right now, the Army’s robots are generally teleoperated, meaning the platforms are controlled by a user remotely. That’s the case for the Army’s Ground Vehicle System Center’s Project Origin, a technology demonstrator vehicle informing requirements for the service’s future Robotic Combat Vehicle modernization program. Maciuba said over the next “four to five years” there will be a transition from teleoperated to “more and more autonomy.”
Spectrum availability will play a significant role in enabling the Pentagon’s future warfighting concept, Joint All-Domain Command and Control, in which the best sensor connects with the best shooter to engage an enemy target. Last year’s Project Convergence, the Army’s annual sensor-to-shooter experiment, highlighted the choices the service has to make about what types of data it can realistically send over a battlefield network — be it still photos or full-motion video.
RELATED: At Project Convergence, Army ‘struggling’ to see joint battlefield as it heeds ‘hard’ lessons
Greater autonomy will be added to smaller Army formations through a program called “AI for Small Unit Maneuver” (AISUM), overseen by Maciuba’s office. That program is developing software that will sit between platoon-level soldiers and robots, and will autonomously decide what information needs to be sent to the soldiers on the ground. AISUM can ingest information both from a platoon’s deployed robots, as well as soak up information sent down from echelons above platoon.
The ultimate goal, as Breaking Defense previously reported, is to create an “AI cloud” running on mini-servers carried by robots, that manage robots’ movements and allowing soldiers to do other tasks. For the things humans need to see, the robots don’t send back full-motion video, rather they send “militarily interesting” findings back as a “burst,” Maciuba said, and then builds a battlefield visualization, marked with geo-reference “thumbnails” of specific things the robot found.
“What we’re trying to do is send as little information as possible that will allow you to build that battlefield visualization for the platoon,” Maciuba said in an interview with Breaking Defense after his presentatio. “You can think of it as a synthetic environment. … You are going to be given it in that format, as a synthetic view of what is actually going on, because we cannot afford the bandwidth to build a real-world view.”