Military Witnesses: Addressing Trust, Ethical Issues Key to UAS Integration

214th_Reconnaissance_Squadron_-_General_Atomics_MQ-1B_Predator_06-3168

214th Reconnaissance Squadron. General Atomics MQ-1B Predator.Wikimedia Commons

While technological challenges are slowing the integration of autonomous systems by the Pentagon the number one hurdle remains acceptance by war fighters, defense officials said last week.

It’s “important to understand that realizing the vision of a fully integrated unmanned and manned naval force will depend as much on significant military cultural evolution as on technology innovation,” said Frank Kelley, the new Deputy Assistant Secretary of the Navy for Unmanned Systems told members of the House Armed Services’ Subcommittee on Emerging Threats and Capabilities. “We have to change the way we think to evolve the way we fight.”

Exactly what that evolution might look like, and what the hurdles might be, was the subject of a report commissioned a year ago by the Frank Kendall, the Pentagon’s top acquisition official. He asked the Defense Science Board to work this summer “to identify the science, engineering, and policy problems that must be solved to permit greater operational use of autonomy across all warfighting domains.”

Among the questions for the study, which is to look out over the next 20 years, “what limits the use of autonomy?” and what potential threats might emerge from “the use of autonomy by adversaries.”

While the Board was asked to look at opportunities for greater mission efficiency and lowering costs and risked to warfighters, the emphasis, Kendal said, was on the “exploration of the bounds—both technological and social—that limit the use of autonomy across a wide range of military operations.”

Among the issues already identified is having trust that the systems would perform as expected, witnesses told the committee, but the hesitation goes beyond those about technology.

“I think what I’m finding today that is remarkable is that our young people are really concerned about the ethical and moral implications of how these unmanned systems are going to be used. said Kelley. …The trust issue is sort of an implied task. We do have DOD directives that talk about certainly weaponizing platforms. But I think its the biggest issue is sort of an intangible and it’s this ethical and moral element of what it means to put unmanned systems in combat.”

In the commercial sector there is concern about liability, a concern that has its parallel among military personnel, said Jonathan Bornstein, chief of the Army Research Lab’s Autonomous System Division within the Vehicle Technical Directorate.

“If there is an accident in the national airspace or an accident on the road, who is liable for the action? As was just mentioned by my colleague Mr. Kelly, he talked about the ethical responsibility that many people see in the use of unmanned systems. Will the liability for their use, will the responsibility of their use —who will that fall upon?”

In his personal opinion, Bornstein said, “that will be a major issue in the future going forward both for the commercial sector and for the military sector.”

“Much of it has to do with trust and proficiencies,” said Dr. Zacharias “one of the things is trying to design interest in today’s systems including engineering assistance or performs well within its scope of operations, knowing when it’s exceeding the scope of operations, — or the human operator knowing that.”

It might even be better to trade capability for transparency, suggested Zach, have a less-than-optimal system that was transparent—that is able to explain waat is was doing.