Approaching the Limits of Speed and Automation in the Cyber Age


Earlier this spring, the Office of Defense, Research, and Engineering in OSD in Annapolis sponsored the final of four war games that examined the future of technology, conflict, and war.[i]  I had the privilege of being one of the few active duty officers to attend all four games.  Based on the war games, it became clear that three transformative technologies will challenge DOD to its core: autonomous killing machines, a next generation of ever more ubiquitous communication networks, and human-machine integration.  But I took away something else from the games: the sense that the momentum of the DOD R&D/acquisition system is propelling the U.S. military toward a strategic conundrum of historic proportions. The problem: the three major transformative technologies replace progressively more human decision making and skill with intelligent electronic devices (IED), and are thus vulnerable to cyber attack.[ii]  In a pre-cyber age dominated by the U.S., such a replacement of the human with machine might be all positives: reduced risk to Americans, faster flow of information, higher performing battlefield units. But in the face of rising “cyber powers,” this proliferation of “intelligent electronic devices"(the other IEDs) in communication networks, robotics, and human decision aides and enhancements may place our defense and security at risk.[iii]  The emerging reality of cyber war argues for what may seem counterintuitive but has historical precedent. During this period of uncertainty, the DOD should slow deployment of programs that displace human skill and decision making and slow the retirement of mature, stand-alone technologies.  In parallel, DOD should reestablish selective training programs to preserve or regain human-centric navigation, language, and war fighting skills that may have atrophied.  The increased reliance on the human will create a more resilient human-machine system, reduce the surface area for cyber attacks, reduce the possibility of cyber 'silent failures', and provide the ultimate unhackable code (ie., natural, human cognitive processes).

The Problem

The futuristic four war games examined the evolution toward unmanned systems, ever larger information networks, and electronic human-machine integration. It was often argued that such electronic based systems could get inside an enemy’s ‘decision cycle’, and give us an advantage in what is known as John Boyd’s classic Observe, Orient, Decide, Act (OODA) Loop. I was persuaded:  unmanned systems with A.I. processors could compute faster in many cases than a human, computer enabled tactical electronic communications systems could transmit more data faster than human voice or non-computerized communications, and soldiers aided by yet ever more electronic and ‘web enabled devices’ could allow fewer, lesser trained humans to do more tasks faster than the personnel who are without these devices  (e.g., think Google Glasses with facial recognition apps which reduce a soldier’s need to memorize;  or the hand held GPS linked to an Iphone which reduce reliance on human navigation skills on land or sea, or computerized translation programs which reduces the incentive for soldiers to maintain natural human language proficiency).  

But where human action and decision making is displaced by intelligent electronic devices, new questions of security arise. This security is now known popularly as "cyber security.”  Its close relative, "cyber power", is that power wherein an actor can use computer code to take control of, influence, or degrade another actor’s electronic enabled devices or communications systems.[iv]   As our country continues to proliferate hackable IEDs in an increasing number of systems, we do so based on an implicit assumption going forward: that the U.S. will maintain information dominance and thus a favorable cyber balance of power.  The assumption that the U.S. can maintain information or cyber dominance underlines DOD's race to build fleets of unmanned vehicles, build ever more complex and networked electronic information systems, and deploy ever more electronic decision aids to our Sailors and Soldiers.

But is it reasonable to assume that our increasingly automated and computerized systems are and will remain cyber secure, trustworthy, and resilient?[v]  I think that several of these suppositions are or will very soon be in doubt, for a simple reason: unlike more traditional forms of physical power, cyber power relationships can shift unpredictably, and leave our nation in a condition of relative uncertainty. Thus, our ability to predict, observe, and react may be inadequate to maintain information dominance and cyber superiority. Why is this so?  

The Uncertainty of Cyber Power

Cyber power calculations are increasingly opaque, and as a result, determining which country is in the cyber lead, or who will remain in the lead, is uncertain[vi].  Unlike security calculations and arms races of the past, where counting tanks, battleships, or ICBMs provided a rough measure of relative technological power, such calculations are more difficult if not impossible today. With the addition of each new IED to the already millions of such devices in the DOD inventory, it adds yet one more conduit for cyber-attack, and contributes to rising complexity[vii].  Due to the proliferation of IEDs, we are on a trajectory leading to the time when nearly all critical systems and weapons may be accessible and ‘hackable’ by computer code.  In this new electronic web of machines, if one of our stronger cyber rivals gains a strategic computing advantage (perhaps, say, a breakthrough in super computing or cryptography), the consequences could range from the tactical to the strategic, across our networked systems and automated platforms,  and  to the detriment of soldiers who have become dependent on electronic devices. Additionally, there is a dawning revelation of the vulnerability of automated and remotely piloted vehicles.  DARPA has recently embarked on a $60 million program to provide better protections to the American drone fleet, a program known as High-Assurance Cyber Military Systems, or HACMs.  But far more must be done.

In conclusion, doctrine, system acquisition, and work force development across the force must be reconsidered.  We must think anew, and consider the stark realities of a cyber multi-polar world and a near certainty of a contested cyber-physical battle space.  To regain a more resilient military capability, doctrine must be revised to include contingencies for degraded electronic-communications environment. Training and education must create a human capability to operate both advanced active cyber defense systems but also to operate existing systems in a degraded condition. Lastly, the employment and future acquisition plans for autonomous systems (e.g.,  UAV, USV, UUV, TLAM, and ICBM fleets) should be reconsidered pending further research. In particular, the doctrinal and even ethical implications of employing networked but unmanned systems in a cyber contested battle space, when faced with near-peer cyber competitors, should be examined.   


Footnotes: 

[i] The series of war games was funded by Office of Secretary of Defense, Office of Rapid Fielding, SES Ben Riley. Mr. Riley is the Principle Deputy, Deputy Assistant Secretary of Defense for Rapid Fielding in the office of the Assistant Secretary of Defense, Research, and Engineering.  Project leadership was shared between Dr. Peter Singer of Brookings and the  NOETIC Corporation.  Game Four was co-hosted with the US Naval Academy, March 2013.  For more information, see NOETIC Corporation and article by Dr. Patrick Lin, The Atlantic, April 18, 2013.

[ii] The expanding realm of cyber insecurity is penetrating an increasing number of activities, from email servers, government data bases, banks, and critical infrastructure, to now frontline weapons.  See “High-Tech U.S. Weapons Face Rising Espionage Threat”, 22 June 2013, found at INSIDEDEFENSE.COM

[iii] The advocates for accelerating the acquisition of unmanned systems are many, but their acknowledgement of the potentially high costs of ensuring of cyber security could be more candid.  If the ‘lifetime costs’ of these systems included never ending cyber security contracts, the argument to automate might be less compelling.  In the first war game of the OSD series, for example, one senior level robot company executive exemplified this problem. When pressed about cyber security of his company's unmanned systems during an off the record meeting, the executive deferred cyber insecurity to software companies.  He appeared to take little ownership of the potentially massive problem, and offered that the ‘banks would be the first to solve the problem’.   

[iv] There exists no consensus as to exactly what is cyber power.  The definition used in this essay is a simplification of many longer attempts at explanation.  A particularly thoughtful essay on the subject is by Joe Nye, Jr., See "Cyber Power", at  https://projects.csail.mit.edu/ecir/wiki/images/d/da/Nye_Cyber_Powe1.pdf   Also, interesting research effort is underway at the University of Tulsa, an NSA recognized center of cyber excellence, and a research team led by Dr. Sujeet Shenoi.

[v] "Silent Failures" are considered by some experts in the field to be the worst kind, most potentially damaging, since they are failures that you don't know occur.  With a human in the loop, especially on physical-kinetic type platforms, a human is 'on scene' and can more quickly identify if the platform or system is failing to follow the assigned tasks.  For more on this subject of the "Silent Failure", see Dan Geer, 26 May 2013, interview with NEWSLE.com. “The most serious attackers will probably get in no matter what you do. At this point, the design principal, if you’re a security person working inside a firm, is not failures, but no silent failures.”  accessed on 31 July 2013,  at  http://newsle.com/article/0/77585703/

[vi] Martin Libicki of the RAND Corporation has written extensively on the problem of  cyber attribution and the differences with traditional theories of deterrence, which relied on more certain knowledge of a potential adversaries military capabilities than might be possible in the case of cyber. 

[vii] Dan Geer, 26 May 2013, interview with NEWSLE.com.  Geer, a leading CIA executive has noted that complexity of our electronic netted systems may be the biggest challenge going forward, even before accounting for the determined attacks of a cyber rival.



 

Mark Hagerott is an Afghanistan war veteran,  former naval chief engineer of a dual nuclear reactor plant, combat information center officer, and combat systems officer where he managed semi-autonomous and automated systems in the U.S. Navy's AEGIS fleet.  He currently serves as the Deputy Director and Distinguished Professor in the Center for Cyber Studies at the U.S. Naval Academy.  His most recent work, a chapter offering a framework for robots and cyber, will be published later this month by the Combat Studies Institute Press, Ft. Leavenworth, Kansas.