There’s an old German saying, “If enough is good, more’s better” that seems to hold true for many aspects of life. But when it comes to trying to create meaning out of data, more data is not necessarily better. The most prominent intelligence failure of late involves the Christmas Day would-be bomber and the failure to “connect the dots.” A very different intelligence dilemma stems quite simply from “too many dots,” that is, too much data, this time in the form in the form of video. In 2009, Air Force drones in Afghanistan and Pakistan collected three times the amount of data they recorded in 2007, so much data in fact that it would take 24 years to watch the entire 2009 collection consecutively from beginning to end. Predator and Reaper drones in Afghanistan fly 39 round-the-clock missions per day and each unmanned plane records a full 24 hours of video per mission, which is streamed back to teams of analysts at five sites in the US. Nineteen hundred analysts work 12-hour shifts to review these videos. Amazingly they are managing to keep up—but barely.
The analysts handle over 700 gigabytes of data a day, which is roughly equivalent to more than 700 sets of the Encyclopedia Britannica. They rapidly relay information about insurgent movements, weapons caches, and roadside bombs to troops back in Afghanistan. So we have the spectacle of unmanned remote-controlled planes, equipped with video cameras, collecting information about enemy activities, and relaying it halfway around the world for analysis. Pertinent information is then relayed back to the troops in Afghanistan. In some ways this is a tremendous display of 21st century technology providing real protection and vital information for combatants.
But now the Air Force plans to increase the number of daily missions by one-third, so that unmanned drones will complete fifty 24-hour videos per day (or about one thousand sets of the Britannica). On top of that, however, they plan to increase the number of cameras in the Reapers to 10 this year and to 30 by 2011. (They actually have plans to eventually increase the number to 65.) Even the Air Force’s head of intelligence, Lt. Gen. David A. Deptula, admits the Air Force could soon be “swimming in sensors and drowning in data.” It’s an increase from 39 video feeds a day to 3000. That’s a lot of data!
Military officials are incorporating techniques developed by television broadcasters to more efficiently transform raw video data into actionable information. New software will allow analysts to more easily provide a context and meaning to video. They’ll be able to highlight and otherwise mark up screens to better convey what’s important. They’ll create clips and insert text and graphics.
Of course, what military officials really want—and desperately need—is automation, software that could automatically scan for vehicles, men, and key changes to the terrain. But the software is not reliable enough yet. Television broadcasters, who have been working with software for years, still are reduced to some pretty simple manual tagging. One Naval officer working to set standards for video intelligence observed a television crew recording a New England Patriots’ football game so he could learn how networks tagged video for easy retrieval. “There are these three guys who sit in the back of a van, and every time Tom Brady comes on the screen, they tap a button so that Tom Brady is marked.” To call up film later, he said, “They just type in: ‘Tom Brady, touchdown pass.’” This is, at best, semi-automatic software. The dreams of true AI software for scanning data have to date proven as illusionary for the military as other visions for AI have for other professions.
Consider a telling example: McDonnell-Douglas tried to build a system that could capture an airport worker’s ability to recognize whether a plane coming in for a landing is positioned correctly as it approaches the ground. The plan was to use the expert system when the visibility was low. After two years of focused effort, however, the technical team had managed to replicate only about 85% of what a worker could see in a two-second visual evaluation. Compared to the task of surveying terrain for armed men and IEDs, this visual evaluation seems trivial. Still it’s impossible to replicate that task in software.
The Air Force has other plans to handle the growing mountain of data that the drones are collecting. Lt. Gen. Deptula intends to hire 2500 more analysts over the next three years to add to its current staff of 1900. But finding and training those new analysts has its own problems. As one intelligence officer put it, “It takes five years to get someone with five years experience.” Experienced contractors, mostly former soldiers and Marines, are hired, but when the situation involves deciding about deploying weapons and other critical actions, an airman subject to the military justice system must make the calls. The Army is also streaming more video directly to the troops on the ground, but this tactic isn’t perfect either: Intelligence commander Captain Mike Hollingsworth explains that people unfamiliar with the analysts’ work think that any soldier can interpret video data, but that’s not true: “There’s a common misconception by people who don’t do this for a living that they could do this for a living.”
Still, overall it’s easier for military officials to get more money for technology than it is to get more manpower. Says Deptula: ”Developing the technology and tools to make this easier, faster, and less manpower-intensive is going to be vital as warfare in the information age matures.” The Air Force is also working hard to provide analysts with better tools to help their analyses. Analysts use Google Earth to refer to other imagery. They also use special enhancements to highlight imagery and overlap historical images for changes.
But issues involving the human factor persist. Human errors play a large part in the failure to detect the Christmas Day bomber and other incidents. The intelligence community intends to pin its hopes on future technologies, not manpower, to handle the mountains of data they will have at their disposal. Connecting the dots and finding patterns of behavior may well be things that computers can do. For now, however, and one expects for a long time to come, it’s human perception, analysis, and, above all, judgment, that the intelligence community has to rely on: “You need somebody who’s trained and is accountable in recognizing that that is a woman, that is a child, and that is someone who’s carrying a weapon.” The leader of one intelligence squadron observed. “And the best tools for that are still the eyeball and the human brain.”
Military officials are aware of the limitations of AI or machine intelligence in wartime. Says one Air Force general, “The history of human conflicts is littered with examples of how military forces achieved results that no algorithm would have predicted.” P.W. Singer, author of the recent Wired for War: The Robotics Revolution and Conflict in the 21st Century (New York: Penguin, 2009) agrees: “It may seem just like a game of chess to some, but war doesn’t have a finite set of possible actions and a quantifiable logic of zeros and ones. Instead, as one writer put it, ‘In war, as in life, spontaneity still prevails over programming’.” Human judgment, intuition, perception, experience, common sense, creativity, emotions –many of the things that make up what it means to be human—are hard to write into the lines of a digital program.
Post a comment
See also
P.W. Singer’s very well-written book Wired for War provides a thorough compendium and analysis of the state of robotics in the US military. Buy it on Amazon.
“Drone Flights Leave Military Awash in Data,” Christopher Drew, NY Times, 1/11/2010.
“Military ‘Swimming in Sensors and Drowning in Data,’” Stew Magnuson, National Defense Magazine, January 2010.
“Get-well intel plan,” Jim Hodges, C4ISR Journal, January 1, 2010.