— Gizmodo (@Gizmodo) March 6, 2018
In a memorandum dated 26 April 2017, the US deputy secretary of Defense set up an Algorithmic Warfare Cross-Functional Team (AWCFT) under the codename Project Maven. The stated goal of Project Maven is
“to field technology to augment or automate Processing, Exploitation, and Dissemination (PED) for tactical Unmanned Aerial System (UAS) and Mid-Altitude Full-Motion Video (FMV) in support of the Defeat-ISIS campaign. This will help to reduce the human factors burden of FMV analysis, increase actionable intelligence, and enhance military decision-making.“
In short, the Department of Defense (DoD) wants AI to help it identify where to look. It will do this, partly by identifying and combining existing AI-technology, but also by developing its own. This development will be focused on providing
“computer vision algorithms for object detection, classification, and alerts for FMV PED. Further sprints will incorporate more advanced computer vision technology. After successful sprints in support of Intelligence, Surveillance, Reconnaissance (ISR) PED, the AWCFT will prioritize the integration of similar technologies into other defense intelligence mission areas.“
This is the part most films skip, so they can go to the
interesting action-packed parts with the explosions. But knowing what to shoot, is often more important than just shooting, and from an efficiency point of view, it makes sense for the DoD to want to automate that process to the highest degree. You can have all the data in the world, but without someone knowing what to make of it, it is useless.
As its partner, the DoD has chosen a party that is both logical and puzzling: Google. On the one hand, Google is one of the biggest tech-companies in the world with an interest in expanding into AI. On the other hand, it is not the most military minded company, with its fair share of SJWs. According to Gizmodo, which broke the news that Google is helping the DoD on 6 March, the cooperation has “set off a firestorm“:
“Google’s pilot project with the Defense Department’s Project Maven, an effort to identify objects in drone footage, has not been previously reported, but it was discussed widely within the company last week when information about the project was shared on an internal mailing list, according to sources who asked not to be named because they were not authorized to speak publicly about the project. (…) Some Google employees were outraged that the company would offer resources to the military for surveillance technology involved in drone operations, sources said, while others argued that the project raised important ethical questions about the development and use of machine learning.“
When asked by Gizmodo about the work it does for the DoD, a spokesperson said Google is providing TensorFlow API’s, machine learning applications. These help military analysts detect objects in images. The company is currently working to “develop policies and safeguards” around the use of these applications in an apparent acknowledgment of the controversial nature of their application.
“We have long worked with government agencies to provide technology solutions. This specific project is a pilot with the Department of Defense, to provide open source TensorFlow APIs that can assist in object recognition on unclassified data. The technology flags images for human review, and is for non-offensive uses only. Military use of machine learning naturally raises valid concerns. We’re actively discussing this important topic internally and with others as we continue to develop policies and safeguards around the development and use of our machine learning technologies.“
The DoD meanwhile declined to comment on Google’s exact role, or whether it was the only private industry partner involved in Project Maven:
“Similar to other DOD programs, Project Maven does not comment on the specifics of contract details, including the names and identities of program contractors and subcontractors.“
But it’s sure is a good thing they gave James Damore the boot. They wouldn’t want to end up in ethically questionable waters, now would they?