Of late we’ve seen both leaked and open evidence of many employees at Internet tech firms in the U.S. rebelling against their firms participating in battlefield systems military contracts, mostly related to cloud services and AI systems.
Some reactions I’ve seen to this include statements like “those employees are unpatriotic and aren’t true Americans!” and “if they don’t like the projects they should just quit the firms!” (the latter as if everybody with a family was independently wealthy).
Many years ago I faced similar questions. My work at UCLA on the early ARPANET (a Department of Defense project) was funded by the military, but was research, not a battlefield system. A lot of very important positive research serving the world has come from military funding over the years and centuries.
When I was doing similar work at RAND, the calculus was a bit more complex since RAND’s primary funding back then was also DOD, but RAND provided analytical reports to decision makers, not actual weapons systems. And RAND had a well-earned reputation of speaking truth to power, even when that truth was not what the power wanted hear. I liked that.
But what’s happening now is different. The U.S. military is attempting to expand its traditional “military-industrial” complex (so named during a cautionary speech by President Eisenhower in 1961) beyond the traditional defense contractors like Boeing, Lockheed, and Raytheon.
The new battle systems procurement targets are companies like Google, Amazon, and Microsoft.
And therein lies the root of the problem.
Projects like Maven and JEDI are not simply research. They are active battlefield systems. JEDI has been specifically described by one of its top officials as a program aimed at “increasing the lethality of our department.”
When you sign on for a job at any of the traditional defense contractors, you know full well that battlefield operational systems are a major part of the firms’ work.
But when you sign on at Google, or Microsoft, or Amazon, that’s a different story.
Whether you’re a young person just beginning your career, or an old-timer long engaged in Internet work, you might quite reasonably expect to be working on search, or ads, or networking, or a thousand other areas related to the Net — but you probably did not anticipate being asked or required to work on systems that will actually be used to kill people.
The arguments in favor of these new kinds of lethal systems are well known. For example, they’re claimed to replace soldiers with AI and make individual soldiers more effective. In theory, fewer of our brave and dedicated volunteer military would be injured or killed. That would be great — if it were truly accurate and the end of the story.
But it’s not. History teaches us that with virtually every advance in operational battlefield technology, there are new calls for even more military operations, more “interventions,” more use of military power. And somehow the promised technological advantages always seem to be somehow largely cancelled out in the end.
So one shouldn’t wonder why Google won’t renew their participation in Maven, and has now announced that they will not participate in JEDI — or why many Microsoft employees are protesting their own firm’s JEDI participation.
And I predict that we’re now only seeing the beginnings of employees being unwilling to just “go along” with working on lethal systems.
The U.S. military has made no secret of the fact that they see cloud environments, AI, robotics, and an array of allied high technology fields as the future of lethal systems going forward.
It’s obvious that we need advanced military systems at least for defensive purposes in today’s world. But simply assuming that employees at firms that are not traditional defense contractors will just “go along” with work on lethal systems would be an enormous mistake. Many of these employees are making much the same sorts of personal decisions as I did long ago and have followed throughout my life, when I decided that I would not work on such systems.
The sooner that DOD actually understands these realities and recalibrates accordingly, the better.
–Lauren–