Washington Post| Alyssa Rosenberg- When Alex Gibney released “Zero Days,” his movie about the Stuxnet program that slowed Iran’s nuclear development, this year, his portrait of the unsettled state of cyberwarfare was already unnervingly relevant. But in the months since, the Democratic National Committee suffered a severe hack, a massive distributed denial of service attack shut down many Internet-based services for East Coast users and WikiLeaks began publishing John Podesta’s private emails. So it’s no surprised that Showtime moved up the air date for “Zero Days” to Saturday at 9 p.m. Gibney and I spoke in early October; our conversation has been edited for clarity and length.
What has it been like to release the movie in a context where some of its scariest predictions seem to be coming true?
Well, it’s unfortunately gratifying. I don’t know how better to put it. When we were making the movie, many people sort of rolled their eyes as if it was somehow abstract. And yet since the movie’s come out, you can see it rolling through the headlines every day, whether it be the DNC hack, the Russian hacks on the election machines. And back in just before the Berlin Festival, there was the presumed Russian assault on the Ukrainian power grid. So, yeah, it’s happening all around us.
I think obviously people were very shocked by Donald Trump’s flip invitation for Russian security services to hack Hillary Clinton’s email, but do you feel like the conversation [about cyberwarfare] has gotten sort of more to a place of awareness or acceptance? And how do you see the conversation going forward?
I think that the conversation is still in the hysterical realm. Which is to say people look like their hair is on fire when one of these things happens, as if it’s the first time. And part of the reason for that is nobody’s really reckoning with the idea and that’s part of what I deal with in “Zero Days,” is nobody’s really reckoning with the fact that so much of the idea of cyberwar, cyberweapons, is still classified in terms of what we’re doing with it. And therefore, presumably, what other nations are doing with it.
Until that comes out in the open, it’s hard to have a really sophisticated conversation about this stuff. And also to understand the differences between different kinds of attacks, and what kinds of attacks are more malicious than others, and also, what kind of implants are already likely installed in this country and how many have we installed in various countries all over the world .?.?. It’s kind of like we’re sitting on digital quicksand, and nobody’s willing to say that we are.
One of the things I thought was really interesting about the movie was the question of what a cyberwarfare doctrine might look like. It seems like, both from the movie and what we’re discussing now, that without transparency it will become extremely difficult to develop a doctrine, because if we don’t have a sense of what the lay of the land actually is, [so] we can’t develop a sense of what should be normal or what is acceptable.
I think that’s true. And it’s particularly difficult in cyber because cyberweapons are a natural outgrowth of espionage. And espionage is a peculiar kind of activity, which is accepted, even though whenever a spy or a spying machine is discovered, everyone reacts with outrage .?.?. It’s also problematic with cyber in the sense that attribution, figuring out who launched the weapon, is extremely difficult. But all the more reason why this kind of thing needs to start being discussed out in the open so that you can at least begin to reckon with some of these problems. Otherwise, the assumption, as [former National Security Agency director] Michael Hayden says in the film, is just keep it secret.
Well, you’ve made an important distinction, which is we have doctrines for war fighting, but it doesn’t seem to me that we have established doctrines about what’s completely unacceptable in espionage.
I think the limit comes, and the reason we focused on Stuxnet, is that is really the first significant time that we know of when things crossed from the cyber realm to the physical realm .?.?. It’s the attack part that I think we can focus on more aggressively, or more openly. That’s the part that we need to take stock of. Because an attack on critical infrastructure can be considered to be an act of war. . . .
It’s a little bit laughable when we get upset about the idea that the Russians are trying to interfere in American elections. Of course we should be upset about it. But we can’t act like we’ve never tried to influence elections all over the world. That’s what the CIA does on a regular basis .?.?. I do think, you know, while it pays to have some perspective about the DNC hack in terms of the United States’ past efforts in terms of influencing elections all over the world, I wouldn’t regard that as an excuse to just throw up your hands and say “Oh, what, me worry?” I don’t think it’s appropriate for the U.S. or other countries to interfere in the elections of sovereign states, and that might be a good doctrine, that might come out of this.
It seems like pop culture has sort of given the impression that we can do anything with computers, and it’s true that this stuff is precise. But it’s also not, I think, as precise and as safe as we’d like to imagine it.
We’re having this problem with drones, too. As a technology, a Predator drone is a lot more precise than a B-52 bomber. But the law isn’t very precise. And the intelligence that leads to the attack isn’t very precise. So we have to decide what that precision means. Sometimes the technical means can feel surgical, but the broader implications can be much scarier. Because when there is no thermonuclear cloud, and massive radiation when you flip a switch and the grid goes down in New York City, but if the grid stays down for a few weeks, a lot of people will die.
While we were talking about precision a minute ago, and it seems like the ability to be more precise doesn’t answer the question of whether, say, killing someone with a drone is something we should do.
It may be that slowing down Iran’s march to nuclear enrichment [via Stuxnet] was a good idea at the time. It may have prevented Israel from bombing Iran, and thereby drawing us into a war with Iran. And at the same time, slowed them down in ways that they didn’t really understand, that they blamed on themselves. All you could consider were good things.
From beef cheeks to pig feet, Richard Knight is broadening Texas's palette with unconventional, nose-to-tail cooking.
And yet we established a norm of behavior that, in the aftermath of Stuxnet, now that it’s been discovered, is not a very good norm. Because it means, as one lawyer says in the film, you can do whatever you can get away with. Well, that’s not a very positive development. So the Iranians think they can do whatever they can get away with. The Russians the same, the same with the Chinese, or for that matter, the North Koreans.
If you think about the dawn of the nuclear age, we developed doctrine by using nuclear bombs, and then worked back from that, and that’s obviously very consequential and yet, the norm of not detonating nuclear devices has held for, in some ways, a shockingly long time. Do you think we’re at a point where we could develop a doctrine back from this first use position and end up okay?
If you think about it, in some ways we’re at the very same point that we were in the years just after Hiroshima and Nagasaki. Because we dropped those bombs, but after dropping the bombs, we didn’t run around to all the countries and say “You know, this is terrible what we’ve done, we want to establish a treaty [to] now govern their use.” We just waited until it was clear that the Soviet Union had a weapon, and then the Chinese had a weapon, and suddenly, everybody had the capacity to create enormous destruction.
Well, that’s where we’re at now with cyberweapons. You can see that other powers, including the United States . . . have this destructive power, so it would be a good time to hunker down and talk about how those weapons, what rules should govern . . . those weapons before we get into a kind of a cataclysm.
Do you feel like [overclassification is a] subject where the tide is turning? The U.S. government obviously still cracks down on leakers and whistleblowers really intensely. But it does seem like it was just interesting to have [Michael Hayden’s] perspective about how much trouble classification caused for the government itself.
Even as you ask that question, there’s this enormous crackdown on leakers and huge sentences proposed for leakers. And the law used to prosecute those leakers tends to be the Espionage Act, as if leaking is akin to spying .?.?. And yet there’s an enormous amount of hypocrisy in the government about what can and can’t be secret according to who is doling out the information, and whether government officials can lie. And of course I’m referring to [Director of National Intelligence] James Clapper, when he lied to the Senate about what was going on inside the NSA, right around the time of the [Edward] Snowden revelations.
So I think the big problem is that we really haven’t reckoned with how to be more open. I think that was always a strength for us. When I was doing my film on WikiLeaks, I talked to the classification czar from the Bush administration. And he echoed Hayden’s sentiments, that everything is hideously over-classified, and in terms of the way it’s supposed to work, there is supposed to be draconian penalties for over-classifying material. But those have just sort of fallen to the wayside. So everything gets classified, and .?.?. when so much is classified and so many people have authorization to access classified materials, it’s not surprising that you’re getting leaks, because how could you not? It’s like the dam is bursting with secrets.
I wanted to ask you a technique-based question, which was about the decision both to use a composite [to represent the movie’s National Security Agency sources] and not to acknowledge that it was a composite until the end of the film. Because fairly early in my notes, I wrote down “Is this an actual person?” Obviously there are questions about folks’ identities and we were just talking about the penalties for leakers and whistleblowers, but I was curious about how that choice sort of came together.
It was not something we even considered or thought of at the beginning of the process. And it happened for a number of different reasons that all brought us to the same place. One was the obvious one, we were having difficulty getting people to talk. And when we did start talking to people inside or who had been inside, and they were starting to talk, they were very, very nervous about possible polygraphs and also how anything that they might say might come back to them.
When we started to explore this idea, our sources became increasingly interested in that possibility, because it did give them actual cover. In other words, the film presents even by the end that there are a number of people, then that’s a good thing. It helps to hide, it helps to protect the source. . . .
We wanted to create a very hacked look, and part of that hacked look was to be able to play with the idea of how much you’re showing or not showing the face, and whether the audience along the way is going to be uncomfortable and go “No, no, you’re showing too much of the face! That person’s going to be exposed.” Because that in itself causes an emotional reaction over this secrecy and the consequences of violating that secrecy.
So why not reveal from the beginning that it’s a composite?
You don’t become involved in the character. And I think that was important. I think also it didn’t end up being as interesting a comment on secrecy as it would have if it was just revealed from the beginning. And I think also the viewer would have felt too much attention would have been paid to the device, rather than the story we were telling.