Tackling disinformation is national security issue says former NSA general counsel
In this episode of Intelligence Matters, Michael Morell speaks with former NSA general counsel Glenn Gerstell about national security issues stemming from disinformation on social media. Gerstell outlines legal, technological, and policy approaches the new administration can take to combat the spread of disinformation on social media.
Highlights:
- Disinformation as national security threat: "Disinformation, whether it's foreign or domestic, is a national security threat very simply because it does one of two things. It either sows discord in our society or it undermines confidence in our democratic institutions, whether that's governmental institutions, the press, or other important societal structures. That's why it's a national security threat. What makes it effective is that disinformation falls on receptive eyes and ears. In order to believe some falsehood, you have to be predisposed to it . . . Disinformation doesn't create these societal divisions and prejudices and predispositions. They already exist. But it deepens them and hardens them."
- Legal ramifications of regulating speech: "We don't really want to hurt these tech companies or their platforms. We like Facebook and Instagram and YouTube, etc. We don't want to get rid of them or really curtail them so that they become uninteresting and not fun and boring. There are lots of legal problems with regulating speech. It sounds a little authoritarian to rule out certain kind of communications. We don't want to look like Russia or China."
- Solutions to countering spread of disinformation: "We could undertake more cyber offensive actions against Russia, China, North Korea, and Iran who are spreading disinformation from abroad. We're already doing that in large respects. We could perhaps do more. We could have an integrated disinformation center within the federal government. There's no one in the federal government who's responsible for countering disinformation. The Justice Department, the FBI, the NSA, the State Department, its global engagement center, the Federal Communications Commission, they all have a role in dealing with digital communications, but there's no one responsible. By contrast, we do have an integrated counterterrorism center in the intelligence community, and maybe we should do that and have some coherent approach to disinformation."
Download, rate and subscribe here: iTunes, Spotify and Stitcher.
"Intelligence Matters": Glenn Gerstell transcript
Producer: Paulina Smolinski
MICHAEL MORELL: Glenn, the first time we had you on the show, we focused on your career and what it really means to be the senior most attorney at the National Security Agency. It was one of our most listened to shows. I would encourage any listeners who haven't listened to it yet to go to our archives and find it. I think it is a very interesting look at what it's like to be the general counsel of one of our nation's leading intelligence agencies. I also want to remind our listeners that this episode is part of a series that we're doing between the election and the inauguration on key national security issues that are facing our country.
Today we're going to talk about disinformation as a national security threat. Our conversation is going to center around a really terrific article that you just published in The New Yorker that was titled "The National Security Case for Fixing Social Media."
I want to ask you this is: you were one of the few senior officials in the intelligence community to serve in both the Obama and the Trump administrations. I'd love for you to compare and contrast what it was like to serve in one versus the other.
GLENN GERSTELL: No surprise there was, of course, a big difference between the Obama and Trump administrations. At least from my perspective, there were three different areas that struck me. To start off, let me describe a piece of equipment to illustrate the first one. When I walked into my office the very first day at the National Security Agency, one of the assistants pointed out a fancy computer with fancy speakers and microphones. I said, 'what's that?' The assistant said, 'that's the special secure classified video conference equipment for conversations with the White House Situation Room and the Pentagon and the State Department and elsewhere within the intelligence community.' And the assistant said, 'you'll probably be using that fairly frequently, a couple of times a week perhaps'. And sure enough, I did.
We had video conferences with lawyers and policy officials, throughout the last year and a half of the of the Obama administration that I served under, debating and considering virtually every national security issue, both from a legal point of view and a policy point of view. All these issues were thoroughly discussed and vetted. The surprise was that when President Trump came into office. I'm not making a political statement, just a factual statement. The number of times I used that video conference was just a handful over a course of three years, literally a handful over three and half years. It's not because I was cut out of things. It's because the discussions weren't even occurring at that level. Change and the style of an organization starts at the top. President Obama was himself a lawyer, a constitutional law professor. He was someone who was very engaged and an avid consumer of intelligence. He was very engaged in receiving the presidential daily brief and asking follow up questions about it.
That tone starts from the top and flows down. No surprise that during the Obama administration, there was extensive discussions of all public policy issues associated with the national security sector. You could argue it was too much, that maybe things got nibbled to death and overlawyered. I certainly saw evidence of that. But things were very considered.
In the Trump administration, that simply wasn't the same case. I think a lot of decisions were made by a very small group of people very close to the president, probably a little more on instinct than on the basis of detailed analysis in the interagency organization. It just reflects a different approach from the top.
The second big change is just a recognition of the turmoil and some of the chaos over the last four years under President Trump is attributable to major personnel changes.
We had during that period four acting or confirmed directors of the office of the director of National Intelligence. I think we had an equal or greater number of secretaries of defense during that period. Unprecedented. The National Security Agency, where I worked, reports to both the secretary of defense and the director of the office of National Intelligence. Add to that mix an impeachment in which the intelligence community was caught between the executive branch on one hand and the demands of Congress on another. Claims of illegal surveillance by the president. An inspector general's report criticizing the FBI for certain surveillance procedures. An inspector general from the intelligence community who sparked the impeachment inquiry and who was himself fired by the president. A contentious renewal of the most important national security statute during that period, Section 702 of FISA. All of this, as I said, occurred in an unprecedented way in just a handful of years. It may not have had an effect on day to day operations, but it certainly had an effect and impeded our ability to do long range policy planning, strategy development that requires sustained leadership from the top. And that just wasn't present with all that turmoil.
Finally, the last difference, in some ways negates the first two that I told you about, which is the difference in style starting at the top and the turmoil of the last four years. That is substantively the work of the intelligence community continued throughout this period professionally without a break in operations. Our adversaries didn't change during that period of time, so there was no doubt about what the challenges were. Our allies didn't change during that period of time. They continued to work with us. This is really a testament to the professional men and women of the intelligence community that notwithstanding the turmoil that I described and notwithstanding the differences in the tone and style at the top, nonetheless, the intelligence community continued to do what it needed to do to keep our country safe during that period.
MICHAEL MORELL: You don't think the quality changed in any way in terms of the execution of the mission of the intelligence community?
GLENN GERSTELL: Not in a day to day sense. I do feel we lost ground or failed to make up ground on longer term strategic and policy issues to prepare for the future. But to answer your question, no, I don't think so.
MICHAEL MORELL: What advice would you give to President Elect Biden about what the intelligence community needs at this point in its history?
GLENN GERSTELL: President Elect Biden and his team are certainly very experienced and probably don't need advice from me. But they certainly fully appreciate what needs to be done with the intelligence community. There's a lot of things that are obvious to all. The intelligence community has in some respects been hollowed out with a loss of some senior talent, particularly at DNI, the office of the director of National Intelligence, and some other agencies where both political and career people have changed. It's been a little battered. It took a bit of a beating from the press, from the president, from some other quarters. There's a little bit of morale issues. The President Elect knows he needs to address those right away. He will appoint good people. He will appoint experienced people. I imagine he would probably make some gesture early on in his administration to restore the confidence of the American public in the intelligence community. I'm not sure what he'll do in that regard. But I know he'll recognize that he needs a confidence building gesture.
What I would say to the President Elect beyond that is there are some deeper structural issues that probably have been papered over for the last decade while we were diverted, rather successfully, but diverted into the counterintelligence mission.
The result is, to use an overworked analogy, the intelligence community is a vehicle. It's driving 70 miles an hour. It's really fast. It's really doing a terrific job. But there's a car in front of us that's doing 80 or 90, and the gap is increasing. By the car, I'm not referring to any one country, but an environment. The environment is developing faster than the rate at which our intelligence community is innovating and adapting. We're at a point of divergence between our very good and increasing capabilities of the intelligence community. They are very substantial. I don't want to minimize them at all.
But it's set up against a world that's growing at a faster rate in a more dangerous way, a more complex way. Technological change and political changes are outpacing our ability. I think we need a broader definition of national security. We need to focus more on open source. We need to deal with a multipolar world. Basically, I would tell the President Elect, we need to confront the reality that we have a 70-year-old intelligence architecture that has changed little from post-war years. We've built on capabilities to deal with technology, cybersecurity, counterterrorism, etc., and very successfully. But everybody agrees that if we started with a blank sheet of paper today and said, 'what are the threats to our national well-being today and how should we structure ourselves? How should we best position ourselves to respond?' We wouldn't come up with the structure we have. So the question, Mr. President Elect, is if we're not going to overturn this because we're not talking about a wholesale change, that's unrealistic. The question is, 'how do we affect the change given limited resources and the time pressure?' That's the issue.
MICHAEL MORELL: I think disinformation and how you think about that as a national security issue fits into exactly what you just said. When you think about disinformation, how do you define it? Are you looking at it just from the perspective of disinformation from our adversaries, or are you looking at it from domestic sources of disinformation as well?
GLENN GERSTELL: Disinformation is one of those things that's a little hard to define, but you know it when you see it. It's a type of incorrect or wrong information or falsehood that exists on a continuum or spectrum. At one end, disinformation is something that is both false and has an element of deliberate, intentional desire to spread a falsehood. Or if it isn't a knowing and intentional one, it's certainly a reckless disregard for the truth or veracity. In some ways, disinformation is a little bit easier to define by what it's not. It's not just misinformation. It's not just a plain error or something wrong. It's also not exaggeration or disingenuous statements or puffing or misleading something like you might see in a political campaign where one candidate mischaracterizes in a misleading way his or her opponent's voting record or something like that.
It has to have an intention to create some injury, some desire to confuse, denigrate. It's not a trivial prank. It has to be something more substantive. It certainly has an element of intent to do some wrongdoing. As to your question of foreign or domestic, it's both. Both are dangerous. They're different and yet similar. With domestic, I think we have a sense that it's more likely to be accurate or more likely to be believed because domestic people have a sense of what our hot buttons are. The foreigners may be one step removed in that regard. But it also could be more concerted. A foreign nation state could have a serious campaign, an integrated campaign, that might in its own way be every bit as devastating as domestic information.
MICHAEL MORELL: You break your article into two sections. One is an argument for why disinformation is a national security threat. The second section is what we can actually do about it. What's the argument for disinformation being a national security threat?
GLENN GERSTELL: Disinformation, whether it's foreign or domestic, is a national security threat very simply because it does one of two things. It either sows discord in our society or it undermines confidence in our democratic institution, whether that's governmental institutions, the press, or other important societal structures. That's why it's a national security threat. What makes it effective is that disinformation falls on receptive eyes and ears. In order to believe some falsehood, you have to be predisposed to it. If you or I hear something that doesn't square with our knowledge, our first instinct, our natural human instinct is to say, 'that's probably not true. That doesn't make sense. That doesn't sound like him or her. I never heard him or do that.' On the other hand, if it's something that fits in with our preconceived notion, we're all too ready to believe it. 'Go ahead and forward it on as a tweet or a Facebook post' because it sounds right to us.
Disinformation doesn't create these societal divisions and these prejudices and predispositions. They already exist. But it deepens them and hardens them. That's why it's a particular national security threat. It really does affect our national well-being. When we think of our national security, we think of missiles and submarines, attacks, et cetera. Yes, those are things we need to concern ourselves with. We spend a lot of money on them.
In this current case, disinformation is presenting true public health ramifications before us. Whether its people not believing about the efficacy of a vaccine or whether they should trust scientists to wear face masks, etc. It's having a real effect on our economy and our public health. There's no question it affects our national well-being.
MICHAEL MORELL: What progress have we made so far on this and why haven't we made more?
GLENN GERSTELL: I don't think we've made that much progress sadly. It's sadly fascinating that we're aware of the problem. We're all cognizant of disinformation on the left and right. We're frightened by it, anxious about it. But we're not doing very much about it. We're talking about it. That's the first step. That's a good sign. But we're not doing it.
There are several reasons we're not taking action, mostly because there's no one obvious solution. There's no magic cure. To use the analogy, it's like treating a disease where the doctor says, 'we could do some surgery, but there's a lot of risks to that or there are several drugs you could take. But each of them have really serious side effects.' Your first reaction is, 'I don't want to do any of those.' Obviously doing nothing is not an acceptable alternative either. We're at that stage right now, which is where we're evaluating what we can do, partly because this is new, it's complex. We like the anonymity that comes with being online. That's a feature that we like in some ways. We don't really want to hurt these tech companies or their platforms. We like Facebook and Instagram and YouTube, etc. We don't want to get rid of them or really curtail them so that they become uninteresting and not fun and boring.
There are lots of legal problems with regulating speech. It sounds a little authoritarian to rule out certain kind of communications. We don't want to look like Russia or China. It's a little hard to see the actual injury. There's no one person being hurt.
Finally, the entity that you'd expect to do something about this, namely government, through regulation or laws. Their hands are somewhat tied because of the domestic area. We don't want to tinker with free speech. And on the foreign side, we're not too sure who's creating the disinformation, given the difficulties of attribution. So government can't do much. Then it ends up in the hands of the private sector where we have trouble getting things done in that regard.
MICHAEL MORELL: This really started in 2016 to grab everybody's attention. How much of it is that the issue itself has become politicized?
GLENN GERSTELL: Very significant. I don't think we can rule out the fact that disinformation and the way to treat disinformation has become politicized. Obviously, the substance of disinformation is political. Whether it comes from the right or left. There's very significant disinformation coming from both sides. The more virulent kind seems to come from the right, which tends to dwell in the more lurid conspiracy theories and more outlandish claims. There are some academic studies that indicate why the right might be more prone to this kind of disinformation because of a feeling of aggrievement or disenfranchisement. Suffice to say, it has become political when the president himself is one of the sources of disinformation. That has a great effect. There's a couple of academic studies that say it isn't really fair to blame social media for disinformation. They certainly help carry it, transmit it, and amplify it. But the most effective disinformation comes from authority figures, elites who are believed, whether it's the president or figures leading an industry or academia or elsewhere. If they start with something that's incorrect, it's amplified by social media, and it becomes a political issue.
MICHAEL MORELL: Technology is a big piece of this too, right? You write in your piece that we've entered a world in which our national well-being depends not just on the government, but also on Facebook and the other companies through which we lead our digital lives. This is a unique situation. When we talk about counterterrorism, the government can take care of the whole thing, but this is fundamentally different.
GLENN GERSTELL: Absolutely. That's a terrific and important point. We, under the Constitution, expect our government to provide for the common defense. That's not a duty of the private sector. Yet simply due to technology, we now have changed that equation. As technology makes us adopt more and more cyber generated platforms, software, or utility, it leads to vulnerability that in turn leads to responsibility. What do I mean by that? In its heyday, if there been a mishap at General Motors, it probably would not have affected our national well-being. But today, imagine the chaos and the effect on our national well-being and our national security if Google went down for two or three days. Or if Visa or MasterCard or Facebook stopped. This would have a major effect on our country. All of a sudden now the private sector, because it has masses of data about our private sector lives, we're now at a situation where the private sector is vulnerable. Therefore, it has some responsibility for our national well-being, too. That's a very significant change that we haven't had before in our country's history.
MICHAEL MORELL: To what extent do you think the private sector understands what you and I are talking about here?
GLENN GERSTELL: They clearly understand it, but they are slow to develop solutions to address it. I think intellectually they understand that in the area of data, the private sector clearly is acquiring lots more data. For example, a few years ago there was a big hack against Marriott where passport and personal details of several hundred million people were stolen out of Marriott's database. They're certainly cognizant of the fact that they have some duty of stewardship of trust to safeguard this data. They took some actions. In this case, it wasn't wasn't sufficient. Someone was able to hack into it. I think most responsible private sector companies understand that with this adoption comes vulnerability and that they have a duty to protect it. But at what level? How much money are they going to spend on it? What are the rules? What are the standards by which we're going to apply that level of responsibility? That's where our society hasn't come together. We're still very unclear as to exactly how much duty, what liability, and what responsibility we want to impose on the private sector. That's what we're going to have to worry about over the next decade.
MICHAEL MORELL: One of the things I absolutely loved about your article is you talked about the problem, but then you actually provided some very specific recommendations for how we can deal with it. I want to talk about those. Before we get to that, one of the things you say in your article is that there's more we can do than we currently think. Walk us through what you mean by that.
GLENN GERSTELL: That point is why I wrote the article because there certainly are lots of articles about disinformation, many of which are better than mine. But what I wanted to do is not to describe the problem, but to say that we actually have some tools to deal with it. The key to this is the fact that disinformation has many components.
It's isn't sourced by one factor or one reason or comes from one particular source. The fact that it has so many components that enable it, that fomented it, amplify it, and facilitated it means that there is no one magic bullet to cure it.
So that's the bad news. But on the other hand, it equally means that we potentially have lots of tools to attack the problem. Maybe no one of them is going to solve it. But we've got lots of individual ways of getting at the problem. Like maybe terrorism or now the pandemic.
There's no one single solution, no one magic drug. But we do have some maneuvering room in the law, in technology and in the political areas. We don't need to be paralyzed by some of the concerns I talked about earlier. We can take some actions, and we can explore whether they will have an effect. Some will work, some won't. We can calibrate this. We have to have some trial and error. We'll make some improvements. The key for us over the next several years as we go down this road is to not be dogmatic about it, to be flexible. In effect, to use this analogy, to try some different drugs and treatments and see which ones work.
MICHAEL MORELL: You put your recommendations into three buckets. The first your first bucket is what you call legal.
GLENN GERSTELL: Maybe I start with that just because I'm a lawyer, so that's where I gravitate to. Obviously, that's a big part of this. We can't just outlaw disinformation, right? Under our Constitution, the First Amendment, you have a right to receive false speech, even from outside the United States.
MICHAEL MORELL: You have a right to receive, not just say?
GLENN GERSTELL: To both say and receive. Yes, you're correct. We can't just outlaw it flat out because of the First Amendment and the way it's been interpreted over the years by the Supreme Court. Having said that, I also want to be very quick to say that the First Amendment is not absolute. People think of it, 'you just can't have any law outlawing free speech.' Well, we've got tons of exceptions where we do impinge on free speech rights. It's beyond the classic one that you learn in grade school that you can't yell fire in a crowded theater. We've got laws prohibiting the filing of false police reports. We've got laws prohibiting making false drug claims, advertising for cigarettes in certain contexts, et cetera. There's lots of places where our society and the Supreme Court has decided that it is appropriate to curb some kind of speech. But this is a very, very fraught area. We don't want to look like China. We don't want to take authoritarian steps.
I think there are some narrow things that we could probably flat out make illegal, some constitutional law professors will have to debate this, but maybe we could make the intentional dissemination of fraudulent news about, say, election processes illegal. For example, you couldn't send out a tweet saying that polling places had changed from location A to location B if you knew that was wrong and you intended to deceive. There are some things we could do, it's not going to cure the problem, but it's a step.
The First Amendment doesn't apply to the private sector. We've got some more freedom there in terms of dealing with social media. We should spend a quick second on Section 230 of the Communications Decency Act. For example, maybe make it illegal to micro target ads and content on the basis of race or other inappropriate categories so that your feed in Facebook or YouTube doesn't necessarily reflect something improper.
Section 230 of the Communications Decency Act, I'll summarize it by saying that it's a statute that was enacted at the dawn of the Internet age in 1996. It says that if you're an Internet operator or a provider, you're not liable for the content on it and you're also not liable for taking down content.
For example, YouTube isn't responsible if someone puts up a bad video. You can't sue YouTube. If they decide to take it down, you can't sue Youtube for taking it down. That made sense in 1996 when we wanted to encourage the growth of the Internet, and we didn't want them hobbled by litigation and lawsuits. Now it certainly doesn't make sense to have these social media platforms that have tremendous sway over our public lives and public opinion and have them be utterly unaccountable. It is something that is an anachronism that we need to fix. There are ways that we could go about amending Section 230. I am not in favor of repealing it, but there are ways we could tinker with it to balance the competing interests.
MICHAEL MORELL: Is social media handled differently than traditional media in that regard?
GLENN GERSTELL: It is. It grows out of this disparity that the Internet grew up after we already had well-established laws for broadcast journalism and print. It occupies a separate category. In a way, they are not subject to a lot of the rules that print media are subject to.
For example, if you run a political advertisement on the radio or TV, it has to conform to certain rules. The candidate at the end has to say, 'I'm so-and-so and I approve this ad.' You run that exact same ad on Facebook or YouTube and that is not required. How does that make sense because they are so blurred right now.
MICHAEL MORELL: The second bucket that you had in your article, you titled Technology. Talk about that.
GLENN GERSTELL: The technology which creates this problem can also help solve the problem in three areas. One, we can do something structurally. We can make posts and tweets that contain disinformation less viral. We can label them. We can remove inauthentic accounts. We can disclose algorithms that provide the feed potentially some misleading information. Those are all structural things. To be fair, social media in the last year has greatly expanded its use of technology in this regard. We could argue that it should have done it four or five or six years ago. But they're now moving along to use technology to help address the problem, not solve it, but address it.
Content policing. We can use artificial intelligence to catch the bad guys, the people spewing disinformation and labeling it. We can have better communication. The intelligence community could share more information about foreign disinformation. Social media itself could share more information consistent with antitrust rules. There's a lot we can do with technology, and we're starting that process, but there is still more to do.
MICHAEL MORELL: Then you have the Other bucket.
GLENN GERSTELL: There are many steps we can take that don't fall under regulation or technology. They are a whole series of policy choices. They range from having stronger international treaties that would make each country agree that they will not export disinformation or at least certain kinds of disinformation. They won't interfere with someone else's election. We could have international agreements that would help bring to justice perpetrators, hackers, and other people who spread deliberate misinformation.
We could undertake more cyber offensive actions against Russia, China, North Korea, Iran, that are spreading disinformation from abroad. We're already doing that in large respects. We could perhaps do more.
We could have an integrated disinformation center within the federal government. There's no one in the federal government who's responsible for countering disinformation. The Justice Department, the FBI, the NSA, the State Department, its global engagement center, the Federal Communications Commission, they all have a role in dealing with digital communications, but there's no one responsible. By contrast, we do have an integrated counterterrorism center in the intelligence community, and maybe we should do that and have some coherent approach to disinformation.
MICHAEL MORELL: Since we're talking about both foreign and domestic. Maybe the intelligence community isn't the right place. Where would you put it?
GLENN GERSTELL: As a starting place, with the Department of Homeland Security. I think you probably don't want it with the Department of Defense or the intelligence community for obvious reasons. You could arguably also see it in the Justice Department. That, to me, seems a very legalistic approach. I'd probably vote for either an independent place in the White House and have it be coordinated out of the White House with some senior level appointment, or Homeland Security.
The final piece is a civic education. I'm a big proponent of civic education to counter disinformation. 20 percent of our states have no requirement that in elementary school or high school you learn anything about how government works. Only nine states and the District of Columbia have even a one year requirement. When you don't know how government works and you don't know what the purpose of checks and balances are, then it's easier to think that, 'all the politicians are crooks' or 'it's OK to ignore judicial rulings,' or 'it doesn't make a difference what the three branches of government are.'
MICHAEL MORELL: Let's just assume for a second that President-Elect Biden read your New Yorker article. What would you want to say to him about what we need to do?
GLENN GERSTELL: I would tell the President-Elect that I'm honored and flattered to be in a position to talk to him about what is really an extraordinary problem. I would say that you're facing two extraordinary problems that are arguably the greatest national crisis we face since World War II, the pandemic and the economic effects of it. To exaggerate just a little to make the point, disinformation and the consequences of it is just as big a problem as those first two. Why? Because unless we start addressing disinformation, we're not going to be able to fix the first two problems. People won't take vaccines if they believe disinformation. They won't wear masks. They won't practice social distancing. Half the country thinks the incoming president wasn't even properly elected. We're not going to make progress if our citizens don't have some basic level of trust in our government to solve those problems.
Disinformation is difficult, there's no one thing that has to be fixed, but it doesn't mean it shouldn't be addressed as a matter of national urgency. I would say, Mr. President-Elect, taking some concerted steps to counter disinformation will powerfully help our national well-being.
MICHAEL MORELL: Glenn, thank you so much for joining us. People can find your article in The New Yorker, and it's a great read. I agree with you that this is an extraordinarily important issue.