Russia’s Campaign to Help Trump Win Was Just the Start

And the next attack on US elections, warns former FBI agent Clint Watts, could come from within.

Mother Jones illustration; Getty Images

Fight disinformation: Sign up for the free Mother Jones Daily newsletter and follow the news that matters.

Former FBI special agent Clint Watts was tracking ISIS terrorists and their propaganda on Twitter in 2014 when he first encountered a different kind of troll. These accounts weren’t trying to recruit fighters for jihad. They were promoting an “Alaska Back to Russia” petition on WhiteHouse.gov, pushing pro-Kremlin foreign policy views, and drumming up support for Syrian President Bashar al-Assad.

Watching this troll army inundate social media into 2015 and 2016—including rising attacks on Hillary Clinton and promotion of Donald Trump for president—Watts realized a new information war was underway. As he tracked false news stories from Russian state media that were repeated by the Trump campaign, he was surprised to see that Kremlin-linked disinformation was sometimes even driving the campaign’s own narrative. Two days before Election Day, Watts and his fellow cybersecurity analysts JM Berger and Andrew Weisburd warned that the Kremlin wasn’t just backing Trump but was seeking “to produce a divided electorate and a president with no clear mandate to govern. The ultimate objective is to diminish and tarnish American democracy.” 

In the aftermath, as lawmakers struggled to contend with Russia’s role, the Senate Intelligence Committee relied on Watts’ expertise to help it understand the attack across social media networks. In his new book, Messing With the Enemy: Surviving in a Social Media World of Hackers, Terrorists, Russians, and Fake News, Watts details how Americans found themselves in a presidential election that was swirling with fake accounts and Kremlin propaganda. His work with Berger and Weisburd has also aided the Alliance for Securing Democracy’s Hamilton 68 Dashboard, which tracks a network of hundreds of Twitter accounts pushing Kremlin propaganda to this day. [Editor’s note: ASD discontinued the dashboard in 2018. The group has since been criticized for refusing to disclose specifics, including which accounts were included and which if any were directly Kremlin-linked.]

Watts spoke to Mother Jones recently about Putin’s backing of Trump in 2016, Twitter’s bot problems, and other US tech giants’ role in the morass. And he offered a roadmap for navigating the even more sophisticated influence campaigns that may be looming for future elections—and that could originate within the US political system itself.  

(This interview has been lightly edited for length and clarity.)

Mother Jones: You were among the first to start raising the flag about Kremlin influence operations on social media. Why didn’t anyone take your analysis seriously back then?

Clint Watts: ISIS. In the context of the Russia stuff, we forget that everyone was talking about the Islamic State. Every time you talked about Russian influence, people immediately said, “Who cares? How do we know this is making any difference? Can you just get back to ISIS?” No one believed that the [Russian] accounts were having an impact on how people think.

MJ: How worried should we be about Russian interference in this year’s midterm elections?

CW: I’m not actually worried much about the Russians because I think the question ultimately comes down to, what do they want? I think the Russians had something they clearly wanted from Brexit to Germany. What do they want right now? There’s a lot of hand-wringing, and I think that’s good for our country to get prepared for this kind of thing. But I don’t know what the Russians want in the midterms, and I don’t know that they know, either. The one thing I’ve learned is when you see the Russians begin a hacking campaign, there’s something that they want. For example, the Olympic doping scandal. They wanted to reset the agenda about that. Whenever they start hacking, as they did in Sweden recently, they have a deliberate objective to go after: “See, the Swedes are doping, too.”

MJ: Are there ways to inoculate the public against disinformation campaigns?

CW: It’s increasingly hard because our government officials lie so much. We used to be able to count on government information being a baseline of truth, or at least being pretty darn accurate. I don’t know that we can do that now. You’ve got to have data that everyone trusts. Instead, we’re seeing political sides argue about the veracity of the government’s data. That makes the problem way, way worse.

MJ: How should we be preparing for future influence operations?

CW: We know a lot from Robert Mueller’s indictment and from the Obama administration’s Russia sanctions at the end of 2016. But we are incredibly stuck on 2016.  Russia is not going to be the biggest player in this space. Russia kicked off the tidal wave, but now they just ride it. There have been a lot of authoritarians who’ve adopted their approach, with more devastating effect on their domestic populations. Cambodia, the Philippines, and Myanmar are three great examples.

And then there’s what I call “trolling as a service.” If our politics take on this information annihilation approach, we are in real trouble. The people who will win in that space will be those who can acquire the best technology for data aggregation and data harvesting—like what we’ve seen from the Cambridge Analytica whistleblower, but combined with strong machine learning to rapidly test themes and narratives on an audience to identify what to deliver to a political candidate. That’s the real fear everyone should look at: not Russian active measures, but American active measures through the hiring of cutouts, contractors, and tech companies.

MJ: You question Twitter’s accounting of 3,814 trolls and 50,000 bots linked to the Internet Research Agency, saying that’s likely a fraction of the Kremlin accounts still active on the platform. Why do you think there are so many more?

CW: I imagine that there’s some massive number still out there, and that we’ll probably never know the full scale of it because Twitter has a very limited ability to pick up on this. It’s really hard for them to tell. It was an open platform from the beginning, and they had very little to go on in terms of identity markers, whereas Facebook and Google have a lot more that they know about you. They can police it much better. And that’s evident in Facebook saying they shut down hundreds of millions of fake accounts this year. They have better systems for figuring it out, because they know more about you.

MJ: Should the tech companies be sharing more information with the government, researchers, and users to counter disinformation campaigns?  

CW: The government and social media companies need to decide what they can share to help each other out. For example, the government could maybe have helped the social media companies by saying, “These are accounts we know are foreign adversaries that are related to espionage, and these are terrorists that we know.” And then the companies could go through their systems and build out a signature for what those accounts look like, and correlate that with bad activity they’re already seeing. Just knowing what one positive looks like helps you find a lot more positives. And then that allows them to use their machine learning and AI to detect it quicker. I think the social media companies are at a deficit, because they don’t know who is a Russian agent on their platforms or what a troll farm is when those entities first start their accounts. If you’re reacting like that—I call them zero days from manipulation—you’ve got to figure out what they look like. Once you figure it out, you can go search for more like them.

That’s kind of how we got onto it. We [Watts, Berger, and Weisburd] had worked on projects for years where we were looking at extremists. The Russians had a totally different signature. Once you knew, you could look for more. The social media companies do this, but they’re not oriented around manipulators. They’re oriented around hackers or criminals, so they don’t have a good signature for manipulators.

MJ: The Hamilton 68 dashboard built on some of the work you, Berger, and Weisburd did to track pro-Kremlin accounts on Twitter. How has that dashboard been used since it launched in August 2017?

CW: We were trying to create a dashboard because to just watch endless Twitter streams, it was very hard to do anything with that. The interesting thing about how the public has reacted to that dashboard is they’ve gotten hyper-focused on accounts. “Is this account a Russian troll or not?”—they love this sort of debate. I’m more interested in what the Russian influence effort is trying to communicate to the audience. The dashboard is a summary view. People are always asking, “Show me the accounts.” Number one: They’re trying to not look like Russian trolls. Number two: Until I get a Russian troll farm operator who comes out and discloses everything, I don’t know 100 percent for sure. What I really want to know is: Who is pushing overt, state-sponsored propaganda—RT, Sputnik news, some of the fringe outlets—and what audience in America are they trying to connect with? That helps me understand what the motivations are in terms of Putin’s regime and who they’re trying to gain ground with, and I can start to assess that in the US context.

We’ve had trouble with journalists jumping to conclusions on a hashtag spiking for 20 minutes—that means nothing. An influence account wants to look and talk like you. It’s like a salesman who goes into a pitch: You start off with rapport-building by mirroring who you’re trying to sell something to. You don’t enter into an audience and go, “Vladimir Putin is right!” What you do is you push hashtags or topics, mostly by retweeting or commenting on the audience’s tweets. And then the URLs are the influence. So, why would you see a Trump-supporting audience hashtag at the top, but the lead URL is RT? How do you explain that? The dashboard is the question, not the answer.

MJ: You write that government resources need to be allocated to effectively track and counter ongoing influence operations. What should that look like?

CW: For the entire government, it should be one effort. It should probably sit with the Director of National Intelligence in its Open Source Center. The dashboard was a minimum effort. This would be looking at what the full Kremlin package of disinfo is in all languages around the world, and what they’re pushing. You can’t do anything until you understand the battlefield. We would say this in real warfare: Do a terrain analysis. What does the info terrain look like? That should be the primary effort.

From there, it should be looking at who the disinformation influence agents are. This is different from how people think of agents in spy work. Who are the unwitting supporters, who maybe don’t realize it? And who are the witting supporters who think ideologically like the Kremlin? This is like Marine Le Pen in France, who will meet with Putin and has supporting media and info channels that support that agenda and re-amplify what the Kremlin puts out. And then, who do we think are potential directed agents, who are being deployed by the Kremlin around the world to do information and influence operations? A great example is Montenegro. The closer you get to Moscow, the more you see how blended this is with real, physical provocations—real reporters who are overtly Russian and say, “I’m just a proud Russian who runs an independent news outlet in the Balkans.” Those are easier to identify, and I would build it out from there. You could come up with a framework like they had with the US Information Agency in the early 1990s. This is what we could provide to social media companies so they know and understand what they’re encountering.

The third stage would be: Can you understand the audiences they’re connecting with? And can you communicate to those audiences and make sure they know that this is coming out of the Kremlin? A lot of times that did not happen in the US going into election 2016. I would see audiences just repeating things, and I would know that this did not start here. I think Americans are a little smarter about it now.

MJ: What did you think about Trump’s recent elimination of the cybersecurity coordinator position on the National Security Council?

CW: It shows that our president does not get it. If a company’s email system takes a cyber hit, his reaction has been, “Then I guess you need to have runners”—like a pony express or something. Do you not understand how our world works? I watched the [May 17] NATO meeting, and he was obsessed with defense spending. Is all that defense spending going toward cyber? If not, who cares? All of us got beat on the cyber front, whether it was hacking or influence, over the last three or four years. So, who is in charge of that, where are the resources and what’s the plan?

Some of the people at the very top in the national security staff don’t really get the era that we’re in. And we’re tangling with two adversaries right now—North Korea and Iran—that have shown outsized weight in cyberattacks on the US.

MJ: What about the tech companies? Facebook and Twitter have both said they will make ads more transparent. Facebook is partnering with the Atlantic Council to track disinformation. Are the companies doing enough to fight disinformation on their platforms?

CW: I’m actually more positive about Facebook than the public is. Look, all of them missed and they all did badly. But Facebook has mobilized a lot of resources. They were the first to start taking down accounts in advance of the French election. I think they get it, and they’re worried about the effect on their platform. Google is a weird animal, but YouTube is a concern. It is the hosting layer of disinfo campaigns, foreign or domestic. I don’t have a solid understanding of what they’re doing.

With Twitter, I thought their most positive step to date was when they started saying, “We found a way to police communities so we can downgrade bad behavior.” But by and large, I’m frustrated with Twitter’s slow reaction time, dragging their feet and refusing to deal with the anonymity vs. authenticity problem on their platform. Instead of extending the blue checkmark, they made it harder. Why not verify as many people as you can? Now that I have Russian trolls all over me all the time, I don’t pay attention to people who aren’t verified because I can’t tell if they’re real, and I don’t want to waste time paying attention to conversations from fake people. If people want to be verified, verify them. This will clear up a lot of the problem if everybody goes to verified accounts and ignores all the anonymous ones. The bots will die on their own. It becomes like spam.

MJ: Are there lessons from the fight against terrorism on social media that could be applied to the battle against disinformation?

CW: Shutdowns do have an effect. We saw that with ISIS. If you keep shutting them down, they will go away. This is where people confuse hacking and influence as being the same. Hackers will come back under another persona or another form. This is not the case with influence. Once you win an audience, you want to keep it. If you keep turning off their channel, it’s hard to build audience back up again. This is what we saw going back to al-Shabaab [militants in Somalia]. If you turn Shabaab off, and they violate terms of service and you shut them off again, and you shut them off again, eventually they stop coming back because they don’t know what channel to come to. This is important on YouTube, Facebook, or Twitter. This works with large disinfo accounts like [Kremlin-linked] TEN_GOP, or even with bad behavior, like Roger Stone. People have heard a lot less about Roger Stone since he’s been off Twitter.

Another thing in terms of disinfo I’ve always wanted to do is the information ratings agency approach. Do sweeps with a ratings period—like they do in television—for information outlets. Don’t tell them when. They get a score and then for the year, until they get assessed again, they have that icon next to their feed in social media or in a search engine. If everybody did that, unrated outlets aren’t restricted from writing their disinformation, but fewer people are going to click on it. And those that perform well are more likely to get subscriptions. Paid services and ad revenue would go up. So, it rewards performance.

MJ: Do you think the public gets it—how information has been manipulated on social media?

CW: Certain segments are definitely smarter to it. It’s interesting that the extreme left and the extreme right have so much agreement that Russia did not influence the election and it’s all made up. One of them is from the extreme-transparency audience, and the other is from the extreme Trump-supporting audience. It’s interesting that those are the audiences Russia has tried to engage.

You’re seeing some withdrawal from information sources altogether because people don’t know what to believe. I worry about that. They’ve suffered information annihilation, and they’re tired. This is what happens in Russia. It leads to political apathy.

WE'LL BE BLUNT

It is astonishingly hard keeping a newsroom afloat these days, and we need to raise $253,000 in online donations quickly, by October 7.

The short of it: Last year, we had to cut $1 million from our budget so we could have any chance of breaking even by the time our fiscal year ended in June. And despite a huge rally from so many of you leading up to the deadline, we still came up a bit short on the whole. We can’t let that happen again. We have no wiggle room to begin with, and now we have a hole to dig out of.

Readers also told us to just give it to you straight when we need to ask for your support, and seeing how matter-of-factly explaining our inner workings, our challenges and finances, can bring more of you in has been a real silver lining. So our online membership lead, Brian, lays it all out for you in his personal, insider account (that literally puts his skin in the game!) of how urgent things are right now.

The upshot: Being able to rally $253,000 in donations over these next few weeks is vitally important simply because it is the number that keeps us right on track, helping make sure we don't end up with a bigger gap than can be filled again, helping us avoid any significant (and knowable) cash-flow crunches for now. We used to be more nonchalant about coming up short this time of year, thinking we can make it by the time June rolls around. Not anymore.

Because the in-depth journalism on underreported beats and unique perspectives on the daily news you turn to Mother Jones for is only possible because readers fund us. Corporations and powerful people with deep pockets will never sustain the type of journalism we exist to do. The only investors who won’t let independent, investigative journalism down are the people who actually care about its future—you.

And we need readers to show up for us big time—again.

Getting just 10 percent of the people who care enough about our work to be reading this blurb to part with a few bucks would be utterly transformative for us, and that's very much what we need to keep charging hard in this financially uncertain, high-stakes year.

If you can right now, please support the journalism you get from Mother Jones with a donation at whatever amount works for you. And please do it now, before you move on to whatever you're about to do next and think maybe you'll get to it later, because every gift matters and we really need to see a strong response if we're going to raise the $253,000 we need in less than three weeks.

payment methods

WE'LL BE BLUNT

It is astonishingly hard keeping a newsroom afloat these days, and we need to raise $253,000 in online donations quickly, by October 7.

The short of it: Last year, we had to cut $1 million from our budget so we could have any chance of breaking even by the time our fiscal year ended in June. And despite a huge rally from so many of you leading up to the deadline, we still came up a bit short on the whole. We can’t let that happen again. We have no wiggle room to begin with, and now we have a hole to dig out of.

Readers also told us to just give it to you straight when we need to ask for your support, and seeing how matter-of-factly explaining our inner workings, our challenges and finances, can bring more of you in has been a real silver lining. So our online membership lead, Brian, lays it all out for you in his personal, insider account (that literally puts his skin in the game!) of how urgent things are right now.

The upshot: Being able to rally $253,000 in donations over these next few weeks is vitally important simply because it is the number that keeps us right on track, helping make sure we don't end up with a bigger gap than can be filled again, helping us avoid any significant (and knowable) cash-flow crunches for now. We used to be more nonchalant about coming up short this time of year, thinking we can make it by the time June rolls around. Not anymore.

Because the in-depth journalism on underreported beats and unique perspectives on the daily news you turn to Mother Jones for is only possible because readers fund us. Corporations and powerful people with deep pockets will never sustain the type of journalism we exist to do. The only investors who won’t let independent, investigative journalism down are the people who actually care about its future—you.

And we need readers to show up for us big time—again.

Getting just 10 percent of the people who care enough about our work to be reading this blurb to part with a few bucks would be utterly transformative for us, and that's very much what we need to keep charging hard in this financially uncertain, high-stakes year.

If you can right now, please support the journalism you get from Mother Jones with a donation at whatever amount works for you. And please do it now, before you move on to whatever you're about to do next and think maybe you'll get to it later, because every gift matters and we really need to see a strong response if we're going to raise the $253,000 we need in less than three weeks.

payment methods

We Recommend

Latest

Sign up for our free newsletter

Subscribe to the Mother Jones Daily to have our top stories delivered directly to your inbox.

Get our award-winning magazine

Save big on a full year of investigations, ideas, and insights.

Subscribe

Support our journalism

Help Mother Jones' reporters dig deep with a tax-deductible donation.

Donate