Back to Articles

How New Knowledge Is Fighting Against Midterm Election Disinformation

Yonder

|

Oct 20, 2018

image of new knowledge Midterms Disinformation Dashboard

Elections are weeks away. It was just two years ago when Russia meddled in our 2016 presidential election, and we feel strongly that we have to do something to prevent that from happening again.

New Knowledge is leveraging its disinformation defense technology to proactively track information warfare in the 2018 midterm elections and accounts that could jeopardize and delegitimize the American electoral process, promote domestic discord, and advance the foreign policy interests of Russia and their allies. We are not only detecting, however. We are reporting and sharing this activity publicly, in real-time, on our midterms disinformation dashboard, Disinfo 2018.

What is Disinfo 2018?

Disinfo2018.com will serve as a public facing hub for all discoveries and reports of information warfare interfering in the upcoming midterm elections. Dashboards and blog posts will keep readers up to date in real time with the accounts we are tracking and their activity.

Our dashboard contains up-to-the-hour summary statistics from these accounts, so that readers can be aware of the foreign propaganda efforts aimed at American voters as the clock ticks toward November 6.

How to read the dashboard

The interactive dashboards are actively tracking disinformation networks targeting US audiences. This includes a breakdown of posts by day, top hashtags, and top keywords based on what these networks have amplified.

It is important to remember that not every message communicated by the monitored accounts originate with those accounts. In fact, it is rare that a narrative or topic originates with these accounts at all. It is far more likely that the emergent trends in the dashboard are amplifications of messages that suit Russian interests.

While this dashboard is intended to be used as a go-to resource for monitoring Russian activity as related to the election, there are a few things worth noting:

  • Not every message communicated by the monitored account originated with those accounts. It is far more likely that the emergent trends in the dashboard reflect the messages that suit Russian interests, for example amplifying existing divisive issues in American social and political discourse. It also means that it is almost always incorrect to simply identify a trending topic on this dashboard and attribute that topic to Russian origins.
  • Not all content being pushed out by these accounts will be obviously malicious. This includes content like “#mondaymotivation”, spam, and apolitical conversations. Some of this content is meant to grant the account an additional level of legitimacy, some of it is meant to rope other users into engaging with or following the account, and some of it is meant to co-opt a trending topic or hashtag in order to enhance account visibility and potentially influence the discussion around those trending topic discussions. Rather than filter out such content, we include it in the dashboard, so that users can be aware of the seemingly harmless places online where Russian operations take place.
  • Finally, it is critical to remember that just because a website is linked by accounts in this dashboard, that does not mean that that website is a Russian propaganda outlet. Russian operations frequently make use of mainstream and extremist content, both to mask their identity and to selectively amplify existing messages that suit their interests. In particular, we have seen far-right and far-left partisan sites in the US heavily amplified as part of Russian operations. Just because a website, a message, or even a re-posted/retweeted account suits Russian interests at a particular moment does not mean it is a participant in the Russian operation.

Dashboard graphs

Because no analysis is 100% certain, we exercise and encourage care when making broad claims from this dashboard. It provides great insights into activity online, and is best used as a launching point for more detailed investigations or a cautionary filter through which to read social-media trends, not a definitive one-stop-shop for Russian operation monitoring.

Why we are doing this?

It is our company mission to protect public discourse. That means for brands, companies, organizations, and our democratic processes. It is important to us to be able to leverage the technology we have created to protect our information ecosystem and give the public visibility into political influence operations. These operations and the social-media accounts that participate in them represent threats to public discourse and democracy in the United States.

You can access the dashboard at disinfo2018.com.

See also:

More like this:

Group 4 CopyGroup 4 Copyicon_artificial_intelligenceUntitled 3$icon_counter_messagingicon_dedicated_analysticon_dialogGroup 9Group 4Group 7Linkedin Copyicon_live_alertsicon_personalized_dashboardGroup 2Group 13icon_data_scientistsicon_visual_scanningTwitter CopyGroup