This briefing document synthesizes the joint report from the Ukrainian Centre for Strategic Communications and the NATO Strategic Communications Centre of Excellence. It outlines a structured, multi-source framework designed to attribute responsibility for Russian Information Influence Operations (IIOs) targeting Ukraine and its neighbors.

Executive Summary

The attribution of Information Influence Operations is a critical component of democratic resilience, enabling decision-makers to hold malign actors accountable and justify proportional responses ranging from public exposure to legal action. Unlike cyber attribution, which relies heavily on technical signatures, IIO attribution requires the convergence of technical, behavioral, and contextual evidence. 1771189586869-compressed1771189586869-compressed.pdf1 MB.a{fill:none;stroke:currentColor;stroke-linecap:round;stroke-linejoin:round;stroke-width:1.5px;}download-circle Critical Takeaways:

  • The IIAF Framework: Attribution is built upon three pillars: Technical (digital traces), Behavioral (tactics and procedures), and Contextual (narratives and timing), supported by a legal and ethical assessment.- Convergence is Mandatory: No single line of evidence is sufficient. High-confidence attribution (≥80%) requires independent indicators from at least two categories.- The Spectrum of Responsibility: Attribution must distinguish between different levels of state involvement, from “state-encouraged” to “state-integrated,” to calibrate diplomatic and legal responses.- Narrative Laundering: Russian operations utilize a sophisticated three-stage process (Placement, Layering, and Integration) to obscure the origins of fabricated stories and grant them synthetic legitimacy.- Policy Context: Increasing regulatory standards, such as the EU’s Digital Services Act (DSA) and Foreign Information Manipulation and Interference (FIMI) framework, are raising the evidential threshold required to withstand “lawfare” and litigation from Russian-linked entities.

The Information Influence Attribution Framework (IIAF)

The IIAF provides a systematic approach to identifying the sources of manipulation. Evidence is categorized by its nature and the type of source (Open, Proprietary, or Classified).

Evidence Categories

Evidence Type

Primary Focus

Examples of Data Points

Technical

Digital traces and infrastructure metadata

IP addresses, WHOIS records, SSL certificates, platform engagement rates (ERR).

Behavioral

Tactics, Techniques, and Procedures (TTPs)

Coordinated inauthentic behavior (CIB), cross-posting intervals, source-laundering.

Contextual

Content, timing, and geopolitical environment

Narrative alignment with state goals, temporal spikes around political events, linguistic markers.

Technical Evidence: The Digital Foundation

Technical analysis provides objective, machine-readable data that reveals how operations are built and sustained.

  • Digital Infrastructure: Analysts trace domain names, hosting services, and DNS records. For example, the domain fondfbr.ru (linked to Yevgeny Prigozhin) was identified using WHOIS data showing registration via REG.RU—a registrar favored for avoiding Western takedowns—and the use of identity-anonymizing SSL certificates from Let’s Encrypt.- Platforms and Networks: Metadata from tools like TGStat can uncover artificial view inflation. A pro-Kremlin channel, @yurasumy, was found to have a 55% Engagement Rate by Reach (ERR), which is considered an anomaly for a channel with 3 million subscribers, suggesting bot involvement.- Circumvention Tactics: Following EU sanctions on RT and Sputnik, technical analysis identified “workaround” domains (e.g., actualidad-rt.com) through shared Google Analytics Tracking IDs (UA codes) and identical nameservers (ns1.rttv.ru).

Behavioral Evidence: Identifying Operational Logic

Behavioral analysis examines how messages are disseminated rather than just what is said.

  • Coordinated Inauthentic Behavior (CIB): Analysts look for near-simultaneous posting. In a case involving a fabricated clash between Georgian and Ukrainian soldiers, 17 Kremlin-linked outlets published the story nearly simultaneously. Sequencing anomalies showed Tass.ru published the story before its alleged Telegram source, indicating central coordination.- The DISARM Framework: This system catalogs approximately 391 specific behaviors (TTPs). In the “Polish annexation” narrative, DISARM mapping identified:T0086.003: Deceptively Editing Images (creating “cheap fakes” of billboards).- T0097.202: News Outlet Persona (impersonating the BBC logo).- T0101: Creating Localized Content (distributing the narrative in Russian, French, Italian, and Turkish).

Contextual Evidence: Narratives and Strategic Alignment

Contextual analysis interprets the “why” behind an operation.

  • Narrative Laundering: This Soviet-era strategy obscures origins via three steps: Placement: Seeding a fabricated story (e.g., the Olena Zelenska Cartier purchase video) in a private account.- Layering: Repetition via mixed outlets and inactive YouTube accounts.- Integration: Amplification by state media to reach mainstream audiences.
  • Temporal Analysis: Influence operations often spike around geopolitical milestones. Anti-mobilization campaigns on TikTok were precisely timed to the expiration of President Zelenskyy’s constitutional mandate in May 2024 to exploit domestic political tensions.

Assessing Confidence and State Responsibility

Because attribution is rarely 100% certain, the IIAF utilizes probability scales and a spectrum of state involvement.

Confidence Intervals

Analysts use standardized language to communicate uncertainty and prevent human error.

Numeric Range

Qualitative Scale

Analytical Meaning

80–100%

High Confidence

Almost certain; completely reliable; confirmed.

60–79%

Medium/High

Likely; probable; reliable.

40–59%

Moderate

Even chance; roughly even; possibly true.

20–39%

Low Confidence

Unlikely; improbable; doubtful.

The Spectrum of State Responsibility

Determining the level of state involvement is essential for calibrating policy responses.

  1. State-Ignored: The state is aware but takes no official action.2. State-Encouraged: Controlled by third parties but encouraged as a matter of policy.3. State-Shaped: Third-party control with informal coordination or support (e.g., attending the same events).4. State-Coordinated: The state coordinates third-party actors, offering technical/tactical assistance covertly.5. State-Ordered: Use of third-party proxies under direct command and control.6. State-Executed/Integrated: Operations conducted directly by government staff using state infrastructure.

Case Study: Corruption Narrative Campaign (2023)

A deep-dive analysis into an operation targeting Ukrainian corruption illustrates the integration of the IIAF pillars:

  • Technical Evidence: Monitoring flagged 462 Russian-affiliated sources and 223 bots. Multiple channels reposted identical content within a 1-to-3 minute window, indicating automated scheduling.- Behavioral Evidence: The “Digital Army of Russia” Telegram channel provided multilingual comment templates for accounts to flood the pages of legitimate media outlets like TSN and Hromadske.- Contextual Evidence: Spikes in messaging followed President Zelenskyy’s visit to the U.S. in December 2023. The narratives mirrored long-standing Kremlin tropes designed to fuel fatigue in Western donor countries.- Final Assessment: Based on the convergence of these indicators, analysts concluded with High Confidence (>80%) that the operation was State-Shaped to State-Coordinated by the Russian Federation.

Conclusions and Recommendations

Attribution of Information Influence Operations is distinct from cyber attribution; it is built on open-source data of variable quality and relies heavily on the interpretation of patterns rather than static signatures.

Improving Attribution Practice

  • Standardized Reporting: Organizations should adopt shared confidence scales and explicitly document evidential gaps to remain resilient against legal challenges (“lawfare”).- Enhanced Data Access: There is a critical need for secure, vetted mechanisms to share proprietary platform data (telemetry) and ad-tech records with researchers.- Refined Language: Analysts must use precise terms (e.g., “state-shaped” vs. “state-ordered”) to improve accountability and support targeted sanctions.- Anticipatory Analysis: Instead of purely retrospective reports, institutions should establish baseline tracking of sensitive themes to detect anomalous spikes and narrative laundering in real-time.- Capacity Building: Training and legal guidance should be provided to civil society and journalists, who often provide the bulk of open-source evidence but lack standardized toolsets.