🤖 AI Summary
This study identifies systemic inconsistencies between Instagram’s official reporting under the EU’s Digital Services Act (DSA) and its actual content moderation practices and platform governance operations. Method: We propose a multi-layered consistency analysis framework that dynamically evaluates platforms within interconnected digital ecosystems, integrating data cross-verification with qualitative comparative analysis to overcome limitations of traditional isolated procedural audits. Contribution/Results: The study provides the first empirical evidence of Instagram’s compliance gaps across risk disclosure, algorithmic transparency, and moderation efficacy—demonstrating the framework’s effectiveness in early-stage compliance risk identification. The framework yields an actionable audit tool for DSA enforcement, advancing platform governance assessment from formal compliance toward substantive consistency.
📝 Abstract
The Digital Services Act (DSA) introduces harmonized rules for content moderation and platform governance in the European Union, mandating robust compliance mechanisms, particularly for very large online platforms and search engines. This study examined compliance with DSA requirements, focusing on Instagram as a case study. We develop and apply a multi-level consistency framework to evaluate DSA compliance. Our findings contribute to the broader discussion on empirically-based regulation, providing insight into how researchers, regulators, auditors and platforms can better utilize DSA mechanisms to improve reporting and enforcement quality and accountability. This work underscores that consistency can help detect potential compliance failures. It also demonstrates that platforms should be evaluated as part of an interconnected ecosystem rather than through isolated processes, which is crucial for effective compliance evaluation under the DSA.