🤖 AI Summary
Existing research lacks a systematic framework covering the full lifecycle of technology-facilitated abuse (TFA), resulting in uneven institutional coverage and limited accessibility of support resources. Method: This study develops the first literature-derived, unified TFA taxonomy and conducts a large-scale zero-shot thematic audit of 52,605 webpages across 306 Australian institutional websites, integrating sentiment analysis and Flesch readability scoring. Contribution/Results: By innovatively combining zero-shot classification with web content auditing, we achieve the first cross-type, cross-path quantification of institutional response breadth. Findings reveal that only 30% of TFA types receive broad coverage; 70% of content addresses harassment and sexual abuse, while critical forms—such as covert surveillance and economic control—are covered in under 1% of resources. Moreover, most materials exhibit excessively high readability scores, impeding comprehension and help-seeking among high-risk populations. The proposed framework enables cross-national benchmarking and evidence-informed policy optimization.
📝 Abstract
Technology-Facilitated Abuse (TFA) encompasses a broad and rapidly evolving set of behaviours in which digital systems are used to harass, monitor, threaten, or control individuals. Although prior research has documented many forms of TFA, there is no consolidated framework for understanding how abuse types, prevention measures, detection mechanisms, and support pathways relate across the abuse life cycle. This paper contributes a unified, literature-derived taxonomy of TFA grounded in a structured review of peer-reviewed studies, and the first large-scale, taxonomy-aligned audit of institutional web resources in Australia. We crawl 306 government, non-government, and service-provider domains, obtaining 52,605 pages, and classify using zero-shot topic models to map web content onto our taxonomy. An emotion and readability analyses reveal how institutions frame TFA and how accessible their guidance is to the public. Our findings show that institutional websites cover only a narrow subset of harms emphasised in the literature, with approximately 70% of all abuse labelled pages focused on harassment, comments abuse, or sexual abuse, while less than 1% address covert surveillance, economic abuse, or long-term controlling behaviours. Support pathways are similarly limited, with most resources centred on digital information hubs rather than counselling or community-based services. Readability analysis further shows that much of this content is written at late secondary or early tertiary reading levels, which may be inaccessible to a substantial portion of at-risk users. By highlighting strengths and gaps in Australia's support for TFA, our taxonomy and audit method provide a scalable basis for evaluating institutional communication, improving survivor resources, and guiding safer digital ecosystems. The taxonomy serves as a foundation for analyses in national contexts to foster TFA awareness.