Temporal fine-tuning for early risk detection

📅 2025-05-16
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the challenges of response latency and insufficient accuracy in early risk detection (ERD) for online mental health monitoring. We propose “Temporal Fine-tuning”, a novel paradigm that explicitly models user posting chronology during Transformer training, enabling dynamic decision adjustment to jointly optimize detection accuracy and timeliness. Unlike conventional classification frameworks, our approach jointly optimizes both precision and latency using time-sensitive evaluation metrics—particularly ERDE(θ). Experiments on Spanish-language datasets for depression and eating disorders, benchmarked on MentalRiskES 2023, demonstrate significant performance gains. To our knowledge, this is the first work achieving coordinated optimization of accuracy and latency within a single-objective setting. The proposed framework establishes a scalable, temporally aware modeling paradigm for multilingual mental health risk预警.

Technology Category

Application Category

📝 Abstract
Early Risk Detection (ERD) on the Web aims to identify promptly users facing social and health issues. Users are analyzed post-by-post, and it is necessary to guarantee correct and quick answers, which is particularly challenging in critical scenarios. ERD involves optimizing classification precision and minimizing detection delay. Standard classification metrics may not suffice, resorting to specific metrics such as ERDE(theta) that explicitly consider precision and delay. The current research focuses on applying a multi-objective approach, prioritizing classification performance and establishing a separate criterion for decision time. In this work, we propose a completely different strategy, temporal fine-tuning, which allows tuning transformer-based models by explicitly incorporating time within the learning process. Our method allows us to analyze complete user post histories, tune models considering different contexts, and evaluate training performance using temporal metrics. We evaluated our proposal in the depression and eating disorders tasks for the Spanish language, achieving competitive results compared to the best models of MentalRiskES 2023. We found that temporal fine-tuning optimized decisions considering context and time progress. In this way, by properly taking advantage of the power of transformers, it is possible to address ERD by combining precision and speed as a single objective.
Problem

Research questions and friction points this paper is trying to address.

Optimizing precision and minimizing delay in early risk detection
Incorporating time explicitly in transformer-based model learning
Addressing depression and eating disorders detection in Spanish language
Innovation

Methods, ideas, or system contributions that make the work stand out.

Temporal fine-tuning for transformer-based models
Incorporates time within learning process explicitly
Combines precision and speed as single objective
🔎 Similar Papers
No similar papers found.
H
Horacio Thompson
Universidad Nacional de San Luis, San Luis, Argentina; Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET)
E
Esaú Villatoro-Tello
Idiap Research Institute, Martigny, Switzerland
Manuel Montes-y-Gómez
Manuel Montes-y-Gómez
National Institute of Astrophysics, Optics and Electronics
text mininginformation retrievalnatural language processing
M
Marcelo Errecalde
Universidad Nacional de San Luis, San Luis, Argentina