XAI-Driven Machine Learning System for Driving Style Recognition and Personalized Recommendations

📅 2025-08-31
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the interpretability and trustworthiness bottlenecks in driving style recognition for intelligent transportation systems, proposing an explainable AI framework that balances high accuracy with lightweight deployment. Methodologically, it constructs a high-fidelity CARLA-Drive simulation dataset; integrates tree-based models—including Random Forest and XGBoost—with SHAP for global feature importance analysis and instance-level attribution explanations; and applies model pruning and feature selection to achieve deployment-friendly efficiency. Its key contribution is the first deep integration of SHAP into a multi-model, three-class driving style classification task—achieving 0.92 accuracy while delivering fine-grained, human-understandable decision rationales. Experimental results demonstrate an optimal trade-off among predictive performance, interpretability, and engineering practicality, significantly enhancing driver trust in and willingness to adopt AI-generated recommendations.

Technology Category

Application Category

📝 Abstract
Artificial intelligence (AI) is increasingly used in the automotive industry for applications such as driving style classification, which aims to improve road safety, efficiency, and personalize user experiences. While deep learning (DL) models, such as Long Short-Term Memory (LSTM) networks, excel at this task, their black-box nature limits interpretability and trust. This paper proposes a machine learning (ML)-based method that balances high accuracy with interpretability. We introduce a high-quality dataset, CARLA-Drive, and leverage ML techniques like Random Forest (RF), Gradient Boosting (XGBoost), and Support Vector Machine (SVM), which are efficient, lightweight, and interpretable. In addition, we apply the SHAP (Shapley Additive Explanations) explainability technique to provide personalized recommendations for safer driving. Achieving an accuracy of 0.92 on a three-class classification task with both RF and XGBoost classifiers, our approach matches DL models in performance while offering transparency and practicality for real-world deployment in intelligent transportation systems.
Problem

Research questions and friction points this paper is trying to address.

Develops interpretable ML system for driving style recognition
Balances high accuracy with model transparency using XAI
Provides personalized safety recommendations for intelligent transportation
Innovation

Methods, ideas, or system contributions that make the work stand out.

ML models like RF and XGBoost for interpretability
SHAP technique provides personalized driving recommendations
Lightweight and transparent system matches DL performance
🔎 Similar Papers
No similar papers found.
F
Feriel Amel Sellal
L3i - La Rochelle University, La Rochelle, France
Ahmed Ayoub Bellachia
Ahmed Ayoub Bellachia
Research Engineer
AI SecurityFederated LearningBlockchainApplied Cryptography
M
Meryem Malak Dif
L3i - La Rochelle University, La Rochelle, France
E
Enguerrand De Rautlin De La Roy
L3i - La Rochelle University, La Rochelle, France
Mouhamed Amine Bouchiha
Mouhamed Amine Bouchiha
PostDoc, Institut Mines-Télécom, SudParis
TrustPrivacyBlockchainsFederated LearningLLMs
Y
Yacine Ghamri-Doudane
L3i - La Rochelle University, La Rochelle, France