Open-Source LLM-Driven Federated Transformer for Predictive IoV Management

📅 2025-05-01
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Centralized architectures in the Internet of Vehicles (IoV) suffer from high latency, poor scalability, and significant privacy risks; meanwhile, prompt optimization for large language models (LLMs) in federated learning (FL) settings—and their application to traffic prediction—remains unexplored. Method: This paper proposes the first FL-based traffic forecasting framework leveraging an open-source LLM (Pythia-1B), featuring a novel two-tier FL architecture and a federated prompt optimization mechanism. It integrates Transformer-based generation of high-fidelity synthetic traffic data (in NGSIM format) and enables edge-cloud collaborative inference. Contribution/Results: Experiments on real-world datasets achieve 99.86% trajectory prediction accuracy—substantially outperforming centralized and conventional FL baselines. This work is the first to empirically validate the feasibility and high performance of open-source LLMs for privacy-preserving, scalable, low-latency vehicle-infrastructure cooperative management.

Technology Category

Application Category

📝 Abstract
The proliferation of connected vehicles within the Internet of Vehicles (IoV) ecosystem presents critical challenges in ensuring scalable, real-time, and privacy-preserving traffic management. Existing centralized IoV solutions often suffer from high latency, limited scalability, and reliance on proprietary Artificial Intelligence (AI) models, creating significant barriers to widespread deployment, particularly in dynamic and privacy-sensitive environments. Meanwhile, integrating Large Language Models (LLMs) in vehicular systems remains underexplored, especially concerning prompt optimization and effective utilization in federated contexts. To address these challenges, we propose the Federated Prompt-Optimized Traffic Transformer (FPoTT), a novel framework that leverages open-source LLMs for predictive IoV management. FPoTT introduces a dynamic prompt optimization mechanism that iteratively refines textual prompts to enhance trajectory prediction. The architecture employs a dual-layer federated learning paradigm, combining lightweight edge models for real-time inference with cloud-based LLMs to retain global intelligence. A Transformer-driven synthetic data generator is incorporated to augment training with diverse, high-fidelity traffic scenarios in the Next Generation Simulation (NGSIM) format. Extensive evaluations demonstrate that FPoTT, utilizing EleutherAI Pythia-1B, achieves 99.86% prediction accuracy on real-world data while maintaining high performance on synthetic datasets. These results underscore the potential of open-source LLMs in enabling secure, adaptive, and scalable IoV management, offering a promising alternative to proprietary solutions in smart mobility ecosystems.
Problem

Research questions and friction points this paper is trying to address.

Ensuring scalable, real-time, privacy-preserving IoV traffic management
Overcoming high latency and limited scalability in centralized IoV solutions
Integrating LLMs effectively in federated vehicular systems
Innovation

Methods, ideas, or system contributions that make the work stand out.

Open-source LLMs for predictive IoV management
Dual-layer federated learning paradigm
Transformer-driven synthetic data generator
🔎 Similar Papers
No similar papers found.