π€ AI Summary
Large language models (LLMs) suffer from spatial hallucinations, weak geographic reasoning, and difficulty satisfying walkability constraints in urban route recommendation. To address these issues, this paper proposes WalkRAGβthe first walkability-aware, spatially enhanced retrieval-augmented generation (RAG) framework. WalkRAG integrates geographic information systems (GIS), spatial indexing and querying techniques, multi-granularity geographic knowledge retrieval, and conversational LLMs to enable spatially constrained path generation and interactive exploration. It explicitly models pedestrian accessibility, point-of-interest (POI) semantics, and user preferences to mitigate factual errors in location-based services. Experimental results demonstrate that WalkRAG improves factual accuracy by 32.7% on dynamic walking route and POI recommendation tasks, while increasing user satisfaction by 41.5%.
π Abstract
Large Language Models (LLMs) have become foundational tools in artificial intelligence, supporting a wide range of applications beyond traditional natural language processing, including urban systems and tourist recommendations. However, their tendency to hallucinate and their limitations in spatial retrieval and reasoning are well known, pointing to the need for novel solutions. Retrieval-augmented generation (RAG) has recently emerged as a promising way to enhance LLMs with accurate, domain-specific, and timely information. Spatial RAG extends this approach to tasks involving geographic understanding. In this work, we introduce WalkRAG, a spatial RAG-based framework with a conversational interface for recommending walkable urban itineraries. Users can request routes that meet specific spatial constraints and preferences while interactively retrieving information about the path and points of interest (POIs) along the way. Preliminary results show the effectiveness of combining information retrieval, spatial reasoning, and LLMs to support urban discovery.