๐ค AI Summary
This work addresses the offline policy evaluation (OPE) challenge of predicting human decisions in language-based persuasive games. We propose a novel cross-agent-space simulation training paradigmโfirst integrating full-agent-space simulation with virtual decision-maker modeling to enhance generalization to unseen expert agents. Leveraging 87K real human decision records, we construct a high-quality benchmark dataset that unifies simulated interaction generation, OPE methodology, and sequential decision modeling. Experimental results demonstrate a 7.1% improvement in prediction accuracy on the most challenging top-15% instances, significantly boosting the robustness and practicality of offline policy evaluation. The code and dataset are publicly released.
๐ Abstract
Recent advances in Large Language Models (LLMs) have spurred interest in designing LLM-based agents for tasks that involve interaction with human and artificial agents. This paper addresses a key aspect in the design of such agents: predicting human decisions in off-policy evaluation (OPE). We focus on language-based persuasion games, where an expert aims to influence the decision-maker through verbal messages. In our OPE framework, the prediction model is trained on human interaction data collected from encounters with one set of expert agents, and its performance is evaluated on interactions with a different set of experts. Using a dedicated application, we collected a dataset of 87K decisions from humans playing a repeated decision-making game with artificial agents. To enhance off-policy performance, we propose a simulation technique involving interactions across the entire agent space and simulated decision-makers. Our learning strategy yields significant OPE gains, e.g., improving prediction accuracy in the top 15% challenging cases by 7.1%. Our code and the large dataset we collected and generated are submitted as supplementary material and publicly available in our GitHub repository: https://github.com/eilamshapira/HumanChoicePrediction