π€ AI Summary
This work proposes a novel paradigm for interactive architecture as an embodied performative agent that co-creates with human artists during rehearsal processes. We present a prototype system integrating spatial and vocal sensing, real-time interpretation via large language models, and explainable AI reasoning to dynamically adapt the physical environment while supporting iterative humanβAI design in a virtual blueprint. By innovatively merging dramaturgical principles with explainable AI, the architecture assumes an active performative role, enhancing transparency and fostering richer artistic dialogue in collaborative creation. The resulting framework offers an experimental, interpretable, and collaborative AI-mediated approach to designing immersive performance spaces.
π Abstract
As AI systems increasingly become embedded in interactive and im-mersive artistic environments, artists and technologists are discovering new opportunities to engage with their interpretive and autonomous capacities as creative collaborators in live performance. The focus of this work-in-progress is on outlining conceptual and technical foundations under which performance-makers and interactive architecture can collaborate within rehearsal settings. It introduces a rehearsal-oriented prototype system for shaping and testing AI-mediated environments within creative practice. This approach treats interactive architecture as a performative agent that senses spatial behaviour and speech, interprets these signals through a large language model, and generates real-time environmental adaptations. Designed for deployment in physical performance spaces, the system employs virtual blueprints to support iterative experimentation and creative dialogue between artists and AI agents, using reasoning traces to inform architectural interaction design grounded in dramaturgical principles.