ReMi: A Random Recurrent Neural Network Approach to Music Production

📅 2025-04-02
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Generative AI music synthesis faces challenges including high energy consumption, copyright infringement risks, and constrained creative expressivity. To address these, this work proposes a training-free, data-agnostic stochastic recurrent neural network (RNN) architecture—employing fully randomly initialized LSTM or GRU units—integrated with temporal parameterization control and real-time audio synthesis interfaces. The method enables low-latency (<10 ms), milliwatt-level power consumption for controllable musical signal generation, bypassing end-to-end learning paradigms entirely. It supports interactive, improvisational generation of configurable musical elements—including arpeggios and low-frequency oscillators (LFOs)—while drastically reducing computational and energy overhead and eliminating dataset-dependent copyright liabilities. The implementation is lightweight and has been integrated into an open-source music production workflow platform. This establishes a novel human-AI co-creation paradigm requiring zero model training and imposing no copyright burden on musicians.

Technology Category

Application Category

📝 Abstract
Generative artificial intelligence raises concerns related to energy consumption, copyright infringement and creative atrophy. We show that randomly initialized recurrent neural networks can produce arpeggios and low-frequency oscillations that are rich and configurable. In contrast to end-to-end music generation that aims to replace musicians, our approach expands their creativity while requiring no data and much less computational power. More information can be found at: https://allendia.com/
Problem

Research questions and friction points this paper is trying to address.

Addresses high energy consumption in generative AI music production
Reduces copyright risks by avoiding data-dependent training
Enhances musician creativity with minimal computational resources
Innovation

Methods, ideas, or system contributions that make the work stand out.

Randomly initialized recurrent neural networks
No data required for operation
Low computational power consumption
🔎 Similar Papers
No similar papers found.
Hugo Chateau-Laurent
Hugo Chateau-Laurent
CNRS
Computational neuroscience
T
Tara Vanhatalo
Allendia, Inria centre of Bordeaux University