Less is More: some Computational Principles based on Parcimony, and Limitations of Natural Intelligence

📅 2025-06-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Natural intelligence (NI) achieves efficient learning, generalization, and creativity under severe constraints on neural resources, energy, and data—contrasting sharply with mainstream AI’s reliance on massive computation and labeled datasets. Method: This paper proposes a “less-is-more” computational paradigm centered on sparsity, chaotic ergodicity, reservoir computing, and embodied active learning. It systematically analyzes how constraints foster intelligent emergence; demonstrates that bandwidth limitations spontaneously induce symbolic-like representations and hierarchical spiking structures; and integrates chaotic dynamics and intrinsic motivation into learning mechanisms. The framework leverages spiking neural networks, developmental cognitive modeling, and embodied interaction. Contribution/Results: We implement a low-energy, few-shot, highly generalizable AI prototype. It achieves infant-level learning efficiency in language acquisition and sensorimotor tasks, while ensuring interpretability and neurobiological plausibility.

Technology Category

Application Category

📝 Abstract
Natural intelligence (NI) consistently achieves more with less. Infants learn language, develop abstract concepts, and acquire sensorimotor skills from sparse data, all within tight neural and energy limits. In contrast, today's AI relies on virtually unlimited computational power, energy, and data to reach high performance. This paper argues that constraints in NI are paradoxically catalysts for efficiency, adaptability, and creativity. We first show how limited neural bandwidth promotes concise codes that still capture complex patterns. Spiking neurons, hierarchical structures, and symbolic-like representations emerge naturally from bandwidth constraints, enabling robust generalization. Next, we discuss chaotic itinerancy, illustrating how the brain transits among transient attractors to flexibly retrieve memories and manage uncertainty. We then highlight reservoir computing, where random projections facilitate rapid generalization from small datasets. Drawing on developmental perspectives, we emphasize how intrinsic motivation, along with responsive social environments, drives infant language learning and discovery of meaning. Such active, embodied processes are largely absent in current AI. Finally, we suggest that adopting 'less is more' principles -- energy constraints, parsimonious architectures, and real-world interaction -- can foster the emergence of more efficient, interpretable, and biologically grounded artificial systems.
Problem

Research questions and friction points this paper is trying to address.

Explores how natural intelligence achieves efficiency with limited resources
Analyzes constraints in neural systems as catalysts for creativity and adaptability
Proposes bio-inspired principles to improve AI efficiency and interpretability
Innovation

Methods, ideas, or system contributions that make the work stand out.

Limited neural bandwidth promotes concise codes
Chaotic itinerancy enables flexible memory retrieval
Reservoir computing facilitates rapid generalization
🔎 Similar Papers
No similar papers found.
L
Laura Cohen
ETIS laboratory, CNRS UMR8051, CY Cergy-Paris University, ENSEA
Xavier Hinaut
Xavier Hinaut
Inria, Bordeaux, France
Reservoir ComputingRecurrent Neural NetworksLanguage ProcessingBirdsongSensorimotor model
L
Lilyana Petrova
ETIS laboratory, CNRS UMR8051, CY Cergy-Paris University, ENSEA
Alexandre Pitti
Alexandre Pitti
full pr, CY Cergy-Paris Université, ETIS Laboratory, CNRS UMR 8051, ENSEA
Artificial IntelligenceNeural NetworksBio-Inspired RoboticsEmbodimentDevelopmental Robotics
S
Syd Reynal
ETIS laboratory, CNRS UMR8051, CY Cergy-Paris University, ENSEA
I
Ichiro Tsuda
AIT Center, Sapporo City University, Sapporo, Japan