The dynamic interplay between in-context and in-weight learning in humans and neural networks

📅 2024-02-13
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
The dual nature of human learning—characterized by rapid logical inference and slow trial-and-error adaptation—lacks a unified explanation within standard neural network weight-update paradigms. Method: We propose a dynamic computational framework integrating in-context learning (ICL) and in-weight learning (IWL), modeling ICL as an emergent capability coexisting with IWL in large language models. Using meta-learning–driven behavioral modeling and computational simulation, we characterize their competitive and cooperative dynamics across key cognitive phenomena: curriculum structure effects, compositional generalization, and the flexibility–stability trade-off. Results: Our framework successfully reproduces canonical human behavioral patterns in category learning and compositional reasoning tasks. It constitutes the first computationally explicit, empirically testable deep learning instantiation of dual-process cognitive theory, thereby bridging a fundamental gap between cognitive psychology and modern neural network theory.

Technology Category

Application Category

📝 Abstract
Human learning embodies a striking duality: sometimes, we appear capable of following logical, compositional rules and benefit from structured curricula (e.g., in formal education), while other times, we rely on an incremental approach or trial-and-error, learning better from curricula that are randomly interleaved. Influential psychological theories explain this seemingly disparate behavioral evidence by positing two qualitatively different learning systems -- one for rapid, rule-based inferences and another for slow, incremental adaptation. It remains unclear how to reconcile such theories with neural networks, which learn via incremental weight updates and are thus a natural model for the latter type of learning, but are not obviously compatible with the former. However, recent evidence suggests that metalearning neural networks and large language models are capable of"in-context learning"(ICL) -- the ability to flexibly grasp the structure of a new task from a few examples. Here, we show that the dynamic interplay between ICL and default in-weight learning (IWL) naturally captures a broad range of learning phenomena observed in humans, reproducing curriculum effects on category-learning and compositional tasks, and recapitulating a tradeoff between flexibility and retention. Our work shows how emergent ICL can equip neural networks with fundamentally different learning properties that can coexist with their native IWL, thus offering a novel perspective on dual-process theories and human cognitive flexibility.
Problem

Research questions and friction points this paper is trying to address.

Understanding interplay between in-context and inweight learning
Reconciling dualprocess theories with neural network learning
Modeling human cognitive flexibility via emergent ICL
Innovation

Methods, ideas, or system contributions that make the work stand out.

Metalearning neural networks enable in-context learning
Dynamic interplay between in-context and in-weight learning
Neural networks mimic human dual-learning systems
🔎 Similar Papers
No similar papers found.
Jacob Russin
Jacob Russin
Department of Computer Science, Department of Cognitive, Linguistic, and Psychological Sciences, Brown University
Ellie Pavlick
Ellie Pavlick
Brown University
Natural Language Processing
M
Michael J. Frank
Department of Cognitive, Linguistic, and Psychological Sciences, Carney Institute for Brain Science, Brown University