Relation-Aware Bayesian Optimization of DBMS Configurations Guided by Affinity Scores

πŸ“… 2025-10-30
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
Existing DBMS configuration tuning methods suffer from three key limitations: (1) ignoring inter-parameter dependencies, (2) optimizing only a subset of parameters due to curse-of-dimensionality constraints, and (3) low exploration efficiency in Bayesian optimization (BO) caused by unstable surrogate models. To address these, we propose RelTuneβ€”a novel framework that constructs the first parameter dependency graph, leverages graph neural networks (GNNs) to jointly learn semantic representations and structural dependencies among configuration parameters, and introduces a hybrid BO mechanism integrating GNN embeddings with affinity scoring to overcome the independence assumption and local-search bottlenecks. Extensive experiments across PostgreSQL, MySQL, and diverse workloads (TPC-C, JOB) demonstrate that RelTune achieves 1.8–3.2Γ— faster convergence and outperforms state-of-the-art methods by 12%–27% in final performance, significantly enhancing both efficiency and robustness in high-dimensional configuration spaces.

Technology Category

Application Category

πŸ“ Abstract
Database Management Systems (DBMSs) are fundamental for managing large-scale and heterogeneous data, and their performance is critically influenced by configuration parameters. Effective tuning of these parameters is essential for adapting to diverse workloads and maximizing throughput while minimizing latency. Recent research has focused on automated configuration optimization using machine learning; however, existing approaches still exhibit several key limitations. Most tuning frameworks disregard the dependencies among parameters, assuming that each operates independently. This simplification prevents optimizers from leveraging relational effects across parameters, limiting their capacity to capture performancesensitive interactions. Moreover, to reduce the complexity of the high-dimensional search space, prior work often selects only the top few parameters for optimization, overlooking others that contribute meaningfully to performance. Bayesian Optimization (BO), the most common method for automatic tuning, is also constrained by its reliance on surrogate models, which can lead to unstable predictions and inefficient exploration. To overcome these limitations, we propose RelTune, a novel framework that represents parameter dependencies as a Relational Graph and learns GNN-based latent embeddings that encode performancerelevant semantics. RelTune further introduces Hybrid-Score-Guided Bayesian Optimization (HBO), which combines surrogate predictions with an Affinity Score measuring proximity to previously high-performing configurations. Experimental results on multiple DBMSs and workloads demonstrate that RelTune achieves faster convergence and higher optimization efficiency than conventional BO-based methods, achieving state-of-the-art performance across all evaluated scenarios.
Problem

Research questions and friction points this paper is trying to address.

Optimizing DBMS configurations by modeling parameter dependencies
Overcoming limitations of independent parameter assumptions in tuning
Improving Bayesian Optimization with relational graphs and affinity scores
Innovation

Methods, ideas, or system contributions that make the work stand out.

Models parameter dependencies with relational graph embeddings
Combines surrogate predictions with affinity score guidance
Uses GNN-based latent embeddings for performance semantics
πŸ”Ž Similar Papers
No similar papers found.
S
Sein Kwon
Computer Science, Yonsei Univercity, Seoul, Korea
S
Seulgi Baek
Computer Science, Yonsei Univercity, Seoul, Korea
H
Hyunseo Yang
Computer Science, Yonsei Univercity, Seoul, Korea
Youngwan Jo
Youngwan Jo
Yonsei University
Computer VisionZero-shot Medical Image SegmentationVideo Anomaly Detection
S
Sanghyun Park
Computer Science, Yonsei Univercity, Seoul, Korea