π€ AI Summary
Existing DBMS configuration tuning methods suffer from three key limitations: (1) ignoring inter-parameter dependencies, (2) optimizing only a subset of parameters due to curse-of-dimensionality constraints, and (3) low exploration efficiency in Bayesian optimization (BO) caused by unstable surrogate models. To address these, we propose RelTuneβa novel framework that constructs the first parameter dependency graph, leverages graph neural networks (GNNs) to jointly learn semantic representations and structural dependencies among configuration parameters, and introduces a hybrid BO mechanism integrating GNN embeddings with affinity scoring to overcome the independence assumption and local-search bottlenecks. Extensive experiments across PostgreSQL, MySQL, and diverse workloads (TPC-C, JOB) demonstrate that RelTune achieves 1.8β3.2Γ faster convergence and outperforms state-of-the-art methods by 12%β27% in final performance, significantly enhancing both efficiency and robustness in high-dimensional configuration spaces.
π Abstract
Database Management Systems (DBMSs) are fundamental for managing large-scale and heterogeneous data, and their performance is critically influenced by configuration parameters. Effective tuning of these parameters is essential for adapting to diverse workloads and maximizing throughput while minimizing latency. Recent research has focused on automated configuration optimization using machine learning; however, existing approaches still exhibit several key limitations. Most tuning frameworks disregard the dependencies among parameters, assuming that each operates independently. This simplification prevents optimizers from leveraging relational effects across parameters, limiting their capacity to capture performancesensitive interactions. Moreover, to reduce the complexity of the high-dimensional search space, prior work often selects only the top few parameters for optimization, overlooking others that contribute meaningfully to performance. Bayesian Optimization (BO), the most common method for automatic tuning, is also constrained by its reliance on surrogate models, which can lead to unstable predictions and inefficient exploration. To overcome these limitations, we propose RelTune, a novel framework that represents parameter dependencies as a Relational Graph and learns GNN-based latent embeddings that encode performancerelevant semantics. RelTune further introduces Hybrid-Score-Guided Bayesian Optimization (HBO), which combines surrogate predictions with an Affinity Score measuring proximity to previously high-performing configurations. Experimental results on multiple DBMSs and workloads demonstrate that RelTune achieves faster convergence and higher optimization efficiency than conventional BO-based methods, achieving state-of-the-art performance across all evaluated scenarios.