TableLoRA: Low-rank Adaptation on Table Structure Understanding for Large Language Models

๐Ÿ“… 2025-03-06
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
To address the challenge that large language models (LLMs) struggle to capture the two-dimensional structural properties of tabular data under parameter-efficient fine-tuning (PEFT), where conventional serialization discards positional relationships among cells, this paper proposes TabLoRAโ€”the first LoRA architecture specifically designed for tabular data. TabLoRA introduces a structure-aware token encoder that preserves table geometry during serialization and incorporates a novel two-dimensional LoRA module explicitly modeling row- and column-coordinate dependencies among cells. Evaluated on four representative tabular understanding tasks, TabLoRA consistently outperforms standard LoRA and state-of-the-art tabular encoders, achieving significant gains in structured-data comprehension with minimal parameter overhead. This work establishes a new paradigm for lightweight, table-aware LLM adaptation.

Technology Category

Application Category

๐Ÿ“ Abstract
Tabular data are crucial in many fields and their understanding by large language models (LLMs) under high parameter efficiency paradigm is important. However, directly applying parameter-efficient fine-tuning (PEFT) techniques to tabular tasks presents significant challenges, particularly in terms of better table serialization and the representation of two-dimensional structured information within a one-dimensional sequence. To address this, we propose TableLoRA, a module designed to improve LLMs' understanding of table structure during PEFT. It incorporates special tokens for serializing tables with special token encoder and uses 2D LoRA to encode low-rank information on cell positions. Experiments on four tabular-related datasets demonstrate that TableLoRA consistently outperforms vanilla LoRA and surpasses various table encoding methods tested in control experiments. These findings reveal that TableLoRA, as a table-specific LoRA, enhances the ability of LLMs to process tabular data effectively, especially in low-parameter settings, demonstrating its potential as a robust solution for handling table-related tasks.
Problem

Research questions and friction points this paper is trying to address.

Improves LLMs' table structure understanding during PEFT.
Addresses challenges in table serialization and 2D information representation.
Enhances LLMs' ability to process tabular data efficiently.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Special tokens for table serialization
2D LoRA for cell position encoding
Enhanced LLM table structure understanding
๐Ÿ”Ž Similar Papers