🤖 AI Summary
This work addresses information-theoretically secure multi-party matrix multiplication under local storage constraints and in the presence of semi-honest colluding adversaries. By encoding submatrices using sparse masking polynomials and integrating coefficient alignment with Beaver triples, the proposed protocol achieves perfect security. It further introduces, for the first time, a learning-augmented low-rank approximation mechanism grounded in tensor decomposition, which significantly enhances computational efficiency for large-scale matrix operations while preserving both optimal recovery thresholds and perfect privacy. Theoretical analysis confirms that any coalition of adversaries below the prescribed threshold gains only uniformly random shares, revealing no information about the inputs. Experimental results demonstrate up to an 80% improvement in computation speed for high-dimensional matrices compared to existing approaches.
📝 Abstract
This paper presents a perfectly secure matrix multiplication (PSMM) protocol for multiparty computation (MPC) of $\mathrm{A}^{\top}\mathrm{B}$ over finite fields. The proposed scheme guarantees correctness and information-theoretic privacy against threshold-bounded, semi-honest colluding agents, under explicit local storage constraints. Our scheme encodes submatrices as evaluations of sparse masking polynomials and combines coefficient alignment with Beaver-style randomness to ensure perfect secrecy. We demonstrate that any colluding set of parties below the security threshold observes uniformly random shares, and that the recovery threshold is optimal, matching existing information-theoretic limits. Building on this framework, we introduce a learning-augmented extension that integrates tensor-decomposition-based local block multiplication, capturing both classical and learned low-rank methods. We demonstrate that the proposed learning-based PSMM preserves privacy and recovery guarantees for MPC, while providing scalable computational efficiency gains (up to $80\%$) as the matrix dimensions grow.