🤖 AI Summary
Existing sketching-based methods for tensor network contraction are limited to acyclic structures and struggle with general topologies containing cycles, while exact contraction suffers from prohibitive computational complexity. This work proposes two novel sketching-based approximate contraction approaches: the first is a sketching algorithm capable of handling arbitrary tensor network topologies, including those with cycles, marking the first such method in the literature; the second is a new technique tailored for acyclic networks, whose time and space complexity scale only polynomially with the number of contraction steps. By overcoming the structural limitations of conventional sketching techniques, this study significantly reduces computational and memory costs for acyclic networks and provides an efficient, general-purpose framework for approximate contraction of large-scale tensor networks.
📝 Abstract
Tensor network contraction is a fundamental mathematical operation that generalizes the dot product and matrix multiplication. It finds applications in numerous domains, such as database systems, graph theory, machine learning, probability theory, and quantum mechanics. Tensor network contractions are computationally expensive, in general requiring exponential time and space. Sketching methods include a number of dimensionality reduction techniques that are widely used in the design of approximation algorithms. The existing sketching methods for tensor network contraction, however, only support acyclic tensor networks. We present the first method capable of approximating arbitrary tensor network contractions, including those of cyclic tensor networks. Additionally, we show that the existing sketching methods require a computational complexity that grows exponentially with the number of contractions. We present a second method, for acyclic tensor networks, whose space and time complexity depends only polynomially on the number of contractions.