Simple Graph Contrastive Learning via Fractional-order Neural Diffusion Networks

📅 2025-04-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Unsupervised graph representation learning commonly relies on data augmentation and negative sampling, limiting its applicability to both homophilic and heterophilic graphs. Method: This paper proposes the first augmentation-free and negative-sample-free graph contrastive learning framework. Its core innovation introduces fractional-order differential equations into graph neural diffusion models, enabling learnable fractional orders to dynamically regulate the scope of information aggregation and thereby generate complementary local and global multi-view node representations. Optimization employs a negative-sample-free contrastive loss. Contribution/Results: This work achieves the first deep integration of fractional-order modeling with graph contrastive learning. Empirical evaluation demonstrates state-of-the-art performance on both homophilic and heterophilic graph benchmarks, significantly improving unsupervised representation quality and generalization capability.

Technology Category

Application Category

📝 Abstract
Graph Contrastive Learning (GCL) has recently made progress as an unsupervised graph representation learning paradigm. GCL approaches can be categorized into augmentation-based and augmentation-free methods. The former relies on complex data augmentations, while the latter depends on encoders that can generate distinct views of the same input. Both approaches may require negative samples for training. In this paper, we introduce a novel augmentation-free GCL framework based on graph neural diffusion models. Specifically, we utilize learnable encoders governed by Fractional Differential Equations (FDE). Each FDE is characterized by an order parameter of the differential operator. We demonstrate that varying these parameters allows us to produce learnable encoders that generate diverse views, capturing either local or global information, for contrastive learning. Our model does not require negative samples for training and is applicable to both homophilic and heterophilic datasets. We demonstrate its effectiveness across various datasets, achieving state-of-the-art performance.
Problem

Research questions and friction points this paper is trying to address.

Develops augmentation-free graph contrastive learning
Uses fractional differential equations for diverse views
Eliminates need for negative samples in training
Innovation

Methods, ideas, or system contributions that make the work stand out.

Augmentation-free GCL with neural diffusion
Fractional Differential Equations for diverse views
No negative samples, works on all datasets
🔎 Similar Papers
No similar papers found.