Research Highlights

(For a list of recent publications see below)

Graph Machine Learning (Graph Neural Networks; Graph Kernels; Graph Matching)   Graph Algorithms   Knowledge Tracing  

Hierarchical Cut Labelling - Scaling Up Distance Queries on Road Networks

Labelling-based solutions are the current state-of-the-arts to render fast response time to distance queries on road networks. However, they still result in large search spaces on distance labels when road networks are large. We propose a novel solution based on a simple but efficient data structure, called balanced tree hierarchy, which can perform 1.5-4 times faster in query processing and achieve up to 60% smaller labelling size compared to the state-of-the-art approaches.

M. Farhan, H. Koehler, R. Ohms, and Q. Wang

ACM Special Interest Group on Management of Data (SIGMOD), 2024

(Source code available here)

Contrastive Learning for Supervised Graph Matching

Deep graph matching techniques have shown promising results in recent years. We cast deep graph matching as a contrastive learning task. Then we explore the power of contrastive loss to incorporate an one-to-one mapping constraint into the learning objective through the choice of negative samples that are most informative. This leads to StableGM, a new deep graph matching architecture based on stable marriage algorithms, with theoretical guarantees of the dual-optimality of matching solutions.

G. Ratnayaka, Q. Wang, and Y. Li

The 39th Conference on Uncertainty in Artificial Intelligence (UAI), 2023 (paper link)

(Source code available here)

BatchHL+: Batch Dynamic Labelling for Distance Queries on Large-Scale Networks

BatchHL+ is a new batch-dynamic algorithm that can dynamize labelling for distance queries efficiently. Theoretically, the correctness and minimality of BatchHL+ are proven through a delicate analysis of patterns of interactions. Empirically, BatchHL+ significantly outperforms the state-of-the-art methods on 15 real-world complex networks, with up to 3 orders of magnitude faster in reflecting updates of rapidly changing graphs for distance queries.

M. Farhan, H. Koehler, and Q. Wang

The VLDB Journal, 2023 (paper link)

(Source code available here)

Local Vertex Colouring Graph Neural Networks

Graph search, a fundamental element of graph theory, plays an important role in solving many problems that cannot be tackled by GNNs. In this work, we investigate the expressivity of graph neural network (GNN) from the perspective of graph search. Two classical graph search methods, breadth-first search (BFS) and depth-first search (DFS), are studied in the context of GNNs. We show either method can be used to design a vertex colouring algorithm, called local vertex colouring (LVC), which goes beyond 1-WL.

S. Li, D. Kim, and Q. Wang

International Conference on Machine Learning (ICML), 2023 (paper link)

(Source code available here)

Efficient Maintenance of Highway Cover Labelling for Distance Queries on Large Dynamic Graphs

To reflect rapid changes on graphs when answering shortest-path distance queries, we devise a parallel fully dynamic labelling method by leveraging two sources of efficiency gains - landmark parallelism and anchor parallelism. This fully dynamic method handles both incremental and decremental updates efficiently through a unified search approach and a bounded repairing inference mechanism. We empirically verify its efficiency and scalability on 10 real-world large networks.

M. Farhan and Q. Wang

World Wide Web, 2023 (paper link)

(Source code available here)

N-WL: A New Hierarchy of Expressivity for Graph Neural Networks

Is the Weisfeiler-Lehman (WL) hierarchy a good yardstick for expressivity of GNNs? In the search for an answer to this question, we show that, contrary to the widely accepted view, the WL hierarchy is not well-suited for measuring expressive GNNs. Thus, we explore a hierarchy of expressivity that is grounded on a new class of graph isomorphism algorithms, namely Neighbourhood WL algorithms, which exhibit a natural and strict hierarchy of expressivity for measuring GNNs.

Q. Wang, D. Chen, A. Wijesinghe, S. Li, and M. Farhan

International Conference on Learning Representations (ICLR), 2023 (paper link)

(Source code available here)

BatchHL: Answering Distance Queries on Batch-Dynamic Networks at Scale

Very recently, several batch-dynamic algorithms have been reported, mostly focusing on traditional graph problems such as graph connectivity, dynamic trees and k-clique counting. As of yet, batch-dynamic algorithms for shortest-path distance have been left unexplored, depsite that distance querying is a fundmental problem in many real-world applications. In this work, we propose a batch-dynamic method, BatchHL, to dynamize distance labellings efficiently in order to reflect large batches of updates on a graph.

M. Farhan, Q. Wang, and H. Koehler

ACM Special Interest Group on Management of Data (SIGMOD), 2022 (paper link)

(Source code available here)

Restructuring Graphs for Higher Homophily via Adaptive Spectral Clustering

While a growing body of literature has been studying new Graph Neural Networks (GNNs) that work on both homophilic and heterophilic graphs, little has been done on adapting classical GNNs to less-homophilic graphs. In this work, we propose a novel graph restructuring method that can be integrated into any type of GNNs, including classical GNNs, to leverage their nice properties such as efficiency, simplicity, and explainability while alleviating their limitations.

S. Li, D. Kim, and Q. Wang

The 37th AAAI Conference on Artificial Intelligence (AAAI - Oral), 2023 (paper link)

(Source code available here)

Knowledge Tracing: A Survey

Knowledge Tracing (KT) concerns about how to effectively track the learning progress of a student through their online interaction with learning materials. We present a detailed survey of the KT literature. A broad range of methods starting from early attempts to recent state-of-the-art methods using deep learning have been covered, while we highlight the theoretical aspects of models and the characteristics of benchmark datasets. We also shed light on key modelling differences between closely related methods, discuss current research gaps in the KT literature, and possible future research directions.

G. Abdelrahman, Q. Wang, and B. Pereira Nunes

ACM Computing Surveys, 2022 (paper link)

A New Perspective on How Graph Neural Networks Go Beyond Weisfeiler-Lehman?

How to design expressive yet simple GNNs that can go beyond the WL test with a theoretically provable guarantee? We introduce a new hierarchy of local isomorphism to characterise different classes of local structures, and discuss its connections with the WL test. We also develop a general solution to inject structural properties into a message-passing aggregation scheme, and theoretically characterize how GNNs can be designed to be more expressive beyond the WL test.

A. Wijesinghe and Q. Wang

International Conference on Learning Representations (ICLR - Oral, 54 out of 3391 submissions), 2022 (paper link)

(Source code available here)

Deep Graph Memory Networks for Forgetting-Robust Knowledge Tracing

Several recent attempts have enhanced deep knowledge tracing (KT) models with an external memory structure to represent knowledge states. However, challenges still remain, e.g., how to represent forgetting behaviours over time? How to identify relationships among latent concepts from questions? We tackle these challenges by introducing a novel forgetting-robust KT model, Deep Graph Memory Network (DGMN), to dynamically trace a student’s knowledge states over latent concepts.

G. Abdelrahman and Q. Wang

IEEE Transactions on Knowledge and Data Engineering (TKDE), 2022 (paper link)

A Regularized Wasserstein Framework for Graph Kernels

We propose a learning framework for graph kernels which is theoretically grounded on regularizing optimal transport. Regularized Wasserstein (RW) discrepancy can preserve both features and structure of graphs via Wasserstein distances on features and their local variations, local barycenters and global connectivity. Two strongly convex regularization terms are introduced to improve the learning ability. The framework is robust and can guarantee the numerical stability in optimization.

A. Wijesinghe, Q. Wang and S. Gould

IEEE International Conference on Data Mining (ICDM), 2021 (paper link)

(Source code available here)

Fast Fully Dynamic Labelling For Distance Queries

Due to the dynamic nature of real-world networks, such as social networks or web graphs in which a link between two entities may fail or become alive at any time, there is a pressing need to address the shortest path distance problem for dynamic networks. In this article, we propose a fully dynamic labelling method to efficiently update distance labelling. At its core, our method incorporates two building blocks for handling incremental update operations and decremental update operations, respectively.

M. Farhan, Q. Wang, Y. Lin, and B. Mckay

The VLDB Journal, 2021 (paper link)

(Source code available here)

Beyond Low-Pass Filters: Adaptive Feature Propagation on Graphs

Many graph neural networks (GNNs) assume local homophily, i.e., strong similarities in local neighborhoods. This assumption however limits the generalizability power of GNNs. To address this limitation, we propose a flexible GNN model, which is capable of handling any graphs without being restricted by their underlying homophily.

S. Li, D. Kim, and Q. Wang

The European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML-PKDD), 2021 (paper link)

(Source code available here)

Query-by-Sketch: Scaling Shortest Path Graph Queries on Very Large Networks

How to efficiently compute a shortest path graph containing exactly all shortest paths between any arbitrary pair of vertices on complex networks? Shortest path graph manifests itself as a basis for tackling various shortest path related problems, e.g., finding critical edges and vertices whose removal can destroy all shortest paths between two vertices.

Y. Wang, Q. Wang, H. Koehler, and Y. Lin

ACM Special Interest Group on Management of Data (SIGMOD), 2021 (paper link)

(Source code available here)

ErGAN: Generative Adversarial Networks for Entity Resolution

Entity Resolution is an important component of real-world applications in various fields. Inspired by recent advances of generative adversarial networks, we propose a novel deep learning model for entity resolution called ErGAN. This model consists of two key components – a label generator and a discriminator that are optimized alternatively through adversarial learning.

J. Shao, Q. Wang, A. Wijesinghe, and E. Rahm

IEEE International Conference on Data Mining (ICDM), 2020 (paper link)

A Highly Scalable Labelling Approach for Exact Distance Queries in Complex Networks

Answering exact shortest path distance queries is a fundamental task in graph theory. Despite a tremendous amount of research on the subject, there is still no satisfactory solution that can scale to billion-scale complex networks. We develop a highly scalable solution for answering exact distance queries over massive complex networks.

M. Farhan, Q. Wang, Y. Lin, and B. Mckay

The 22nd International Conference on Extending Database Technology (EDBT), 2019 (paper link)

(Source code available here)

DFNets: Spectral CNNs for Graphs with Feedback-Looped Filters

We propose a novel spectral convolutional neural network model on graph-structured data, namely Distributed Feedback-Looped Networks (DFNets). This model is incorporated with a robust class of spectral graph filters, called feedback-looped filters, to provide better localization on vertices, while still attaining fast convergence and linear memory requirements.

A. Wijesinghe and Q. Wang

The 33rd Conference on Neural Information Processing Systems (NeurIPS), 2019 (paper link)

(Source code available here)

Knowledge Tracing with Sequential Key-Value Memory Networks

Can machines trace human knowledge like humans? Knowledge tracing (KT) is a fundamental task in a wide range of applications in education. In this work, we propose a novel deep learning KT model, namely Sequential Key-Value Memory Networks (SKVMN), which unifies the strengths of recurrent modelling capacity and memory capacity for modelling student learning.

G. Abdelrahman and Q. Wang

ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR), 2019 (paper link)

Learning to Sample: an Active Learning Framework

Meta-learning algorithms for active learning are emerging as a promising paradigm for learning the “best” active learning strategy. However, current learning-based active learning approaches still require sufficient training data so as to generalize meta-learning models for active learning. This is contrary to the nature of active learning which typically starts with a small number of labeled samples.

J. Shao, Q. Wang and F. Liu

IEEE International Conference on Data Mining (ICDM), 2019 (paper link)

FACH: Fast Algorithm for Detecting Cohesive Hierarchies of Communities in Large Networks

We study the hierarchical community detection problem in large networks and show that this problem is NP-hard. We leverage a fast network sparsification technique to improve the efficiency and scalability of a cut-based algorithm for detecting a cohesive hierarchy of communities in a large-scale network.

M. Rezvani, Q. Wang, and W. Liang

The 11th ACM International Conference on Web Search and Data Mining (WSDM), 2018 (paper link)

 

List of Recent Publications

Hierarchical Cut Labelling - Scaling Up Distance Queries on Road Networks
M. Farhan, H. Koehler, R. Ohms, and Q. Wang
ACM Special Interest Group on Management of Data (SIGMOD), 2024

Knowledge Tracing Using Deep Learning Methods
G. Abdelrahman
PhD thesis, Australian National University, 2023 (thesis link)

Contrastive Learning for Supervised Graph Matching
G. Ratnayaka, Q. Wang, and Y. Li
The 39th Conference on Uncertainty in Artificial Intelligence (UAI), 2023 (paper link)

BatchHL+: Batch Dynamic Labelling for Distance Queries on Large-Scale Networks
M. Farhan, H. Koehler, and Q. Wang
The VLDB Journal, 2023 (paper link)

Local Vertex Colouring Graph Neural Networks
S. Li, D. Kim, and Q. Wang
International Conference on Machine Learning (ICML), 2023 (paper link)

Learning Data Teaching Strategies via Knowledge Tracing
G. Abdelrahman and Q. Wang
Knowledge-Based Systems, 2023 (paper link)

Efficient Maintenance of Highway Cover Labelling for Distance Queries on Large Dynamic Graphs
M. Farhan and Q. Wang
World Wide Web, 2023 (paper link)

N-WL: A New Hierarchy of Expressivity for Graph Neural Networks
Q. Wang, D. Chen, A. Wijesinghe, S. Li, and M. Farhan
International Conference on Learning Representations (ICLR), 2023 (paper link)

Geometric Learning on Graph Structured Data
A. Wijesinghe
PhD thesis, Australian National University, 2023 (thesis link)

BatchHL: Answering Distance Queries on Batch-Dynamic Networks at Scale
M. Farhan, Q. Wang, and H. Koehler
ACM Special Interest Group on Management of Data (SIGMOD), 2022 (paper link)

Restructuring Graphs for Higher Homophily via Adaptive Spectral Clustering
S. Li, D. Kim, and Q. Wang
The 37th AAAI Conference on Artificial Intelligence (AAAI - Oral), 2023 (paper link)

Knowledge Tracing: A Survey
G. Abdelrahman, Q. Wang, and B. Pereira Nunes
ACM Computing Surveys, 2022 (paper link)

A New Perspective on How Graph Neural Networks Go Beyond Weisfeiler-Lehman?
A. Wijesinghe and Q. Wang
International Conference on Learning Representations (ICLR - Oral, 54 out of 3391 submissions), 2022 (paper link)

Privacy-Preserving Data Publishing
M. Iftikhar
PhD thesis, Australian National University, 2022 (thesis link)

Answering Shortest Path Distance Queries in Large Complex Networks
M. Farhan
PhD thesis, Australian National University, 2022 (thesis link)

dK-Personalization: Publishing Network Statistics with Personalized Differential Privacy
M. Iftikhar, Q. Wang, and Y. Li
The 26th Pacific-Asia Conference on Knowledge Discovery and Data Mining (PAKDD), 2022 (paper link)

Deep Graph Memory Networks for Forgetting-Robust Knowledge Tracing
G. Abdelrahman and Q. Wang
IEEE Transactions on Knowledge and Data Engineering (TKDE), 2022 (paper link)

A Regularized Wasserstein Framework for Graph Kernels
A. Wijesinghe, Q. Wang and S. Gould
IEEE International Conference on Data Mining (ICDM), 2021 (paper link)

Fast Fully Dynamic Labelling For Distance Queries
M. Farhan, Q. Wang, Y. Lin, and B. Mckay
The VLDB Journal, 2021 (paper link)

Beyond Low-Pass Filters: Adaptive Feature Propagation on Graphs
S. Li, D. Kim, and Q. Wang
The European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML-PKDD), 2021 (paper link)

Entity Resolution with Active Learning
J. Shao
PhD thesis, Australian National University, 2021 (thesis link)

Query-by-Sketch: Scaling Shortest Path Graph Queries on Very Large Networks
Y. Wang, Q. Wang, H. Koehler, and Y. Lin
ACM Special Interest Group on Management of Data (SIGMOD), 2021 (paper link)

Efficient Maintenance of Distance Labelling for Incremental Updates in Large Dynamic Graphs
M. Farhan and Q. Wang
The 24th International Conference on Extending Database Technology (EDBT), 2021 (paper link)

dK-Projection: Publishing Graph Joint Degree Distribution with Node Differential Privacy
M. Iftikhar and Q. Wang
The 25th Pacific-Asia Conference on Knowledge Discovery and Data Mining (PAKDD), 2021 (paper link)

Exploring Shortest Paths on Large-Scale Networks
Y. Wang
MPhil thesis, Australian National University, 2021 (thesis link)

Episode-Adaptive Embedding Networks for Few-Shot Learning
F. Liu and Q. Wang
The 25th Pacific-Asia Conference on Knowledge Discovery and Data Mining (PAKDD), 2021 (paper link)

ErGAN: Generative Adversarial Networks for Entity Resolution
J. Shao, Q. Wang, A. Wijesinghe, and E. Rahm
IEEE International Conference on Data Mining (ICDM), 2020 (paper link)

dK-Microaggregation: Anonymizing Graphs with Differential Privacy Guarantees
M. Iftikhar, Q. Wang, and Y. Lin
The 24th Pacific-Asia Conference on Knowledge Discovery and Data Mining (PAKDD), 2020 (paper link)

Dynamic Chunkwise CNN for Distantly Supervised Relation Extraction
F. Liu and Q. Wang
IEEE International Conference on Big Data (IEEE BigData), 2020 (paper link)

A Highly Scalable Labelling Approach for Exact Distance Queries in Complex Networks
M. Farhan, Q. Wang, Y. Lin, and B. Mckay
The 22nd International Conference on Extending Database Technology (EDBT), 2019 (paper link)

DFNets: Spectral CNNs for Graphs with Feedback-Looped Filters
A. Wijesinghe and Q. Wang
The 33rd Conference on Neural Information Processing Systems (NeurIPS), 2019 (paper link)

Knowledge Tracing with Sequential Key-Value Memory Networks
G. Abdelrahman and Q. Wang
ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR), 2019 (paper link)

Publishing Differentially Private Datasets via Stable Microaggregation
M. Iftikhar, Q. Wang, and Y. Lin
The 22nd International Conference on Extending Database Technology (EDBT), 2019 (paper link)

Learning to Sample: an Active Learning Framework
J. Shao, Q. Wang and F. Liu
IEEE International Conference on Data Mining (ICDM), 2019 (paper link)

Skyblocking for Entity Resolution
J. Shao, Q. Wang and Y. Lin
Information Systems, Volume 85, November 2019 (paper link)

Repairing of Record Linkage: Turing Errors into Insight
Q. Bui-Nguyen, Q. Wang, J. Shao, and D. Vatsalan
The 22nd International Conference on Extending Database Technology (EDBT), 2019 (paper link)

Community Structure in Large-Scale Complex Networks
M. Rezvani
PhD thesis, Australian National University, 2019 (thesis link)

Active Blocking Scheme Learning for Entity Resolution
J. Shao and Q. Wang
The 22nd Pacific-Asia Conference on Knowledge Discovery and Data Mining (PAKDD), 2018 (paper link)

FACH: Fast Algorithm for Detecting Cohesive Hierarchies of Communities in Large Networks
M. Rezvani, Q. Wang, and W. Liang
The 11th ACM International Conference on Web Search and Data Mining (WSDM), 2018 (paper link)