2023 article

Fast Parallel Tensor Times Same Vector for Hypergraphs

2023 IEEE 30TH INTERNATIONAL CONFERENCE ON HIGH PERFORMANCE COMPUTING, DATA, AND ANALYTICS, HIPC 2023, pp. 324–334.

author keywords: hypergraphs; sparse symmetric tensor times same vector; tensor eigenvector; generating function
Source: Web Of Science
Added: July 8, 2024

Hypergraphs are a popular paradigm to represent complex real-world networks exhibiting multi-way relationships of varying sizes. Mining centrality in hyper-graphs via symmetric adjacency tensors has only recently become computationally feasible for large and complex datasets. To enable scalable computation of these and related hypergraph analytics, here we focus on the Sparse Symmetric Tensor Times Same Vector (S <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">3</sup> TTVC) operation. We introduce the Compound Compressed Sparse Symmetric (CCSS) format, an extension of the compact CSS format for hypergraphs of varying hyperedge sizes and present a shared-memory parallel algorithm to compute S <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">3</sup> TTVC. We experimentally show S <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">3</sup> TTVc computation using the CCSS format achieves better performance than the naive baseline, and is subsequently more performant for hypergraph <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">$H$</tex> -eigenvector centrality.