Cascade2vec: Learning Dynamic Cascade Representation by Recurrent Graph Neural Networks

An information dissemination network (i.e., a cascade) with a dynamic graph structure is formed when a novel idea or message spreads from person to person. Predicting the growth of cascades is one of the fundamental problems in social network analysis. Existing deep learning models for cascade predi...

Full description

Bibliographic Details
Main Authors: Zhenhua Huang, Zhenyu Wang, Rui Zhang
Format: Article
Language:English
Published: IEEE 2019-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8846015/
id doaj-9d6b62bece8e4b30a71db4027847b988
record_format Article
spelling doaj-9d6b62bece8e4b30a71db4027847b9882021-04-05T17:24:35ZengIEEEIEEE Access2169-35362019-01-01714480014481210.1109/ACCESS.2019.29428538846015Cascade2vec: Learning Dynamic Cascade Representation by Recurrent Graph Neural NetworksZhenhua Huang0https://orcid.org/0000-0002-0389-9061Zhenyu Wang1Rui Zhang2https://orcid.org/0000-0002-2264-8735School of Software Engineering, South China University of Technology, Guangzhou, ChinaSchool of Software Engineering, South China University of Technology, Guangzhou, ChinaSchool of Software Engineering, South China University of Technology, Guangzhou, ChinaAn information dissemination network (i.e., a cascade) with a dynamic graph structure is formed when a novel idea or message spreads from person to person. Predicting the growth of cascades is one of the fundamental problems in social network analysis. Existing deep learning models for cascade prediction are primarily based on recurrent neural networks and representation on random walks or propagation paths. However, these models are not sufficient for learning the deep spatial and temporal features of an entire cascade. Therefore, a new model, called Cascade2vec, is proposed to learn the dynamic graph representation of cascades based on graph recurrent neural networks. To learn more effective graph-level representation of cascades, the current graph neural networks are improved by designing a graph residual block, which shares attention weights between nodes, and by transforming features through perception layers. Furthermore, the proposed graph neural network is integrated into a recurrent neural network to learn the temporal features between graphs. With this method, both the spatial and temporal characteristics of cascades are learned in Cascade2vec. The experimental results show that our method significantly reduces the mean squared logarithmic error and median squared logarithmic error by 16.1% and 12%, respectively, in the cascade prediction at one hour in the Microblog network dataset compared with strong baselines.https://ieeexplore.ieee.org/document/8846015/Social networkinformation dissemination networkcascade predictiongraph neural networks
collection DOAJ
language English
format Article
sources DOAJ
author Zhenhua Huang
Zhenyu Wang
Rui Zhang
spellingShingle Zhenhua Huang
Zhenyu Wang
Rui Zhang
Cascade2vec: Learning Dynamic Cascade Representation by Recurrent Graph Neural Networks
IEEE Access
Social network
information dissemination network
cascade prediction
graph neural networks
author_facet Zhenhua Huang
Zhenyu Wang
Rui Zhang
author_sort Zhenhua Huang
title Cascade2vec: Learning Dynamic Cascade Representation by Recurrent Graph Neural Networks
title_short Cascade2vec: Learning Dynamic Cascade Representation by Recurrent Graph Neural Networks
title_full Cascade2vec: Learning Dynamic Cascade Representation by Recurrent Graph Neural Networks
title_fullStr Cascade2vec: Learning Dynamic Cascade Representation by Recurrent Graph Neural Networks
title_full_unstemmed Cascade2vec: Learning Dynamic Cascade Representation by Recurrent Graph Neural Networks
title_sort cascade2vec: learning dynamic cascade representation by recurrent graph neural networks
publisher IEEE
series IEEE Access
issn 2169-3536
publishDate 2019-01-01
description An information dissemination network (i.e., a cascade) with a dynamic graph structure is formed when a novel idea or message spreads from person to person. Predicting the growth of cascades is one of the fundamental problems in social network analysis. Existing deep learning models for cascade prediction are primarily based on recurrent neural networks and representation on random walks or propagation paths. However, these models are not sufficient for learning the deep spatial and temporal features of an entire cascade. Therefore, a new model, called Cascade2vec, is proposed to learn the dynamic graph representation of cascades based on graph recurrent neural networks. To learn more effective graph-level representation of cascades, the current graph neural networks are improved by designing a graph residual block, which shares attention weights between nodes, and by transforming features through perception layers. Furthermore, the proposed graph neural network is integrated into a recurrent neural network to learn the temporal features between graphs. With this method, both the spatial and temporal characteristics of cascades are learned in Cascade2vec. The experimental results show that our method significantly reduces the mean squared logarithmic error and median squared logarithmic error by 16.1% and 12%, respectively, in the cascade prediction at one hour in the Microblog network dataset compared with strong baselines.
topic Social network
information dissemination network
cascade prediction
graph neural networks
url https://ieeexplore.ieee.org/document/8846015/
work_keys_str_mv AT zhenhuahuang cascade2veclearningdynamiccascaderepresentationbyrecurrentgraphneuralnetworks
AT zhenyuwang cascade2veclearningdynamiccascaderepresentationbyrecurrentgraphneuralnetworks
AT ruizhang cascade2veclearningdynamiccascaderepresentationbyrecurrentgraphneuralnetworks
_version_ 1721539702806282240