Time-aware neural ordinary differential equations for incomplete time series modeling

Internet of Things realizes the ubiquitous connection of all things, generating countless time-tagged data called time series. However, real-world time series are often plagued with missing values on account of noise or malfunctioning sensors. Existing methods for modeling such incomplete time serie...

Full description

Bibliographic Details
Main Authors: Cai, Z. (Author), Chang, Z. (Author), Liu, S. (Author), Qiu, R. (Author), Song, S. (Author), Tu, G. (Author)
Format: Article
Language:English
Published: Springer 2023
Subjects:
Online Access:View Fulltext in Publisher
View in Scopus
LEADER 03221nam a2200445Ia 4500
001 10.1007-s11227-023-05327-8
008 230529s2023 CNT 000 0 und d
020 |a 09208542 (ISSN) 
245 1 0 |a Time-aware neural ordinary differential equations for incomplete time series modeling 
260 0 |b Springer  |c 2023 
856 |z View Fulltext in Publisher  |u https://doi.org/10.1007/s11227-023-05327-8 
856 |z View in Scopus  |u https://www.scopus.com/inward/record.uri?eid=2-s2.0-85159711067&doi=10.1007%2fs11227-023-05327-8&partnerID=40&md5=d0a0a208176bf9dd77e1ce9cecdf0547 
520 3 |a Internet of Things realizes the ubiquitous connection of all things, generating countless time-tagged data called time series. However, real-world time series are often plagued with missing values on account of noise or malfunctioning sensors. Existing methods for modeling such incomplete time series typically involve preprocessing steps, such as deletion or missing data imputation using statistical learning or machine learning methods. Unfortunately, these methods unavoidable destroy time information and bring error accumulation to the subsequent model. To this end, this paper introduces a novel continuous neural network architecture, named Time-aware Neural-Ordinary Differential Equations (TN-ODE), for incomplete time data modeling. The proposed method not only supports imputation missing values at arbitrary time points, but also enables multi-step prediction at desired time points. Specifically, TN-ODE employs a time-aware Long Short-Term Memory as an encoder, which effectively learns the posterior distribution from partial observed data. Additionally, the derivative of latent states is parameterized with a fully connected network, thereby enabling continuous-time latent dynamics generation. The proposed TN-ODE model is evaluated on both real-world and synthetic incomplete time-series datasets by conducting data interpolation and extrapolation tasks as well as classification task. Extensive experiments show the TN-ODE model outperforms baseline methods in terms of Mean Square Error for imputation and prediction tasks, as well as accuracy in downstream classification task. © 2023, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature. 
650 0 4 |a Classification (of information) 
650 0 4 |a Classification tasks 
650 0 4 |a Continuous time systems 
650 0 4 |a Incomplete time series 
650 0 4 |a Learning systems 
650 0 4 |a Mean square error 
650 0 4 |a Missing values 
650 0 4 |a Network architecture 
650 0 4 |a Neural networks 
650 0 4 |a Neural ODE 
650 0 4 |a Neural ODEs 
650 0 4 |a Ordinary differential equation models 
650 0 4 |a Ordinary differential equations 
650 0 4 |a Signal encoding 
650 0 4 |a Tagged data 
650 0 4 |a Time points 
650 0 4 |a Time series 
650 0 4 |a Time-aware encoder 
650 0 4 |a Times series 
650 0 4 |a Times series models 
700 1 0 |a Cai, Z.  |e author 
700 1 0 |a Chang, Z.  |e author 
700 1 0 |a Liu, S.  |e author 
700 1 0 |a Qiu, R.  |e author 
700 1 0 |a Song, S.  |e author 
700 1 0 |a Tu, G.  |e author 
773 |t Journal of Supercomputing