Self-Supervised Feature Specific Neural Matrix Completion

Unsupervised matrix completion algorithms mostly model the data generation process by using linear latent variable models. Recently proposed algorithms introduce non-linearity via multi-layer perceptrons (MLP), and self-supervision by setting separate linear regression frameworks for each feature to...

Full description

Bibliographic Details
Main Authors: Mehmet Aktukmak, Samuel M. Mercier, Ismail Uysal
Format: Article
Language:English
Published: IEEE 2020-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9245478/
Description
Summary:Unsupervised matrix completion algorithms mostly model the data generation process by using linear latent variable models. Recently proposed algorithms introduce non-linearity via multi-layer perceptrons (MLP), and self-supervision by setting separate linear regression frameworks for each feature to estimate the missing values. In this article, we introduce an MLP based algorithm called feature-specific neural matrix completion (FSNMC), which combines self-supervised and non-linear methods. The model parameters are estimated by a rotational scheme which separates the parameter and missing value updates sequentially with additional heuristic steps to prevent over-fitting and speed up convergence. The proposed algorithm specifically targets small to medium sized datasets. Experimental results on real-world and synthetic datasets varying in size with a range of missing value percentages demonstrate the superior accuracy for FSNMC, especially at low sparsities when compared to popular methods in the literature. The proposed method has particular potential in estimating missing data collected via real experimentation in fundamental life sciences.
ISSN:2169-3536