Transformer-CNN: Swiss knife for QSAR modeling and interpretation

Abstract We present SMILES-embeddings derived from the internal encoder state of a Transformer [1] model trained to canonize SMILES as a Seq2Seq problem. Using a CharNN [2] architecture upon the embeddings results in higher quality interpretable QSAR/QSPR models on diverse benchmark datasets includi...

Full description

Bibliographic Details
Main Authors: Pavel Karpov, Guillaume Godin, Igor V. Tetko
Format: Article
Language:English
Published: BMC 2020-03-01
Series:Journal of Cheminformatics
Subjects:
Online Access:http://link.springer.com/article/10.1186/s13321-020-00423-w