Molding CNNs for text: Non-linear, non-consecutive convolutions
The success of deep learning often derives from well-chosen operational building blocks. In this work, we revise the temporal convolution operation in CNNs to better adapt it to text processing. Instead of concatenating word representations, we appeal to tensor algebra and use low-rank n-gram tensor...
Main Authors: | Lei, Tao (Contributor), Barzilay, Regina (Contributor), Jaakkola, Tommi S. (Contributor) |
---|---|
Other Authors: | Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science (Contributor) |
Format: | Article |
Language: | English |
Published: |
Association for Computational Linguistics,
2017-07-18T14:54:24Z.
|
Subjects: | |
Online Access: | Get fulltext |
Similar Items
-
Style transfer from non-parallel text by cross-alignment
by: Jaakkola, Tommi, et al.
Published: (2021) -
Semi-supervised question retrieval with gated convolutions
by: Lei, Tao, et al.
Published: (2017) -
Learning to refine text based recommendations
by: Gu, Youyang, et al.
Published: (2021) -
Learning to refine text based recommendations
by: Gu, Youyang, et al.
Published: (2021) -
Rationalizing Neural Predictions
by: Lei, Tao, et al.
Published: (2020)