Molding CNNs for text: Non-linear, non-consecutive convolutions
The success of deep learning often derives from well-chosen operational building blocks. In this work, we revise the temporal convolution operation in CNNs to better adapt it to text processing. Instead of concatenating word representations, we appeal to tensor algebra and use low-rank n-gram tensor...
Main Authors: | , , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
Association for Computational Linguistics,
2017-07-18T14:54:24Z.
|
Subjects: | |
Online Access: | Get fulltext |