Compact ConvNets with Ternary Weights and Binary Activations
Compact architectures, ternary weights and binary activations are two methods suitable for making neural networks more efficient. We introduce a) a dithering binary activation which improves accuracy of ternary weight networks with binary activations by randomizing quantization error, and b) a metho...
Main Author: | |
---|---|
Format: | Others |
Language: | English |
Published: |
KTH, Robotik, perception och lärande, RPL
2017
|
Subjects: | |
Online Access: | http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-216389 |