Probabilistic Siamese Networks for Learning Representations

We explore the training of deep neural networks to produce vector representations using weakly labelled information in the form of binary similarity labels for pairs of training images. Previous methods such as siamese networks, IMAX and others, have used fixed cost functions such as $L_1$, $L_2$-no...

Full description

Bibliographic Details
Main Author: Liu, Chen
Other Authors: Frey, Brendan J.
Language:en_ca
Published: 2013
Subjects:
Online Access:http://hdl.handle.net/1807/43097
id ndltd-LACETR-oai-collectionscanada.gc.ca-OTU.1807-43097
record_format oai_dc
spelling ndltd-LACETR-oai-collectionscanada.gc.ca-OTU.1807-430972013-12-06T03:37:54ZProbabilistic Siamese Networks for Learning RepresentationsLiu, ChenMachine LearningNeural Networks0544We explore the training of deep neural networks to produce vector representations using weakly labelled information in the form of binary similarity labels for pairs of training images. Previous methods such as siamese networks, IMAX and others, have used fixed cost functions such as $L_1$, $L_2$-norms and mutual information to drive the representations of similar images together and different images apart. In this work, we formulate learning as maximizing the likelihood of binary similarity labels for pairs of input images, under a parameterized probabilistic similarity model. We describe and evaluate several forms of the similarity model that account for false positives and false negatives differently. We extract representations of MNIST, AT\&T ORL and COIL-100 images and use them to obtain classification results. We compare these results with state-of-the-art techniques such as deep neural networks and convolutional neural networks. We also study our method from a dimensionality reduction prospective.Frey, Brendan J.2013-112013-12-05T15:31:00ZNO_RESTRICTION2013-12-05T15:31:00Z2013-12-05Thesishttp://hdl.handle.net/1807/43097en_ca
collection NDLTD
language en_ca
sources NDLTD
topic Machine Learning
Neural Networks
0544
spellingShingle Machine Learning
Neural Networks
0544
Liu, Chen
Probabilistic Siamese Networks for Learning Representations
description We explore the training of deep neural networks to produce vector representations using weakly labelled information in the form of binary similarity labels for pairs of training images. Previous methods such as siamese networks, IMAX and others, have used fixed cost functions such as $L_1$, $L_2$-norms and mutual information to drive the representations of similar images together and different images apart. In this work, we formulate learning as maximizing the likelihood of binary similarity labels for pairs of input images, under a parameterized probabilistic similarity model. We describe and evaluate several forms of the similarity model that account for false positives and false negatives differently. We extract representations of MNIST, AT\&T ORL and COIL-100 images and use them to obtain classification results. We compare these results with state-of-the-art techniques such as deep neural networks and convolutional neural networks. We also study our method from a dimensionality reduction prospective.
author2 Frey, Brendan J.
author_facet Frey, Brendan J.
Liu, Chen
author Liu, Chen
author_sort Liu, Chen
title Probabilistic Siamese Networks for Learning Representations
title_short Probabilistic Siamese Networks for Learning Representations
title_full Probabilistic Siamese Networks for Learning Representations
title_fullStr Probabilistic Siamese Networks for Learning Representations
title_full_unstemmed Probabilistic Siamese Networks for Learning Representations
title_sort probabilistic siamese networks for learning representations
publishDate 2013
url http://hdl.handle.net/1807/43097
work_keys_str_mv AT liuchen probabilisticsiamesenetworksforlearningrepresentations
_version_ 1716616406583214080