Preparing Deep Belief Networks for Practical Tasks
碩士 === 國立中正大學 === 電機工程研究所 === 100 === Deep Belief Networks (DBNs) is a probabilistic generative models composed of multiple layers of stochastic, latent variables. multiple layers of stochastic, latent variables. The network can learn many layers of features on various type of data such as binary i...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Others |
Language: | en_US |
Published: |
2012
|
Online Access: | http://ndltd.ncl.edu.tw/handle/68415284452640040030 |
id |
ndltd-TW-100CCU00442070 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-TW-100CCU004420702015-10-13T21:07:19Z http://ndltd.ncl.edu.tw/handle/68415284452640040030 Preparing Deep Belief Networks for Practical Tasks 深度學習網路之研究及其應用 Lu, LiWei 盧立偉 碩士 國立中正大學 電機工程研究所 100 Deep Belief Networks (DBNs) is a probabilistic generative models composed of multiple layers of stochastic, latent variables. multiple layers of stochastic, latent variables. The network can learn many layers of features on various type of data such as binary images, gray scaled images, color images and acoustic data. This paper further examined the ability of DBNs to interpret the binary representation of data. The performance is validated by learning given distributions such as normal distribution, Poisson distribution and random number generator. We have shown that Deep Believe Networks can successfully learn the probability distribution with binary encoded dataset. With this property, we can further extend DBNs into states or properties prediction application, we will provide an example showing that DBNs can take multiple binary encoded parameters as input vector and predict the belonging category of these input. Generally, the sensory input of DBNs contains information belong to a certain timestep, that is, the prediction depends only on the current input. However, in some practical tasks, prediction often depend not only on the current state but also the history of states. We propose a method combining DBNs with Echo State Networks(ESNs), using the properties of ESNs’ reservoir, a type of Recurrent Neural Networks, to encoded the history of previous states in which gives us an idea of artificial dreaming. Dr. N. Michael, Mayer 許宏銘 2012 學位論文 ; thesis 38 en_US |
collection |
NDLTD |
language |
en_US |
format |
Others
|
sources |
NDLTD |
description |
碩士 === 國立中正大學 === 電機工程研究所 === 100 === Deep Belief Networks (DBNs) is a probabilistic generative models composed of
multiple layers of stochastic, latent variables. multiple layers of stochastic, latent variables. The network can learn many layers of features on various type of data such as binary images, gray scaled images, color images and acoustic data. This paper further examined the ability of DBNs to interpret the binary representation of data. The performance is validated by learning given distributions such as normal distribution, Poisson distribution and random number generator. We have shown that Deep Believe Networks can successfully learn the probability distribution with binary encoded dataset. With this property, we can further extend DBNs into states or properties prediction application, we will provide an example showing that DBNs can take multiple binary encoded parameters as input vector and predict the belonging category of these input. Generally, the sensory input of DBNs contains information belong to a certain timestep, that is, the prediction depends only on the current input. However, in some practical tasks, prediction often depend not only on the current state but also the history of states. We propose a method combining DBNs with Echo State Networks(ESNs), using the properties of ESNs’ reservoir, a type of Recurrent Neural Networks, to encoded the history of previous states in which gives us an idea of artificial dreaming.
|
author2 |
Dr. N. Michael, Mayer |
author_facet |
Dr. N. Michael, Mayer Lu, LiWei 盧立偉 |
author |
Lu, LiWei 盧立偉 |
spellingShingle |
Lu, LiWei 盧立偉 Preparing Deep Belief Networks for Practical Tasks |
author_sort |
Lu, LiWei |
title |
Preparing Deep Belief Networks for Practical Tasks |
title_short |
Preparing Deep Belief Networks for Practical Tasks |
title_full |
Preparing Deep Belief Networks for Practical Tasks |
title_fullStr |
Preparing Deep Belief Networks for Practical Tasks |
title_full_unstemmed |
Preparing Deep Belief Networks for Practical Tasks |
title_sort |
preparing deep belief networks for practical tasks |
publishDate |
2012 |
url |
http://ndltd.ncl.edu.tw/handle/68415284452640040030 |
work_keys_str_mv |
AT luliwei preparingdeepbeliefnetworksforpracticaltasks AT lúlìwěi preparingdeepbeliefnetworksforpracticaltasks AT luliwei shēndùxuéxíwǎnglùzhīyánjiūjíqíyīngyòng AT lúlìwěi shēndùxuéxíwǎnglùzhīyánjiūjíqíyīngyòng |
_version_ |
1718055931446558720 |