Burst Packet Loss Concealment and Noise Robust for Distributed Speech Recognition

碩士 === 國立臺北科技大學 === 電腦與通訊研究所 === 94 === The thesis discusses the burst packet loss concealment and noise environment mismatch compensation problem over the phase of distributed speech recognition. First, for burst packet loss, we use sub-frame interleaver to disperse burst length in the front-end, a...

Full description

Bibliographic Details
Main Authors: Cheng-Chang Lee, 李政璋
Other Authors: 廖元甫
Format: Others
Language:zh-TW
Published: 2006
Online Access:http://ndltd.ncl.edu.tw/handle/3dk6sm
id ndltd-TW-094TIT05652010
record_format oai_dc
spelling ndltd-TW-094TIT056520102019-06-27T05:08:57Z http://ndltd.ncl.edu.tw/handle/3dk6sm Burst Packet Loss Concealment and Noise Robust for Distributed Speech Recognition 分散式語音辨認架構下之叢集性封包遺失隱藏及雜訊強健式語音辨認 Cheng-Chang Lee 李政璋 碩士 國立臺北科技大學 電腦與通訊研究所 94 The thesis discusses the burst packet loss concealment and noise environment mismatch compensation problem over the phase of distributed speech recognition. First, for burst packet loss, we use sub-frame interleaver to disperse burst length in the front-end, and reconstruct lost feature after space transformation in the back-end, finally apply ARMA filter to smooth the discontinuity due to the reconstruction. And, for noise environment mismatch, the a priori knowledge interpolation method is proposed to alleviate the problem of unseen environments. We evaluate our methodology on the Aurora2 noisy digits database. In the packet loss concealment experiment, average recognition rate on nine simulated channel conditions is improved from 76.16%(ETSI baseline) to 93.06%. In the noise environment mismatch compensation experiment, the performance of all sets were improved from MVA baseline to AKI method (setA : from 92.09% to 92.91%, setB : from 92.08% to 92.11%, setC : from 91.71% to 92.37%). 廖元甫 2006 學位論文 ; thesis 80 zh-TW
collection NDLTD
language zh-TW
format Others
sources NDLTD
description 碩士 === 國立臺北科技大學 === 電腦與通訊研究所 === 94 === The thesis discusses the burst packet loss concealment and noise environment mismatch compensation problem over the phase of distributed speech recognition. First, for burst packet loss, we use sub-frame interleaver to disperse burst length in the front-end, and reconstruct lost feature after space transformation in the back-end, finally apply ARMA filter to smooth the discontinuity due to the reconstruction. And, for noise environment mismatch, the a priori knowledge interpolation method is proposed to alleviate the problem of unseen environments. We evaluate our methodology on the Aurora2 noisy digits database. In the packet loss concealment experiment, average recognition rate on nine simulated channel conditions is improved from 76.16%(ETSI baseline) to 93.06%. In the noise environment mismatch compensation experiment, the performance of all sets were improved from MVA baseline to AKI method (setA : from 92.09% to 92.91%, setB : from 92.08% to 92.11%, setC : from 91.71% to 92.37%).
author2 廖元甫
author_facet 廖元甫
Cheng-Chang Lee
李政璋
author Cheng-Chang Lee
李政璋
spellingShingle Cheng-Chang Lee
李政璋
Burst Packet Loss Concealment and Noise Robust for Distributed Speech Recognition
author_sort Cheng-Chang Lee
title Burst Packet Loss Concealment and Noise Robust for Distributed Speech Recognition
title_short Burst Packet Loss Concealment and Noise Robust for Distributed Speech Recognition
title_full Burst Packet Loss Concealment and Noise Robust for Distributed Speech Recognition
title_fullStr Burst Packet Loss Concealment and Noise Robust for Distributed Speech Recognition
title_full_unstemmed Burst Packet Loss Concealment and Noise Robust for Distributed Speech Recognition
title_sort burst packet loss concealment and noise robust for distributed speech recognition
publishDate 2006
url http://ndltd.ncl.edu.tw/handle/3dk6sm
work_keys_str_mv AT chengchanglee burstpacketlossconcealmentandnoiserobustfordistributedspeechrecognition
AT lǐzhèngzhāng burstpacketlossconcealmentandnoiserobustfordistributedspeechrecognition
AT chengchanglee fēnsànshìyǔyīnbiànrènjiàgòuxiàzhīcóngjíxìngfēngbāoyíshīyǐncángjízáxùnqiángjiànshìyǔyīnbiànrèn
AT lǐzhèngzhāng fēnsànshìyǔyīnbiànrènjiàgòuxiàzhīcóngjíxìngfēngbāoyíshīyǐncángjízáxùnqiángjiànshìyǔyīnbiànrèn
_version_ 1719209896991260672