Effective Flow Control on Self-similar Traffic in ATM Networks- An FIR neural network Approach
碩士 === 國立中興大學 === 電機工程學系 === 87 === Abstract Asynchronous Transfer Mode (ATM) has been extensively applied in supporting all conceivable media including data, voice, and video in Broadband-ISDNs. For transporting such a diverse mixture of traffic sources requiring various Qua...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Others |
Language: | zh-TW |
Published: |
1999
|
Online Access: | http://ndltd.ncl.edu.tw/handle/27616512516225319115 |
id |
ndltd-TW-087NCHU0442020 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-TW-087NCHU04420202015-10-13T17:54:32Z http://ndltd.ncl.edu.tw/handle/27616512516225319115 Effective Flow Control on Self-similar Traffic in ATM Networks- An FIR neural network Approach ATM網路中自相似性交通流量之有效控制-以FIR類神經網路實現 WeiShi Lian 連偉錫 碩士 國立中興大學 電機工程學系 87 Abstract Asynchronous Transfer Mode (ATM) has been extensively applied in supporting all conceivable media including data, voice, and video in Broadband-ISDNs. For transporting such a diverse mixture of traffic sources requiring various Quality of Services (QoSs), ATM networks must offer QoS guarantees for various classes of traffic sources. Recent traffic measurement studies have demonstrated that the variable bit rate (VBR) video over ATM networks exhibits self-similarity. Neglecting this characteristic would lead to overly optimistic performance predictions and inadequate allocation of network resources. Restated, the self-similarity characteristic has practical implications for analysis, design and control of ATM networks. A neural network can learn from experiences by providing the input-output data without the need of specifying the exact relation between the input and output. It can generalize the learned experience and obtain the correct output when new situations are encountered. All that is required are examples of the relation between the given inputs and the desired outputs. With appropriate training process, a neural network can learn such relations and produce accurate output, even when new input data are confronted. A portion of self-similar traffic data has the same characteristics as the original traffic data, and its similar traffic patterns repeat themselves in the future. Therefore, if the portion of self-similar traffic data is used to be the selected training set, the neural network can accurately predict the whole self-similar traffic. The finite-duration impulse response (FIR) multilayer network, which belongs to the time-delay neural network (TDNN), is employed to predict the number of incoming cells at the next time slot. Based on the information provided by the FIR multilayer network, our proposed feedback rate regulator can decrease the cell loss rate and significantly enhance the network resource utilization. In this thesis, the proposed feedback rate regulator for self-similar VBR traffic in ATM networks is based on the multiple leaky buckets (MLB) mechanism. The multiplexing gain is assumed herein to exist for aggregated self-similar VBR traffic. Therefore, in contrast to the conventional leaky bucket (LB), the leaky rate and buffer capacity of all LBs are shared in the same virtual path in order to more effectively use network resources. In MLB mechanisms, the leaky rate and buffer capacity of each LB are dynamically adjusted based on the buffer occupancy. To validate the performance of our mechanisms (MLB with FIR) , ten real world MPEG1 traffic traces and synthesized self-similar data series are used in our experiments. Simulation results verify that the cell loss rate has an improvement over the conventional leaky bucket method by more than ten thousand times. Key words : ATM, self-similar, QoS, neural network, multiple leaky buckets, congestion control. Yen Chieh Ouyang 歐陽彥杰 1999 學位論文 ; thesis 64 zh-TW |
collection |
NDLTD |
language |
zh-TW |
format |
Others
|
sources |
NDLTD |
description |
碩士 === 國立中興大學 === 電機工程學系 === 87 === Abstract
Asynchronous Transfer Mode (ATM) has been extensively applied in supporting all conceivable media including data, voice, and video in Broadband-ISDNs. For transporting such a diverse mixture of traffic sources requiring various Quality of Services (QoSs), ATM networks must offer QoS guarantees for various classes of traffic sources. Recent traffic measurement studies have demonstrated that the variable bit rate (VBR) video over ATM networks exhibits self-similarity. Neglecting this characteristic would lead to overly optimistic performance predictions and inadequate allocation of network resources. Restated, the self-similarity characteristic has practical implications for analysis, design and control of ATM networks.
A neural network can learn from experiences by providing the input-output data without the need of specifying the exact relation between the input and output. It can generalize the learned experience and obtain the correct output when new situations are encountered. All that is required are examples of the relation between the given inputs and the desired outputs. With appropriate training process, a neural network can learn such relations and produce accurate output, even when new input data are confronted. A portion of self-similar traffic data has the same characteristics as the original traffic data, and its similar traffic patterns repeat themselves in the future. Therefore, if the portion of self-similar traffic data is used to be the selected training set, the neural network can accurately predict the whole self-similar traffic. The finite-duration impulse response (FIR) multilayer network, which belongs to the time-delay neural network (TDNN), is employed to predict the number of incoming cells at the next time slot. Based on the information provided by the FIR multilayer network, our proposed feedback rate regulator can decrease the cell loss rate and significantly enhance the network resource utilization. In this thesis, the proposed feedback rate regulator for self-similar VBR traffic in ATM networks is based on the multiple leaky buckets (MLB) mechanism. The multiplexing gain is assumed herein to exist for aggregated self-similar VBR traffic. Therefore, in contrast to the conventional leaky bucket (LB), the leaky rate and buffer capacity of all LBs are shared in the same virtual path in order to more effectively use network resources. In MLB mechanisms, the leaky rate and buffer capacity of each LB are dynamically adjusted based on the buffer occupancy.
To validate the performance of our mechanisms (MLB with FIR) , ten real world MPEG1 traffic traces and synthesized self-similar data series are used in our experiments. Simulation results verify that the cell loss rate has an improvement over the conventional leaky bucket method by more than ten thousand times.
Key words : ATM, self-similar, QoS, neural network, multiple
leaky buckets, congestion control.
|
author2 |
Yen Chieh Ouyang |
author_facet |
Yen Chieh Ouyang WeiShi Lian 連偉錫 |
author |
WeiShi Lian 連偉錫 |
spellingShingle |
WeiShi Lian 連偉錫 Effective Flow Control on Self-similar Traffic in ATM Networks- An FIR neural network Approach |
author_sort |
WeiShi Lian |
title |
Effective Flow Control on Self-similar Traffic in ATM Networks- An FIR neural network Approach |
title_short |
Effective Flow Control on Self-similar Traffic in ATM Networks- An FIR neural network Approach |
title_full |
Effective Flow Control on Self-similar Traffic in ATM Networks- An FIR neural network Approach |
title_fullStr |
Effective Flow Control on Self-similar Traffic in ATM Networks- An FIR neural network Approach |
title_full_unstemmed |
Effective Flow Control on Self-similar Traffic in ATM Networks- An FIR neural network Approach |
title_sort |
effective flow control on self-similar traffic in atm networks- an fir neural network approach |
publishDate |
1999 |
url |
http://ndltd.ncl.edu.tw/handle/27616512516225319115 |
work_keys_str_mv |
AT weishilian effectiveflowcontrolonselfsimilartrafficinatmnetworksanfirneuralnetworkapproach AT liánwěixī effectiveflowcontrolonselfsimilartrafficinatmnetworksanfirneuralnetworkapproach AT weishilian atmwǎnglùzhōngzìxiāngshìxìngjiāotōngliúliàngzhīyǒuxiàokòngzhìyǐfirlèishénjīngwǎnglùshíxiàn AT liánwěixī atmwǎnglùzhōngzìxiāngshìxìngjiāotōngliúliàngzhīyǒuxiàokòngzhìyǐfirlèishénjīngwǎnglùshíxiàn |
_version_ |
1717785282250539008 |