Performance Optimization of Semi-supervised Distributed Learning from Non-IID Data
碩士 === 國立臺灣大學 === 資訊工程學研究所 === 107 === Modern mobile end devices, such as smartphones and dash cams, are collecting numerous data that can be useful for various IoT and AI appli- cations. However, as the concern of privacy rises, users are not willing to provide their data to the server for further...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Others |
Language: | en_US |
Published: |
2019
|
Online Access: | http://ndltd.ncl.edu.tw/handle/at73fd |
id |
ndltd-TW-107NTU05392037 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-TW-107NTU053920372019-11-16T05:27:54Z http://ndltd.ncl.edu.tw/handle/at73fd Performance Optimization of Semi-supervised Distributed Learning from Non-IID Data 非獨立同分布之影像資料使用於半監督分散式學習的效能優化 Chieh-Sheng Wang 王捷生 碩士 國立臺灣大學 資訊工程學研究所 107 Modern mobile end devices, such as smartphones and dash cams, are collecting numerous data that can be useful for various IoT and AI appli- cations. However, as the concern of privacy rises, users are not willing to provide their data to the server for further data analysis, which involves deep learning model training. Therefore, Google proposes Federated Learning, of- fering distributed end devices a way to train a deep learning model without passing their private data to a centralized server. However, in the real world circumstances, the data on the end devices are non-independent and identi- cally distributed (non-IID) such that it may cause weight divergence during training and eventually result in a considerable decrease in the model perfor- mance. In this thesis, we propose an innovative Federated learning scheme, in which we design a new operation called Federated Swapping (FedSwap) to replace some Federated Averaging (FedAvg) operations based on a few shared data during federated training to alleviate the adverse impact of weight divergence. We implement our method on both image classification using CIFAR-10 benchmark data and object detection using the real world video data. Experiment results show that the accuracy of image classification is increased by 3.8%, and the object detection task can be improved by 1.1%. 逄愛君 2019 學位論文 ; thesis 13 en_US |
collection |
NDLTD |
language |
en_US |
format |
Others
|
sources |
NDLTD |
description |
碩士 === 國立臺灣大學 === 資訊工程學研究所 === 107 === Modern mobile end devices, such as smartphones and dash cams, are collecting numerous data that can be useful for various IoT and AI appli- cations. However, as the concern of privacy rises, users are not willing to provide their data to the server for further data analysis, which involves deep learning model training. Therefore, Google proposes Federated Learning, of- fering distributed end devices a way to train a deep learning model without passing their private data to a centralized server. However, in the real world circumstances, the data on the end devices are non-independent and identi- cally distributed (non-IID) such that it may cause weight divergence during training and eventually result in a considerable decrease in the model perfor- mance. In this thesis, we propose an innovative Federated learning scheme, in which we design a new operation called Federated Swapping (FedSwap) to replace some Federated Averaging (FedAvg) operations based on a few shared data during federated training to alleviate the adverse impact of weight divergence. We implement our method on both image classification using CIFAR-10 benchmark data and object detection using the real world video data. Experiment results show that the accuracy of image classification is increased by 3.8%, and the object detection task can be improved by 1.1%.
|
author2 |
逄愛君 |
author_facet |
逄愛君 Chieh-Sheng Wang 王捷生 |
author |
Chieh-Sheng Wang 王捷生 |
spellingShingle |
Chieh-Sheng Wang 王捷生 Performance Optimization of Semi-supervised Distributed Learning from Non-IID Data |
author_sort |
Chieh-Sheng Wang |
title |
Performance Optimization of Semi-supervised Distributed Learning from Non-IID Data |
title_short |
Performance Optimization of Semi-supervised Distributed Learning from Non-IID Data |
title_full |
Performance Optimization of Semi-supervised Distributed Learning from Non-IID Data |
title_fullStr |
Performance Optimization of Semi-supervised Distributed Learning from Non-IID Data |
title_full_unstemmed |
Performance Optimization of Semi-supervised Distributed Learning from Non-IID Data |
title_sort |
performance optimization of semi-supervised distributed learning from non-iid data |
publishDate |
2019 |
url |
http://ndltd.ncl.edu.tw/handle/at73fd |
work_keys_str_mv |
AT chiehshengwang performanceoptimizationofsemisuperviseddistributedlearningfromnoniiddata AT wángjiéshēng performanceoptimizationofsemisuperviseddistributedlearningfromnoniiddata AT chiehshengwang fēidúlìtóngfēnbùzhīyǐngxiàngzīliàoshǐyòngyúbànjiāndūfēnsànshìxuéxídexiàonéngyōuhuà AT wángjiéshēng fēidúlìtóngfēnbùzhīyǐngxiàngzīliàoshǐyòngyúbànjiāndūfēnsànshìxuéxídexiàonéngyōuhuà |
_version_ |
1719291612329148416 |