Using Deep Learning to Develop An Automatic EEG-Based Sleep Staging Method for Mice

碩士 === 國立中山大學 === 機械與機電工程學系研究所 === 107 === By employing deep learning models, the purpose of this work is to develop an automatic EEG-based sleep staging method for mice. In this work, sleep is divided into four categories, which contains rapid eye movement (REM), non-rapid eye movement (NREM), wake...

Full description

Bibliographic Details
Main Authors: Cheng-Wei Chen, 陳政緯
Other Authors: Chen-Wen Yen
Format: Others
Language:zh-TW
Published: 2019
Online Access:http://ndltd.ncl.edu.tw/handle/eu48ue
Description
Summary:碩士 === 國立中山大學 === 機械與機電工程學系研究所 === 107 === By employing deep learning models, the purpose of this work is to develop an automatic EEG-based sleep staging method for mice. In this work, sleep is divided into four categories, which contains rapid eye movement (REM), non-rapid eye movement (NREM), wake, and seizure periods. The experimental mice include two groups, which are SD and Wistar mice. In addition, EEG signals were measured in different conditions including before or after the electric shocks and light on or light off. The architecture of the deep learning models used in this work consists of convolutional neural network (CNN), inception network and long short-term memory (LSTM). The inception network is used to extract the characteristics of different frequency bands whereas the LSTM network is employed to explore the time series properties the EEG signals. Based on the results of deep learning models, this study introduces the neighboring rule and the compensation rule to further enhance the sleep staging results. For the REM stages, the results of the deep learning model are summarized as follows. The training accuracy and Kappa index are 0.935 and 0.688, respectively; the testing accuracy and Kappa index are 0.925 and 0.652. After applying the neighboring and the compensation rules, training accuracy and Kappa index improved to 0.961 and 0.825, respectively. The test accuracy and Kappa index also improved to 0.95 and 0.782.