Summary: | 碩士 === 國立高雄應用科技大學 === 電機工程系博碩士班 === 104 === Recently, the resource services from distant data centers become unacceptable for the real time applications of mobile devices due to long network latency. Therefore, mobile edge computing proposed an idea that was pushing data centers toward the end of networks in order to reducing the network latency of delivering cloud services to mobile devices. However, because of user mobility, it is necessary for edge servers to track and predict the moving paths of mobile devices and prepare service handoff as possible as they can in order for archiving seamless service handoff. On the other hand, one service instance must serve multiple users for saving the resource and pay of deploying the same service. Therefore, carefully allocating service instances onto proper edge servers for execution is essential because the position of a service instance is an important factor for determining the response time of this service instance to its clients. To address these issues, this study is aimed at proposing a location-aware service handoff and deployment (simply called LASHD) framework for mobile edge computing. In this framework, edge servers cooperatively apply K-means to divide users into a number of clusters based on the geographical positions of their mobile devices, and then deploy K service instances onto a number of edge servers which are the most close to the center positions of the user clusters. In addition, data centers apply HMM(Hidden Markov Model) to learn the historical moving paths of users, and provide the learning model for edge servers to predict the moving direction of their clients and which edge server is the most possible to take over these clients, and then ask the predicted edge servers to prepare the resource necessary for service handoff in advance. This study has evaluated the effectiveness and efficiency of the proposed framework. The evaluation result has proved that the proposed framework can efficiently and properly deploy service instances onto edger servers for reducing the response time of their services to 5~22%, and can effectively reduce the latency of service handoff to 8~26% with the prediction of moving paths and the early preparation of necessary service resources.
|