Gaze estimation in unconstrained environments
Gaze estimation in unconstrained environments, where the subjects are free to conduct movements without wearing any device, faces a great challenge due to various eye appearance, occlusion of eyelids, large head movements, different viewing angles and illumination conditions. The main contribution o...
Main Author: | |
---|---|
Other Authors: | |
Published: |
University of Portsmouth
2018
|
Subjects: | |
Online Access: | https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.765704 |
id |
ndltd-bl.uk-oai-ethos.bl.uk-765704 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-bl.uk-oai-ethos.bl.uk-7657042019-03-05T15:59:12ZGaze estimation in unconstrained environmentsCai, HaibinLiu, Honghai ; Ju, Zhaojie ; Tan, Jiacheng2018Gaze estimation in unconstrained environments, where the subjects are free to conduct movements without wearing any device, faces a great challenge due to various eye appearance, occlusion of eyelids, large head movements, different viewing angles and illumination conditions. The main contribution of this thesis lies in the development of several algorithms for eye center localization and gaze estimation. Firstly, a novel convolution based integro-differential operator (CIDO) is proposed to detect the eye center quickly by designing different kinds of kernels to convolute the eye images. The low computational cost and accurate localization performance enable CIDO to be easily integrated into real-time gaze related applications. Based on the theory of CIDO, a radial integro-differential method (RIDM) is proposed to further improve the eye center localization accuracy. Experimental results on three publicly available datasets have demonstrated that RIDM outperforms the state-of-the art methods. Secondly, a normalized iris center eye corner vector (NICEC) based gaze estimation method which improves the traditional PCCR based methods by removing the requirement of additional IR light sources is proposed. To overcome the influence of various head movements, this thesis further proposes a simplified eye model based gaze estimation method which outperforms many state-of-the-art methods and achieves an average estimation error of 1.99 o under free head movements. Thirdly, based on the proposed eye center localization methods and gaze estimation methods, a real-time multi-sensory fusion framework is proposed to estimate the gaze in an unconstrained environment. The proposed system facilitates the efficiency and the effectiveness of multi-sensory fusion and addresses significant challenges in multi-modal data acquiring, fusing, and interpreting. Experimental results have shown that not only does the system have the capability of dealing with large head movements but it also can be applied to analysis the gaze behavior of children with autism spectrum disorder (ASD).004University of Portsmouthhttps://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.765704https://researchportal.port.ac.uk/portal/en/theses/gaze-estimation-in-unconstrained-environments(5c391e0b-4026-4415-a1e1-8995b622d246).htmlElectronic Thesis or Dissertation |
collection |
NDLTD |
sources |
NDLTD |
topic |
004 |
spellingShingle |
004 Cai, Haibin Gaze estimation in unconstrained environments |
description |
Gaze estimation in unconstrained environments, where the subjects are free to conduct movements without wearing any device, faces a great challenge due to various eye appearance, occlusion of eyelids, large head movements, different viewing angles and illumination conditions. The main contribution of this thesis lies in the development of several algorithms for eye center localization and gaze estimation. Firstly, a novel convolution based integro-differential operator (CIDO) is proposed to detect the eye center quickly by designing different kinds of kernels to convolute the eye images. The low computational cost and accurate localization performance enable CIDO to be easily integrated into real-time gaze related applications. Based on the theory of CIDO, a radial integro-differential method (RIDM) is proposed to further improve the eye center localization accuracy. Experimental results on three publicly available datasets have demonstrated that RIDM outperforms the state-of-the art methods. Secondly, a normalized iris center eye corner vector (NICEC) based gaze estimation method which improves the traditional PCCR based methods by removing the requirement of additional IR light sources is proposed. To overcome the influence of various head movements, this thesis further proposes a simplified eye model based gaze estimation method which outperforms many state-of-the-art methods and achieves an average estimation error of 1.99 o under free head movements. Thirdly, based on the proposed eye center localization methods and gaze estimation methods, a real-time multi-sensory fusion framework is proposed to estimate the gaze in an unconstrained environment. The proposed system facilitates the efficiency and the effectiveness of multi-sensory fusion and addresses significant challenges in multi-modal data acquiring, fusing, and interpreting. Experimental results have shown that not only does the system have the capability of dealing with large head movements but it also can be applied to analysis the gaze behavior of children with autism spectrum disorder (ASD). |
author2 |
Liu, Honghai ; Ju, Zhaojie ; Tan, Jiacheng |
author_facet |
Liu, Honghai ; Ju, Zhaojie ; Tan, Jiacheng Cai, Haibin |
author |
Cai, Haibin |
author_sort |
Cai, Haibin |
title |
Gaze estimation in unconstrained environments |
title_short |
Gaze estimation in unconstrained environments |
title_full |
Gaze estimation in unconstrained environments |
title_fullStr |
Gaze estimation in unconstrained environments |
title_full_unstemmed |
Gaze estimation in unconstrained environments |
title_sort |
gaze estimation in unconstrained environments |
publisher |
University of Portsmouth |
publishDate |
2018 |
url |
https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.765704 |
work_keys_str_mv |
AT caihaibin gazeestimationinunconstrainedenvironments |
_version_ |
1718999439953100800 |