Show simple item record

dc.contributor.authorKaymak, Sertan
dc.date.accessioned2015-09-29T12:00:47Z
dc.date.available2015-09-29T12:00:47Z
dc.date.issued2015-01
dc.identifier.citationKaymak, S. 2015. Real-time appearance-based gaze tracking. Queen Mary University of London.en_US
dc.identifier.urihttp://qmro.qmul.ac.uk/xmlui/handle/123456789/8949
dc.descriptionPhDen_US
dc.description.abstractGaze tracking technology is widely used in Human Computer Interaction applications such as in interfaces for assisting people with disabilities and for driver attention monitoring. However, commercially available gaze trackers are expensive and their performance deteriorates if the user is not positioned in front of the camera and facing it. Also, head motion or being far from the device degrades their accuracy. This thesis focuses on the development of real-time time appearance based gaze tracking algorithms using low cost devices, such as a webcam or Kinect. The proposed algorithms are developed by considering accuracy, robustness to head pose variation and the ability to generalise to different persons. In order to deal with head pose variation, we propose to estimate the head pose and then compensate for the appearance change and the bias to a gaze estimator that it introduces. Head pose is estimated by a novel method that utilizes tensor-based regressors at the leaf nodes of a random forest. For a baseline gaze estimator we use an SVM-based appearance-based regressor. For compensating the appearance variation introduced by the head pose, we use a geometric model, and for compensating for the bias we use a regression function that has been trained on a training set. Our methods are evaluated on publicly available datasetsen_US
dc.language.isoenen_US
dc.publisherQueen Mary University of Londonen_US
dc.subjectElectronic Engineeringen_US
dc.subjectHuman-computer interactionen_US
dc.subjectGaze trackingen_US
dc.subjectWebcamen_US
dc.subjectVisionen_US
dc.titleReal-time appearance-based gaze tracking.en_US
dc.typeThesisen_US
dc.rights.holderThe copyright of this thesis rests with the author and no quotation from it or information derived from it may be published without the prior written consent of the author


Files in this item

Thumbnail

This item appears in the following Collection(s)

  • Theses [2761]
    Theses Awarded by Queen Mary University of London

Show simple item record