Head-Movement Compensation for Visible-Spectrum Remote Eye Tracker

  • IMABUCHI Takashi
    Graduate School of Software and Information Science, Iwate Prefectural University
  • PRIMA Oky DickyArdiansyah
    Graduate School of Software and Information Science, Iwate Prefectural University
  • ITO Hisayoshi
    Graduate School of Software and Information Science, Iwate Prefectural University
  • KAMEDA Masashi
    Graduate School of Software and Information Science, Iwate Prefectural University

Bibliographic Information

Other Title
  • 頭部姿勢の変動を考慮した可視光線非接触型の視線計測システムの開発
  • トウブ シセイ ノ ヘンドウ オ コウリョ シタ カシ コウセン ヒセッショクガタ ノ シセン ケイソク システム ノ カイハツ

Search this article

Description

<p>Eye trackers have been used in various fields such as eye movement-related brain activity analyses, visual attention analyses and gaze input interfaces.Recently,these trackers have been improved in both terms of accuracy and portability. In this study, we propose a new eye tracking method which enables to estimate gaze direction from faces in images or movies without using the conventional eye trackers, and evaluate the proposed method based on accuracies of point-of-regards (POR) to measure its capability as gaze input interfaces. The proposed method contains four steps: fitting a 3D model to the targeted facial image, extracting and geometrically correcting the regions of eyes using head-pose information, fitting a circle to each iris to localize its center, and compensating the POR corresponding to the head-pose changes. Our experiment’s result on 11 subjects shows that the average error between each fixation target and the POR is about 2 and 3.5 degrees for ±5 and ±5~±20 degrees head movement, respectively.</p>

Journal

Details 詳細情報について

Report a problem

Back to top