Multimodal Human-Computer Interaction
( for more information see my publications )

 

EyeTalk project seeks to build intelligent multimodal user interface into a car environment so that useful human interaction with the car navigation system(CNS) can occur in a natural manner. I am the project coordinator. My responsibility is to manage development and to build eye-gaze tracking module. EyeTalk includes followings:
[eye-gaze tracking] [lip-reading] [speech recognition]  

PowerPoint Slides of EyeTalk project

Back to Home

กก