Some days ago, my colleague Min-Gyu asked me if know of any eye-tracking system. For his project, he needs a way to know where the people look, making effective the experiments (I am sorry, but this is all I can say). I had not any in my pockets, so, I toke a look through the Internet to know how the state of the art was. I have discovered that some companies are applying eye-tracking to advertisements: they are want to know where the people look to know if the money they are spending in publicity has any sense.
In other part of the world, there are people that work on eye-tracking techniques to help people with disabilities (helping them to use a computer, just looking at it) and other just for fun. Searching between these people, I found the work of Tristan Hume, that based in the scientific work of this paper, is implementing an algorithm to make a robust a feasible eye-tracking with a webcam.
I tried the code in worked smooth, as Tristan promised. As I had one free hour and a Nao robot bored, I put some code over code and I have implemented the same eye-tracking system for the Nao robot. Now, when you look to the robot, he can know where are you looking, be careful!
I have uploaded the code to gitHub, maybe someone can find it interesting.