Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Maybe you could place 3-4 30-60FPS cheapish CMOS cams around the edge of the screen and stagger their frame captures? You'd get different angles (better for detecting eye vectors) and increase the sampling rate.


You’d likely want IR cameras and IR sensors to flood illuminate the area outside the human visual spectrum


(Can't reply to your other comment for some reason)

Why is the IR part of the spectrum better for the cameras? Is it because if I take an image of my eye and look at the IR part of the spectrum is it just easier to see parts of my eye that determine where it's looking?


IR is better because you can provide illumination via IR flood lighting that isn’t visible by the human eye.

This is effectively how the FaceID system on your iPhone can work regardless of lighting condition.

That means that even in dark situations, you can have much higher quality imaging , albeit in a limited range.


Ah, interesting! So you can ensure you get illumination on the eyes w/out shining annoying, visible light at them. Very cool.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: