I am experimenting with the pupil eyetracker and could set it up (almost) smoothly on a macOS. There is an excellent documentation, and my first goal was to just record raw data and extract eye position.
from IPython.display import HTML HTML('<center><video controls autoplay loop src="http://blog.invibe.net/files/2017-12-13_pupil%20test_480.mp4" width=61.8%/></center>')
This video shows the world view (cranio-centric, from a head-mounted camera fixed on the frame) with overlaid the position of the (right) eye while I am configuring a text box. You see the eye fixating on the screen then jumping somewhere else on the screen (saccades) or on the keyboard / hands. Note that the screen itself shows the world view, such that this generates an self-reccurrent pattern.
For this, I could use the capture script and I will demonstrate here how to extract the raw data in a few lines of python code.