This project has moved and is read-only. For the latest updates, please go here.

Access gaze coordinates in real time

Jul 23, 2012 at 9:35 PM


Thank you for all your work on Ogama!  I am an undergrad student doing a summer research project on gaze tracking and user interaction.  My group is using Gaze Tracker ITU within Ogama.

I was wondering if it is possible to access the gaze data in real time.  Right now we are simply extracting the data from Ogama and analyzing it after running an experiment.  But we are attempting to use the gaze data as an interactive tool (ie. if the user looks at a certain position the program will respond in one way, if they look elsewhere the program will respond differently).  If anyone has any ideas of how to extract data in real time or simply access/stream the data it would be greatly appreciated~


Thank you so much!


Aug 27, 2012 at 5:22 PM

Hi Chelsea,

the standalone itu gazetracker has a network client which supplies the collected data stream via udp, which you can subscribe to with any network client you code, ogama does this also.

Regards, Adrian

Feb 25, 2014 at 3:35 PM
Edited Feb 25, 2014 at 3:36 PM
Does anyone know which format ITU streams the data? I tried to find it out from the ITU forum but it seems not to be active and nothing about this is mentioned.

I tried to setup a UDP receive module in Simulink and I receive something but it's always the same and not nearly the amount that I should get (despite the fact that it does not make sense at all).
I read that ITU does updates at 170hz so I sample at .005 seconds. The ports work. The buffer size seems to be sufficient and I think I have tried nearly every possible option for the data type of the message. Does anybody know where the fundamental flaw is?
Mar 12, 2014 at 5:04 PM
For all the other coding illiterates like me who want to use the udp stream of the ITU tracker with Matlab Simulink here my insights so far:

Simple example:
  • use the UDP receive block
  • set it to int8 for the data format
  • connect the block to a signal to workspace block
  • start the ITU tracker (data server on)
  • let the Simulink model run
you get a vector with 42 values

convert this matrix (trim all the zeros at the end of the vector) with native2unicode and you get the gaze position.

Sorry, most of you will find this primitive and/or obvious but for me it was a great breakthrough :-)

Mar 12, 2014 at 5:10 PM
Edited Mar 12, 2014 at 5:14 PM
Next steps that need to be fixed and where I would be grateful for help:
  • Find out how pupil dilation can be extracted
  • How to obtain more than one data row / udp package
I would like to create a application with Simulink that shows the cognitive load in realtime. I will combine this with priming and therefore I need OGAMA.

Mar 24, 2014 at 10:55 AM
Hi Ewald,
in the GTNetworkClient.GazeData struct there are two properties PupilDiameterLeft and PupilDiameterRight.
For getting Streaming data you would have to subscribe to the GazeData.OnGazeData Events, but i do not know how this is done with simulink. Maybe you will just need a loop which reads the udp block multiple times.
Kind regards,