MouseGaze simulates an eye tracker using the computer Mouse.
Platforms:
Windows 7 / 10
Linux
macOS
Required Python Version:
Python 3.6 +
Supported Models:
Any Mouse. ;)
None
To start iohub with a Mouse Simulated eye tracker, add the full iohub device name as a kwarg passed to launchHubServer:
eyetracker.hw.mouse.EyeTracker
Examples
Start ioHub with the Mouse Simulated eye tracker:
from psychopy.iohub import launchHubServer
from psychopy.core import getTime, wait
iohub_config = {'eyetracker.hw.mouse.EyeTracker': {}}
io = launchHubServer(**iohub_config)
# Get the eye tracker device.
tracker = io.devices.tracker
Print all eye tracker events received for 2 seconds:
# Check for and print any eye tracker events received...
tracker.setRecordingState(True)
stime = getTime()
while getTime()-stime < 2.0:
for e in tracker.getEvents():
print(e)
Print current eye position for 5 seconds:
# Check for and print current eye position every 100 msec.
stime = getTime()
while getTime()-stime < 5.0:
print(tracker.getPosition())
wait(0.1)
tracker.setRecordingState(False)
# Stop the ioHub Server
io.quit()
Clears any DeviceEvents that have occurred since the last call to the device’s getEvents(), or clearEvents() methods.
Note that calling clearEvents() at the device level only clears the given device’s event buffer. The ioHub Process’s Global Event Buffer is unchanged.
None –
None
enableEventReporting is functionally identical to the eye tracker device specific setRecordingState method.
Retrieve the configuration settings information used to create the device instance. This will the default settings for the device, found in iohub.devices.<device_name>.default_<device_name>.yaml, updated with any device settings provided via launchHubServer(…).
Changing any values in the returned dictionary has no effect on the device state.
None –
The dictionary of the device configuration settings used to create the device.
(dict)
Retrieve any DeviceEvents that have occurred since the last call to the device’s getEvents() or clearEvents() methods.
Note that calling getEvents() at a device level does not change the Global Event Buffer’s contents.
event_type_id (int) – If specified, provides the ioHub DeviceEvent ID for which events
ID (should be returned for. Events that have occurred but do not match the event) –
class; (specified are ignored. Event type ID's can be accessed via the EventConstants) –
EventConstants. (all available event types are class attributes of) –
clearEvents (int) – Can be used to indicate if the events being returned should also be
True (removed from the device event buffer.) –
buffer. (being returned. False results in events being left in the device event) –
asType (str) – Optional kwarg giving the object type to return events as. Valid values
'namedtuple' (are) –
New events that the ioHub has received since the last getEvents() or clearEvents() call to the device. Events are ordered by the ioHub time of each event, older event at index 0. The event object type is determined by the asType parameter passed to the method. By default a namedtuple object is returned for each event.
(list)
The getLastGazePosition method returns the most recent eye gaze position received from the Eye Tracker. This is the position on the calibrated 2D surface that the eye tracker is reporting as the current eye position. The units are in the units in use by the ioHub Display device.
If binocular recording is being performed, the average position of both eyes is returned.
If no samples have been received from the eye tracker, or the eye tracker is not currently recording data, None is returned.
None –
If this method is not supported by the eye tracker interface, EyeTrackerConstants.EYETRACKER_INTERFACE_METHOD_NOT_SUPPORTED is returned.
None: If the eye tracker is not currently recording data or no eye samples have been received.
tuple: Latest (gaze_x,gaze_y) position of the eye(s)
The getLastSample method returns the most recent eye sample received from the Eye Tracker. The Eye Tracker must be in a recording state for a sample event to be returned, otherwise None is returned.
None –
If this method is not supported by the eye tracker interface, EyeTrackerConstants.FUNCTIONALITY_NOT_SUPPORTED is returned.
None: If the eye tracker is not currently recording data.
EyeSample: If the eye tracker is recording in a monocular tracking mode, the latest sample event of this event type is returned.
BinocularEyeSample: If the eye tracker is recording in a binocular tracking mode, the latest sample event of this event type is returned.
See getLastGazePosition().
isRecordingEnabled returns the recording state from the eye tracking device.
True == the device is recording data; False == Recording is not occurring
runSetupProcedure displays a mock calibration procedure. No calibration is actually done.
MouseGaze generates monocular eye samples. A MonocularEyeSampleEvent is created every 10 or 20 msec depending on the sampling_rate set for the device.
The following fields of the MonocularEyeSample event are supported:
The BinocularEyeSampleEvent event represents the eye position and eye attribute data collected from one frame or reading of an eye tracker device that is recording both eyes of a participant.
Event Type ID: EventConstants.BINOCULAR_EYE_SAMPLE
Event Type String: ‘BINOCULAR_EYE_SAMPLE’
time of event, in sec.msec format, using psychopy timebase.
The horizontal position of MouseGaze on the computer screen, in Display Coordinate Type Units. Calibration must be done prior to reading (meaningful) gaze data. Uses Gazepoint LPOGX field.
The vertical position of MouseGaze on the computer screen, in Display Coordinate Type Units. Calibration must be done prior to reading (meaningful) gaze data. Uses Gazepoint LPOGY field.
MouseGaze pupil diameter, static at 5 mm.
Indicates if eye sample contains ‘valid’ position data. 0 = MouseGaze position is valid. 2 = MouseGaze position is missing (in simulated blink).
MouseGaze also creates basic fixation, saccade, and blink events based on mouse event data.
A FixationStartEvent is generated when the beginning of an eye fixation ( in very general terms, a period of relatively stable eye position ) is detected by the eye trackers sample parsing algorithms.
Event Type ID: EventConstants.FIXATION_START
Event Type String: ‘FIXATION_START’
time of event, in sec.msec format, using psychopy timebase.
EyeTrackerConstants.RIGHT_EYE.
The horizontal ‘eye’ position on the computer screen at the start of the fixation. Units are same as Window.
The vertical eye position on the computer screen at the start of the fixation. Units are same as Window.
A FixationEndEvent is generated when the end of an eye fixation ( in very general terms, a period of relatively stable eye position ) is detected by the eye trackers sample parsing algorithms.
Event Type ID: EventConstants.FIXATION_END
Event Type String: ‘FIXATION_END’
time of event, in sec.msec format, using psychopy timebase.
EyeTrackerConstants.RIGHT_EYE.
The horizontal ‘eye’ position on the computer screen at the start of the fixation. Units are same as Window.
The vertical ‘eye’ position on the computer screen at the start of the fixation. Units are same as Window.
The horizontal ‘eye’ position on the computer screen at the end of the fixation. Units are same as Window.
The vertical ‘eye’ position on the computer screen at the end of the fixation. Units are same as Window.
Average calibrated horizontal eye position during the fixation, specified in Display Units.
Average calibrated vertical eye position during the fixation, specified in Display Units.
Duration of the fixation in sec.msec format.
eyetracker.hw.mouse.EyeTracker:
# True = Automatically start reporting events for this device when the experiment starts.
# False = Do not start reporting events for this device until enableEventReporting(True)
# is called for the device.
auto_report_events: False
# Should eye tracker events be saved to the ioHub DataStore file when the device
# is recording data ?
save_events: True
# Should eye tracker events be sent to the Experiment process when the device
# is recording data ?
stream_events: True
# How many eye events (including samples) should be saved in the ioHub event buffer before
# old eye events start being replaced by new events. When the event buffer reaches
# the maximum event length of the buffer defined here, older events will start to be dropped.
event_buffer_length: 1024
runtime_settings:
# How many samples / second should Mousegaze Generate.
# 50 or 100 hz are supported.
sampling_rate: 50
# MouseGaze always generates Monocular Right eye samples.
track_eyes: RIGHT_EYE
controls:
# Mouse Button used to make a MouseGaze position change.
# LEFT_BUTTON, MIDDLE_BUTTON, RIGHT_BUTTON.
move: RIGHT_BUTTON
# Mouse Button(s) used to make MouseGaze generate a blink event.
# LEFT_BUTTON, MIDDLE_BUTTON, RIGHT_BUTTON.
blink: [LEFT_BUTTON, RIGHT_BUTTON]
# Threshold for saccade generation. Specified in visual degrees.
saccade_threshold: 0.5
# MouseGaze creates (minimally populated) fixation, saccade, and blink events.
monitor_event_types: [MonocularEyeSampleEvent, FixationStartEvent, FixationEndEvent, SaccadeStartEvent, SaccadeEndEvent, BlinkStartEvent, BlinkEndEvent]
Last Updated: March, 2021