MouseGaze

MouseGaze simulates an eye tracker using the computer Mouse.

Platforms:

  • Windows 7 / 10

  • Linux

  • macOS

Required Python Version:

  • Python 3.6 +

Supported Models:

  • Any Mouse. ;)

Additional Software Requirements

None

EyeTracker Class

class psychopy.iohub.devices.eyetracker.hw.mouse.EyeTracker[source]

To start iohub with a Mouse Simulated eye tracker, add the full iohub device name as a kwarg passed to launchHubServer:

eyetracker.hw.mouse.EyeTracker

Examples

  1. Start ioHub with the Mouse Simulated eye tracker:

    from psychopy.iohub import launchHubServer
    from psychopy.core import getTime, wait
    
    iohub_config = {'eyetracker.hw.mouse.EyeTracker': {}}
    
    io = launchHubServer(**iohub_config)
    
    # Get the eye tracker device.
    tracker = io.devices.tracker
    
  2. Print all eye tracker events received for 2 seconds:

    # Check for and print any eye tracker events received...
    tracker.setRecordingState(True)
    
    stime = getTime()
    while getTime()-stime < 2.0:
        for e in tracker.getEvents():
            print(e)
    
  3. Print current eye position for 5 seconds:

    # Check for and print current eye position every 100 msec.
    stime = getTime()
    while getTime()-stime < 5.0:
        print(tracker.getPosition())
        wait(0.1)
    
    tracker.setRecordingState(False)
    
    # Stop the ioHub Server
    io.quit()
    
clearEvents(event_type=None, filter_id=None, call_proc_events=True)

Clears any DeviceEvents that have occurred since the last call to the device’s getEvents(), or clearEvents() methods.

Note that calling clearEvents() at the device level only clears the given device’s event buffer. The ioHub Process’s Global Event Buffer is unchanged.

Parameters:

None

Returns:

None

enableEventReporting(enabled=True)[source]

enableEventReporting is functionally identical to the eye tracker device specific setRecordingState method.

getConfiguration()

Retrieve the configuration settings information used to create the device instance. This will the default settings for the device, found in iohub.devices.<device_name>.default_<device_name>.yaml, updated with any device settings provided via launchHubServer(…).

Changing any values in the returned dictionary has no effect on the device state.

Parameters:

None

Returns:

The dictionary of the device configuration settings used to create the device.

Return type:

(dict)

getEvents(*args, **kwargs)

Retrieve any DeviceEvents that have occurred since the last call to the device’s getEvents() or clearEvents() methods.

Note that calling getEvents() at a device level does not change the Global Event Buffer’s contents.

Parameters:
  • event_type_id (int) – If specified, provides the ioHub DeviceEvent ID for which events

  • ID (should be returned for. Events that have occurred but do not match the event) –

  • class; (specified are ignored. Event type ID's can be accessed via the EventConstants) –

  • EventConstants. (all available event types are class attributes of) –

  • clearEvents (int) – Can be used to indicate if the events being returned should also be

  • True (removed from the device event buffer.) –

  • buffer. (being returned. False results in events being left in the device event) –

  • asType (str) – Optional kwarg giving the object type to return events as. Valid values

  • 'namedtuple' (are) –

Returns:

New events that the ioHub has received since the last getEvents() or clearEvents() call to the device. Events are ordered by the ioHub time of each event, older event at index 0. The event object type is determined by the asType parameter passed to the method. By default a namedtuple object is returned for each event.

Return type:

(list)

getLastGazePosition()

The getLastGazePosition method returns the most recent eye gaze position received from the Eye Tracker. This is the position on the calibrated 2D surface that the eye tracker is reporting as the current eye position. The units are in the units in use by the ioHub Display device.

If binocular recording is being performed, the average position of both eyes is returned.

If no samples have been received from the eye tracker, or the eye tracker is not currently recording data, None is returned.

Parameters:

None

Returns:

If this method is not supported by the eye tracker interface, EyeTrackerConstants.EYETRACKER_INTERFACE_METHOD_NOT_SUPPORTED is returned.

None: If the eye tracker is not currently recording data or no eye samples have been received.

tuple: Latest (gaze_x,gaze_y) position of the eye(s)

Return type:

int

getLastSample()

The getLastSample method returns the most recent eye sample received from the Eye Tracker. The Eye Tracker must be in a recording state for a sample event to be returned, otherwise None is returned.

Parameters:

None

Returns:

If this method is not supported by the eye tracker interface, EyeTrackerConstants.FUNCTIONALITY_NOT_SUPPORTED is returned.

None: If the eye tracker is not currently recording data.

EyeSample: If the eye tracker is recording in a monocular tracking mode, the latest sample event of this event type is returned.

BinocularEyeSample: If the eye tracker is recording in a binocular tracking mode, the latest sample event of this event type is returned.

Return type:

int

getPosition()

See getLastGazePosition().

isRecordingEnabled()[source]

isRecordingEnabled returns the recording state from the eye tracking device.

Returns:

True == the device is recording data; False == Recording is not occurring

Return type:

bool

runSetupProcedure(calibration_args={})[source]

runSetupProcedure displays a mock calibration procedure. No calibration is actually done.

setRecordingState(recording)[source]

setRecordingState is used to start or stop the recording of data from the eye tracking device.

trackerSec()[source]

Same as trackerTime().

trackerTime()[source]

Current eye tracker time.

Returns:

current eye tracker time in seconds.

Return type:

float

Supported Event Types

MouseGaze generates monocular eye samples. A MonocularEyeSampleEvent is created every 10 or 20 msec depending on the sampling_rate set for the device.

The following fields of the MonocularEyeSample event are supported:

class psychopy.iohub.devices.eyetracker.BinocularEyeSampleEvent(*args, **kwargs)[source]

The BinocularEyeSampleEvent event represents the eye position and eye attribute data collected from one frame or reading of an eye tracker device that is recording both eyes of a participant.

Event Type ID: EventConstants.BINOCULAR_EYE_SAMPLE

Event Type String: ‘BINOCULAR_EYE_SAMPLE’

time

time of event, in sec.msec format, using psychopy timebase.

gaze_x

The horizontal position of MouseGaze on the computer screen, in Display Coordinate Type Units. Calibration must be done prior to reading (meaningful) gaze data. Uses Gazepoint LPOGX field.

gaze_y

The vertical position of MouseGaze on the computer screen, in Display Coordinate Type Units. Calibration must be done prior to reading (meaningful) gaze data. Uses Gazepoint LPOGY field.

left_pupil_measure_1

MouseGaze pupil diameter, static at 5 mm.

status

Indicates if eye sample contains ‘valid’ position data. 0 = MouseGaze position is valid. 2 = MouseGaze position is missing (in simulated blink).

MouseGaze also creates basic fixation, saccade, and blink events based on mouse event data.

class psychopy.iohub.devices.eyetracker.FixationStartEvent(*args, **kwargs)[source]

A FixationStartEvent is generated when the beginning of an eye fixation ( in very general terms, a period of relatively stable eye position ) is detected by the eye trackers sample parsing algorithms.

Event Type ID: EventConstants.FIXATION_START

Event Type String: ‘FIXATION_START’

time

time of event, in sec.msec format, using psychopy timebase.

eye

EyeTrackerConstants.RIGHT_EYE.

gaze_x

The horizontal ‘eye’ position on the computer screen at the start of the fixation. Units are same as Window.

gaze_y

The vertical eye position on the computer screen at the start of the fixation. Units are same as Window.

class psychopy.iohub.devices.eyetracker.FixationEndEvent(*args, **kwargs)[source]

A FixationEndEvent is generated when the end of an eye fixation ( in very general terms, a period of relatively stable eye position ) is detected by the eye trackers sample parsing algorithms.

Event Type ID: EventConstants.FIXATION_END

Event Type String: ‘FIXATION_END’

time

time of event, in sec.msec format, using psychopy timebase.

eye

EyeTrackerConstants.RIGHT_EYE.

start_gaze_x

The horizontal ‘eye’ position on the computer screen at the start of the fixation. Units are same as Window.

start_gaze_y

The vertical ‘eye’ position on the computer screen at the start of the fixation. Units are same as Window.

end_gaze_x

The horizontal ‘eye’ position on the computer screen at the end of the fixation. Units are same as Window.

end_gaze_y

The vertical ‘eye’ position on the computer screen at the end of the fixation. Units are same as Window.

average_gaze_x

Average calibrated horizontal eye position during the fixation, specified in Display Units.

average_gaze_y

Average calibrated vertical eye position during the fixation, specified in Display Units.

duration

Duration of the fixation in sec.msec format.

Default Device Settings

eyetracker.hw.mouse.EyeTracker:
    #   True = Automatically start reporting events for this device when the experiment starts.
    #   False = Do not start reporting events for this device until enableEventReporting(True)
    #   is called for the device.
    auto_report_events: False

    # Should eye tracker events be saved to the ioHub DataStore file when the device
    # is recording data ?
    save_events: True

    # Should eye tracker events be sent to the Experiment process when the device
    # is recording data ?
    stream_events: True

    # How many eye events (including samples) should be saved in the ioHub event buffer before
    # old eye events start being replaced by new events. When the event buffer reaches
    # the maximum event length of the buffer defined here, older events will start to be dropped.
    event_buffer_length: 1024
    runtime_settings:
        # How many samples / second should Mousegaze Generate.
        # 50 or 100 hz are supported.
        sampling_rate: 50

        # MouseGaze always generates Monocular Right eye samples.
        track_eyes: RIGHT_EYE

    controls:
        # Mouse Button used to make a MouseGaze position change.
        # LEFT_BUTTON, MIDDLE_BUTTON, RIGHT_BUTTON.
        move: RIGHT_BUTTON

        # Mouse Button(s) used to make MouseGaze generate a blink event.
        # LEFT_BUTTON, MIDDLE_BUTTON, RIGHT_BUTTON.
        blink: [LEFT_BUTTON, RIGHT_BUTTON]

        # Threshold for saccade generation. Specified in visual degrees.
        saccade_threshold: 0.5

    # MouseGaze creates (minimally populated) fixation, saccade, and blink events.
    monitor_event_types: [MonocularEyeSampleEvent, FixationStartEvent, FixationEndEvent, SaccadeStartEvent, SaccadeEndEvent, BlinkStartEvent, BlinkEndEvent]

Last Updated: March, 2021


Back to top