Gazepoint

Platforms:

  • Windows 7 / 10 only

Required Python Version:

  • Python 3.6 +

Supported Models:

  • Gazepoint GP3

Additional Software Requirements

To use your Gazepoint GP3 during an experiment you must first start the Gazepoint Control software on the computer running PsychoPy®.

EyeTracker Class

class psychopy.iohub.devices.eyetracker.hw.gazepoint.gp3.EyeTracker[source]

To start iohub with a Gazepoint GP3 eye tracker device, add a GP3 device to the device dictionary passed to launchHubServer or the experiment’s iohub_config.yaml:

eyetracker.hw.gazepoint.gp3.EyeTracker

Note

The Gazepoint control application must be running while using this interface.

Examples

  1. Start ioHub with Gazepoint GP3 device and run tracker calibration:

    from psychopy.iohub import launchHubServer
    from psychopy.core import getTime, wait
    
    iohub_config = {'eyetracker.hw.gazepoint.gp3.EyeTracker':
        {'name': 'tracker', 'device_timer': {'interval': 0.005}}}
    
    io = launchHubServer(**iohub_config)
    
    # Get the eye tracker device.
    tracker = io.devices.tracker
    
    # run eyetracker calibration
    r = tracker.runSetupProcedure()
    
  2. Print all eye tracker events received for 2 seconds:

    # Check for and print any eye tracker events received...
    tracker.setRecordingState(True)
    
    stime = getTime()
    while getTime()-stime < 2.0:
        for e in tracker.getEvents():
            print(e)
    
  3. Print current eye position for 5 seconds:

    # Check for and print current eye position every 100 msec.
    stime = getTime()
    while getTime()-stime < 5.0:
        print(tracker.getPosition())
        wait(0.1)
    
    tracker.setRecordingState(False)
    
    # Stop the ioHub Server
    io.quit()
    
clearEvents(event_type=None, filter_id=None, call_proc_events=True)

Clears any DeviceEvents that have occurred since the last call to the device’s getEvents(), or clearEvents() methods.

Note that calling clearEvents() at the device level only clears the given device’s event buffer. The ioHub Process’s Global Event Buffer is unchanged.

Parameters:

None

Returns:

None

enableEventReporting(enabled=True)[source]

enableEventReporting is functionally identical to the eye tracker device specific setRecordingState method.

getConfiguration()

Retrieve the configuration settings information used to create the device instance. This will the default settings for the device, found in iohub.devices.<device_name>.default_<device_name>.yaml, updated with any device settings provided via launchHubServer(…).

Changing any values in the returned dictionary has no effect on the device state.

Parameters:

None

Returns:

The dictionary of the device configuration settings used to create the device.

Return type:

(dict)

getEvents(*args, **kwargs)

Retrieve any DeviceEvents that have occurred since the last call to the device’s getEvents() or clearEvents() methods.

Note that calling getEvents() at a device level does not change the Global Event Buffer’s contents.

Parameters:
  • event_type_id (int) – If specified, provides the ioHub DeviceEvent ID for which events

  • ID (should be returned for. Events that have occurred but do not match the event) –

  • class; (specified are ignored. Event type ID's can be accessed via the EventConstants) –

  • EventConstants. (all available event types are class attributes of) –

  • clearEvents (int) – Can be used to indicate if the events being returned should also be

  • True (removed from the device event buffer.) –

  • buffer. (being returned. False results in events being left in the device event) –

  • asType (str) – Optional kwarg giving the object type to return events as. Valid values

  • 'namedtuple' (are) –

Returns:

New events that the ioHub has received since the last getEvents() or clearEvents() call to the device. Events are ordered by the ioHub time of each event, older event at index 0. The event object type is determined by the asType parameter passed to the method. By default a namedtuple object is returned for each event.

Return type:

(list)

getLastGazePosition()

The getLastGazePosition method returns the most recent eye gaze position received from the Eye Tracker. This is the position on the calibrated 2D surface that the eye tracker is reporting as the current eye position. The units are in the units in use by the ioHub Display device.

If binocular recording is being performed, the average position of both eyes is returned.

If no samples have been received from the eye tracker, or the eye tracker is not currently recording data, None is returned.

Parameters:

None

Returns:

If this method is not supported by the eye tracker interface, EyeTrackerConstants.EYETRACKER_INTERFACE_METHOD_NOT_SUPPORTED is returned.

None: If the eye tracker is not currently recording data or no eye samples have been received.

tuple: Latest (gaze_x,gaze_y) position of the eye(s)

Return type:

int

getLastSample()

The getLastSample method returns the most recent eye sample received from the Eye Tracker. The Eye Tracker must be in a recording state for a sample event to be returned, otherwise None is returned.

Parameters:

None

Returns:

If this method is not supported by the eye tracker interface, EyeTrackerConstants.FUNCTIONALITY_NOT_SUPPORTED is returned.

None: If the eye tracker is not currently recording data.

EyeSample: If the eye tracker is recording in a monocular tracking mode, the latest sample event of this event type is returned.

BinocularEyeSample: If the eye tracker is recording in a binocular tracking mode, the latest sample event of this event type is returned.

Return type:

int

getPosition()

See getLastGazePosition().

isRecordingEnabled()[source]

isRecordingEnabled returns the recording state from the eye tracking device.

Returns:

True == the device is recording data; False == Recording is not occurring

Return type:

bool

runSetupProcedure(calibration_args={})[source]

Start the eye tracker calibration procedure.

setRecordingState(recording)[source]

setRecordingState is used to start or stop the recording of data from the eye tracking device.

Parameters:

recording (bool) – if True, the eye tracker will start recordng available eye data and sending it to the experiment program if data streaming was enabled for the device. If recording == False, then the eye tracker stops recording eye data and streaming it to the experiment.

If the eye tracker is already recording, and setRecordingState(True) is called, the eye tracker will simple continue recording and the method call is a no-op. Likewise if the system has already stopped recording and setRecordingState(False) is called again.

Parameters:

recording (bool) – if True, the eye tracker will start recordng data.; false = stop recording data.

Return:trackerTime

bool: the current recording state of the eye tracking device

trackerSec()[source]

Same as the GP3 implementation of trackerTime().

trackerTime()[source]

Current eye tracker time in the eye tracker’s native time base. The GP3 system uses a sec.usec timebase based on the Windows QPC, so when running on a single computer setup, iohub can directly read the current gazepoint time. When running with a two computer setup, current gazepoint time is assumed to equal current local time.

Returns:

current native eye tracker time in sec.msec format.

Return type:

float

Supported Event Types

The Gazepoint GP3 provides real-time access to binocular sample data. iohub creates a BinocularEyeSampleEvent for each sample received from the GP3.

The following fields of the BinocularEyeSample event are supported:

class psychopy.iohub.devices.eyetracker.BinocularEyeSampleEvent(*args, **kwargs)[source]

The BinocularEyeSampleEvent event represents the eye position and eye attribute data collected from one frame or reading of an eye tracker device that is recording both eyes of a participant.

Event Type ID: EventConstants.BINOCULAR_EYE_SAMPLE

Event Type String: ‘BINOCULAR_EYE_SAMPLE’

time

time of event, in sec.msec format, using psychopy timebase.

left_gaze_x

The horizontal position of the left eye on the computer screen, in Display Coordinate Type Units. Calibration must be done prior to reading (meaningful) gaze data. Uses Gazepoint LPOGX field.

left_gaze_y

The vertical position of the left eye on the computer screen, in Display Coordinate Type Units. Calibration must be done prior to reading (meaningful) gaze data. Uses Gazepoint LPOGY field.

left_raw_x

The uncalibrated x position of the left eye in a device specific coordinate space. Uses Gazepoint LPCX field.

left_raw_y

The uncalibrated y position of the left eye in a device specific coordinate space. Uses Gazepoint LPCY field.

left_pupil_measure_1

Left eye pupil diameter. (in camera pixels??). Uses Gazepoint LPD field.

right_gaze_x

The horizontal position of the right eye on the computer screen, in Display Coordinate Type Units. Calibration must be done prior to reading (meaningful) gaze data. Uses Gazepoint RPOGX field.

right_gaze_y

The vertical position of the right eye on the computer screen, in Display Coordinate Type Units. Calibration must be done prior to reading (meaningful) gaze data. Uses Gazepoint RPOGY field.

right_raw_x

The uncalibrated x position of the right eye in a device specific coordinate space. Uses Gazepoint RPCX field.

right_raw_y

The uncalibrated y position of the right eye in a device specific coordinate space. Uses Gazepoint RPCY field.

right_pupil_measure_1

Right eye pupil diameter. (in camera pixels??). Uses Gazepoint RPD field.

status

Indicates if eye sample contains ‘valid’ data for left and right eyes. 0 = Eye sample is OK. 2 = Right eye data is likely invalid. 20 = Left eye data is likely invalid. 22 = Eye sample is likely invalid.

iohub also creates basic start and end fixation events by using Gazepoint FPOG* fields. Identical / duplicate fixation events are created for the left and right eye.

class psychopy.iohub.devices.eyetracker.FixationStartEvent(*args, **kwargs)[source]

A FixationStartEvent is generated when the beginning of an eye fixation ( in very general terms, a period of relatively stable eye position ) is detected by the eye trackers sample parsing algorithms.

Event Type ID: EventConstants.FIXATION_START

Event Type String: ‘FIXATION_START’

time

time of event, in sec.msec format, using psychopy timebase.

eye

Eye that generated the event. Either EyeTrackerConstants.LEFT_EYE or EyeTrackerConstants.RIGHT_EYE.

gaze_x

The calibrated horizontal eye position on the computer screen at the start of the fixation. Units are same as Display. Calibration must be done prior to reading (meaningful) gaze data. Uses Gazepoint FPOGX field.

gaze_y

The calibrated horizontal eye position on the computer screen at the start of the fixation. Units are same as Display. Calibration must be done prior to reading (meaningful) gaze data. Uses Gazepoint FPOGY field.

class psychopy.iohub.devices.eyetracker.FixationEndEvent(*args, **kwargs)[source]

A FixationEndEvent is generated when the end of an eye fixation ( in very general terms, a period of relatively stable eye position ) is detected by the eye trackers sample parsing algorithms.

Event Type ID: EventConstants.FIXATION_END

Event Type String: ‘FIXATION_END’

time

time of event, in sec.msec format, using psychopy timebase.

eye

Eye that generated the event. Either EyeTrackerConstants.LEFT_EYE or EyeTrackerConstants.RIGHT_EYE.

average_gaze_x

Average calibrated horizontal eye position during the fixation, specified in Display Units. Uses Gazepoint FPOGX field.

average_gaze_y

Average calibrated vertical eye position during the fixation, specified in Display Units. Uses Gazepoint FPOGY field.

duration

Duration of the fixation in sec.msec format. Uses Gazepoint FPOGD field.

Default Device Settings

eyetracker.hw.gazepoint.gp3.EyeTracker:
    # Indicates if the device should actually be loaded at experiment runtime.
    enable: True

    # The variable name of the device that will be used to access the ioHub Device class
    # during experiment run-time, via the devices.[name] attribute of the ioHub
    # connection or experiment runtime class.
    name: tracker

    # Should eye tracker events be saved to the ioHub DataStore file when the device
    # is recording data ?
    save_events: True

    # Should eye tracker events be sent to the Experiment process when the device
    # is recording data ?
    stream_events: True

    # How many eye events (including samples) should be saved in the ioHub event buffer before
    # old eye events start being replaced by new events. When the event buffer reaches
    # the maximum event length of the buffer defined here, older events will start to be dropped.
    event_buffer_length: 1024

    # The GP3 implementation of the common eye tracker interface supports the
    # BinocularEyeSampleEvent event type.
    monitor_event_types: [ BinocularEyeSampleEvent, FixationStartEvent, FixationEndEvent]

    device_timer:
        interval: 0.005

    calibration:        
        # target_duration is the number of sec.msec that a calibration point should
        # be displayed before moving onto the next point.
        # (Sets the GP3 CALIBRATE_TIMEOUT)
        target_duration: 1.25
        # target_delay specifies the target animation duration in sec.msec.
        # (Sets the GP3 CALIBRATE_DELAY)
        target_delay: 0.5
        
    # The model name of the device.
    model_name: GP3

    # The serial number of the GP3 device.
    serial_number:

    # manufacturer_name is used to store the name of the maker of the eye tracking
    # device. This is for informational purposes only.
    manufacturer_name: GazePoint

Last Updated: January, 2021


Back to top