Pupil Core is a wearable eye tracker. The system consists of two inward-facing eye cameras and one forward-facing world camera mounted on a wearable eyeglasses-like frame.
Pupil Core provides gaze data in its world camera’s field of view, regardless of the wearer’s head position. As such, gaze can be analysed with the wearer looking and moving freely in their environment.
Pupil Core differs from remote eye trackers often used with PsychoPy®. Remote eye trackers employ cameras mounted on or near a computer monitor. They provide gaze in screen-based coordinates, and this facilitates closed-loop analyses of gaze based on the known position of stimuli on-screen and eye gaze direction.
In order to use Pupil Core for screen-based work in PsychoPy®, the screen will need to be robustly located within the world camera’s field of view, and Pupil Core’s gaze data subsequently transformed from world camera-based coordinates to screen-based coordinates. This is achieved with the use of AprilTag Markers.
For a detailed overview of wearable vs remote eye trackers, check out this Pupil Labs blog post.
Join the Pupil Labs Discord community to share your research and/or questions.
Pupil Capture version v2.0 or newer
Platforms:
Windows 10
macOS 10.14 or newer
Ubuntu 16.04 or newer
Supported Models:
Pupil Core headset
Follow Pupil Core’s Getting Started guide to setup the headset and Capture software
Open experiment settings
in the Builder Window (cog icon in top
panel)
Open the Eyetracking
tab
Modify the properties as follows:
Select Pupil Labs
from the Eyetracker Device
drop down menu
Pupil Remote Address
/ Port
- Defines how to connect to
Pupil Capture. See Pupil Capture’s Network API menu to check
address and port are correct. PsychoPy® will wait the amount of
milliseconds declared in Pupil Remote Timeout (ms)
for the
connection to be established. An error will be raised if the
timeout is reached.
Pupil Capture Recording
- Enable this option to tell Pupil
Capture to record the eye tracker’s raw data during the
experiment. You can read more about that in Pupil Capture’s
official
documentation.
Leave Pupil Capture Recording Location
empty to record to the
default
Gaze Confidence Threshold
- Set the minimum data quality
received from Pupil Capture. Ranges from 0.0 (all data) to 1.0
(highest possible quality). We recommend using the default value
of 0.6.
Pupillometry Only
- If this mode is selected you will only
receive pupillometry data. No further setup is required. If you
are interested in gaze data, keep this option disabled and read on
below.
To receive gaze, enable Pupil Capture’s Surface Tracking plugin:
Start by printing four apriltag markers and attaching them at the screen corners. Avoid occluding the screen and leave sufficient white space around the marker squares. Read more about the general marker setup here.
Define a surface and align its surface corners with the screen corners as good as possible
Rename the surface to the name set in the Surface Name
field of
the eye tracking project settings (default:
psychopy_iohub_surface
)
Run the PsychoPy® calibration component as part of your experiment
Bases: EyeTrackerDevice
Implementation of the Common Eye Tracker Interface
for the Pupil Core headset.
Uses ioHub’s polling method to process data from Pupil Capture’s Network API.
To synchronize time between Pupil Capture and PsychoPy, the integration estimates the offset between their clocks and applies it to the incoming data. This step effectively transforms time between the two softwares while taking the transmission delay into account. For details, see this real-time time-sync tutorial.
This class operates in two modes, depending on the pupillometry_only
runtime
setting:
If the pupillometry_only
setting is to True
, the integration will only
receive eye-camera based metrics, e.g. pupil size, its location in eye camera
coordinates, etc. The advatage of this mode is that it does not require
calibrating the eye tracker or setting up AprilTag markers for the AoI tracking.
To receive gaze data in PsychoPy screen coordinates, see the Pupillometry+Gaze
mode below.
Internally, this is implemented by subscribing to the pupil.
data topic.
If the Pupillometry only
setting is set to False
, the integration will
receive positional data in addition to the pupillometry data mentioned above.
For this to work, one has to setup Pupil Capture’s built-in AoI tracking system
and perform a calibration for each subject.
The integration takes care of translating the spatial coordinates to PsychoPy display coordinates.
Internally, this mode is implemented by subscribing to the gaze.3d.
and the
corresponding surface name data topics.only
Note
Only one instance of EyeTracker can be created within an experiment. Attempting to create > 1 instance will raise an exception.
The getLastGazePosition method returns the most recent eye gaze position received from the Eye Tracker. This is the position on the calibrated 2D surface that the eye tracker is reporting as the current eye position. The units are in the units in use by the ioHub Display device.
If binocular recording is being performed, the average position of both eyes is returned.
If no samples have been received from the eye tracker, or the eye tracker is not currently recording data, None is returned.
If the eye tracker is not currently recording data or no eye samples have been received.
Latest (gaze_x,gaze_y) position of the eye(s)
The getLastSample method returns the most recent eye sample received from the Eye Tracker. The Eye Tracker must be in a recording state for a sample event to be returned, otherwise None is returned.
Gaze mapping result from a single pupil detection. Only emitted if a second eye camera is not being operated or the confidence of the pupil detection was insufficient for a binocular pair. See also this high-level overview of the Pupil Capture Data Matching algorithm
Gaze mapping result from two combined pupil detections
If the eye tracker is not currently recording data.
isConnected returns whether the ioHub EyeTracker Device is connected to Pupil Capture or not. A Pupil Core headset must be connected and working properly for any of the Common Eye Tracker Interface functionality to work.
None –
bool: True = the eye tracking hardware is connected. False otherwise.
The isRecordingEnabled method indicates if the eye tracker device is currently recording data.
True
== the device is recording data; False
== Recording is not
occurring
The runSetupProcedure method starts the Pupil Capture calibration choreography.
Note
This is a blocking call for the PsychoPy Process and will not return to the experiment script until the calibration procedure was either successful, aborted, or failed.
calibration_args – This argument will be ignored and has only been added for the purpose of compatibility with the Common Eye Tracker Interface
EyeTrackerConstants.EYETRACKER_OK
if the calibration was succesful
EyeTrackerConstants.EYETRACKER_SETUP_ABORTED
if the choreography was aborted by the user
EyeTrackerConstants.EYETRACKER_CALIBRATION_ERROR
if the calibration failed, check logs for details
EyeTrackerConstants.EYETRACKER_ERROR
if any other error occured, check logs for details
setConnectionState either connects (setConnectionState(True)
) or
disables (setConnectionState(False)
) active communication between the
ioHub and Pupil Capture.
Note
A connection to the Eye Tracker is automatically established when the ioHub Process is initialized (based on the device settings in the iohub_config.yaml), so there is no need to explicitly call this method in the experiment script.
Note
Connecting an Eye Tracker to the ioHub does not necessarily collect and
send eye sample data to the ioHub Process. To start actual data collection,
use the Eye Tracker method setRecordingState(bool)
or the ioHub Device
method (device type independent) enableEventRecording(bool)
.
enable (bool) – True = enable the connection, False = disable the connection.
bool: indicates the current connection state to the eye tracking hardware.
The setRecordingState method is used to start or stop the recording and transmission of eye data from the eye tracking device to the ioHub Process.
If the pupil_capture_recording.enabled
runtime setting is set to True
,
a corresponding raw recording within Pupil Capture will be started or stopped.
should_be_recording
will also be passed to
EyeTrackerDevice.enableEventReporting()
.
recording (bool) – if True, the eye tracker will start recordng data.; false = stop recording data.
bool: the current recording state of the eye tracking device
Read-ony Pupil Capture subscription topic to receive data from the configured surface
The Pupil Core–PsychoPy® integration provides real-time access to monocular and binocular
sample data. In pupillometry-only mode, all events will be emitted as
MonocularEyeSampleEvents
.
In pupillometry+gaze mode, the software only emits BinocularEyeSampleEvents
events if Pupil Capture is
driving a binocular headset and the detection from both eyes have sufficient confidence to be paired. See this
high-level overview of the Pupil Capture Data Matching algorithm
for details.
The supported fields are described below.
A MonocularEyeSampleEvent represents the eye position and eye attribute data collected from one frame or reading of an eye tracker device that is recoding from only one eye, or is recording from both eyes and averaging the binocular data.
Event Type ID: EventConstants.MONOCULAR_EYE_SAMPLE
Event Type String: ‘MONOCULAR_EYE_SAMPLE’
time at which the sample was received in PsychoPy®, in sec.msec format, using PsychoPy clock
psychopy.iohub.constants.EyeTrackerConstants.RIGHT_EYE
(22
)
or psychopy.iohub.constants.EyeTrackerConstants.LEFT_EYE
(21
)
x component of gaze location in display coordinates. Set to float("nan")
in
pupillometry-only mode.
y component of gaze location in display coordinates. Set to float("nan")
in
pupillometry-only mode.
z component of gaze location in display coordinates. Set to float("nan")
in
pupillometry-only mode. Set to 0.0
otherwise.
phi angle / horizontal rotation of the 3d eye model location in radians.
-pi/2
corresponds to looking directly into the eye camera
theta angle / vertical rotation of the 3d eye model location in radians.
pi/2
corresponds to looking directly into the eye camera
x component of the pupil center location in normalized coordinates
y component of the pupil center location in normalized coordinates
The BinocularEyeSampleEvent event represents the eye position and eye attribute data collected from one frame or reading of an eye tracker device that is recording both eyes of a participant.
Event Type ID: EventConstants.BINOCULAR_EYE_SAMPLE
Event Type String: ‘BINOCULAR_EYE_SAMPLE’
time at which the sample was received in PsychoPy, in sec.msec format, using PsychoPy clock
x component of gaze location in display coordinates. Set to float("nan")
in
pupillometry-only mode. Same as right_gaze_x
.
y component of gaze location in display coordinates. Set to float("nan")
in
pupillometry-only mode. Same as right_gaze_y
.
z component of gaze location in display coordinates. Set to float("nan")
in
pupillometry-only mode. Set to 0.0
otherwise. Same as right_gaze_z
.
phi angle / horizontal rotation of the 3d eye model location in radians.
-pi/2
corresponds to looking directly into the eye camera
theta angle / vertical rotation of the 3d eye model location in radians.
pi/2
corresponds to looking directly into the eye camera
x component of the pupil center location in normalized coordinates
y component of the pupil center location in normalized coordinates
x component of gaze location in display coordinates. Set to float("nan")
in
pupillometry-only mode. Same as left_gaze_x
.
y component of gaze location in display coordinates. Set to float("nan")
in
pupillometry-only mode. Same as left_gaze_y
.
z component of gaze location in display coordinates. Set to float("nan")
in
pupillometry-only mode. Set to 0.0
otherwise. Same as left_gaze_z
.
phi angle / horizontal rotation of the 3d eye model location in radians.
-pi/2
corresponds to looking directly into the eye camera
theta angle / vertical rotation of the 3d eye model location in radians.
pi/2
corresponds to looking directly into the eye camera
x component of the pupil center location in normalized coordinates
y component of the pupil center location in normalized coordinates
eyetracker.hw.pupil_labs.pupil_core.EyeTracker:
# Indicates if the device should actually be loaded at experiment runtime.
enable: True
# The variable name of the device that will be used to access the ioHub Device class
# during experiment run-time, via the devices.[name] attribute of the ioHub
# connection or experiment runtime class.
name: tracker
device_number: 0
#####
model_name: Pupil Core
model_number: "0"
serial_number: N/A
manufacturer_name: Pupil Labs
software_version: N/A
hardware_version: N/A
firmware_version: N/A
#####
monitor_event_types: [MonocularEyeSampleEvent, BinocularEyeSampleEvent]
# Should eye tracker events be saved to the ioHub DataStore file when the device
# is recording data ?
save_events: True
# Should eye tracker events be sent to the Experiment process when the device
# is recording data ?
stream_events: True
# How many eye events (including samples) should be saved in the ioHub event buffer before
# old eye events start being replaced by new events. When the event buffer reaches
# the maximum event length of the buffer defined here, older events will start to be dropped.
event_buffer_length: 1024
# Do not change this value.
auto_report_events: False
device_timer:
interval: 0.005
#####
runtime_settings:
pupil_remote:
ip_address: 127.0.0.1
port: 50020
timeout_ms: 1000
pupil_capture_recording:
enabled: True
location: Null # Use Pupil Capture default recording location
# Subscribe to pupil data only, does not require calibration or surface setup
pupillometry_only: False
confidence_threshold: 0.6
# Only relevant if pupillometry_only is False
surface_name: psychopy_iohub_surface
Last Updated: February, 2022