Attempting to create > 1 instance will raise an exception. note:: Only **one** instance of EyeTracker can be created within an experiment. Internally, this mode is implemented by subscribing to the ``gaze.3d.`` and the corresponding surface name data topics.only. The integration takes care of translating the spatial coordinates to PsychoPy display coordinates. For this to work, one has to setup Pupil Capture's built-in AoI tracking system and perform a calibration for each subject. Pupillometry+Gaze mode If the ``Pupillometry only`` setting is set to ``False``, the integration will receive positional data in addition to the pupillometry data mentioned above. Internally, this is implemented by subscribing to the ``pupil.`` data topic. To receive gaze data in PsychoPy screen coordinates, see the Pupillometry+Gaze mode below. The advatage of this mode is that it does not require calibrating the eye tracker or setting up AprilTag markers for the AoI tracking. pupil size, its location in eye camera coordinates, etc. Pupillometry-only mode If the ``pupillometry_only`` setting is to ``True``, the integration will only receive eye-camera based metrics, e.g. This class operates in two modes, depending on the ``pupillometry_only`` runtime setting: #. For details, see this `real-time time-sync tutorial `_. This step effectively transforms time between the two softwares while taking the transmission delay into account. To synchronize time between Pupil Capture and PsychoPy, the integration estimates the offset between their clocks and applies it to the incoming data. Uses ioHub's polling method to process data from `Pupil Capture's Network API `_. Class EyeTracker ( EyeTrackerDevice ): """ Implementation of the :py:class:`Common Eye Tracker Interface ` for the Pupil Core headset.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |