PlumX Metrics
Embed PlumX Metrics

Improving cognitive-state analysis from eye gaze with synthetic eye-movement data

Computers & Graphics, ISSN: 0097-8493, Vol: 119, Page: 103901
2024
  • 6
    Citations
  • 0
    Usage
  • 17
    Captures
  • 1
    Mentions
  • 0
    Social Media
Metric Options:   Counts1 Year3 Year

Metrics Details

  • Citations
    6
    • Citation Indexes
      6
  • Captures
    17
  • Mentions
    1
    • News Mentions
      1
      • News
        1

Most Recent News

Reports Outline Networks Study Results from University of Potsdam (Improving Cognitive-state Analysis From Eye Gaze With Synthetic Eye-movement Data)

2024 MAY 23 (NewsRx) -- By a News Reporter-Staff News Editor at Network Daily News -- Researchers detail new data in Networks. According to news

Article Description

Eye movements can be used to analyze a viewer’s cognitive capacities or mental state. Neural networks that process the raw eye-tracking signal can outperform methods that operate on scan paths preprocessed into fixations and saccades. However, the scarcity of such data poses a major challenge. We therefore develop SP-EyeGAN, a neural network that generates synthetic raw eye-tracking data. SP-EyeGAN consists of Generative Adversarial Networks; it produces a sequence of gaze angles indistinguishable from human ocular micro- and macro-movements. We explore the use of these synthetic eye movements for pre-training neural networks using contrastive learning. We find that pre-training on synthetic data does not help for biometric identification, while results are inconclusive for the detection of ADHD and gender classification. However, for the eye movement-based assessment of higher-level cognitive skills such general reading comprehension, text comprehension, and the distinction of native from non-native readers, pre-training on synthetic eye-gaze data improves the models’ performance and even advances the state-of-the-art for reading comprehension. The SP-EyeGAN model, pre-trained on GazeBase, along with the code for developing your own raw eye-tracking machine learning model with contrastive learning, is available at https://github.com/aeye-lab/sp-eyegan.

Provide Feedback

Have ideas for a new metric? Would you like to see something else here?Let us know