Eye movement pattern recognition based on fused features and optimized random forest
DOI:
Author:
Affiliation:

1.College of Electrical Engineering, Sichuan University,Chengdu 610065, China; 2.Key Laboratory of Information and Automation Technology of Sichuan Province,Chengdu 610065, China

Clc Number:

TP181;R318

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    To fully exploit the eye movement pattern information, maximize the optimization model effect and improve the eye movement pattern recognition accuracy, this paper proposes an eye movement pattern recognition method based on fused features and optimized random forest. First, we extract three groups of feature parameters: Conventional eye movement features, eye movement sequence subpattern features, and gaze points gaussian distribution features, combine them with ReliefF to select important features and build a fused features matrix. Then we use the particle swarm algorithm to globally seek the model parameters based on random forest to build the optimized random forest eye movement pattern recognition model. We verify the effectiveness of the proposed method by using the open dataset of eye movement experiments of autistic patients, and the experimental results show that the proposed method can better distinguish the difference of eye movement patterns between normal and autistic patients, and the classification accuracy is improved by 957% compared with the random forest of Conventional eye movement features. Therefore, the fused features can better exploit the information contained in the eye movement patterns, and the particle swarm algorithm can effectively optimize the effect of the pattern recognition model, which provides a new idea and method for eye movement pattern recognition.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:
  • Revised:
  • Adopted:
  • Online: January 08,2024
  • Published: