↓ Skip to main content

Scanpath modeling and classification with hidden Markov models

Overview of attention for article published in Behavior Research Methods, April 2017
Altmetric Badge

About this Attention Score

  • Good Attention Score compared to outputs of the same age (70th percentile)
  • Good Attention Score compared to outputs of the same age and source (70th percentile)

Mentioned by

twitter
11 X users

Citations

dimensions_citation
91 Dimensions

Readers on

mendeley
220 Mendeley
citeulike
1 CiteULike
Title
Scanpath modeling and classification with hidden Markov models
Published in
Behavior Research Methods, April 2017
DOI 10.3758/s13428-017-0876-8
Pubmed ID
Authors

Antoine Coutrot, Janet H. Hsiao, Antoni B. Chan

Abstract

How people look at visual information reveals fundamental information about them; their interests and their states of mind. Previous studies showed that scanpath, i.e., the sequence of eye movements made by an observer exploring a visual stimulus, can be used to infer observer-related (e.g., task at hand) and stimuli-related (e.g., image semantic category) information. However, eye movements are complex signals and many of these studies rely on limited gaze descriptors and bespoke datasets. Here, we provide a turnkey method for scanpath modeling and classification. This method relies on variational hidden Markov models (HMMs) and discriminant analysis (DA). HMMs encapsulate the dynamic and individualistic dimensions of gaze behavior, allowing DA to capture systematic patterns diagnostic of a given class of observers and/or stimuli. We test our approach on two very different datasets. Firstly, we use fixations recorded while viewing 800 static natural scene images, and infer an observer-related characteristic: the task at hand. We achieve an average of 55.9% correct classification rate (chance = 33%). We show that correct classification rates positively correlate with the number of salient regions present in the stimuli. Secondly, we use eye positions recorded while viewing 15 conversational videos, and infer a stimulus-related characteristic: the presence or absence of original soundtrack. We achieve an average 81.2% correct classification rate (chance = 50%). HMMs allow to integrate bottom-up, top-down, and oculomotor influences into a single model of gaze behavior. This synergistic approach between behavior and machine learning will open new avenues for simple quantification of gazing behavior. We release SMAC with HMM, a Matlab toolbox freely available to the community under an open-source license agreement.

X Demographics

X Demographics

The data shown below were collected from the profiles of 11 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 220 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Austria 1 <1%
Unknown 219 100%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 59 27%
Researcher 31 14%
Student > Master 28 13%
Student > Bachelor 27 12%
Student > Doctoral Student 13 6%
Other 30 14%
Unknown 32 15%
Readers by discipline Count As %
Psychology 48 22%
Computer Science 37 17%
Engineering 23 10%
Neuroscience 20 9%
Medicine and Dentistry 11 5%
Other 35 16%
Unknown 46 21%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 6. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 12 August 2020.
All research outputs
#6,334,439
of 25,382,440 outputs
Outputs from Behavior Research Methods
#782
of 2,526 outputs
Outputs of similar age
#93,937
of 324,612 outputs
Outputs of similar age from Behavior Research Methods
#12
of 44 outputs
Altmetric has tracked 25,382,440 research outputs across all sources so far. This one has received more attention than most of these and is in the 74th percentile.
So far Altmetric has tracked 2,526 research outputs from this source. They typically receive more attention than average, with a mean Attention Score of 8.2. This one has gotten more attention than average, scoring higher than 69% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 324,612 tracked outputs that were published within six weeks on either side of this one in any source. This one has gotten more attention than average, scoring higher than 70% of its contemporaries.
We're also able to compare this research output to 44 others from the same source and published within six weeks on either side of this one. This one has gotten more attention than average, scoring higher than 70% of its contemporaries.