Participants
Fifty-eight typically developing Chinese adults participated in this study, comprising 31 males and 27 females. Ten participants were excluded from the analysis due to insufficient raw data collection (n = 7) or poor eye movement data quality (n = 3). Eventually, 48 participants, 24 males and 24 females were included in the final sample, aged 19–30 years. All participants are free from mental disorders and have normal or corrected-to-normal vision. This study was approved by the Ethics Committee of Tianjin Children's Hospital, and written informed consent was obtained from all participants.
The measure of Autistic Traits
The participants’ autistic traits were evaluated based on the Chinese version Autism Spectrum Quotient (AQ). The AQ is widely used for self-assessment of autistic traits, consisting of 50 items, covering five aspects of autism symptoms: social skills, attention switching, attention to detail, communication, and imagination. The total score of AQ is 50, with 10 scores for each aspect. The higher the AQ score, the stronger abnormality or autism-like performance [18]. In our sample, AQ total score ranged from 9–32, with two participants scoring the cut-off score of 32. Given that there was insufficient evidence that these two participants had autism, they were retained.
Stimuli
Ten males and ten females’ colored facial photographs of neutral, happy, sad, and angry expressions were selected from the Karolinska Directed Emotional Faces (KDEF) database [24], including three views: 0° (left or right), 45° (left or right), and 90°. All photographs were the same size (Width:17° visual angle, Height:23° visual angle), and the critical face features were standardized to the same location. To enhance ecological validity, two dynamic presentation sequences of expression photographs were designed that simulate real-life observation. Order 1 presents photographs from view 0° to 90° to 0°, and order 2 presents photographs from 90° to 0° to 90°. The experimental paradigm was written and presented by MATLAB software supplemented with Psychtoolbox.
Procedure
Participants were guided to a quiet, soft lightroom, sitting approximately 60cm away from a 27-inch monitor (1920*1080 pixels) to complete an emotion-discrimination task. Eye movement data were recorded by Tobii Pro Spectrum eye tracker; the sampling rate is 1200Hz.
After completing the AQ scale, the experimenter explained the task to ensure the participants fully understood and calibrated participants’ eye movement with the 5-point calibration procedure in Tobii Pro Lab software. And then, the experiment task was started by the experimenter pressing the space key.
The task contains 5 blocks; there were 16 trials in each block, a total of 80 trials. During each trial, as shown in Fig. 1, a cartoon character was presented for 3s first to attract participants’ attention to the center of the screen. Subsequently, five photographs of the same actor’s facial expressions were presented in a particular order (order 1 or order 2), each lasting for 2s. These presentation sequences contained different views, visually as a face was rotating. The participants were required to distinguish the expression presented by the actor or actress within 10s of the five face photographs being presented, and gave feedback with the mouse by clicking the corresponding icon on the screen at the moment of recognition. At the same time, the computer recorded the participants’ reaction time and accuracy but did not inform the participants. Then the subsequent trial started, presenting five new photographs. There was no interval between the 16 trials in a block. Each actor's photographs of a specific expression were presented only once across the 80 trials. The two presentation orders and the four emotional conditions were balanced in number and presented in random order.
Data Analysis
Behavioral data were exported for offline analysis with MATLAB. The average accuracy rate and average response time (in the case of correct discrimination) of two gender groups were calculated, respectively.
The raw data of eye movement were extracted from Tobii Pro Lab software. Fixation data were defined using the Tobii IV-T filter with the default preset. Trials with a total fixation duration < 4000ms were excluded, and if the number of invalid trials is more than 20% of total trials, this participant was excluded. Three participants were eventually excluded from further analyses.
To visually observe the differences in gaze patterns between genders, we created heat maps for each gender and calculated difference maps between groups. Referring to the iMAP4 toolbox [25], we wrote our own code for analysis. Heat maps were created based on the ratio of the duration of fixation points at each location in the stimulus pictures to the total duration of stimulus presentation (2s). Then, we applied a Gaussian kernel to smooth each map spatially. Finally, the heat maps of the male group were subtracted from the female group to obtain the difference maps.
We defined the eye region of interest (ROI) for each face stimuli (0°: 9.96°* 4.23° visual angle, 45°: 8.48°* 3.84° visual angle, 90°: 5.78°* 3.69° visual angle). The ROIs of the left and right faces were symmetrical. Further, the proportional fixation duration was calculated for eye ROIs.
In this article, for participants' age, AQ scores, and behavioral performance, we used the two-sample t-test for normally distributed data or the Mann-Whitney U test for non-normally distributed data. As the partial eye movement characteristics did not follow the normal distribution, statistical analysis was performed using the generalized estimating equation (GEE). The significance level of 0.05, adjusting for multiple comparisons using the Bonferroni method.