Facial affect recognition is associated with neuropsychological status and psychiatric diseases. We hypothesized that facial affect recognition is associated with psychological status and perception of other affects.
A total of 80 images depicting facial affect, including 20 Neutral, 20 Angry, 20 Fear, and 20 Sad, were screened for use in our research. A total of 100 healthy individuals were asked to rate these images using a 10-point Likert scale and complete psychological scales assessing the emotional statuses and cognitive functions.
The participants’ emotional state of aggression, attention, and impulsivity may have been associated with their interpretation of the Angry facial expressions. The participants often rated the Angry facial expressions as Fear. The participants rated Fear images as Angry or Sad. In response to a Sad facial expression, the participants reported psychological statuses of attention and impulsivity which were associated with the facial expression rating. The participants rated the Sad expression as Angry or Fear.
The psychological statuses of the participants were significantly correlated with their interpretation of facial affects. In particular, a psychological state of attention was often correlated with incorrect affect ratings. Attention and impulsivity could affect the rating of the sad facial expressions.
Studies on facial affect recognition have been instrumental in gaining insights into cognition and emotion, and in influencing the design of computational models and perceptual interfaces. Such studies have been conducted for several decades [
A facial affect recognition deficit is thought to be due to the individual emotional statuses of depression, anxiety, and aggression [
In addition to emotional factors, difficulties in facial affect recognition are associated with cognitive impairments, including attention and impulsivity [
We hypothesized that facial affect recognition would be affected by participants’ emotional status, including depression, anxiety, and aggression, as well as cognitive functions, including attention and impulsivity. Additionally, one facial affect can be perceived as another facial affect influenced by individual emotional and cognitive factors.
Effect size was determined using Cohen’s d [
A total of 103 participants were recruited based on the following criteria: 1) the participants must be at least 18 years of age and 2) must not have a history of psychiatric diseases such as schizophrenia, other psychotic disorders, intellectual disability, mental disorders, or neurological disease. Through screening using the Mini International Neuropsychiatric Interview (MINI), and after meeting with a psychiatric doctor (DHH), three participants were excluded from the study. Two participants were excluded because of a major depressive disorder. The other participant was excluded because of substance dependence. Therefore, we used data from a total of 100 participants in the analyses (
After screening for psychiatric comorbidities and completing surveys for psychological status, all participants were asked to rate facial affects in response to images depicting facial expressions, including Neutral, Angry, Fear, and Sad.
Psychiatric comorbidities were screened using the Korean version of the MINI. The MINI is a semi-structured diagnostic interview that is generally used to assess the presence of co-occurring mental disorders [
Before rating the facial expression images, all participants were asked to complete psychological surveys in order to assess the emotional status of depression, anxiety, and aggression, as well as cognitive functions of attention and impulsivity (
Depression was assessed using the Beck Depression Inventory II (BDI-II) [
Attention problems were assessed using the Korean version of the Adult Attention Deficit/Hyperactivity Disorder Self-Report Scale (K-ASRS). The total K-ASRS score ranges from 0 (best) to 72 (worst; 34). The questions in the K-ASRS were divided into two sections: A (six questions) and B (12 questions). Four or more positive answers in Section A can indicate K-ASRS [
Eighty images depicting facial expressions were screened in our study. These included 20 Neutral (N), 20 Angry (A), 20 Fear (F), and 20 Sad (S) facial images. All facial affect images were randomly selected from the following four categories: neutral, angry, fear, and sad out of 176 Korean facial expressions [
The presentation of the facial expression images consisted of 20 blocks. Each block contained four facial expressions (N, A, F, and S) with various distributions. One of the 20 facial expression images in each category was distributed into 20 blocks. The presentation order of the facial expression images in each category was distributed as follows: N-A-F-S, N-A-S-F, N-F-A-S, N-S-A-F, N-F-S-A, N-S-F-A, A-F-S-N, A-S-F-N, A-N-S-F, A-N-F-S, A-S-N-F, A-F-N-S, F-N-A-S, F-N-S-A, F-A-N-S, F-A-S-N, F-S-A-N, and S-A-F-N, A-F-S-N, S-F-A-N. Each image (5×7 cm2) was shown to the participant for three seconds. The participants rated the images for three seconds. A total of 480 seconds was required to rate all 80 images in the four categories.
If participants could not respond within three seconds they were timed out, these trials were discarded from the analyses. Participants underwent response training for 10 minutes to reduce the percentage of discarded trials. Of the 8,000 trials (80 trials in 100 participants), 38 (0.48%) were discarded as timed out in the analyses.
Linear mixed-effects models were used to estimate the effect of participants’ psychological status on the rating scores and the 95% confidence interval after adjusting for the effect of participants’ sex on the results. Subsequently, the affect of each facial expression image was compared with that of a neutral face. This served as the reference image. In addition, the rating scores for each image were fitted using the estimated coefficients of the linear mixed-effects models. All tests were two-sided and differences were considered statistically significant at a significance level of 0.05. All statistical analyses were performed using the lmer function of the lem4 package in the R software (version 3.6.3; R Foundation for Statistical Computing, Vienna, Austria).
The clinical characteristics and psychological state of the participants are presented in
In response to fearful facial expressions, the emotional status of aggression and cognitive function of attention were associated with participants’ ratings (
In response to the Sad facial expression, the cognitive functions of attention and impulsivity were associated with the participants’ facial affect ratings (
Among the neutral facial expression images, those depicting Neutral 11 had the lowest fitted rating scores, while Neutral 1 had the highest fitted rating scores in the Angry, Fear, and Sad ratings. Among the facial expression images depicting anger, the image depicting Angry 13 had the highest fitted rating score, and Angry 6 had the lowest fitted rating score in the Angry group. Among the images depicting fear, that of Fear 18 had the highest fitted rating score, and Fear 7 had the lowest fitted rating score in the Fear group. Among the images depicting sad facial expressions, Sad 2 had the highest fitted rating score, and Sad 11 had the lowest fitted rating score in the Sad group (
In the present study, participants’ emotional mood and anxiety were not linked to the rating of facial emotional expressions. This differs from the results of previous studies [
The cognitive function of the participants was significantly correlated with their interpretation of the facial affects. Attention, in particular, was correlated with affect ratings. Attention and aggression levels may have affected the ratings of fearful facial expressions in the present study. Attention and impulsivity may have affected ratings of sad facial expressions.
These results are in line with those of previous studies on the correlation between facial expression and attention [
Additionally, the emotional and motivational value of social signals derived from facial expressions may be associated with the attention system [
The core deficits of facial expression recognition in ADHD might be caused by a failure to correctly interpret affects due to inattention or impulsivity [
However, whether abnormal executive function in subjects with ADHD can cause deficits in emotional recognition remains controversial [
In the present study, participants were more likely to interpret facial expressions as emotions that they had previously felt. However, images depicting Fear could be rated as Angry or Sad while pictures depicting a Sad facial expression could be rated as Angry or Fear. By controlling emotional status and cognitive function, healthy individuals can misinterpret facial expressions as other emotions. Shioiro et al. [
The present study has several limitations. First, the small number of participants and unbalanced sex distribution are insufficient to generalize the results, although we considered them in the statistical analyses. Second, in the present study, we did not perform thorough standardized cognitive function tests to assess attention and intelligence. Future studies should include a larger number of participants, a more balanced sex distribution, and cognitive function tests.
In conclusion, our findings suggest that interpretation of facial expressions can be affected by psychological status and misinterpretation of other affects. Researchers should consider these factors when planning facial expressions studies.
The datasets generated or analyzed during the study are available from the corresponding author upon reasonable request.
The authors have no potential conflicts of interest to disclose.
Conceptualization: Doug Hyun Han, Young Don Son. Data curation: Sujin Bae. Formal analysis: Beom Seuk Hwang, Eunhee Rhee. Funding acquisition: Doug Hyun Han, Young Don Son. Investigation: Doug Hyun Han. Methodology: Doug Hyun Han, Sujin Bae. Project administration: Eunhee Rhee. Validation: Sujin Bae, Ji Hyun Bae. Writing—original draft: Doug Hyun Han, Young Don Son. Writing—review & editing: Doug Hyun Han, Sujin Bae.
This work was supported by a National Research Foundation of Korea (NRF) grant funded by the Korean government (MSIT) (NRF-2020R1A4A1019623).
We acknowledge the contributions of colleagues, institutions, and agencies that aided the efforts of the authors.
Diagram for research processing. MINI, Mini International Neuropsychiatric Interview; MDD, major depressive disorder; SUD, substance use disorder.
Demographic and psychological characteristics of the participants
Variable | Value |
---|---|
Sex | |
Male | 78 (78.0) |
Female | 22 (22.0) |
Age (yr) | 22.9±2.6 |
Education (yr) | 14.5±1.7 |
Adult Attention Deficit/Hyperactivity Disorder Self-Report Scale | 7.3±6.4 |
Barratt Impulsiveness Scale–11 | 63.5±7.2 |
The Buss–Perry Aggression Questionnaire | 51.9±13.3 |
Beck Depressive Inventory II | 10.7±8.8 |
Beck Anxiety Inventory | 5.6±8.2 |
Values are presented as number (%) or mean±standard deviation
Effects of psychological status on the rating of facial emotions
Predictor | Angry face |
Fear face |
Sad face |
||||||
---|---|---|---|---|---|---|---|---|---|
Estimation | CI | p-value | Estimation | CI | p-value | Estimation | CI | p-value | |
Intercept | -2.65 | -5.56– -0.27 | 0.075 | -3.86 | -7.56– -0.15 | 0.041 | -2.65 | -5.75–0.46 | 0.032 |
Sex | 0.30 | -0.45–1.04 | 0.434 | 0.32 | -0.64–1.27 | 0.516 | 0.56 | -0.23–1.36 | 0.165 |
K-ASRS | -0.05 | -0.10–0.01 | 0.079 | -0.10 | -0.17–0.03 | 0.006 |
-0.07 | -0.13– -0.01 | 0.023 |
BIS-11 | 0.04 | -0.00–0.09 | 0.060 | 0.04 | -0.00–0.11 | 0.051 | 0.05 | 0.00–0.09 | 0.045 |
AQ | 0.02 | -0.00–0.05 | 0.071 | 0.02 | 0.00–0.07 | 0.024 |
0.02 | -0.00–0.05 | 0.065 |
BDI-II | 0.01 | -0.04–0.06 | 0.602 | 0.01 | -0.06–0.07 | 0.787 | 0.04 | -0.02–0.09 | 0.202 |
BAI | 0.00 | -0.06–0.05 | 0.927 | 0.00 | -0.04–0.10 | 0.391 | 0.02 | -0.04–0.08 | 0.604 |
Angry emotion | 6.78 | 6.22–7.34 | <0.001 |
1.25 | 0.66–1.84 | <0.001 |
-0.74 | -1.36– -0.45 | 0.005 |
Fear emotion | 1.95 | 1.39–2.50 | <0.001 |
4.37 | 3.78–4.97 | <0.001 |
0.15 | -0.47–0.77 | 0.019 |
Sad emotion | 0.37 | -0.19–0.93 | 0.193 | 3.01 | 2.41–3.60 | <0.001 |
4.90 | 4.28–5.52 | <0.001 |
Linear mixed-effects model adjusted for sex.
statistically significant.
K-ASRS, the Korean version of the Adult Attention Deficit/Hyperactivity Disorder Self-Report Scale; BIS-11, Barratt Impulsiveness Scale–11; AQ, Buss–Perry Aggression Questionnaire; BDI-II, Beck Depression Inventory II; BAI, Beck Anxiety Inventory; CI, confidence interval
Fitted rating scores of the facial emotion expression pictures
Neutral pictures | Perceived emotion |
Angry pictures | Perceived emotion |
Fearful pictures | Perceived emotion |
Sad pictures | Perceived emotion |
||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Angry | Fear | Sad | Angry | Fear | Sad | Angry | Fear | Sad | Angry | Fear | Sad | ||||
Neutral 11 | 0.51 | 0.66 | 1.08 | Angry 13 | 8.74 | 2.33 | 1.33 | Fear 18 | 1.70 | 6.20 | 2.58 | Sad 2 | 0.78 | 4.22 | 7.49 |
Neutral 9 | 0.61 | 1.23 | 2.52 | Angry 14 | 8.59 | 2.65 | 1.23 | Fear 20 | 2.00 | 6.86 | 3.03 | Sad 4 | 0.94 | 4.38 | 6.38 |
Neutral 18 | 0.64 | 1.56 | 2.68 | Angry 1 | 8.59 | 2.96 | 1.62 | Fear 17 | 2.00 | 6.20 | 2.38 | Sad 7 | 1.00 | 2.99 | 5.47 |
Neutral 3 | 0.79 | 0.91 | 1.72 | Angry 11 | 8.56 | 2.51 | 1.07 | Fear 19 | 2.06 | 6.38 | 2.22 | Sad 10 | 1.01 | 3.59 | 5.77 |
Neutral 4 | 0.83 | 0.69 | 0.76 | Angry 3 | 8.36 | 2.38 | 1.00 | Fear 12 | 2.08 | 6.77 | 1.95 | Sad 5 | 1.04 | 4.16 | 6.98 |
Neutral 16 | 0.94 | 2.36 | 2.81 | Angry 17 | 8.30 | 2.64 | 1.64 | Fear 15 | 2.32 | 6.72 | 2.19 | Sad 9 | 1.06 | 3.96 | 6.27 |
Neutral 17 | 1.01 | 1.18 | 1.91 | Angry 15 | 8.11 | 2.79 | 1.15 | Fear 16 | 2.43 | 6.46 | 1.96 | Sad 6 | 1.08 | 4.51 | 6.98 |
Neutral 13 | 1.05 | 0.99 | 1.86 | Angry 18 | 8.08 | 2.67 | 1.87 | Fear 10 | 2.54 | 5.58 | 2.65 | Sad 3 | 1.11 | 3.84 | 6.34 |
Neutral 7 | 1.07 | 1.11 | 2.04 | Angry 8 | 8.06 | 2.50 | 1.37 | Fear 13 | 2.54 | 6.24 | 1.77 | Sad 1 | 1.11 | 4.25 | 8.29 |
Neutral 2 | 1.17 | 2.36 | 3.83 | Angry 19 | 8.04 | 2.41 | 1.18 | Fear 11 | 2.72 | 5.98 | 1.85 | Sad 8 | 1.11 | 3.89 | 7.00 |
Neutral 14 | 1.18 | 1.18 | 1.51 | Angry 7 | 7.96 | 2.65 | 1.34 | Fear 3 | 3.24 | 4.34 | 2.02 | Sad 18 | 1.47 | 4.37 | 7.67 |
Neutral 19 | 1.21 | 0.75 | 1.36 | Angry 20 | 7.94 | 2.98 | 1.43 | Fear 1 | 3.41 | 4.04 | 1.29 | Sad 13 | 1.52 | 3.98 | 7.40 |
Neutral 20 | 1.23 | 0.88 | 1.50 | Angry 4 | 7.92 | 2.64 | 1.09 | Fear 14 | 3.50 | 6.30 | 1.68 | Sad 14 | 1.77 | 4.69 | 7.95 |
Neutral 10 | 1.25 | 1.76 | 2.82 | Angry 9 | 7.80 | 2.37 | 1.14 | Fear 8 | 3.73 | 5.64 | 1.95 | Sad 12 | 1.95 | 4.49 | 7.56 |
Neutral 15 | 1.34 | 1.41 | 2.69 | Angry 16 | 7.78 | 2.64 | 1.20 | Fear 2 | 3.76 | 5.39 | 2.32 | Sad 19 | 2.01 | 4.36 | 7.77 |
Neutral 6 | 1.46 | 1.43 | 1.69 | Angry 12 | 7.76 | 2.66 | 1.21 | Fear 4 | 4.03 | 5.24 | 2.30 | Sad 16 | 2.05 | 4.11 | 7.47 |
Neutral 12 | 1.65 | 0.98 | 1.53 | Angry 5 | 7.73 | 2.51 | 1.05 | Fear 6 | 4.12 | 4.09 | 2.09 | Sad 17 | 2.07 | 4.63 | 7.77 |
Neutral 8 | 1.65 | 1.89 | 2.49 | Angry 2 | 7.45 | 2.23 | 1.41 | Fear 9 | 4.39 | 4.64 | 2.47 | Sad 20 | 2.25 | 6.77 | 2.74 |
Neutral 5 | 1.90 | 1.09 | 1.26 | Angry 10 | 7.23 | 2.31 | 1.21 | Fear 5 | 4.80 | 4.91 | 2.19 | Sad 15 | 2.70 | 4.64 | 7.52 |
Neutral 1 | 2.23 | 1.84 | 3.01 | Angry 6 | 6.38 | 2.38 | 1.68 | Fear 7 | 5.32 | 5.70 | 3.21 | Sad 11 | 3.08 | 4.57 | 8.20 |
Coefficients of the linear mixed-effects models