Purchase this article with an account.
R.W. Massof, G.D. Barnett, C. Rainey, C. Epstein, R. Palmer, K. Chen; Performance Measure of Facial Expression Discrimination in Low Vision . Invest. Ophthalmol. Vis. Sci. 2006;47(13):2303.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
To develop and validate a performance measure of low vision patients’ ability to discriminate facial expressions using realistic stimuli.
Five–second color video clips were made of 10 adult models representing both genders and a wide range of appearances. Each clip began with a profile image of the model exhibiting a neutral expression. The model then turned his/her head to face the camera and generated one of four expressions: excited, pleased, indifferent, or angry. One set of clips was made at an equivalent viewing distance of 10 feet and another set at 3 feet, resulting in a total of 80 different video clips that served as stimuli. The 80 video clips were distributed into four overlapping sets of 40, with 50% of the stimuli in each set duplicating 50% of the stimuli in each of the other sets. Data were obtained from 200 low vision patients consecutively recruited from the Wilmer Low Vision Clinic. Each patient was presented with one of the four sets of video clips. The patient was instructed to imagine that he/she had just entered the room and the person in the video was responding to his/her presence. For each clip the patient was forced to judge which of the four possible responses was expressed by the person in the video. The patient’s response was scored as correct or incorrect. Also, as part of an intake survey conducted prior to their clinic visit and data collection session, patients rated the difficulty they have recognizing facial expressions at 10 feet, 6 feet, and 3 feet.
Rasch analysis on the 200 persons x 80 items matrix of dichotomous scores provided estimates of difficulty for each video clip and face discrimination ability for each patient on an interval scale. Construct validity was evaluated with mean square fit statistics. Both item and person measure infit mean squares were consistent with a unidimensional construct that varied between people and video clips. Separation reliabilities (fraction of the observed variance that can be attributed to variance between persons or items) were 0.83 for person measures and 0.88 for item measures. Person measures were significantly correlated with self–reported ratings of facial expression difficulty (Spearman = .35 for 10’; .42 for 6’; .40 for 3’).
This study demonstrates that with the application of Rasch analysis, valid and reliable measures of facial expression discrimination ability by low vision patients can be estimated from performance measures using realistic stimuli.
This PDF is available to Subscribers Only