January 2007
Volume 48, Issue 1
Free
Clinical and Epidemiologic Research  |   January 2007
Children Unable to Perform Screening Tests in Vision In Preschoolers Study: Proportion with Ocular Conditions and Impact on Measures of Test Accuracy
Investigative Ophthalmology & Visual Science January 2007, Vol.48, 83-87. doi:10.1167/iovs.06-0384
  • Views
  • PDF
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      The Vision in Preschoolers Study Group; Children Unable to Perform Screening Tests in Vision In Preschoolers Study: Proportion with Ocular Conditions and Impact on Measures of Test Accuracy. Invest. Ophthalmol. Vis. Sci. 2007;48(1):83-87. doi: 10.1167/iovs.06-0384.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

purpose. To examine the relative prevalence of ocular conditions among children who are unable to perform preschool vision screening tests and the impact on measures of screening test performance.

methods. Trained nurse and lay screeners each administered a Lea Symbols visual acuity (VA) test (Good-Lite, Inc., Steamwood, IL), Stereo Smile II test (Stereo Optical, Inc., Chicago, IL), and Retinomax Autorefractor (Right Manufacturing, Virginia Beach, VA), and SureSight Vision Screener (Welch Allyn, Inc., Skaneateles Falls, NY) examinations to 1475 children who later received a comprehensive eye examination to identify amblyopia, strabismus, significant refractive error, and unexplained reduced VA. The outcomes of the examination for children for whom screeners were unable to obtain results (Unables) were compared to the outcomes of children who passed and children who failed each screening test. When estimating sensitivity, specificity, and positive and negative predictive values (PPV and NPV), Unables were classified as either screening failures or screening passers.

results. Less than 2% of children were classified as Unables for each test. The percentage with an ocular condition was at least two times higher for Unables than for screening passers for six of the eight modes of screening (P < 0.05). Considering Unables as screening failures, rather than screening passers, increased the estimate of sensitivity by 1% to 3% (depending on test) and decreased the estimate of specificity by 0% to 2%; PPV decreased by 0% to 4% for most tests, whereas NPV increased by <1%.

conclusions. Preschool children who are unable to perform VIP screening tests are more likely to have vision disorders than are children who pass the tests. Because ≤2% of children were unable to do each test, referring these children for an eye examination had little impact on the PPV and NPV of the tests, as administered in VIP.

The Vision in Preschoolers (VIP) Study is a multicenter, multidisciplinary, phased study designed to evaluate the performance of vision screening tests for identifying preschool children who would benefit from a comprehensive eye examination. Phase I of the VIP study compared 11 screening tests administered by licensed eye care professionals. 1 Phase II of the VIP study compared the performance of nurses and lay screeners in administering selected screening tests. 2 The screening tests used in phase II were the crowded Linear Lea Symbols Visual Acuity (VA) test, the crowded Single Lea Symbol VA test, the Stereo Smile II test, the Retinomax Autorefractor, and the SureSight Vision Screener. Comprehensive (“gold standard”) eye examinations (GSEs) conducted by study-certified optometrists and ophthalmologists were used to identify ocular conditions that had been targeted for detection (amblyopia, strabismus, significant refractive error, and unexplained low VA). The sensitivity and specificity for the screening tests administered by nurse and lay screeners were calculated. 
In the VIP phase II analysis, children who were unable to perform the screening test (Unables) were classified as screening failures because of the hypothesis that some children are unable to perform a screening test because they have an ocular problem. The purpose of this article is to describe the characteristics of children who were unable to perform vision screening tests and to investigate the effects on the sensitivity and specificity of the screening tests when Unables were classified as screening failures or screening passers. 
Methods
The design and methods for phase II of the VIP Study have been published in detail. 2 Children who participated during the 2003 academic year, when both nurse screeners and lay screeners administered screening tests, are included in this report. The aspects of the study with direct bearing on the interpretation of this report are provided in the following sections. 
Participants
Children enrolled in a Head Start program near one of five VIP clinical centers (Berkeley, CA; Boston, MA; Columbus, OH; Philadelphia, PA; and Tahlequah, OK) who were ≥3 and <5 years of age at the beginning of the 2003 academic year (September 1) were eligible for the study. Head Start is a national, comprehensive child-development program that serves preschool children and their families. The goal of Head Start is to increase the school readiness of children from low-income families. To obtain a sample that was enriched with children who had vision problems, recruitment of children was based on the results of a regular vision screening conducted by local Head Start personnel. Screening procedures varied by site, ranging from tumbling-E tests to noncycloplegic retinoscopy. The goal was to recruit approximately 350 children who had ≥1 targeted conditions. To accomplish this goal, all children at participating Head Start centers who had failed the Head Start vision screening were asked to participate in the VIP Study, as were a randomly selected subset of children who had not failed the screening. The project was approved by the appropriate institutional board(s) associated with each center. The research adhered to the tenets of the Declaration of Helsinki and was approved by the appropriate local institutional review boards associated with each VIP center. Parents or legal guardians of children provided written informed consent. 
VIP Screening Procedure
VIP screenings were performed within local Head Start centers. Screeners participated in training sessions and certification activities before the first study screening sessions. Each child was tested by a nurse screener and a lay screener, each of whom conducted four screening tests: Lea Symbols VA test (with the nurse screener using the crowded linear-array test at 10 feet and the lay screener using the crowded single-symbol test at 5 feet), Stereo Smile II stereoacuity test, and Retinomax Autorefractor and SureSight Vision Screener examinations. Children were assigned randomly to either the nurse or lay screener first. Each screener conducted the subjective tests (VA and stereoacuity) first, with test order assigned randomly. The Retinomax Autorefractor and SureSight Vision Screener examinations were administered after the subjective tests, with test order assigned randomly. 
Screening Tests
Linear Lea Symbols VA Test.
The testing material consisted of an 18 ×18 cm lap card on which four optotypes (heart, house, circle, and square) were printed, four 9 × 9 cm cards each containing a single optotype, a card with four 10/100 optotypes surrounded by a crowding bar, and a booklet of linear arrays of five optotypes surrounded on all four sides by a crowding bar (Precision Vision, Inc., La Salle, IL, and Good-Lite, Inc., Steamwood, IL). 1 3 Screening began with a binocular pretest, in which the screener showed the cards containing single, large optotypes to the child at a distance of approximately 1 meter. The child’s task was to match each optotype to the correct one on the lap card or to identify the optotype verbally. The child was allowed two attempts to identify each optotype. Children who could not correctly identify the four symbols during the pretest were considered unable to perform the test. If the child successfully completed the pretest, screening of the right eye began with presentation of the largest optotype card at 10 feet and continued with presentation of cards with increasingly smaller optotypes until the child did not identify at least three of four optotypes on a card or until all cards were completed. Testing of the left eye followed, beginning with the largest optotype card at 10 feet. 
Crowded Single Lea Symbol VA Test.
The test involved presentation of Lea Symbols optotypes (Good-Lite, Inc.) at a distance of 5 feet. Optotypes were surrounded on all four sides by a crowding bar and printed on a disc that had an overlay mask with a window, allowing presentation of single crowded symbols. The disc contained four optotypes at each size level. Screening began with the same binocular pretest at 3 feet as used for the Linear Lea Symbols VA test. Children who could not correctly identify the four symbols during the pretest were considered unable to perform the test. If the child successfully completed the pretest, screening of the right eye began. The screener first presented a 5/50 card at 5 feet, followed by the disc for the right eye. Screening continued with presentation of sequences of four optotypes with increasingly smaller optotypes until the child did not identify at least three of the optotypes of a particular size or until all sizes were completed. Testing of the left eye followed beginning with the 5/50 card at 5 feet. 
Stereo Smile II Test.
The test (Stereo Optical, Inc.) consisted of a “blank” plate (a random-dot pattern), a demonstration plate (a nonstereo “smile face” on a background of random dots), and three plates, each displaying a random-dot stereo smile face of successively finer levels of stereoacuity. Testing was conducted at 40 cm, with the child wearing Polaroid glasses (Polaroid, Cambridge, MA). Screening began with a pretest in which the child had to correctly identify the demonstration plate on four of four or four of five presentations of the demonstration plate paired with the “blank” plate. Children who could not correctly identify the demonstration plate four times during the pretest were considered unable to perform the test. If the child successfully completed the pretest, the screener presented the blank plate paired with each stereo plate for up to five presentations, proceeding to finer disparities as long as the child correctly identified the stereo plate on four of four or four of five presentations. 
Retinomax Autorefractor and SureSight Vision Screener.
The Retinomax Autorefractor (Right Manufacturing, Virginia Beach, VA) and the SureSight Vision Screener (Welch Allyn, Inc., Skaneateles Falls, NY; software version 2.12) are handheld autorefractors used to measure refractive error monocularly. If the reliability rating for the summary reading of an eye was less than the manufacturer’s recommended minimum value (8 for the Retinomax, 6 for the SureSight Vision Screener), the process was repeated. A maximum of three readings per eye could be made. Children with at least one printed refractive error for each eye, regardless of reliability rating, were considered able to be screened. Children who did not allow the machine to be positioned properly or for whom the machine did not provide a refractive error reading for an eye were considered unable to perform the test. A printed series of 9’s for the SureSight Vision Screener was not considered a refractive error reading. 
Gold Standard Examination
Comprehensive eye examinations were conducted in a VIP van by optometrists and ophthalmologists who are experienced in providing care to children and who had participated in training sessions and certification activities. 4 Screeners and GSE examiners were masked to each others’ results. Monocular distance VA assessment, 5 6 cover testing at distance and near, and cycloplegic retinoscopy were used to determine the presence of amblyopia, strabismus, significant refractive error and/or unexplained reduced VA. Unilateral amblyopia was defined as 3-line (presumed amblyopia) or 2-line (suspected amblyopia) interocular acuity difference accompanied by strabismus and/or anisometropia. Reduced VA was defined as VA worse than 20/50 in 3-year-olds and worse than 20/40 in 4-year-olds. Bilateral amblyopia was defined as reduced VA and an amblyogenic factor in each eye (astigmatism >2.50 D, hyperopia >5.00 D, or myopia >8.00 D). Significant refractive error was defined as astigmatism >1.50 D, hyperopia >3.25 D, myopia >2.00 D, or anisometropia. These targeted conditions were further categorized into three groups based on the severity of the ocular condition. 1 Group 1 conditions, considered to be the most severe and very important to detect and treated early, included bilateral amblyopia, presumed unilateral amblyopia with worse eye VA ≤20/64, constant strabismus, hyperopia ≥5.0 D, astigmatism ≥2.5 D, myopia ≥6.0 D, or severe anisometropia (interocular difference >2.0 D in hyperopia, >3.0 D in astigmatism, or >6.0 D in myopia). 
Data Analysis
For each screening test, the results were classified according to the screening failure criteria used in the previous report on phase II results. 2 These criteria were such that overall sensitivity for detecting any targeted condition was maximized with specificity set to 0.90. Comparisons among proportions were evaluated using exact tests for categorical data. 
The sensitivity was calculated as the proportion of children who failed a screening test among children with a targeted condition, and specificity was calculated to estimate the proportion of children who passed a screening test among those without any targeted conditions. Because of the random sampling of children who had passed the local Head Start screening, specificity was derived as a weighted average of the specificity for children who had failed the Head Start screening or were identified by their teachers as high risk (1/6) and of the specificity for children who had not failed the Head Start screening (5/6). The sensitivities and specificities were compared under three different options for handling the results of the children unable to perform a screening test: considering screening Unables as screening failures, screening passers, or simply excluding them from the calculations. 
The positive predictive value (PPV) is the probability that children who fail the screening test actually have one of the targeted conditions, and the negative predictive value (NPV) is the probability that children who pass the screening test actually do not have any of the targeted conditions. Predictive values are a function of the prevalence of targeted conditions, as well as the sensitivity and specificity of the screening test. Approximately, 32% of the VIP phase II study population had a targeted condition; however, because only a sample of children failing the Head Start screening were selected for the study, this proportion is not applicable to the general population. Using information collected during recruitment for the study, the estimated prevalence of targeted conditions in the Head Start population is approximately 20%. Because low-income, Head Start children are at increased risk of vision problems, an estimate of 15% was used for the prevalence in the general population for the analyses presented. PPV and NPV were compared under two methods of incorporating Unables into the results of screening by classifying Unables as screening failures or screening passes. 
Results
At least one screening test was performed on 1541 children. A few children left the screening area before seeing both the nurse screener and the lay screener or before all screening tests could be performed by a screener, so that the number of children with screening results varied from 1534 to 1537. GSEs were initiated on 1475 (95.7%) and 1452 (94.2%) had examinations complete enough to provide classification with respect to the targeted conditions and severity level. 
As reported previously, 2 the proportion of the children who were classified as Unable was 2% or less for each test (Table 1) . Children who were unable to perform one or more tests with one screener were likely to be unable on one or more tests with the other screener; among the 106 children who were unable to perform at least one test, 23 (22%) were unable for at least one test with each screener. Often, but not always, the child was unable to do the same test with each screener. Children who were unable on one of the subjective tests (VA or stereoacuity) were generally able to perform the objective tests (autorefractors) and vice versa; there were only three children in the entire study who were unable on one of the subjective tests and on one of the objective tests. The average age of children unable to perform one or more screening tests was 43.7 months, whereas the average age of the children able to perform all tests was 47.2 months (P < 0.0001). Children who were unable to perform a screening test were less likely to have a complete GSE. With the exception of the SureSight Vision Screener administered by lay people, the percentage of children with an incomplete examination was 1% among children able to perform the screening test, but 10% or more among those who were unable (P < 0.05 for each screening test administered by each type of screening personnel). Most, 21 (91%) of 23, children with an incomplete GSE were not able to complete the threshold VA testing. 
The percentage of Unables who were found to have one or more targeted conditions on the GSE (GSE failure) was generally between the percentage for children classified as pass and the percentage for children classified as fail for each screening test (Table 1) . For the two autorefractors administered by lay screeners, the percentage failing the GSE in the Unable group was higher than in the screening fail group, but not to a statistically significant degree (P > 0.05). For each screening test, the percentage of GSE failures among the Unable group was significantly higher than the percentage among the pass group for one or both types of screeners. Unables for the autorefractors who were GSE failures also tended to have a higher percentage of group 1 (the most severe) conditions. Although approximately 45% of all GSE failures had a condition in severity group 1, all the GSE failures in the Unable groups for the Retinomax Autorefractor had a group-1 condition as did nearly all the GSE failures in the Unable groups for the SureSight Vision Screener (11 [85%] of 13 for lay screeners; 11 [79%] of 14 for nurse screeners). 
The impact on the characteristics (such as sensitivity and specificity) of screening tests of choosing different options for handling the results of children unable to perform the test is shown in Table 2 . Because children in this study who are unable to perform the test are generally more likely to have a targeted condition than were children who pass the test but less likely to have a targeted condition than children who fail the test, sensitivity is higher and specificity is lower when unable children are considered screening failures than when they are considered as passing the screening test. Often participants unable to perform a screening test are excluded from the analysis. 7 8 If the Unable group is excluded from calculation of sensitivity and specificity, then sensitivity and specificity will be between the values provided by classifying all Unables as a screening pass or a screening failure. 
Under the assumption that the combined prevalence of all the targeted conditions is 15% and with the percentage with a targeted condition in the Unable group intermediate between the percentage for those passing and failing the screening, the PPV is higher when the Unable group is considered as passing the screening test. The exception in Table 2is for the SureSight Vision Screener administered by Lay Screeners. In this case, the proportion of the Unable group with a targeted condition was higher than the proportion in the screening failure group (Table 1) . The NPV would be expected to be lower when the Unable group is considered as passing; however, because the percentage of Unables is low and the proportion of children without a targeted condition is high (85%), the choice for classification of Unable children had no impact on the values for NPV within two decimal places. If the Unable group is ignored in the calculation of sensitivity and specificity, predictive values for a screening test are not defined. 
Discussion
When screened for vision problems with four available tests, 2 preschool children who were unable to perform the screening test were at higher risk of having amblyopia, strabismus, significant refractive error, or unexplained low VA than were children who passed the screening test. The Unable group had higher risk whether the tests were administered by nurse or lay screeners. 
A high proportion of children unable to perform a screening test were also unable to perform threshold VA testing during the GSE. The finding that eyecare professionals experienced with working with children aged 3 to 5 years could not complete testing suggests that at least some of the children who are unable to perform screening tests may have behavioral or learning characteristics that interfere with evaluation of the child’s visual system. Nonetheless, among the children for whom a complete examination could be performed, children unable to perform a screening test generally had a risk of having an eye condition that was intermediate between the risk for children passing the screening test and children failing the screening test. Of interest, children who were unable to perform one of the subjective tests were nearly always able to provide a reading on an autorefractor, and those unable to provide a reading on an autorefractor were nearly always able to perform the subjective screening tests. Unfortunately, our study population was too small to determine whether it is better to refer the child or to screen with a different test. 
In practice, children who are unable to perform a screening test are often not referred for a comprehensive eye examination, and they are managed as a child who has passed the screening. Given that the prevalence rates of the targeted conditions in the Unable groups were usually intermediate between the prevalence rates in the screening pass and fail groups, the VIP phase II data demonstrate that including the Unable group with the screening failure group increases sensitivity to a modest degree (1%–3%, Table 2 ). The percentage of all screened children who were Unable on a test was so small (≤2%) that the additional number of Unable children without any condition who were considered screening failures caused only very small decreases (0%–2%) in the specificity relative to classifying the Unable group as passing the screening test. These differences in sensitivity and specificity yielded small decreases (0%–4%) in positive predictive value when the Unable group was considered as failing the screening and no meaningful change (<1%) in negative predictive value when the prevalence of targeted conditions was assumed to be 15%. 
The results of phase II of the VIP Study show that very few children between the ages of 3 and 5 years are unable to perform these preschool vision screening tests when they are administered by trained screeners. However, those who are unable to perform a test are at higher risk of having a targeted condition than are children who perform the test and pass it. There are many options for handling these children: refer (fail) for a comprehensive eye examination, screen with a different test, screen again at a later date, or pass the child. These results suggest that it is best to refer or rescreen the child (either with a different test or at a later date) than to pass the child when using these tests with trained screeners. Screening performed in other environments or by less well-trained screeners may have a higher proportion of children who are unable to perform the test, and the degree to which the risk of having a condition may be lower than observed in the VIP phase II study. Whether it is better for a particular screening program to refer or rescreen depends on the type of resources available and whether there is an opportunity to screen children at a later date. 
Appendix 1
The Vision in Preschoolers Study Group
Executive Committee.
Paulette Schmidt, (Chair), Agnieshka Baumritter, Elise Ciner, Lynn Cyert, Velma Dobson, Beth Haas, Marjean Taylor Kulp, Maureen Maguire, Bruce Moore, Deborah Orel-Bixler, Ellen Peskin, Graham Quinn, Maryann Redford, Janet Schultz, Gui-shuang Ying. 
Writing Committee.
Maureen Maguire (Chair), Chengcheng Liu, Velma Dobson, Graham Quinn. 
Participating Centers
AA, Administrative Assistant; BPC, Back-up Project Coordinator; GSE, Gold Standard Examiner; LS, Lay Screener; NS, Nurse Screener; PI, Principal Investigator; PC, Project Coordinator; PL, Parent Liaison; PR, Programer; VD, Van Driver; NHC, Nurse/Health Coordinator. 
University of California Berkeley School of Optometry (Berkeley, CA).
Deborah Orel-Bixler (PI/GSE), Pamela Qualley (PC), Dru Howard (BPC/PL), Lempi Miller Suzuki (BPC), Sarah Fisher (GSE), Darlene Fong (GSE), Sara Frane (GSE), Cindy Hsiao-Threlkeld (GSE), Selim Koseoglu (GSE), A. Mika Moy (GSE), Sharyn Shapiro (GSE), Lisa Verdon (GSE), Tonya Watson (GSE), Sean McDonnell (LS/VD), Erika Paez (LS), Darlene Sloan (LS), Evelyn Smith (LS), Leticia Soto (LS), Robert Prinz (LS), Joan Edelstein (NS), Beatrice Moe (NS). 
New England College of Optometry (Boston, MA).
Bruce Moore (PI/GSE), Joanne Bolden (PC), Sandra Umaña (PC/LS/PL), Amy Silbert (BPC), Nicole Quinn (GSE), Heather Bordeau (GSE), Nancy Carlson (GSE), Amy Croteau (GSE), Micki Flynn (GSE), Barry Kran (GSE), Jean Ramsey (GSE), Melissa Suckow (GSE), Erik Weissberg (GSE), Marthedala Chery (LS/PL), Maria Diaz (LS), Leticia Gonzalez (LS/PL), Edward Braverman (LS/VD), Rosalyn Johnson (LS/PL), Charlene Henderson (LS/PL), Maria Bonila (PL), Cathy Doherty (NS), Cynthia Peace-Pierre (NS), Ann Saxbe (NS), Vadra Tabb (NS). 
The Ohio State University College of Optometry (Columbus, OH).
Paulette Schmidt (PI), Marjean Taylor Kulp (Co-investigator/GSE), Molly Biddle (PC), Jason Hudson (BPC), Melanie Ackerman (GSE), Sandra Anderson (GSE), Michael Earley (GSE), Kristyne Edwards (GSE), Nancy Evans (GSE), Heather Gebhart (GSE), Jay Henry (GSE), Richard Hertle (GSE), Jeffrey Hutchinson (GSE), LeVelle Jenkins (GSE), Andrew Toole (GSE), Keith Johnson (LS/VD), Richard Shoemaker (VD), Rita Atkinson (LS), Fran Hochstedler (LS), Tonya James (LS), Tasha Jones (LS), June Kellum (LS), Denise Martin (LS), Christina Dunagan (NS), Joy Cline (NS), Sue Rund (NS). 
Pennsylvania College of Optometry (Philadelphia, PA).
Elise Ciner (PI/GSE), Angela Duson (PC/LS), Lydia Parke (BPC), Mark Boas (GSE), Shannon Burgess (GSE), Penelope Copenhaven (GSE), Ellie Francis (GSE), Michael Gallaway (GSE), Sheryl Menacker (GSE), Graham Quinn (GSE), Janet Schwartz (GSE), Brandy Scombordi-Raghu (GSE), Janet Swiatocha (GSE), Edward Zikoski (GSE), Leslie Kennedy (LS/PL), Rosemary Little (LS/PL), Geneva Moss (LS/PL), Latricia Rorie (LS), Shirley Stokes (LS/PL), Jose Figueroa (LS/VD), Eric Nesmith (LS), Gwen Gold (BPC/NHC/PL), Ashanti Carter (PL), David Harvey (LS/VD), Sandra Hall (NS), Lisa Hildebrand (NS), Margaret Lapsley (NS), Cecilia Quenzer (NS), Lynn Rosenbach (NHC/NS). 
Northeastern State University College of Optometry (Tahlequah, OK).
Lynn Cyert (PI/GSE), Linda Cheatham (PC/VD), Anna Chambless (BPC/PL), Colby Beats (GSE), Jerry Carter (GSE), Debbie Coy (GSE), Jeffrey Long (GSE), Shelly Rice (GSE), Shelly Dreadfulwater, (LS/PL), Cindy McCully (LS/PL), Rod Wyers (LS/VD), Ramona Blake (LS/PL), Jamey Boswell (LS/PL), Anna Brown (LS/PL), Jeff Fisher (NS), Jody Larrison (NS). 
Study Center: The Ohio State University College of Optometry.
Paulette Schmidt (PI), Beth Haas (Study Coordinator). 
Coordinating Center: University of Pennsylvania, Department of Ophthalmology.
Maureen Maguire (PI), Agnieshka Baumritter (Project Director), Mary Brightwell-Arnold (Systems Analyst), Christine Holmes (AA), Andrew James (PR), Aleksandr Khvatov (PR), Lori O’Brien (AA), Ellen Peskin (Project Director), Claressa Whearry (AA), Gui-shuang Ying (Biostatistician). 
National Eye Institute (Bethesda, MD).
Maryann Redford. 
 
Table 1.
 
Proportion of Children with Ocular Conditions by the Results of Screening Tests
Table 1.
 
Proportion of Children with Ocular Conditions by the Results of Screening Tests
Screener Type Screening Test Screening Result n Ocular Condition n (%) P *
Lay Single Lea Fail 371 277 (74.66) 0.24
Pass 1070 180 (16.82) 0.01
Unable 9 5 (55.56)
Stereo Smile Fail 258 179 (69.38) 0.002
Pass 1169 275 (23.52) 0.22
Unable 23 8 (34.78)
Retinomax Fail 399 283 (70.93) 0.68
Pass 1045 174 (16.65) <0.001
Unable 6 5 (83.33)
SureSight Fail 383 269 (70.23) 0.08
Pass 1052 179 (17.02) <0.001
Unable 14 13 (92.86)
Nurse Linear Lea Fail 333 221 (66.37) 0.45
Pass 1108 236 (21.30) 0.07
Unable 8 4 (50.00)
Stereo Smile Fail 301 196 (65.12) 0.07
Pass 1126 255 (22.65) 0.02
Unable 22 10 (45.45)
Retinomax Fail 422 310 (73.46) 0.12
Pass 1022 149 (14.58) 0.16
Unable 5 2 (40.00)
SureSight Fail 386 279 (72.28) 0.006
Pass 1030 168 (16.31) <0.001
Unable 30 14 (46.67)
Table 2.
 
Screening Test Characteristics with Different Options for Handling Results of Children Unable to Perform the Test
Table 2.
 
Screening Test Characteristics with Different Options for Handling Results of Children Unable to Perform the Test
Unable Option Lay Screener Nurse Screener
Single Lea Stereo Smile Retino-max Sure-Sight Linear Lea Stereo Smile Retino-max Sure-Sight
Sensitivity Fail 0.61 0.40 0.62 0.61 0.49 0.45 0.68 0.64
Pass 0.60 0.39 0.61 0.58 0.48 0.43 0.67 0.61
Exclude 0.61 0.39 0.62 0.60 0.48 0.44 0.68 0.62
Specificity Fail 0.91 0.90 0.90 0.90 0.89 0.90 0.90 0.90
Pass 0.91 0.92 0.90 0.90 0.90 0.91 0.90 0.91
Exclude 0.91 0.92 0.90 0.90 0.90 0.91 0.90 0.91
PPV Fail 0.55 0.42 0.52 0.52 0.45 0.43 0.53 0.51
Pass 0.56 0.46 0.52 0.51 0.45 0.45 0.54 0.55
NPV Fail 0.93 0.90 0.93 0.93 0.91 0.90 0.94 0.93
Pass 0.93 0.90 0.93 0.93 0.91 0.90 0.94 0.93
Vision In Preschoolers (VIP) Study Group. Comparison of preschool vision screening tests as administered by licensed eye care professionals in the Vision In Preschoolers (VIP) Study. Ophthalmology. 2004;111:637–650. [CrossRef] [PubMed]
Vision In Preschoolers (VIP) Study Group. Preschool vision screening tests administered by nurse screeners compared to lay screeners in the Vision in Preschoolers Study. Invest Ophthalmol Vis Sci. 2005;46:2639–2648. [CrossRef] [PubMed]
Vision In Preschoolers (VIP) Study Group. Preschool visual acuity screening with HOTV and Lea symbols: testability and between-test agreement. Optom Vis Sci. 2004;81:678–683. [CrossRef] [PubMed]
Vision In Preschoolers (VIP) Study Group. Implementation of a preschool vision screening program in a mobile setting. The NHSA [National Head Start Association] Dialog. 2005;8:16–24. [CrossRef]
MokePS, TurpinAH, BeckRW, et al. Computerized method of visual acuity testing: adaptation of the Amblyopia Treatment Study visual acuity testing protocol. Am J Ophthalmol. 2001;132:903–909. [CrossRef] [PubMed]
Vision In Preschoolers (VIP) Study Group. The electronic visual acuity tester: testability in preschool children. Optom Vis Sci. 2004;81:238–244. [CrossRef] [PubMed]
SpiererA, RoyzmanZ, ChetritA, NovikovI, BarkayA. Vision screening of preverbal children with Teller Acuity Cards. Ophthalmology. 1999;106:849–854. [CrossRef] [PubMed]
MerrittJC, GameS, WilliamsOD, BlakeD. Visual acuity in preschool children: the Chaple Hill-Durham day-care vision study. J Natl Med Assoc. 1996;88:709–712. [PubMed]
Table 1.
 
Proportion of Children with Ocular Conditions by the Results of Screening Tests
Table 1.
 
Proportion of Children with Ocular Conditions by the Results of Screening Tests
Screener Type Screening Test Screening Result n Ocular Condition n (%) P *
Lay Single Lea Fail 371 277 (74.66) 0.24
Pass 1070 180 (16.82) 0.01
Unable 9 5 (55.56)
Stereo Smile Fail 258 179 (69.38) 0.002
Pass 1169 275 (23.52) 0.22
Unable 23 8 (34.78)
Retinomax Fail 399 283 (70.93) 0.68
Pass 1045 174 (16.65) <0.001
Unable 6 5 (83.33)
SureSight Fail 383 269 (70.23) 0.08
Pass 1052 179 (17.02) <0.001
Unable 14 13 (92.86)
Nurse Linear Lea Fail 333 221 (66.37) 0.45
Pass 1108 236 (21.30) 0.07
Unable 8 4 (50.00)
Stereo Smile Fail 301 196 (65.12) 0.07
Pass 1126 255 (22.65) 0.02
Unable 22 10 (45.45)
Retinomax Fail 422 310 (73.46) 0.12
Pass 1022 149 (14.58) 0.16
Unable 5 2 (40.00)
SureSight Fail 386 279 (72.28) 0.006
Pass 1030 168 (16.31) <0.001
Unable 30 14 (46.67)
Table 2.
 
Screening Test Characteristics with Different Options for Handling Results of Children Unable to Perform the Test
Table 2.
 
Screening Test Characteristics with Different Options for Handling Results of Children Unable to Perform the Test
Unable Option Lay Screener Nurse Screener
Single Lea Stereo Smile Retino-max Sure-Sight Linear Lea Stereo Smile Retino-max Sure-Sight
Sensitivity Fail 0.61 0.40 0.62 0.61 0.49 0.45 0.68 0.64
Pass 0.60 0.39 0.61 0.58 0.48 0.43 0.67 0.61
Exclude 0.61 0.39 0.62 0.60 0.48 0.44 0.68 0.62
Specificity Fail 0.91 0.90 0.90 0.90 0.89 0.90 0.90 0.90
Pass 0.91 0.92 0.90 0.90 0.90 0.91 0.90 0.91
Exclude 0.91 0.92 0.90 0.90 0.90 0.91 0.90 0.91
PPV Fail 0.55 0.42 0.52 0.52 0.45 0.43 0.53 0.51
Pass 0.56 0.46 0.52 0.51 0.45 0.45 0.54 0.55
NPV Fail 0.93 0.90 0.93 0.93 0.91 0.90 0.94 0.93
Pass 0.93 0.90 0.93 0.93 0.91 0.90 0.94 0.93
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×