Over the median observation period of 8.2 years, 8 of the 95 OHT eyes converted to manifest glaucoma. Conversion and nonconversion were predicted to different degrees with different PERG measures (A, amplitudes to 0.8° checks; B, age-corrected amplitudes to 0.8° checks; C, amplitudes to 16° checks; D, the PERG ratio [i.e., A divided by C]). The predictive values of these measures are depicted in
Table 2 . All measures showed some correlation with development of glaucomatous field defects, the PERG ratio had the largest ROC area. With the threshold at 1.06 (minimum error score), the PERG ratio predicted conversion or nonconversion for the year after an examination with a sensitivity of 80%, a specificity of 71%, a positive predictive value of 23%, and a negative predictive value of 97%.
The slight superiority of the PERG ratio over the other PERG measures suggests that, indeed, normalization by the response to very large checks reduces variability. PERG amplitudes can vary by a factor of three between individuals.
18 Because an individual with a large 0.8° PERG will also have a large 16° PERG, it is useful to compute the PERG ratio to reduce interindividual variability.
18 Moreover, the PERG ratio has the inherent advantage that electrode placement from session to session and electrode type are of little relevance: To a first approximation, the PERG ratio would not differ between electrode type, be it corneal or skin electrode,
42 because the amplitude transfer factor would not depend on the check size and thus is factored out by computing the ratio. In the face of this theoretical advantage, we are somewhat surprised that the PERG ratio performed only approximately 10% better than the response amplitude to 0.8° checks.
With more severe glaucoma the PERG to 16° checks also decreases and hence the ratio misleadingly improves. Therefore, the PERG ratio would be a poor choice for monitoring of advanced glaucoma. In OHT, however, we are concerned with very early stages, and there the affection of 0.8° check amplitude responses and 16° check responses differ.
The lower yield of the age-corrected amplitude compared with the uncorrected amplitude suggests that age was a partial confounder. Indeed, converters were a bit older than nonconverters (55.6 vs. 49.4 years of age), but the difference is not significant.
Some problems in interpretation arose from the fact that treatment was allowed, which probably distorted the natural course. First, some patients may have received unnecessary treatment, and some may have converted without the treatment. However, this problem does not interfere with the study’s intent, which was to analyze the PERG’s ability for early detection of glaucoma, irrespective of treatment. Second, all converting eyes had received treatment before conversion, and the data seem to suggest that the particular treatment did not prevent conversion. However, as long as no conversion was present, treatment was only mild, and the mean achieved intraocular pressure was still 24.3 mm Hg, maybe more intensive treatment would have been able to halt the development of glaucoma. It is true that today’s treatment may not always be satisfactory in terms of preventing conversion from OHT to glaucoma.
2 43 We hope that other established or innovative therapies will be more effective. If so, techniques for early detection of glaucoma such as the PERG will be especially valuable.
This is the third longitudinal PERG study of glaucoma: Arai et al.
25 looked at the transient PERG and found a decrease of the N95 after 40 months in OHT eyes, but not in normal eyes. Of their 15 OHT eyes, one developed glaucoma 2 years after PERG decrease. Pfeiffer et al.
20 looked at high-risk eyes and thus, despite their short observation period (11–31 months), found that glaucoma developed in 5 of 29 eyes. This was predicted by the PERG with a sensitivity of 100% and a specificity of 71%. These longitudinal studies show that the suggestions based on cross-sectional studies
14 21 22 23 24 26 27 28 44 seem valid: The PERG amplitude shows signs of glaucomatous damage earlier than is obvious from morphologic or psychometric measures (though it has been reported that not every patient with glaucoma has a pathologic PERG amplitude
45 ).
The PERG is one of several diagnostic methods that are designed to detect early dysfunction preceding glaucoma. On the one hand there are subjective tests designed to detect visual dysfunction before defects become evident on standard white-on-white (w/w) perimetry, like blue-on-yellow (b/y) perimetry or frequency-doubling technology (FDT) perimetry. In their longitudinal study Demirel and Johnson
46 demonstrated that in patients with OHT the prevalence of visual field defects was much higher with b/y perimetry (9.2%) than with w/w perimetry (1.4%). This finding fits well with results obtained by Horn et al.
47 who showed that VEP responses obtained by b/y stimulation are a very early indicator of glaucomatous damage. Johnson et al.
48 showed that patients with OHT with a pathologic b/y perimetry finding have an increased risk for development of w/w perimetry defects within 5 years. Another OHT study with a 3-year follow-up showed that b/y perimetry can predict w/w defects with a sensitivity of 73% at a specificity of 68%.
49 B/y perimetry, however, can be impaired by media opacities.
50 The FDT has been reported to detect pre-w/w perimetric glaucoma with a sensitivity of 39% at a specificity of 95% as confirmed with morphometric methods.
51 Longitudinal data on the FDT in OHT have not been published to date. Subjective tests such as FDT and b/y perimetry depend on the patient’s vigilance, which can fluctuate from measurement to measurement and can show improvement simply due to a learning effect.
50 When electrode placement is performed in a standardized fashion, the PERG shows little intraindividual variation (coefficient of variation, ≈15%),
52 although reproducibility may well be different between normal subjects and patients.
Another type of diagnostic methods relies on the hypothesis that early nerve fiber loss precedes visual field defects in standard perimetry. This early nerve fiber loss becomes manifest in an increasing optic disc cup and a decrease in nerve fiber density. Morphometric data on the optic disc can be obtained by scanning laser tomography (e.g., HRT), optic coherence tomography (e.g., OCT3; Carl Zeiss Meditec, Oberkochen, Germany), or optic disc photography. Mardin et al.
53 reported a sensitivity of 42% and a specificity of 95% for the HRT in a cross-sectional study. The HRT data obtained during our study are published elsewhere.
54 The most valid HRT parameter turned out to be the cup-shape measure in the superior temporal sector, with a sensitivity of 56% and a specificity of 70%. These values are lower than those found in the current study for the PERG ratio.
The OCT was found to detect a significantly thinner retinal nerve fiber layer in patients with suspected glaucoma with normal w/w perimetry results and pathologic b/y results compared with control subjects,
55 data on sensitivity or specificity were not provided. In a study of 813 patients, photographic monitoring of optic disc cupping predicted glaucomatous visual field loss in only 19%.
56 A decrease in peripapillary nerve fiber density can be detected with nerve fiber photography or scanning laser polarimetry (e.g., GDx; Carl Zeiss Meditec). Nerve fiber photography is the only morphometric method for which longitudinal data are currently available. The respective studies yielded the following sensitivity/specificity values for the detection of visual field defects: 91%/100% 1 to 2 years in advance (Goldmann perimetry),
57 31% to 61%/89% to 96% 1 year in advance,
58 and 54%/68% 5 years in advance.
56 However, in practice, nerve fiber photography is greatly impaired by variations in image quality and interobserver variability.
59 For the GDx nerve fiber analyzer, longitudinal studies on OHT conversion to glaucoma are not yet available. An overall problem of morphometric methods is the distinction of glaucomatous changes from age-related optic disc changes. The PERG shares this problem. This obstacle can be circumvented, however, by calculating the PERG ratio, which stays largely constant during life and decreases only in the presence of disease.
Refractive errors decrease small check size amplitudes more than large check size amplitudes, mainly due to the reduction of visual acuity.
60 In our study, we sidestepped this pitfall by including only patients with a best corrected visual acuity of ≥0.8, but this issue somewhat limits the general applicability of the PERG for early glaucoma detection.
A comprehensive cross-sectional comparison of methods was performed in a study of 43 patients with early glaucoma and 43 healthy individuals
15 : The authors compared light threshold perimetry, short-wavelength automated perimetry, high-pass resolution perimetry, motion detection, flicker contrast sensitivity, and flickering and isoluminantly matched letter tests. The objective tests were pattern and flash electroretinography. Of all parameters the PERG performed best (sensitivity was 85.4%, specificity 87.8%).
In principle, and if noise affects all methodologies similarly, one would expect clinical function tests (like the PERG) to detect dysfunction preceding glaucoma earlier than would morphometric tests, because a diseased ganglion cell would first lose its specific function and then die and disappear.
The above comparison of methods is hampered by the choice of threshold for each method, especially since there is no gold standard for the detection of glaucoma. For example, if one weakened the criteria for progression on w/w perimetry (i.e., increased its sensitivity), the advantage of the PERG—with the given PERG ratio threshold—would disappear. However, at the same time, the visual field test would produce more false positives. Therefore, a comparison of methods is nevertheless feasible when the thresholds of the tests are set at optimal sensitivity/specificity relations.
In conclusion, our long-term results suggest that the PERG, with the stimulus parameters applied in this study, helps to predict stability versus glaucomatous progression in OHT. Objectivity, reproducibility, and relative independence from age changes are advantages of this particular glaucoma tool—namely, the PERG ratio. It can help to discriminate between OHT eyes that may develop glaucomatous visual field defects and those that probably will not.
The authors thank Joerg Meyer for substantial help in early parts of the study; Margret Schumacher for recording of the PERGs; Gabriele Graf for recording the visual fields and performing retinal tomography; and Ursula Sessler, Vanessa Inguscio, and Tanja Schwibinger for taking the optic disc photographs.