April 2009
Volume 50, Issue 13
ARVO Annual Meeting Abstract  |   April 2009
Predicting Visual Estimates of Depth of Focus From Optical Data
Author Affiliations & Notes
  • S. Manzanera
    Laboratorio de Optica, Universidad de Murcia, Murcia, Spain
  • P. M. Prieto
    Laboratorio de Optica, Universidad de Murcia, Murcia, Spain
  • C. Canovas
    Laboratorio de Optica, Universidad de Murcia, Murcia, Spain
  • H. Weeber
    R & D, AMO Groningen, Groningen, The Netherlands
  • P. Piers
    R & D, AMO Groningen, Groningen, The Netherlands
  • P. Artal
    Laboratorio de Optica, Universidad de Murcia, Murcia, Spain
  • Footnotes
    Commercial Relationships  S. Manzanera, AMO, F; P.M. Prieto, AMO, F; C. Canovas, AMO, F; H. Weeber, AMO, E; P. Piers, AMO, E; P. Artal, AMO, F; AMO, C.
  • Footnotes
    Support  "Ministerio de Educación y Ciencia", Spain (grants nº FIS2007-63123) and "Fundación Séneca", Murcia, Spain (grant 04524/GERM/06) and AMO.
Investigative Ophthalmology & Visual Science April 2009, Vol.50, 1120. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      S. Manzanera, P. M. Prieto, C. Canovas, H. Weeber, P. Piers, P. Artal; Predicting Visual Estimates of Depth of Focus From Optical Data. Invest. Ophthalmol. Vis. Sci. 2009;50(13):1120.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Purpose: : A procedure to predict the subjective depth of focus (DoF) from optical data would be quite useful to design and to test improved presbyopic correcting phase profiles. In this study, we compared subjective (visual) estimates of DoF and those obtained from optical metrics for different subjects considering individual contrast threshold data.

Methods: : We used an adaptive optics visual simulator to simultaneously manipulate the effective ocular aberration and the stimulus vergence while measuring the subject's visual performance by means of a word reading acuity test. The instrument consisted of a Hartmann-Shack wavefront sensor together with a liquid crystal spatial-light modulator and a stimulus generator. The experiment was performed on three subjects for a 4.8 mm pupil with paralyzed accommodation in green light. The system included an optometer allowing the subject to modify the vergence to the stimuli generator. For different letter sizes and different aberrations, visual DoF was estimated from the defocus range where the words were readable. For the threshold defocus value, the modulation transfer function (MTF) was computed for the actual subject’s aberrations. The modulation value corresponding to the main frequency in the word test for each letter size was selected to quantify the contrast threshold.

Results: : For each subject, phase condition and stimulus letter size, we compared the visually determined DoF and that calculated from the optical data. This was done by computing the amount of defocus that brings the MTF for the combination of the phase profile and the subject’s aberrations below the contrast threshold at the main spatial frequency of the stimulus. The comparison of the two DoF estimates for different pure Zernike modes showed a similar tendency for some cases although there was not a general good correlation, with differences larger than the experimental errors.

Conclusions: : The estimates of DoF from optical data did not reproduce accurately the values obtained by visual testing. This happened even when individual values of contrast thresholds, simultaneously measured, were incorporated in the calculations. This result suggests that visual measurements cannot be ultimately avoided at this stage when testing the DoF for different phase profiles. However, the incorporation of further parameters and factors in the procedure could improve the predictions.

Keywords: presbyopia • optical properties 

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.