February 2023
Volume 64, Issue 2
Open Access
Clinical and Epidemiologic Research  |   February 2023
Blink Rate Measured In Situ Decreases While Reading From Printed Text or Digital Devices, Regardless of Task Duration, Difficulty, or Viewing Distance
Author Affiliations & Notes
  • Ngozi Charity Chidi-Egboka
    School of Optometry and Vision Science, Faculty of Medicine and Health, UNSW Sydney, Sydney, NSW, Australia
  • Isabelle Jalbert
    School of Optometry and Vision Science, Faculty of Medicine and Health, UNSW Sydney, Sydney, NSW, Australia
  • Jiaying Chen
    School of Optometry and Vision Science, Faculty of Medicine and Health, UNSW Sydney, Sydney, NSW, Australia
  • Nancy E. Briggs
    School of Optometry and Vision Science, Faculty of Medicine and Health, UNSW Sydney, Sydney, NSW, Australia
    Mark Wainwright Analytical Centre, UNSW Sydney, Sydney, NSW, Australia
  • Blanka Golebiowski
    School of Optometry and Vision Science, Faculty of Medicine and Health, UNSW Sydney, Sydney, NSW, Australia
  • Correspondence: Ngozi Charity Chidi-Egboka, School of Optometry and Vision Science, Faculty of Medicine and Health, Level 3, North Wing, Rupert Myers Building, Gate 14 Barker St, UNSW Sydney, NSW 2052, Australia; [email protected] 
Investigative Ophthalmology & Visual Science February 2023, Vol.64, 14. doi:https://doi.org/10.1167/iovs.64.2.14
  • Views
  • PDF
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Ngozi Charity Chidi-Egboka, Isabelle Jalbert, Jiaying Chen, Nancy E. Briggs, Blanka Golebiowski; Blink Rate Measured In Situ Decreases While Reading From Printed Text or Digital Devices, Regardless of Task Duration, Difficulty, or Viewing Distance. Invest. Ophthalmol. Vis. Sci. 2023;64(2):14. https://doi.org/10.1167/iovs.64.2.14.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Purpose: To compare blinking measured in situ during various tasks and examine relationships with ocular surface symptoms. The day-to-day repeatability of the blink rate and interblink interval was assessed.

Methods: Twenty-four students (28.6 ± 6.3 years; 8 male and 16 female) completed six reading tasks (printed text, laptop, TV, smartphone, smartphone at 50% brightness, smartphone with complex text), and two nonreading tasks (conversation, walking) in a randomized cross-over study. Ocular surface symptoms and clinical signs were assessed. The blink rate and interblink interval were measured using a wearable eye tracking headset. Blink parameters were compared across tasks and time (linear mixed model and post hoc comparisons with Bonferroni correction). Associations between blinking, symptoms, ocular surface, and clinical signs were assessed (Spearman's correlation). The smartphone reading task was completed twice to determine the coefficient of repeatability.

Results: The blink rate was lower (mean 10.7 ± 9.7 blinks/min) and the interblink interval longer (mean 9.6 ± 8.7 seconds) during all reading tasks compared with conversation (mean 32.4 ± 12.4 blinks/min; 1.5 ± 0.6 seconds) and walking (mean 31.3 ± 15.5 blinks/min; 1.9 ± 1.3s) (P < 0.001). There were no significant differences in blink parameters between any of the reading tasks or between conversation and walking. Changes in blinking occurred within 1 minute of starting the task. No associations were evident between blink rate or interblink interval and ocular surface symptoms or signs. The coefficient of repeatability was ±12.4 blinks/min for blink rate and ±18.8 seconds for interblink interval.

Conclusions: Spontaneous blinking can be measured reliably in situ. The blink rate was decreased and the interblink interval increased during reading compared with conversation and walking. Changes in blinking were immediate, sustained, and not associated with ocular surface symptoms or signs.

Blinking maintains a stable tear film, thereby sustaining ocular surface integrity and visual function.1 Disruptions to blinking disturb ocular surface homeostasis and may contribute to ocular discomfort and dry eye.2,3 
Blinking is affected by the type, complexity or difficulty, and cognitive demand of the task undertaken during measurement.48 Differences in viewing distance, factors such as font size, contrast, and device used create different demands on blinking.5,9 Previous studies have found increased discomfort linked with impaired blinking during smartphone and computer use.3,911 Blinking has been investigated during various tasks, (e.g., conversation, reading, playing computer games, watching a film, listening to music, resting quietly) of various complexities and on various devices including printed text, desktop and laptop computers, and tablet, and at various viewing distances and gaze positions.1118 However, blink assessment remains hampered by the lack of a gold standard method and standardized conditions of measurement. 
A wide range of mean blink parameters have been previously reported ranging from 11 to 36 blinks/min during conversation, 4 to 14 blinks/min during reading, and 5 to 26 blinks/min during rest and directed fixed gaze in adults.19 This wide range can be explained in part by differences in the definitions of spontaneous blinking. Various definitions of a blink include a “25% downward movement of the upper eyelid” from the fully open position,20 an “obvious downward eyelid movement,”21 the “upper eyelid reaching downwards from the top of the pupil,”22 a “downward movement of the upper eyelid covering 30%–75% of the cornea,”7 and a “15% decrease in the height of the upper eyelid.”23 Blinking is difficult to assess clinically or in situ outside of the laboratory setting; thus, a method that allows more natural measurement may be helpful in standardizing the definition of spontaneous blinking. 
Blink measurements in previous studies have occurred typically in settings not representative of real-life situations, often requiring participants to keep a stationary head position.19 Fixed head positions during measurement may stimulate participants’ awareness and potential changes in blink dynamics.24 It seems reasonable to assume that blink measurements whereby participants are placed in a fixed head position may be unsuitable for activities involving shifts in gaze and facial orientation,25 such as during reading, conversation, and walking. Robust blink measurement requires the whole anterior eye to be constantly visible so that the full range of eyelid movements can be observed.26,27 Head-mounted eye tracking technologies with cameras in close proximity to the eye allow more robust blink monitoring.25 A higher than 95% blink detection accuracy in relation to pupil detection has previously been demonstrated with head-mounted eye tracking technology that allows free head positioning.26 It would be reasonable to assume that the measurement of blinking in real time in a head position more reflective of the real-life situation is desirable to improve understanding of blink behavior. 
A recent study in children demonstrated that the blinks counted by the Pupil software blink detection algorithm using the wearable eye tracker were in agreement with a manual count.27 Hence, blinking in situ could be reliably measured using a wearable eye tracking headset, showing a rapid decrease in the blink rate during 1 hour of smartphone gaming, which was linked to ocular discomfort.27,28 However, it is not clear if this effect was due to the use of smartphones per se or due to the task of reading itself. 
The repeatability of recent and commonly used blink measurement methods has not been assessed. Repeated measurements of blink rate have been reported for electrophysiology methods (magnetic search coil technique and electro-oculography),6,18,29 and for manual counting of blinks from eye video recording3032 conducted in a laboratory setting where participants’ head position was fixed. However, none of these studies reported standard measures of repeatability.33 
The current study aimed to compare blink parameters (blink rate, interblink interval) during various reading and nonreading tasks measured in situ using a wearable eye tracking headset and to examine associations with ocular surface symptoms. In addition, the day-to-day repeatability of the blink rate and interblink interval measurement was assessed. 
Methods
A randomized cross-over study was conducted. Approval was obtained from the University of New South Wales (UNSW) Human Research Ethics Advisory Panel and the tenets of the Declaration of Helsinki were adhered to. Informed consent was obtained from all participants before participation. 
Participants
Students aged 18 to 40 years were recruited from the UNSW Sydney campus. Minimum unaided visual acuity of 0.1 LogMAR at 6 m and 40 cm and binocular vision (accommodation and convergence) normal for age were required for participants to be enrolled in the study including a minimum amplitude of accommodation of 5 diopters (D) (push up to blur with Royal Air Force rule) and a near phoria equal or smaller than 6 prism D (modified Thorington test).34 Participants were excluded if they wore spectacle or contact lenses, or had a history of ocular conditions including eye allergies, systemic conditions (e.g., Parkinson's disease, diabetes) or medications (e.g., cornea cold thermoreceptor stimulants such as menthol ointment, dopamine antagonist drugs) likely to impact blinking.35,36 Sample size calculation (SAS 9.4, SAS Institutes, Cary, NC, USA) showed that 24 participants were required to detect a difference in blink rate between various tasks of 5.8 blinks/min,12,15 with 90% power at an alpha (α) level of 0.05/7 (statistical significance corrected for multiple comparison of seven tasks) and to account for a possible 20% attrition. Twenty-four participants were also sufficient to assess the day-to-day repeatability of the blink rate and interblink interval, based on a desired precision of ±30%, which was expressed as a percentage of within-person SD, with two repeated measurements.37 
Procedures
All participants attended two visits (Fig. 1) during which they completed a questionnaire on demographics and daily hours of digital device use, and eight tasks as described elsewhere in this article. Ocular surface symptoms and the ocular surface clinical indices were assessed and in situ blinking was measured. In line with the coronavirus disease 2019 safety protocol and guidelines that came into effect in Sydney, Australia, part way through the study, some of the participants wore a surgical mask that covered from nose to chin for all assessments during both study visits (Figs. 2C, 2D). 
Figure 1.
 
Flowchart of study visits and order of clinical assessments. Note: Visit 2 was conducted 2 days after visit 1. Smartphone task was completed twice before other tasks at each visit for assessment of repeatability. Other tasks randomly allocated include six reading tasks (printed text, laptop, smart TV at 6 m, smartphone, smartphone at 50% brightness, smartphone more complex text), and two non -reading tasks (conversation, walking indoors). DEQ-5, Dry Eye Questionnaire 5; LLT, lipid layer thickness; NIBUT, noninvasive tear break-up time; OSDI, Ocular Surface Disease Index; TMH, tear meniscus height; SANDE, Symptoms Assessment in Dry Eye.
Figure 1.
 
Flowchart of study visits and order of clinical assessments. Note: Visit 2 was conducted 2 days after visit 1. Smartphone task was completed twice before other tasks at each visit for assessment of repeatability. Other tasks randomly allocated include six reading tasks (printed text, laptop, smart TV at 6 m, smartphone, smartphone at 50% brightness, smartphone more complex text), and two non -reading tasks (conversation, walking indoors). DEQ-5, Dry Eye Questionnaire 5; LLT, lipid layer thickness; NIBUT, noninvasive tear break-up time; OSDI, Ocular Surface Disease Index; TMH, tear meniscus height; SANDE, Symptoms Assessment in Dry Eye.
Figure 2.
 
Study set-up showing the wearable eye tracking headset (Pupil Labs GmbH) with two inbuilt high-speed adjustable eye cameras and a scene camera for real time monitoring from participants’ vantage point. The headset was worn by study participants during various tasks including reading from (a) printed text, (b) laptop, (c) smart TV at 6 m, (d) smartphone, and (e) walking indoors and conversation (not shown). The wearable eye tracking headset was connected to a laptop for task monitoring and data acquisition for all tasks other than walking indoors, where an android phone was used for the same purpose, while the examiner followed behind the participant holding the android phone to monitor recording (Fig. 2e). Participants’ consents were obtained for use of these images.
Figure 2.
 
Study set-up showing the wearable eye tracking headset (Pupil Labs GmbH) with two inbuilt high-speed adjustable eye cameras and a scene camera for real time monitoring from participants’ vantage point. The headset was worn by study participants during various tasks including reading from (a) printed text, (b) laptop, (c) smart TV at 6 m, (d) smartphone, and (e) walking indoors and conversation (not shown). The wearable eye tracking headset was connected to a laptop for task monitoring and data acquisition for all tasks other than walking indoors, where an android phone was used for the same purpose, while the examiner followed behind the participant holding the android phone to monitor recording (Fig. 2e). Participants’ consents were obtained for use of these images.
Tasks
The tasks comprised six reading tasks (printed text, laptop, smart TV at 6 m, smartphone, smartphone at 50% brightness, smartphone with more complex text), and two nonreading tasks (conversation, walking indoors). Text complexity throughout this article refers to the difficulty of reading text comprehension according to the Flesch–Kincaid grade level scale.39 All tasks were 15 minutes in duration. Tasks were completed in random order, other than the smartphone task, which was completed first at each visit (repeated twice) (Fig. 1). Data for the repeat 1 of the smartphone task was used for all analyses, except for repeatability, where both repeats were used. 
A reading level of the fifth to seventh grade was selected for the reading tasks.38,39 A text at a reading level of a university graduate was selected for the complex smartphone reading task.39,40 For all reading tasks, the default text font size was 16 pixels (equivalence of 12 points), black Times New Roman. However, the viewing distances, screen, or display size and the varying digital device pixel may affect the actual angular extent and therefore alter the font sizes.41,42 The printed text reading task was printed one-sided in A4 format. Conversation was elicited using age-appropriate great conversation starters.43 The walking indoors task was conducted in a level corridor of a temperature-controlled university building. 
The same smartphone (iPhone 8 Plus, 5.5-inch, 1920 × 1080 pixels at 401 ppi, 2017) was used for all smartphone reading tasks. A MacBook Pro (13.3-inch, 2560 × 1600 pixels at 227 ppi built-in display, 2019) was used for the laptop task, and a smart TV (NEC, Model: V754Q, 75-inch, 3840 × 2160 pixels at 59 ppi) for reading at 6 m (Fig. 2). Participants were instructed to hold the smartphone at their habitual reading distance, to use one finger to scroll to the next page or the side arrow button on the laptop keyboard to scroll to the next page during reading, and to not alter the screen brightness or font size. The smartphone was set at maximum screen brightness of measured luminance 380 cd/m2 (Konica Minolta CS-100A) for two tasks and decrease to half during reading from the smartphone at 50% brightness task (measured luminance 121 cd/m2). The laptop screen and smart TV were also set at maximum screen brightness, with a measured luminance of 316 cd/m2 and 316 cd/m2, respectively. The measured luminance for the printed text reading was 77 cd/m2
Ocular Surface Symptoms and Ocular Surface Clinical Assessments
Baseline ocular surface symptoms were assessed using the Instant Ocular Symptoms Survey (IOSS),44 Dry Eye Questionnaire 5,45 Symptoms Assessment in Dry Eye,46 and Ocular Surface Disease Index47 questionnaires, self-completed by participants. The IOSS questionnaire (printed text) was completed by participants between (before and after) the tasks. The IOSS was found to be an effective tool for instant symptom measurement, with good diagnostic ability and was developed to measure instantaneous symptoms, that is, at the time of administration (compared with the other questionnaires which record symptoms experienced over the preceding weeks), and as such is appropriate to administer for repeated comfort assessment.44 
The following baseline tear film clinical assessments were conducted before blink measurements: tear film lipid layer thickness (LipiView interferometer; Tear Science, Morrisville, NC, USA), tear meniscus height, and noninvasive tear break-up time (Oculus Keratograph 5; Oculus, Arlington, WA, USA). The index of the lipid layer thickness based on the mean interferometry color units was recorded.48 The tear meniscus height was assessed in the regions vertically below the pupil center, and directly under the nasal and temporal corneal limbal edges (determined using the integrated ruler) to account for variability in the tear meniscus height along the length of the lower meniscus, and the average of the three measurements was recorded.49 The automated detection of the first tear break-up was recorded for the noninvasive tear break-up time.50 Measurement of the tear breakup time with the noninvasive tear break-up time technique was considered preferable,51,52 because it is automated compared with other subjective methods, such as a videokeratoscope or Tearscope, with which measurements have been found to vary between sessions and observers.52,53 
Ocular surface clinical assessments were performed on the right eye only, in the same temperature-controlled examination room, in ascending order of invasiveness.52 General ocular surface health, corneal staining (fluorescein) and conjunctival staining (Lissamine green strips, GreenGlo) (Oxford grading scale),54,55 telangiectasia,56 meibomian gland expressibility, and meibography imaging of the upper eyelid (Oculus Keratograph 5) was scored in relation to loss of meibomian glands using the meiboscore (meiboscore: degree 0 = no gland loss, 1 ≤25% gland area of loss, 2 = 26%–50% gland area loss, 3 = 51%–75% gland area loss, 4 ≥75% gland area loss)57 and the pattern of meibomian gland morphological changes were assessed5759 after all tasks were completed, as shown in Figure 1
In Situ Blink Measurement
Blink assessment was conducted after tear film assessment, after 10 minutes of rest. In situ assessment of blink parameters was conducted during each task using a binocular wearable eye tracking headset (Pupil Labs Core GmbH, Berlin, Germany)60 (Fig. 2). Data were analyzed using the mean values for each minute as well mean values over 12 or 15 minutes of recording. 
The wearable eye tracking headset recorded participants’ eyes using the two inbuilt adjustable eye cameras with a resolution of 192 × 192 pixels at 120 Hz (Fig. 2).60 The eye camera (providing a view of the participant and their eye) together with the scene camera (providing a view of what the participant is looking at) (Fig. 2) enabled continuous monitoring of participant adherence in real time. Blink activity was detected using the open source eye tracking software Pupil v2.0 (Pupil Labs GmbH), based on the visibility of the pupil, as previously described.27,60 Briefly, the Pupil software assigns a quality measure for the detected pupil in each video frame, referred to as pupil confidence. The pupil confidence value indicates how accurately the edge of the detected pupil fits an ellipse (range, 0 [no fit] to 1 [good fit]).27,60 Blinks are assumed to occur during pupil confidence drops evident when the pupil is obscured; thus, pupil confidence is a proxy measure for blink detection.60 Blink data was extracted from the eye tracker recordings using Pupil software Player module (Pupil Labs GmbH) as CSV files.27 Blink rate (number of blinks per minute) and interblink interval (the time between the end of one blink to the start of the following blink) data were estimated using Pupil software blink detection algorithm as described elsewhere.27 Interblink interval data were determined from the timestamp details of the blink onset and offset identified by the Pupil software blink detection algorithm, and so not an inverse of blink rate.27 
For the reading from a smartphone task (repeats 1 and 2), data from the first 3 minutes of video recording were discarded and the remaining 12 minutes were used for the analysis to allow for adjustment and adaptation to wearing the headset as recommended.61 Complete recordings (15 minutes) were analyzed for all other tasks, because the participants continued with each of the subsequent tasks without removing the headset. 
Repeatability of Blink Measurements
Participants completed the reading from smartphone task with maximum screen brightness twice at separate study visits occurring 2 days apart at the same time of day (between 10 am and 11 am). The time of the day was controlled because the blink rate has been reported to exhibit diurnal variation (higher in the evening).62 
Statistical Analyses
Statistical analyses were performed using IBM SPSS Statistics (version 26, 2019; Armonk, NY, USA). A linear mixed model with fixed effect of task and mask wear and their interactions was used to examine differences in blink parameters between tasks and the effect of mask wearing on the differences in blink parameters. A separate linear mixed model with the fixed effect of time was used to compare differences in blink parameters across time, within each task. Another model with fixed effects of task, time, and mask wear was used to examine the differences in ocular surface symptoms across tasks and time. All models included a random effect for individual to account for repeated measures within persons. Model-estimated means were obtained and post hoc pairwise comparisons were performed between tasks, between each minute within each task duration, and between pretask and post-task symptoms within each task; P values were corrected for multiple comparisons by a Bonferroni adjustment. Spearman's bivariate correlation was used to examine associations between blinking and changes in ocular surface symptoms, and ocular surface and tear film indices; the P values for the correlations were adjusted for multiple comparisons using the one-step Bonferroni method. The statistical approach suggested by Bland and Altman was used to examine repeatability of blink rate and interblink interval. The coefficient of repeatability (CoR = 1.96 × SD of differences between the two repeats), mean difference (bias) between repeats and limits of agreement (limits of agreement = bias ± CoR) were calculated, and paired t-tests were used to examine agreement between repeats.33 Significance was established at a P value of 0.05 or less. 
Results
Twenty-four participants with normal ocular surface health completed the study. Participants were aged 18 to 40 years (mean, 28.6 ± 6.3 years), 67% were female, and comprised different ethnicities: African (38%), South Asian (21%), Middle Eastern (17%), East Asian (12%), and Caucasian (12%). Fourteen participants wore a surgical mask during data collection. 
Thirteen data points were excluded, where more than 60% pupil confidence values were less than 0.6, as per the manufacturer's recommendation27: three from the printed text task, one from smart TV, five from smartphone, two from smartphone (50% brightness), and two from smartphone (more complex text). Ocular surface symptoms and clinical signs reported by participants who had excluded data points were within the range of other participants. 
Baseline ocular surface symptoms and clinical assessments are presented in Table. Examination room temperature was maintained at 21.9 ± 0.7°C. 
Table.
 
Baseline Ocular Surface Symptoms and Clinical Assessments for 24 Students With Healthy Eyes
Table.
 
Baseline Ocular Surface Symptoms and Clinical Assessments for 24 Students With Healthy Eyes
Differences in Blink Parameters Between Tasks
There were significant differences in blink rate (F = 29.94, P < 0.001) and interblink interval (F = 38.32, P < 0.001) between tasks. The blink rate was lower and the interblink interval was longer during all reading tasks compared with conversation (P < 0.001) and walking indoors (P < 0.001) (Fig. 3). There were no significant differences in the blink rate or interblink interval between conversation and walking indoors, or between any of the reading tasks. Interactions between tasks and mask wear were not significant, indicating that mask wear did not have an effect on the differences in blink rate (P = 0.65) or interblink interval (P = 0.72) between tasks. Blink rate and interblink interval remained unchanged throughout measurement duration for each task (P > 0.05) (Fig. 4). 
Figure 3.
 
(a) Blink rate and (b) interblink interval during various tasks of 15 minutes’ duration, measured using a wearable eye tracking headset (Pupil Labs GmbH) for 24 students with healthy eyes. Note that data from the first 3 minutes of the smartphone task were discarded and the remaining 12 minutes used for analysis. Data are presented as median and interquartile range. Open circles represent mild outliers (measurements >1.5 to 3.0 times the interquartile range) and stars represent extreme outliers (measurements >3 times the interquartile range).
Figure 3.
 
(a) Blink rate and (b) interblink interval during various tasks of 15 minutes’ duration, measured using a wearable eye tracking headset (Pupil Labs GmbH) for 24 students with healthy eyes. Note that data from the first 3 minutes of the smartphone task were discarded and the remaining 12 minutes used for analysis. Data are presented as median and interquartile range. Open circles represent mild outliers (measurements >1.5 to 3.0 times the interquartile range) and stars represent extreme outliers (measurements >3 times the interquartile range).
Figure 4.
 
(a) Blink rate and (b) interblink interval during various tasks of 15 minutes duration, measured using a wearable eye tracking headset (Pupil Labs GmbH) for 24 students with healthy eyes. The tasks include six reading tasks (printed text, laptop, smart TV at 6 m, smartphone, smartphone at 50% brightness, smartphone more complex text), and two nonreading tasks (conversation, walking indoors). *Note, data from the first three minutes of the smartphone task were discarded and the remaining 12 minutes used for analysis.
Figure 4.
 
(a) Blink rate and (b) interblink interval during various tasks of 15 minutes duration, measured using a wearable eye tracking headset (Pupil Labs GmbH) for 24 students with healthy eyes. The tasks include six reading tasks (printed text, laptop, smart TV at 6 m, smartphone, smartphone at 50% brightness, smartphone more complex text), and two nonreading tasks (conversation, walking indoors). *Note, data from the first three minutes of the smartphone task were discarded and the remaining 12 minutes used for analysis.
Differences in Ocular Surface Symptoms Before and After a Task and Association With Blinking
Ocular surface symptoms (IOSS) pretask and post-tasks differed between tasks (F = 4.69, P < 0.001). Symptoms worsened after reading from a smartphone when text was more complex (P = 0.01) or at 50% brightness (P = 0.02), and from a smart TV (P < 0.001), but did not change during other tasks (Fig. 5). There was no evidence that mask wearing influenced these differences (mask wear × task × time interaction, P= 1.00). These changes in symptoms were not associated with blink rate or interblink interval (rho −0.09 to 0.41, P= 1.00) (Supplementary Table S1). There were no associations between blinking and baseline ocular surface symptoms (Ocular Surface Disease Index, Symptoms Assessment in Dry, Dry Eye Questionnaire 5, IOSS), tear film, and other clinical indices (rho −0.01 to 0.45, P = 1.00) (Supplementary Table S2). 
Figure 5.
 
Ocular surface symptoms of discomfort and dryness scores (median and interquartile range) measured using the IOSS before and after various tasks of 15 minutes duration for 24 students with healthy eyes. Note, data from the first three minutes of the smartphone task were discarded and the remaining 12 minutes used for analysis. Higher IOSS scores indicate worse discomfort. Blue and red circles represent mild outliers (symptom scores >1.5 to 3.0 times the interquartile range).
Figure 5.
 
Ocular surface symptoms of discomfort and dryness scores (median and interquartile range) measured using the IOSS before and after various tasks of 15 minutes duration for 24 students with healthy eyes. Note, data from the first three minutes of the smartphone task were discarded and the remaining 12 minutes used for analysis. Higher IOSS scores indicate worse discomfort. Blue and red circles represent mild outliers (symptom scores >1.5 to 3.0 times the interquartile range).
Repeatability of the Blink Rate and Interblink Interval
The group mean blink rate for participants while reading from a smartphone was 10.6 ± 10.4 blinks/min for the first repeat and 11.3 ± 10.4 blinks/min for the second repeat. The interblink interval was 10.3 ± 9.7 seconds and 9.7 ± 11.2 seconds for the first and second repeats, respectively. There was no significant difference between two repeated measurements for blink rate (P = 0.62) or interblink interval (P = 0.55). The Bland and Altman plots for blink rate and interblink interval showing the bias and limits of agreement are presented in Figure 6. The CoR was calculated to be ±12.4 blinks/min for blink rate and ±18.8 seconds for the interblink interval. 
Figure 6.
 
Differences between (a) blink rate and (b) interblink interval measured using the wearable eye tracking headset (Pupil Labs GmbH) during two repeats plotted against their mean for 24 students with healthy eyes, while reading easy book series on a smartphone for 12 minutes. The dotted line shows a bias of (a) −0.7 blinks/min (P = 0.62) and (b) 0.7s (P = 0.55). The dashed lines represent the limits of agreement of (a) +11.7 to −13.1 blinks/min and (b) +19.5 to −18.2s.
Figure 6.
 
Differences between (a) blink rate and (b) interblink interval measured using the wearable eye tracking headset (Pupil Labs GmbH) during two repeats plotted against their mean for 24 students with healthy eyes, while reading easy book series on a smartphone for 12 minutes. The dotted line shows a bias of (a) −0.7 blinks/min (P = 0.62) and (b) 0.7s (P = 0.55). The dashed lines represent the limits of agreement of (a) +11.7 to −13.1 blinks/min and (b) +19.5 to −18.2s.
Discussion
A wearable eye tracking headset can be used to measure blinking reliably in a variety of real-life settings and was found to be repeatable from day to day. The blink rate was consistently decreased and the interblink interval was longer during various reading tasks, including reading from printed text, a laptop, a smart TV at 6 m, and a smartphone, compared with conversation or walking, irrespective of reading task difficulty, screen brightness, viewing distance, or device used. Changes in the blink rate and interblink interval occurred immediately upon starting tasks and did not change throughout the 15-minute task duration. No relationship was apparent between blinking and ocular surface comfort or clinical signs. 
The blink rate during reading (mean for all reading tasks, 10.7 ± 9.7 blinks/min) and conversation (32.4 ± 12.4 blinks/min) in this study aligns with previous findings of a slower blink rate while reading printed text and on a computer (pooled mean, 7.9 ± 3.3 blinks/min) than during conversation in adults (mean, 21 blinks/min)4 and in children (20.5 blinks/min).27 A lower blink rate has been consistently reported with computer or smartphone reading and gaming relative to conversation28,63,64 and rest or primary gaze.5,7,1214,59,65 The blink rate while reading on a smartphone (10.6 blinks/min) is similar to a previously reported mean of 8.9 blinks/min within 1 minute of gaming on a smartphone28 and a median of 12.5 blinks/min within 10 minutes of reading on a smartphone.50 A lower blink rate during reading tasks compared with conversation and walking is as expected, because tasks involving higher cognitive demand and concentration are associated with a slower blink rate compared with tasks of a lower cognitive demand.4,5,8,9,66,67 
The interblink interval measured during conversation in this study (1.5 ± 0.6 seconds) is shorter compared with the only previously reported value of 6 ± 3 seconds in healthy adults.68 The interblink interval has not been measured previously during reading or walking. Other reports were during rest, predetermined gaze, or steady fixation, and viewing a game or movie on computer with reported mean interblink interval values ranging from 3 to 10 seconds.15,18,19,6871 As with the blink rate, the interblink interval has been speculated to be unconsciously adjusted depending on the importance of perceived visual information—prolonged with greater cognitive demand.70 
Enabled by the portability of the wearable eye tracking headset, this study was the first to report a blink rate while walking. Blinking during walking did not differ from that during conversation. A previous study speculated that the cognitive demand during conversation compares with that during orientation simulated in a laboratory, similar to walking.72 
Blinking was not affected by the type of device used in this study. These results align with previous reports that the blink rate remains unchanged when an identical reading task is performed in print and on any type of digital device.8,9,11,65,73,74 
Text complexity did not modulate the effect of reading on blinking in this study, in agreement with a previous study that compared blink rate while reading regular words with reordered mirrored images of the same words.67 In contrast, another study found a small decrease in the blink rate during complex reading compared with noncomplex reading on tablet and printed text.8 Other studies3,5,10,65 that report a reduced lower while reading complex text on a computer, tablet, or printed text did not directly compare texts of differing complexities. The likely high reading comprehension ability level of university student participants may have limited this study's ability to demonstrate an effect of text complexity. 
Screen brightness did not affect the blink rate in the present study. Another study found decreased blinking while reading from computer with high screen brightness compared with low screen brightness.75 The higher blinking with low screen brightness under standard background luminance was speculated to be caused by increased glare discomfort.21,75 An effect of screen brightness on the blink rate may not be expected; because in the photopic range, the eye and visual system constantly and rapidly adapt to luminance changes.76 
The presentation of the reading tasks at near or distance did not impact blinking in this study. There was no effect on blinking in this study of reading letters of a smaller angular size on a smart TV at 6 m compared with those of a larger angular size while reading from a smartphone, laptop, or printed text. A previous study reported a lower blink rate during reading text printed in a small font compared with viewing a picture; however, the effect was not modulated by induced accommodative and convergence visual demand.77 Another previous study reported a decreased blink rate while reading text set at 100% font display on a desktop and an increased blink rate while reading the same text set at 330% expanded font display from the same desktop distance.9 A relationship between screen viewing distance and blink rate has not been reported previously. The direction of the gaze during tasks may also modulate blink behavior. Tasks involving a down gaze, such as reading on printed text and smartphone, may be less likely to trigger blinking compared with tasks involving an upward (e.g., smart TV at 6 m)21,69 or primary gaze.67 An upward gaze direction could lead to ocular surface area exposure, thereby stimulating blinks.9,21,78 
The impact of task on the blink rate and interblink interval in this study was immediate and remained unchanged throughout the task duration, in agreement with earlier work. A study that investigated the blink rate each 30 seconds over the course of 10 minutes reading on a tablet also found no changes in the blink rate.8 A study using the same eye tracker device in school-aged children similarly found a rapid slowing of the blink rate and lengthening of the interblink interval, which occurred within the first minute of gaming on a smartphone, and these parameters remained further unchanged throughout 1 hour of gaming.28 A study in adults found no difference in blink rate over the course of 1 hour of gaming on a smartphone.79 An intervention study in adults found an increase in rate of incomplete blinks from 1 to 60 minutes of smartphone reading, but no change in the rate of complete blinks.80 
Ocular surface symptoms of dryness and discomfort worsened when reading on a smartphone with 50% screen brightness, more complex text on a smartphone, and reading on a smart TV at 6 m, but there was no association between these changes in ocular surface symptoms and the blink rate or the interblink interval. In agreement with this study, previous studies have shown that ocular surface symptoms of dryness and discomfort are worsened after as few as 12 to 20 minutes of reading story books or compiled words from fiction novels.74,81,82 In addition, a previous study reported both increased eyestrain symptoms and ocular surface symptoms after reading from a smartphone for 1 hour.80 Whereas previous studies in adults did not find direct associations between symptoms and blink rate during digital device use, similar to this study,3,11,22 an increased occurrence of incomplete blinking has been implicated in the worsening of ocular surface symptoms while reading on a computer or smartphone.11,22,80,83 Complete blinking is essential to the replenishment of the tear film and maintenance of ocular comfort,84 and incomplete blinking can potentially impact dry eye symptoms.85,86 Blink amplitude was not characterized in the current study, but its usefulness as a possible marker of ocular surface health warrants exploration. 
The day-to-day repeatability of blink rate was CoR ±12.4 blinks/min; this finding sets the smallest measurable difference in blink rate in longitudinal studies. A closer inspection of the limits of agreement in Figure 6 suggests that CoR may differ with magnitude of blink measurement. Therefore, the CoR was calculated separately for blink rate values higher than 10 blinks/min. Repeatability was better with blink rates of 10 blinks/min or less (CoR, ±5.4 blinks/min), but less reliable at more than 10 blinks/min (CoR, ±18.8 blinks/min) (Supplementary Table S3). Previous studies intending to report repeatability of blink rate do not provide a standard repeatability measure to enable comparison with the present findings.6,18,29,30 These findings suggest that blink repeatability may be better during reading than conversation and walking. These results provide a basis on which to estimate sample size in future studies. 
As for the blink rate, the repeatability of interblink interval was better for values of less than 10 seconds (CoR, ±3.9 seconds) and poorer for longer interblink interval values (CoR, ±29.6 seconds) (Supplementary Table S3). The overall CoR for interblink interval is higher compared with the normal range previously reported.19 No studies have previously examined the repeatability of the interblink interval. 
The strengths of this study lie with measurement of blinking in situ without the need for head restraint. Also, the blink rate and interblink interval were compared between various tasks on differing devices, controlling for complexity, viewing distance, direction of gaze, and luminance, within one study. Based on these results, in situ measurements of blinking parameters may not be feasible in a small proportion of participants owing to poor pupil detection confidence, the causes of which require further investigation. Poor pupil confidence unrelated to blinks can occur when using the wearable headset owing to extreme gaze angles or pupil obscuration by eyelashes.60 Excluded data in the current study were likely unrelated to gaze angle, because no extreme gaze angles were observed by the continuous eye monitoring during data collection. Future studies should explore whether this limitation is uniquely related to participant's eye characteristics (e.g., long eye lashes).60 
Incomplete blinking has been reported as an important marker of ocular surface symptoms during reading on smartphone or computer,9,11,80 as well as during driving.87 Future studies using the wearable eye tracker will enable examination of blink amplitude in situ during various task and conditions. 
Conclusions
The blink rate was decreased and the interblink interval increased during reading compared with conversation and walking. Changes in the blink rate and interblink interval were immediate and sustained for all tasks, suggesting that blinking is a rapidly responsive marker of changes. The similarity in the blink rate and interblink interval response during a variety of reading tasks, including on a smartphone, suggests that the lower blink rate during reading is not driven by the type of device used, viewing distance, screen brightness, or duration or complexity of task, but rather is intrinsic to the task of reading itself. There was no apparent relationship between changes in blinking and ocular surface comfort or signs. 
The blink rate measured using a wearable device in situ was repeatable from day to day. The current study established solid foundations for the usefulness of blinking as a repeatable and responsive marker of ocular surface health when measured in situ. Future research should explore its usefulness in the settings of dry eye diagnosis and monitoring of treatment effectiveness. 
Acknowledgments
Peter Wagner for technical support during data collection using the wearable eye tracking headset. The authors thank the Eye Research Group at the School of Optometry and Vision Science for the provision of clinical facilities in support of this research. 
Data Availability Statement: The datasets generated during and/or analyzed during the current study are available in the Mendeley Data repository, https://data.mendeley.com/drafts/j63×6bxj8k, Doi:10.17632/j63x6bxj8k.1 
Funding: This research did not receive specific funding from agencies in the public, commercial, or not-for-profit sectors. 
The first author, NC, received a UNSW Tuition Fee Remission Postgraduate Research Scholarship and the Australian Government Research Training Program Thesis Completion Scholarship. The research was also supported by the Dorothy Carlborg Research Grant from the Cornea and Contact Lens Society of Australia, and the UNSW Faculty of Science Research Infrastructure Scheme. The funding sources have no involvement in the study design, conduct of the research, collection, analysis and interpretation of data, writing of the report, preparation of the article or in the decision to submit the article for publication. 
Disclosure: N.C. Chidi-Egboka (N); I. Jalbert (N); J. Chen (N); N.E. Briggs (N); B. Golebiowski (N) 
References
Evinger C, Bao JB, Powers AS, Kassem IS, Schicatano EJ, Henriquez VM, et al. Dry eye, blinking, and blepharospasm. Mov Disord. 2002; 17(Suppl 2): S75–S78. [PubMed]
Nakamori K, Odawara M, Nakajima T, Mizutani T, Tsubota K. Blinking is controlled primarily by ocular surface conditions. Am J Ophthalmol. 1997; 124: 24–30. [CrossRef] [PubMed]
Dumery B, Van Toi V. Relationship between blink rate, ocular discomfort, and visual tasks. Invest Ophthalmol Vis Sci. 1997; 38: 326.
Doughty MJ . Consideration of three types of spontaneous eyeblink activity in normal humans: during reading and video display terminal use, in primary gaze, and while in conversation. Optom Vis Sci. 2001; 78: 712–725. [CrossRef] [PubMed]
Jaiswal S, Asper L, Long J, Lee A, Harrison K, Golebiowski B. Ocular and visual discomfort associated with smartphones, tablets and computers: what we do and do not know. Clin Exp Optom. 2019; 102: 463–477. [CrossRef] [PubMed]
Stern JA, Boyer D, Schroeder D. Blink rate: a possible measure of fatigue. Hum Factors. 1994; 36: 285–297. [CrossRef] [PubMed]
Cardona G, García C, Serés C, Vilaseca M, Gispets J. Blink rate, blink amplitude, and tear film integrity during dynamic visual display terminal tasks. Curr Eye Res. 2011; 36: 190–197. [CrossRef] [PubMed]
Rosenfield M, Jahan S, Nunez K, Chan K. Cognitive demand, digital screens and blink rate. Comput Human Behav. 2015; 51: 403–406. [CrossRef]
Argilés M, Cardona G, Pérez-Cabré E, Rodríguez M. Blink rate and incomplete blinks in six different controlled hard-copy and electronic reading conditions. Invest Ophthalmol Vis Sci. 2015; 56: 6679–6685. [CrossRef] [PubMed]
Tanaka Y, Yamaoka K. Blink activity and task difficulty. Percept Mot Skills. 1993; 77: 55–66. [CrossRef] [PubMed]
Chu CA, Rosenfield M, Portello JK. Blink patterns: reading from a computer screen versus hard copy. Optom Vis Sci. 2014; 91: 297–302. [CrossRef] [PubMed]
Bentivoglio AR, Bressman SB, Cassetta E, Carretta D, Tonali P, Albanese A. Analysis of blink rate patterns in normal subjects. Mov Disord. 1997; 12: 1028–1034. [CrossRef] [PubMed]
Tsubota K, Nakamori K. Dry eyes and video display terminals. N Engl J Med. 1993; 328: 584. [CrossRef] [PubMed]
Himebaugh NL, Begley CG, Bradley A, Wilkinson JA. Blinking and tear break-up during four visual tasks. Optom Vis Sci. 2009; 86: E106–E114. [CrossRef] [PubMed]
Doughty MJ . Further assessment of gender- and blink pattern-related differences in the spontaneous eyeblink activity in primary gaze in young adult humans. Optom Vis Sci. 2002; 79: 439–447. [CrossRef] [PubMed]
Cardona G, Quevedo N. Blinking and driving: the influence of saccades and cognitive workload. Curr Eye Res. 2014; 39: 239–244. [CrossRef] [PubMed]
Wu Z, Begley CG, Situ P, Simpson T, Liu H. The effects of mild ocular surface stimulation and concentration on spontaneous blink parameters. Curr Eye Res. 2014; 39: 9–20. [CrossRef] [PubMed]
Borges FP, Garcia DM. Distribution of spontaneous inter-blink interval in repeated measurements with and without topical ocular anesthesia. Arq Bras Oftalmol. 2010; 73: 329–332. [CrossRef] [PubMed]
Chen J, Chidi-Egboka NC, Jalbert I, Golebiowski B. Is there a consistent way to measure spontaneous blinking? - A narrative review. 2022; invited review conditionally accepted.
Navascues-Cornago M, Morgan PB, Maldonado-Codina C, Read ML. Characterisation of blink dynamics using a high-speed infrared imaging system. Ophthalmic Physiol Opt. 2020; 40: 519–528. [CrossRef] [PubMed]
Doughty MJ . Spontaneous eyeblink activity under different conditions of gaze (eye position) and visual glare. Graefes Arch Clin Exp Ophthalmol. 2014; 252: 1147–1153. [CrossRef] [PubMed]
Portello JK, Rosenfield M, Chu CA. Blink rate, incomplete blinks and computer vision syndrome. Optom Vis Sci. 2013; 90: 482–487. [CrossRef] [PubMed]
Tsubota K, Hata S, Okusawa Y, Egami F, Ohtsuki T, Nakamori K. Quantitative videographic analysis of blinking in normal subjects and patients with dry eye. Arch Ophthalmol. 1996; 114: 715–720. [CrossRef] [PubMed]
Muntz A, Turnbull PR, Kim AD, Gokul A, Wong D, Tsay TS-W, et al. Extended screen time and dry eye in youth. Cont Lens Anterior Eye. 2021; 45: 101541. [CrossRef] [PubMed]
Lam C, Epps J, Chen S. Wearable fatigue detection based on blink-saccade synchronisation. IEEE. 2021: 1186–1191.
Chen S, Epps J. Efficient and robust pupil size and blink estimation from near-field video sequences for human–machine interaction. IEEE Trans Cybernet. 2014; 44: 2356–2367. [CrossRef]
Chidi-Egboka NC, Jalbert I, Wagner P, Golebiowski B. Blinking and normal ocular surface in school-aged children, and the effects of age and screen time. Br J Ophthalmol. 2022;24:bjophthalmol-2022-321645, doi:10.1136/bjophthalmol-2022-321645.
Chidi-Egboka NC, Jalbert I, Golebiowski B. Smartphone gaming induces dry eye symptoms and reduces blinking in school-aged children. Eye. 2022; 6: 1–8. doi:10.1038/s41433-022-02122-2.
Hidalgo-Lopez E, Zimmermann G, Pletzer B. Intra-subject consistency of spontaneous eye blink rate in young women across the menstrual cycle. Sci Rep. 2020; 10: 1–8. [CrossRef] [PubMed]
Yong PT, Arif N, Sharanjeet-Kaur S, Hairol MI. Double eyelid tape wear affects anterior ocular health among young adult women with single eyelids. Int J Environ Res Public Health. 2020; 17: 7701. [CrossRef] [PubMed]
Doughty MJ . Effects of background lighting and retinal illuminance on spontaneous eyeblink activity of human subjects in primary eye gaze. Eye Contact Lens. 2013; 39: 138–146. [CrossRef] [PubMed]
Doughty MJ . Influence of mouth and jaw movements on dynamics of spontaneous eye blink activity assessed during slitlamp biomicroscopy. Clin Exp Optom. 2018; 101: 345–353. [CrossRef] [PubMed]
Bland JM, Altman DG. Measuring agreement in method comparison studies. Stat Methods Med Res. 1999; 8: 135–160. [CrossRef] [PubMed]
Scheiman M, Wick B. Clinical Management of Binocular Vision: Heterophoric, Accommodative, and Eye Movement Disorders. 3. . ed: Philadelphia, PA: Lippincott; 2008.
Gowrisankaran S, Nahar NK, Hayes JR, Sheedy JE. Asthenopia and blink rate under visual and cognitive loads. Optom Vis Sci. 2012; 89: 97–104. [PubMed]
Rodriguez JD, Lane KJ, Ousler GW, 3rd  Angjeli E, Smith LM, Abelson MB. Blink: characteristics, controls, and relation to dry eyes. Curr Eye Res. 2018; 43: 52–66. [PubMed]
McAlinden C, Khadka J, Pesudovs K. Precision (repeatability and reproducibility) studies and sample-size calculation. J Cataract Refract Surg. 2015; 41: 2598–2604. [PubMed]
Tucker KA . Ten Tiny Breaths. Stouffville, Ontario, Canada: Papoti Books; 2012.
Good Calculators. Flesch Kincaid calculator, https://goodcalculators.com/flesch-kincaid-calculator/. Accessed September 12, 2019.
Cohen S, Konstantinou L. The Legacy of David Foster Wallace: University of Iowa Press; 2012.
Gill K, Mao A, Powell AM, Sheidow T. Digital reader vs print media: the role of digital technology in reading accuracy in age-related macular degeneration. Eye. 2013; 27: 639–643. [PubMed]
Sanchez CA, Goolsbee JZ. Character size and reading to remember from small displays. Computers & Education. 2010; 55: 1056–1062.
Jasper. 100 Awesome Conversation Starters to Help You Break the Ice Every Time, https://www.mantelligence.com/conversation-starters/. Accessed September 12, 2019.
Boga A, Stapleton F, Briggs N, Golebiowski B. Daily fluctuations in ocular surface symptoms during the normal menstrual cycle and with the use of oral contraceptives. Ocul Surf. 2019; 17: 763–770. [PubMed]
Chalmers RL, Begley CG, Caffery B. Validation of the 5-Item Dry Eye Questionnaire (DEQ-5): discrimination across self-assessed severity and aqueous tear deficient dry eye diagnoses. Cont Lens Anterior Eye. 2010; 33: 55–60. [PubMed]
Schaumberg DA, Gulati A, Mathers WD, Clinch T, Lemp MA, Nelson JD, et al. Development and validation of a short global dry eye symptom index. Ocul Surf. 2007; 5: 50–57. [PubMed]
Schiffman RM, Christianson M, Jacobsen G, Hirsch JD, Reis BL. Reliability and validity of the ocular surface disease index. Arch Ophthalmol. 2000; 118: 615–621. [PubMed]
Eom Y, Lee J-S, Kang S-Y, Kim HM, Song J-S. Correlation between quantitative measurements of tear film lipid layer thickness and meibomian gland loss in patients with obstructive meibomian gland dysfunction and normal controls. Am J Ophthalmol. 2013; 155: 1104–1110.e2. [PubMed]
Baek J, Doh SH, Chung SK. Comparison of tear meniscus height measurements obtained with the keratograph and Fourier domain optical coherence tomography in dry eye. Cornea. 2015; 34: 1209–1213. [PubMed]
Hong J, Sun X, Wei A, Cui X, Li Y, Qian T, et al. Assessment of tear film stability in dry eye with a newly developed keratograph. Cornea. 2013; 32: 716–721. [PubMed]
Bron AJ, Abelson MB, Ousler G, Pearce E, Tomlinson A, Yokoi N, et al. Methodologies to diagnose and monitor dry eye disease: report of the Diagnostic Methodology Subcommittee of the International Dry Eye WorkShop (2007). Ocul Surf. 2007; 5: 108–152. [PubMed]
Wolffsohn JS, Arita R, Chalmers R, Djalilian A, Dogru M, Dumbleton K, et al. TFOS DEWS II diagnostic methodology report. Ocul Surf. 2017; 15: 539–574. [PubMed]
Nichols JJ, Nichols KK, Puent B, Saracino M, Mitchell GL. Evaluation of tear film interference patterns and measures of tear break-up time. Optom Vis Sci. 2002; 79: 363–369. [PubMed]
Bron AJ, Evans VE, Smith JA. Grading of corneal and conjunctival staining in the context of other dry eye tests. Cornea. 2003; 22: 640–650. [PubMed]
Delaveris A, Stahl U, Madigan M, Jalbert I. Comparative performance of lissamine green stains. Cont Lens Anterior Eye. 2018; 41: 23–27. [PubMed]
Foulks GN, Bron AJ. Meibomian gland dysfunction: a clinical scheme for description, diagnosis, classification, and grading. Ocul Surf. 2003; 1: 107–126. [PubMed]
Pult H, Riede-Pult B. Comparison of subjective grading and objective assessment in meibography. Cont Lens Anterior Eye. 2013; 36: 22–27. [PubMed]
Tomlinson A, Bron AJ, Korb DR, Amano S, Paugh JR, Ian Pearce E, et al. The International Workshop on Meibomian Gland Dysfunction: report of the diagnosis subcommittee. Invest Ophthalmol Vis Sci. 2011; 52: 2006–2049. [PubMed]
Kim JS, Wang MTM, Craig JP. Exploring the Asian ethnic predisposition to dry eye disease in a pediatric population. Ocul Surf. 2019; 17: 70–77. [PubMed]
PupilLabs. Pupil docs - master, https://docs.pupil-labs.com/core/software/pupil-capture/#blink-detection. Accessed August 20, 2021.
Zaman ML, Doughty MJ. Some methodological issues in the assessment of the spontaneous eyeblink frequency in man. Ophthalmic Physiol Opt. 1997; 17: 421–432. [PubMed]
Barbato G, Ficca G, Muscettola G, Fichele M, Beatrice M, Rinaldi F. Diurnal variation in spontaneous eye-blink rate. Psychiatry Res. 2000; 93: 145–51. [PubMed]
Patel P, Henderson R, Bradley L, Galloway B, Hunter L. Effect of visual display unit use on blink rate and tear stability. Optom Vis Sci. 1991; 68: 888–892. [PubMed]
Freudenthaler N, Neuf H, Kadner G, Schlote T. Characteristics of spontaneous eyeblink activity during video display terminal use in healthy volunteers. Graefes Arch Clin Exp Ophthalmol. 2003; 241: 914–920. [PubMed]
Talens-Estarelles C, García-Marqués JV, Cervino A, García-Lázaro S. Use of digital displays and ocular surface alterations: a review. Ocul Surf. 2021; 19: 252–265. [PubMed]
Stern JA, Walrath LC, Goldstein R. The endogenous eyeblink. Psychophysiology. 1984; 21: 22–33. [PubMed]
Cho P, Sheng C, Chan C, Lee R, Tam J. Baseline blink rates and the effect of visual task difficulty and position of gaze. Curr Eye Res. 2000; 20: 64–70. [PubMed]
Mantelli F, Tiberi E, Micera A, Lambiase A, Visintini F, Bonini S. MUC5AC overexpression in tear film of neonates. Graefes Arch Clin Exp Ophthalmol. 2007; 245: 1377–1381. [PubMed]
Zaman M, Doughty M, Button N. The exposed ocular surface and its relationship to spontaneous eyeblink rate in elderly Caucasians. Exp Eye Res. 1998; 67: 681–686. [PubMed]
Ranti C, Jones W, Klin A, Shultz S. Blink rate patterns provide a reliable measure of individual engagement with scene content. Sci Rep. 2020; 10: 1–10. [PubMed]
Jansen ME, Begley CG, Himebaugh NH, Port , NL . Effect of contact lens wear and a near task on tear film break-up. Optom Vis Sci. 2010; 87.
Fu HH, White KA, Collings RDJPCJoPR. The effects of conversation arousal level on attention processes. Psi Chi. 2020; 25.
Chu C, Rosenfield M, Portello J. Computer vision syndrome: blink rate and dry eye during hard copy or computer viewing. Investig Opthalmology Vis Sci. 2010; 51: 951.
Hue JE, Rosenfield M, Saá G. Reading from electronic devices versus hardcopy text. Work. 2014; 47: 303–307. [PubMed]
Benedetto S, Carbone A, Drai-Zerbib V, Pedrotti M, Baccino T. Effects of luminance and illuminance on visual fatigue and arousal during digital reading. Comput Human Behav. 2014; 41: 112–119.
Bertalmío M . Chapter 4 - adaptation and efficient coding. In: Bertalmío M (ed.). Vision Models for High Dynamic Range and Wide Colour Gamut Imaging. New York: Academic Press; 2020;65–93.
Gowrisankaran S, Sheedy JE, Hayes JR . Eyelid squint response to asthenopia-inducing conditions. Optom Vis Sci. 2007; 84: 611–619.
Tsubota K, Nakamori K. Effects of ocular surface area and blink rate on tear dynamics. Arch Ophthalmol. 1995; 113: 155–158. [PubMed]
Park JS, Choi MJ, Ma JE, Moon JH, Moon HJ. Influence of cellular phone videos and games on dry eye syndrome in university students. J Korean Acad Community Health Nurs. 2014; 25: 12–23.
Golebiowski B, Long J, Harrison K, Lee A, Chidi-Egboka N, Asper L. Smartphone use and effects on tear film, blinking and binocular vision. Curr Eye Res. 2020; 45: 428–434. [PubMed]
Chu C, Rosenfield M, Portello JK, Benzoni JA, Collier JD. A comparison of symptoms after viewing text on a computer screen and hardcopy. Ophthalmic Physiol Opt. 2011; 31: 29–32. [PubMed]
Talens-Estarelles C, Sanchis-Jurado V, Esteve-Taboada JJ, Pons ÁM, García-Lázaro S. How do different digital displays affect the ocular surface? Optom Vis Sci. 2020; 97: 1070–1079. [PubMed]
Rosenfield M . Computer vision syndrome: a review of ocular causes and potential treatments. Ophthalmic Physiol Opt. 2011; 31: 502–515. [PubMed]
Braun RJ, King-Smith PE, Begley CG, Li L, Gewecke NR. Dynamics and function of the tear film in relation to the blink cycle. Prog Retin Eye Res. 2015; 45: 132–164. [PubMed]
Pult H, Riede-Pult BH, Murphy PJ. The relation between blinking and conjunctival folds and dry eye symptoms. Optom Vis Sci. 2013; 90: 1034. [PubMed]
Pult H, Murphy P, Riede-Pult , B . Velocity of upper lid spontaneous complete blinks and dry eye. Cont Lens Anterior Eye. 2015; 38: e10–e11.
Soleimanloo SS, Wilkinson VE, Cori JM, Westlake J, Stevens B, Downey LA, et al. Eye-blink parameters detect on-road track-driving impairment following severe sleep deprivation. J Clin Sleep Med. 2019; 15: 1271–1284.
Figure 1.
 
Flowchart of study visits and order of clinical assessments. Note: Visit 2 was conducted 2 days after visit 1. Smartphone task was completed twice before other tasks at each visit for assessment of repeatability. Other tasks randomly allocated include six reading tasks (printed text, laptop, smart TV at 6 m, smartphone, smartphone at 50% brightness, smartphone more complex text), and two non -reading tasks (conversation, walking indoors). DEQ-5, Dry Eye Questionnaire 5; LLT, lipid layer thickness; NIBUT, noninvasive tear break-up time; OSDI, Ocular Surface Disease Index; TMH, tear meniscus height; SANDE, Symptoms Assessment in Dry Eye.
Figure 1.
 
Flowchart of study visits and order of clinical assessments. Note: Visit 2 was conducted 2 days after visit 1. Smartphone task was completed twice before other tasks at each visit for assessment of repeatability. Other tasks randomly allocated include six reading tasks (printed text, laptop, smart TV at 6 m, smartphone, smartphone at 50% brightness, smartphone more complex text), and two non -reading tasks (conversation, walking indoors). DEQ-5, Dry Eye Questionnaire 5; LLT, lipid layer thickness; NIBUT, noninvasive tear break-up time; OSDI, Ocular Surface Disease Index; TMH, tear meniscus height; SANDE, Symptoms Assessment in Dry Eye.
Figure 2.
 
Study set-up showing the wearable eye tracking headset (Pupil Labs GmbH) with two inbuilt high-speed adjustable eye cameras and a scene camera for real time monitoring from participants’ vantage point. The headset was worn by study participants during various tasks including reading from (a) printed text, (b) laptop, (c) smart TV at 6 m, (d) smartphone, and (e) walking indoors and conversation (not shown). The wearable eye tracking headset was connected to a laptop for task monitoring and data acquisition for all tasks other than walking indoors, where an android phone was used for the same purpose, while the examiner followed behind the participant holding the android phone to monitor recording (Fig. 2e). Participants’ consents were obtained for use of these images.
Figure 2.
 
Study set-up showing the wearable eye tracking headset (Pupil Labs GmbH) with two inbuilt high-speed adjustable eye cameras and a scene camera for real time monitoring from participants’ vantage point. The headset was worn by study participants during various tasks including reading from (a) printed text, (b) laptop, (c) smart TV at 6 m, (d) smartphone, and (e) walking indoors and conversation (not shown). The wearable eye tracking headset was connected to a laptop for task monitoring and data acquisition for all tasks other than walking indoors, where an android phone was used for the same purpose, while the examiner followed behind the participant holding the android phone to monitor recording (Fig. 2e). Participants’ consents were obtained for use of these images.
Figure 3.
 
(a) Blink rate and (b) interblink interval during various tasks of 15 minutes’ duration, measured using a wearable eye tracking headset (Pupil Labs GmbH) for 24 students with healthy eyes. Note that data from the first 3 minutes of the smartphone task were discarded and the remaining 12 minutes used for analysis. Data are presented as median and interquartile range. Open circles represent mild outliers (measurements >1.5 to 3.0 times the interquartile range) and stars represent extreme outliers (measurements >3 times the interquartile range).
Figure 3.
 
(a) Blink rate and (b) interblink interval during various tasks of 15 minutes’ duration, measured using a wearable eye tracking headset (Pupil Labs GmbH) for 24 students with healthy eyes. Note that data from the first 3 minutes of the smartphone task were discarded and the remaining 12 minutes used for analysis. Data are presented as median and interquartile range. Open circles represent mild outliers (measurements >1.5 to 3.0 times the interquartile range) and stars represent extreme outliers (measurements >3 times the interquartile range).
Figure 4.
 
(a) Blink rate and (b) interblink interval during various tasks of 15 minutes duration, measured using a wearable eye tracking headset (Pupil Labs GmbH) for 24 students with healthy eyes. The tasks include six reading tasks (printed text, laptop, smart TV at 6 m, smartphone, smartphone at 50% brightness, smartphone more complex text), and two nonreading tasks (conversation, walking indoors). *Note, data from the first three minutes of the smartphone task were discarded and the remaining 12 minutes used for analysis.
Figure 4.
 
(a) Blink rate and (b) interblink interval during various tasks of 15 minutes duration, measured using a wearable eye tracking headset (Pupil Labs GmbH) for 24 students with healthy eyes. The tasks include six reading tasks (printed text, laptop, smart TV at 6 m, smartphone, smartphone at 50% brightness, smartphone more complex text), and two nonreading tasks (conversation, walking indoors). *Note, data from the first three minutes of the smartphone task were discarded and the remaining 12 minutes used for analysis.
Figure 5.
 
Ocular surface symptoms of discomfort and dryness scores (median and interquartile range) measured using the IOSS before and after various tasks of 15 minutes duration for 24 students with healthy eyes. Note, data from the first three minutes of the smartphone task were discarded and the remaining 12 minutes used for analysis. Higher IOSS scores indicate worse discomfort. Blue and red circles represent mild outliers (symptom scores >1.5 to 3.0 times the interquartile range).
Figure 5.
 
Ocular surface symptoms of discomfort and dryness scores (median and interquartile range) measured using the IOSS before and after various tasks of 15 minutes duration for 24 students with healthy eyes. Note, data from the first three minutes of the smartphone task were discarded and the remaining 12 minutes used for analysis. Higher IOSS scores indicate worse discomfort. Blue and red circles represent mild outliers (symptom scores >1.5 to 3.0 times the interquartile range).
Figure 6.
 
Differences between (a) blink rate and (b) interblink interval measured using the wearable eye tracking headset (Pupil Labs GmbH) during two repeats plotted against their mean for 24 students with healthy eyes, while reading easy book series on a smartphone for 12 minutes. The dotted line shows a bias of (a) −0.7 blinks/min (P = 0.62) and (b) 0.7s (P = 0.55). The dashed lines represent the limits of agreement of (a) +11.7 to −13.1 blinks/min and (b) +19.5 to −18.2s.
Figure 6.
 
Differences between (a) blink rate and (b) interblink interval measured using the wearable eye tracking headset (Pupil Labs GmbH) during two repeats plotted against their mean for 24 students with healthy eyes, while reading easy book series on a smartphone for 12 minutes. The dotted line shows a bias of (a) −0.7 blinks/min (P = 0.62) and (b) 0.7s (P = 0.55). The dashed lines represent the limits of agreement of (a) +11.7 to −13.1 blinks/min and (b) +19.5 to −18.2s.
Table.
 
Baseline Ocular Surface Symptoms and Clinical Assessments for 24 Students With Healthy Eyes
Table.
 
Baseline Ocular Surface Symptoms and Clinical Assessments for 24 Students With Healthy Eyes
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×