Abstract
Purpose: :
To compare online patient-reviews from the website Yelp with peer-reviews of the same 10 ophthalmologists.
Methods: :
An online search on the website Yelp was conducted to assess patient-reviews through January 1st, 2009 of 10 primary care ophthalmologists from the San Francisco Bay area. These online Yelp scores were compared with 1) 5 peer-reviews from ophthalmologists in the same community and 2) surgical complication rates.
Results: :
Yelp online patient-reviews showed low correlation with peer-reviews of ophthalmologists (ρ=0.018, p=0.96). Fifty percent of Yelp patient-reviews pertained to the quality of the physician and 50% to other factors including staff, wait time, finances, office building, availability, and location. Twenty-seven percent of patient reviews were negative, which most commonly (53%) reflected staff or wait time. The 5 peer-reviewers showed high inter-agreement (ρ=0.635, p<0.001). Cataract complication rates correlated significantly better with peer reviews (ρ=1.00, p<0.0001) than with Yelp patient-reviews (ρ=0.13, p=0.75).
Conclusions: :
Patient-reviews from the website Yelp demonstrate minimal correlation with peer-reviews of primary care ophthalmologists in the San Francisco Bay area.
Keywords: clinical (human) or epidemiologic studies: health care delivery/economics/manpower • clinical (human) or epidemiologic studies: outcomes/complications