Given DISCERN scores across studies, information about rotator cuff tears ranked below information about osteosarcoma and juvenile idiopathic arthritis but above information about scoliosis, cervical spine surgery, and ACL reconstruction (Table). DISCERN scores must be compared across studies, as there are no definitions for good and poor DISCERN scores.
Of the 4 studies that analyzed percentage of websites citing peer-reviewed sources, only our study and the study of cervical spine surgery18 analyzed that percentage as well as DISCERN score. Percentage citing peer-reviewed sources was 26% for rotator cuff tears and 24% for cervical spine surgery; the respective DISCERN scores were 44 and 43.6. As only these 2 studies could be compared, no real correlation between percentage of websites citing peer-reviewed sources and the quality of the content on a given topic can be assessed. More research into this relationship is needed. One already delineated association is the correlation between HONcode-certified sites and high DISCERN scores.21 For high-quality medical information, physicians can direct their patients both to academic institution websites and to HONcode-certified websites.
When we compared the present study with previous investigations, we found a large difference between search results for a given topic. In 2013, Duncan and colleagues6 and Bruce-Brand and colleagues9 used similar study designs (eg, search terms, search engines) for their investigations of quality of web information. Their results, however, were widely different. For example, percentages of industry authorship were 4.5% (Duncan and colleagues6) and 64% (Bruce-Brand and colleagues9). This inconsistency between studies conducted during similar periods might be related to what appears at the top of the results queue for a search. Duncan and colleagues6 analyzed 200 websites, Bruce-Brand and colleagues9 only 45. Industries may have made financial arrangements and used search engine optimization techniques to have their websites listed first in search results.
In our study, we also analyzed how web information has changed over time. On the Internet, information changes daily, and we hypothesized that the content found during our 2 searches (2011, 2014) would yield different results. Surprisingly, the data were similar, particularly concerning authorship (Figures 1, 2). In both searches, the largest authorship source was private physician or physician groups (38% in 2011 and 2014). Other authorship sources showed little change in percentage between searches. As for content, we found both increases and decreases in specific web information. Ability to contact authors increased from 21% (2011) to 50% (2014). We think it is important that websites offer a communication channel to people who read the medical information the sites provide. Percentage of websites discussing nonoperative treatment options increased from 11.5% to 61%. Therefore, patients in 2014 were being introduced to more options (in addition to surgery) for managing shoulder pain—an improvement in quality of information between the searches. Percentage of websites discussing surgical eligibility, however, decreased from 43% to 18%—a negative development in information quality. Another decrease, from 42% to 25%, was found for websites discussing surgical complications. Given the data as a whole, and our finding both negative and positive changes, it appears the quality of web content has not improved significantly. Interestingly, no websites discussed double- versus single-row surgery in 2014, but 6% did so in 2011.
Lost in the discussion of quality and reliability of information is whether patients comprehend what they are reading.23 Yi and colleagues19 recentlyassessed the readability level of arthroscopy information in articles published online by the American Academy of Orthopaedic Surgeons (AAOS) and the Arthroscopy Association of North America (AANA). The investigators used the Flesch-Kincaid readability test to determine readability level in terms of grade level. They found that the majority of the patient education articles on the AAOS and AANA sites had a readability level far above the national average; only 4 articles were written at or below the eighth-grade level, the current average reading level in the United States.24 Information that is not comprehensible is of no use to patients, and information that physicians and researchers consider high-quality might not be what patients consider high-quality. As we pursue higher-quality web content, we need to consider that its audience includes nonmedical readers, our patients. In the present study, we found that the readability of a website had no correlation with the site’s DISCERN score (Figure 5). Therefore, for information about rotator cuff repairs, higher-quality websites are no harder than lower-quality sites for patients to comprehend. The Flesch-Kincaid readability test is flawed in that it considers only total number of syllables per word and words per sentence, not nontextual elements of patient education materials, such as illustrations on a website. The 10.98 mean grade level found in our study is higher than the levels found for most studies reviewed by Yi and colleagues.19