Comment
In addition to the convenience of having an educational tool in their white-coat pocket, learners of dermatology have been shown to benefit from supplementing their curriculum with mobile apps, which sets the stage for formal integration of mobile apps into dermatology teaching in the future.8 Prior to widespread adoption, mobile apps must be evaluated for content and utility, starting with an objective rubric.
Without official scientific standards in place, it was unsurprising that only half of the dermatology education applications were classified as adequate in this study. Among the types of apps offered—clinical decision support tools, journals, references, podcast, games, learning modules, and self-evaluation—certain categories scored higher than others. App formats with the highest average score (16.5 out of 20) were journals and podcast.
One barrier to utilization of these apps was that a subscription to the journals and podcast was required to obtain access to all available content. Students and trainees can seek out library resources at their academic institutions to take advantage of journal subscriptions available to them at no additional cost. Dermatology residents can take advantage of their complimentary membership in the American Academy of Dermatology for a free subscription to AAD Dialogues in Dermatology (otherwise $179 annually for nonresident members and $320 annually for nonmembers).
On the other hand, learning module was the lowest-rated format (average score, 11.3 out of 20), with only Medical Student: Dermatology qualifying as adequate (total score, 16). This finding is worrisome given that students and residents might look to learning modules for quick targeted lessons on specific topics.
The lowest-scoring app, a clinical decision support tool called Naturelize, received a total score of 7. Although it listed the indications and contraindications for dermal filler types to be used in different locations on the face, there was a clear conflict of interest, oversimplified design, and little evidence-based education, mirroring the current state of cosmetic dermatology training in residency, in which trainees think they are inadequately prepared for aesthetic procedures and comparative effectiveness research is lacking.9-11
At the opposite end of the spectrum, MyDermPath+ was a reference app with a total score of 20. The app cited credible authors with a medical degree (MD) and had an easy-to-use, well-designed interface, including a reference guide, differential builder, and quiz for a range of topics within dermatology. As a free download without in-app purchases or advertisements, there was no evidence of conflict of interest. The position of a dermatopathology app as the top dermatology education mobile app might reflect an increased emphasis on dermatopathology education in residency as well as a transition to digitization of slides.5
The second-highest scoring apps (total score of 19 points) were Dermatology Database and VisualDx. Both were references covering a wide range of dermatology topics. Dermatology Database was a comprehensive search tool for diseases, drugs, procedures, and terms that was simple and entirely free to use but did not cite references. VisualDx, as its name suggests, offered quality clinical images, complete guides with references, and a unique differential builder. An annual subscription is $399.99, but the process to gain free access through a participating academic institution was simple.
Games were a unique mobile app format; however, 2 of 3 games scored in the somewhat adequate range. The game DiagnosUs, which tested users’ ability to differentiate skin cancer and psoriasis from dermatitis on clinical images, would benefit from more comprehensive content as well as professional verification of true diagnoses, which earned the app 2 points in both the content and accuracy categories. The Unusual Suspects tested the ABCDE algorithm in a short learning module, followed by a simple game that involved identification of melanoma in a timed setting. Although the design was novel and interactive, the game was limited to the same 5 melanoma tumors overlaid on pictures of normal skin. The narrow scope earned 1 point for content, the redundancy in the game earned 3 points for design, and the lack of real clinical images earned 2 points for educational objectives. Although game-format mobile apps have the capability to challenge the user’s knowledge with a built-in feedback or reward system, improvements should be made to ensure that apps are equally educational as they are engaging.
AAD Dialogues in Dermatology was the only app in the form of a podcast and provided expert interviews along with disclosures, transcripts, commentary, and references. More than half the content in the app could not be accessed without a subscription, earning 2.5 points in the conflict of interest category. Additionally, several flaws resulted in a design score of 2.5, including inconsistent availability of transcripts, poor quality of sound on some episodes, difficulty distinguishing new episodes from those already played, and a glitch that removed the episode duration. Still, the app was a valuable and comprehensive resource, with clear objectives and cited references. With improvements in content, affordability, and user experience, apps in unique formats such as games and podcasts might appeal to kinesthetic and auditory learners.
An important factor to consider when discussing mobile apps for students and residents is cost. With rising prices of board examinations and preparation materials, supplementary study tools should not come with an exorbitant price tag. Therefore, we limited our evaluation to apps that were free or cost less than $5 to download. Even so, subscriptions and other in-app purchases were an obstacle in one-third of apps, ranging from $4.99 to unlock additional content in Rash Decisions to $69.99 to access most topics in Fitzpatrick’s Color Atlas. The highest-rated app in our study, MyDermPath+, historically cost $19.99 to download but became free with a grant from the Sulzberger Foundation.12 An initial investment to develop quality apps for the purpose of dermatology education might pay off in the end.
To evaluate the apps from the perspective of the target demographic of this study, 2 medical students—one in the preclinical stage and the other in the clinical stage of medical education—and a dermatology resident graded the apps. Certain limitations exist in this type of study, including differing learning styles, which might influence the types of apps that evaluators found most impactful to their education. Interestingly, some apps earned a higher resident score than student score. In particular, RightSite (a reference that helps with anatomically correct labeling) and Mohs Surgery Appropriate Use Criteria (a clinical decision support tool to determine whether to perform Mohs surgery) each had a 3-point discrepancy (data not shown). A resident might benefit from these practical apps in day-to-day practice, but a student would be less likely to find them useful as a learning tool.
Still, by defining adequate teaching value using specific categories of educational objectives, content, accuracy, design, and conflict of interest, we attempted to minimize the effect of personal preference on the grading process. Although we acknowledge a degree of subjectivity, we found that utilizing a previously published rubric with defined criteria was crucial in remaining unbiased.
Conclusion
Further studies should evaluate additional apps available on Apple’s iPad (tablet), as well as those on other operating systems, including Google’s Android. To ensure the existence of mobile apps as adequate education tools, they should be peer reviewed prior to publication or before widespread use by future and current providers at the minimum. To maximize free access to highly valuable resources available in the palm of their hand, students and trainees should contact the library at their academic institution.