The competitive nature of academic research requires objective metrics to define career end points, such as promotion and funding procurement. While various criteria are used to assess performance in academia, publications and research funding are particularly regarded.1 Quantifying research dollars is relatively straightforward, but measuring research productivity is more complex. Not all articles are created equal, and disparities exist regarding effort and the ultimate impact of articles. In 2005, a physicist created the h-index to measure both research impact and productivity.2 As a bibliometric, the h-index equals the number of publications h that have been cited at least h times. Given its simplicity, the h-index has gained wide popularity in diverse medical specialties, including orthopedic surgery.3 Other recent studies have applied the h-index to hand surgery and spine surgery.4,5
Importantly, some authors have raised concerns regarding potential limitations of the h-index. One potentially significant limitation is the ability of authors to artificially inflate their h-index via self-citation. The impact of this practice is of particular interest as the h-index becomes widely adopted as a metric for promotion at many academic institutions.6-7 Furthermore, scholarly productivity has remained a critical component of successful grant funding procurement, and future grant funding applications may evaluate the h-index.8-10
The purpose of this study is to determine the prevalence and impact of self-citation on the h-index in a large cohort of orthopedic investigators. Given their high level of investment in academic orthopedic surgery, we focused on program directors, chairpersons, and National Institutes of Health (NIH)-funded research faculty at orthopedic surgery residency programs.
METHODS
INCLUSION CRITERIA
This study qualified as non-human and non-animal research and received exemption per the standing policy of the Institutional Review Board. The Fellowship and Residency Electronic Interactive Database (FREIDA) was accessed to generate a list of orthopedic residency program directors.11 This database was also used to generate a list of allopathic orthopedic surgery residency programs. Official program websites were accessed to generate a list of orthopedic chairpersons. Lastly, the NIH RePORTER was used to generate a list of basic science orthopedic investigators who received funding anytime during 2011 to 2014.12 This methodology was used due to the lack of reporting of basic science investigators on program websites. A list of NIH-funded orthopedic investigators was cross-referenced via an online search to isolate a cohort of PhD investigators.
Orthopedic faculty were defined as chairpersons, program directors, or NIH-funded investigators. In cases of overlap, preference was given in that order. Orthopedic investigators who had not published an article after 1995 were excluded (6 chairpersons, 1 program director).
BIBLIOMETRIC ANALYSIS
While several resources exist to calculate the h-index, the Scopus database (Elsevier) is one of the easiest programs to use.13 Author entries are created via institutional affiliations, thereby alleviating the need for manual reconciliations. Investigators were identified on Scopus via “author last name” and “first name, middle initial.” For each author, publications were screened for relevance to the field of orthopedics. Affiliated institutions were cross-referenced with information obtained from individual program websites. The “view h-graph” feature was used to calculate the number of publications, h-index, and number of citations. Then, the “Exclude self-citations” feature was used to calculate the number of corrected citations and the h-index excluding self-citations. Metrics were calculated over a 2-day period.
Continue to: STATISTICAL ANALYSIS