Traditional article level metrics, like the number of times a journal article is cited, are certainly useful in establishing the degree of impact that a particular research study has on its field. Especially true in the basic sciences, article citations are the currency with which researchers acknowledge the research that inspired their new ideas, as their new research findings build on those of others. Citation counts, however, can be limited in their usefulness as they do require a considerable time lag to accumulate and so rarely provide an immediately accurate indication of impact.
Similarly, it is not uncommon in clinical research that a published study could have far-reaching, practice-changing, implications for clinical practice yet not garner a particularly noteworthy number of times cited, especially in comparison to papers in the basic sciences. As not all practicing clinicians take part in clinical research, it is very possible to have a situation where many health professionals are heavily consuming some published information but since they do not conduct research nor publish articles of their own, they will never cite (ie. acknowledge in the scholarly record) this information that may have been critical to their practice.
In comes a tool like PlumX. Continue reading