New Database! iThenticate

The MSK Library provides all MSK staff with access to plagiarism detection software through iThenticate.

iThenticate is a tool for researchers and writers to check their original works for potential plagiarism. This database will check against 93% of Top Cited Journal content and 70+ billion current and archived web pages.

Check out our Support for Authors guide for more writing and publishing resources.

Systematic Review Course: Cochrane Interactive Learning

The library recently acquired a subscription to Cochrane Interactive Learning, an online introductory course on how to conduct a systematic review of interventions. This 15-hour self-paced course, divided into 11 modules, covers the following topics:

  • Module 1: Introduction to conducting systematic reviews
  • Module 2: Writing the review protocol
  • Module 3: Searching for studies
  • Module 4: Selecting studies and collecting data
  • Module 5: Introduction to study quality and risk of bias
  • Module 6: Analyzing the data
  • Module 7: Interpreting the findings
  • Module 8: Reporting the review
  • Module 9: Introduction to health economics
  • Module 10: Network meta-analysis
  • Module 11: Health equity in systematic reviews

This dynamic and flexible course is designed for both beginners and experienced review authors, and is broken down into over 100 small learning units that build on the knowledge one step at a time. Complete with interactive activities and quizzes, each module also finishes with an assessment to demonstrate skills and download a certificate of completion.

For technical support accessing the course, please refer to the registration instructions.

Reproducibility in Pre-clinical Cancer Research

The Reproducibility Project: Cancer Biology, led by Tim Errington, just reported the results of an eight-year project where they tried to replicate experiments from 193 preclinical studies published in 53 cancer journals. Preclinical studies refer to studies conducted in animals before being carried out in humans. Those studies were published between 2010 and 2012.
The team could only reproduce 50 experiments out of 193 because of a lack of data, reporting issues, or access to materials.  
To then assess the replication of the 50 experiments, they used five criteria and “found that just 18% succeeded on all five, while 20% failed on all five. Overall, 46% of effects succeeded on most criteria.”
The results are reported in two articles published in eLife, Challenges for assessing replicability in preclinical cancer biology, and Investigating the replicability of preclinical cancer biology.

Nature just published a comment titled, “Five keys to writing a reproducible lab protocol” for better transparency and to avoid similar issues in the future.