It’s Time to Reschedule Your Cancer Screenings

A recent survey of more than 1,000 American adults found that 35 percent skipped previously scheduled cancer screening appointments during the pandemic. A USA Today article on the survey quotes MSK’s Dr. Jeffrey Drebin, who says that pushing back screening has a domino effect, as it could also delay life-saving treatment.

In the New York Daily News, MSK’s Dr. Lisa DeAngelis emphasizes the importance of cancer care, and how early detection saves lives. She urges New Yorkers to schedule mammograms, colonoscopies, skin checks, and other screenings now, while New York City’s COVID numbers are low and stable.

Covidence: Better SR Data Quality & Integrity

The Covidence systematic review (SR) data management software is essentially a research electronic data capture tool, similar to REDCap. In a SR, however, the “study population” consists not of patients, but rather of literature database search results (i.e., references), while the “survey” administered to each “study subject” consists of the inclusion and exclusion criteria. 

Different than in a typical clinical study, a unique feature of the systematic review study design is that all the information captured is done so in duplicate (ideally), by two human screeners/reviewers working independently of each other. In other words, the same “survey” is administered twice to the same “study subject” and the two data captures are then compared to identify any disagreements.

This is where REDCap differs in its functionality from Covidence. Covidence not only documents the decisions of the two reviewers but it also compares them, and then automatically separates out any conflicts that need to be resolved – providing built-in quality control.

In fact, Covidence requires that reviewers address all screening discrepancies before allowing them to move on to the next stage of the review. In the full-text review stage, where explanations for exclusions must be provided, even if both reviewers vote similarly to exclude an item, Covidence will flag any exclusion reason discrepancies and force the team to resolve the conflicts before being allowed to proceed.

Data integrity features are also prominent in Covidence. For example, reviewers have the ability to reverse a decision (ie. make changes to collected data), however, if the second reviewer has already voted on that item, both reviewers will have to re-screen the record from the beginning in order to re-capture both reviewers’ judgements (i.e., this undoes all of the votes associated with the reference from that stage).

Also, in order to minimize the introduction of bias into the review process, the individual decisions made by the two reviewers are blinded to the team so that if a conflict has to be resolved by a third party, the third party will not be influenced by knowing who made which decision (as they may unconsciously side with the more senior reviewer, etc.). Even though a specific batch of records cannot be assigned to/linked to a particular reviewer, a particular task in the review process can, however, be assigned to a specific team member (for example, resolving conflicts may be set to be solely handled by the project PI).

Another feature of Covidence that leads to better data is its quality assessment and data extraction process. If two reviewers are assessing each study for bias, a comparison of assessments and consensus of judgements will be needed to complete this stage. The data extraction completed by two reviewers independently is also followed by a consensus step. If the consensus step is skipped, data will appear blank in the data export as it is only the “consensus judgements of data extraction” that can be exported to Excel. In other words, if the data is not first “cleaned” by the team, they will literally not be able to get it out of Covidence.

Although Covidence does not include any data visualization or data/statistical analysis functionality, it does allow you to export the data in a spreadsheet. “The goal of this format is to facilitate import of data extracted in Covidence into statistical programs such as Stata, R, or SPSS.”

To learn more about Covidence, register for an upcoming workshop or Ask Us

A Document Delivery Milestone: 400,000 ILLiad Requests

We’re celebrating a milestone here on the library team; the counter has tipped from 399,999 to 400,000 requests in ILLiad. This means that since 2009, the ILL team at the MSK Medical Library has processed 400k requests for books, articles, standards, dissertations, meeting abstracts, conference papers, tables of contents, and more for researchers at MSK and on the flip side, for libraries all over the world who request items from our collection.

https://pixabay.com/images/id-1614223/

We’ve learned some tricks here and there to fill these requests. One blog [now defunct] referred to our specialization as Forensic Bibliographic Reconstruction and I, for one, take that as a huge compliment. Most requests come to us replete with all the fields filled, others take more research. I’ve seen requests for “The Smith Article from last week” or “Not sure.” Sometimes it can be more of a scavenger hunt or a wild goose chase. We usually get the article in the user’s hands.

We provide this service at no cost to the user, anyone with an MSK email and ID number, and as long as it’s relevant to work done at MSK. We’ve processed requests for physics, psychology, nutrition, medicine, physical therapy, chemistry, administration, HR, etc. But sometimes there are things we just can’t get.

With libraries around the world currently shut down or in various stages of quarantine, getting articles to users has been trickier. We are learning just how many items are only available from the shelves of shuttered libraries. The partnerships and channels of interlibrary loan have become more important than ever.    

I’m happy to use this opportunity to announce that ILLiad will soon be getting a new look! The webpages are being redesigned to be easier to use, especially on mobile devices. We’re currently working to make ordering and receiving articles easier for our user community. More details will follow in the coming weeks.

If you’d like to know more about our service or place a request, visit us at https://illiad.mskcc.org/illiad/Logon.html. We’re happy to answer any questions at ill@mskcc.org.