Skip to Main Content

Systematic Review: Screening process steps

Revised content order 2022

Screening

​​​When reviewing the final search results from your chosen databases (and other sources if relevant), multiple reviewers (you and your supervisors or co-reviewers) will make decisions on which articles to include and exclude based on the criteria specified in your protocol. The first stage of this is usually based on titles and abstracts, then a full-text analysis follows before data extraction.

  • Pre-screening:   Record the numbers of results from each database or source recorded before screening commences.
  • Remove duplicates:  Covidence automates this process, but you can choose to de-duplicate references within EndNote.
  • Title/abstract screening:  Reviewers scan titles and abstracts to see if they match the criteria or have some value to the systematic review.  This may be done by a single reviewer, but done separately by multiple reviewers and the results compared, reduces the likelihood of bias.
  • Full-text screening:  Multiple reviewers individually look through the full-text of included articles to fine-tune the final collection of articles that will contribute to the review.

You can Document your searches - using an Excel workbook or there is a range of tools that can be used to store citations and do the screening.

Endnote and Covidence

At Monash Library, we support EndNote (reference manager) and Covidence (software package for reviews), which can help you streamline the review process. Self-enrol in our Integrating EndNote and Covidence tutorial to familiarise yourself with using these tools for systematic reviews.

Other tools that may be useful for screening (and other review management processes) include:

(Note that Monash Library does not offer support for these tools):

  • Abstrackr (free)
  • Colandr
  • EPPI-Reviewer (subscription)
  • HubMeta (AI Assistant learns from screening decisions to continuously sort remaining articles based on relevance and likelihood of inclusion in the review - currently free)
  • Litstream|ICF
  • PICO Portal (Free, only one review at a time. Ask AI feature not available in free version.)
  • Rayyan (free)
  • RevMan Web (for Cochrane and other reviews).  RMW simplifies creating meta-analyses, forest plots and risk-of-bias tables.  It may be used for teaching / learning and research activities related to systematic reviews and is mandatory for research that is to be published in the Cochrane Library (i.e. 'Cochrane Reviews').  Register with Cochrane using your Monash email then click "sign up now" on this page to connect to RMW. Once confirmed via email you can link to RMW directly then click "My Portfolio" to create a new review and access support via the help menu or RMW training resources.

  • SUMARI (for JBI reviews mainly) - enter via any Ovid database and look for the EBP tools on the top blue toolbar

 

Apply Criteria

As each reviewer assesses the articles returned by the searches, they must adhere to the inclusion and exclusion criteria that were defined in the protocol. A checklist or table will assist with this.

Example:

Author

Date

Journal

Study aim

Hypothesis

Research questions

Location

Study design

Participants

Data collection &

analysis methods

Results/

findings

Relevant to topic/

research question

Your table also serves as documentation of each reviewer's rationale in selecting or rejecting articles. A "cross-check" is performed to make sure that all reviewers have agreed on the included articles (based on their abstracts).
Example: Screening selection tool
Full texts of the included articles are retrieved at this point. There are a number of ways to do this:

 

  • Use the DOI or URL of the article
  • Use the find full-text feature in Endnote (note this is very limited)
  • Search in library catalogues such as TROVE.

The screening process is then repeated based on the full text of each article selected from the title and abstract screen. This step is usually performed independently by multiple reviewers to reduce bias. Reviewers then compare their results until an agreement is reached. Sometimes an additional reviewer is needed at this stage if the inclusion of any articles is particularly contentious. The articles remaining are the ones that will be evaluated and analysed.