Systematic review searching

From HLWIKI Canada
Revision as of 23:10, 4 May 2012 by Dean (Talk | contribs)

Jump to: navigation, search
Another view of the "evidence wedge"
Are you interested in contributing to HLWIKI Canada - hlwiki.ca? contact: dean.giustini@ubc.ca

To browse other articles on a range of HSL topics, see the wiki index.

Contents

Introduction

see also: Bibliographic citation software, Expert searching, Filters (ie. hedges), Hand-searching & Systematic reviews

What is the purpose of a systematic review (SR)?

The purpose of a systematic review is to collect ALL available evidence (or as near "complete recall" as possible) whether published in the peer-reviewed, literature or on the web outside of regular publishing and indexing channels (in the grey literature). Systematic reviews (SRs) are often described as 'papers that summarize other papers' and defined as "overviews of primary studies that have used explicit and reproducible methods". Typically, SRs synthesize findings from important clinical trials that may or may not include a meta-analysis. SRs are also said to constitute the best-available evidence on a topic where the research has been identified, selected, appraised and synthesized in a systematic and transparent way to inform decision-making.

In information retrieval for the systematic review, it is important to document search strategies to ensure reproducibility.

PRISMA Statement

According to the PRISMA Statement reprinted in the Annals of Internal Medicine in 2009, a "Systematic Review attempts to collate all empirical evidence that fits pre-specified eligibility criteria to answer a specific research question. It uses explicit, systematic methods that are selected to minimize bias, thus providing reliable findings from which conclusions can be drawn and decisions made".

Consultation with a health librarian

While health librarians often play key roles in systematic reviews, searching is also performed by researchers with health librarians assuming more consultative roles. However, several issues must be considered when this type of consultative work is undertaken. Efficient retrieval plays a significant and central role in answering clinical questions and in gathering evidence for the SR. In the initial stages, it is critical that relevant studies are found and multiple sources are searched (including those in the deep or hidden web or known as grey literature). Your goal as an information retrieval specialist or expert searcher is to maximize the recall of the literature on the topic and to minimize any bias in doing so. Your search strategy must reflect this kind of thinking. Comprehensive searching for all relevant studies is essential as is documentation of explicit strategies; consequently, consult a health librarian about your searches.

You must carefully document and report your search strategies. Hand-searching and snowballing are required as is searching for grey literature. The latter involves locating additional citations that are not normally indexed in the major biomedical databases such as MEDLINE or Embase.

Search strategies

SRs are scientific investigations in themselves, with pre-planned methods and an assembly of original studies as their "subjects." They synthesize the results of many primary studies by using strategies to limit bias and error. However, no study or investigation can occur unless an information specialist has reviewed your searches or helped in their formation.

In summary, the role of the SR searcher is to:

  1. Understand systematic review methods
  2. Plan search strategies
  3. Undertake faceted searching using various tools, engines and databases
  4. Be able to locate and use methodological search filters
  5. Understand how to do budgeting and cost support for research grants
  6. Be able to document search processes, and work towards reproducibility
  7. Have an awareness of managing references using RefWorks and social media tools such as Zotero & Mendeley
  8. Understand the importance of searching in the systematic review process

For an excellent overview of expert search strategies, see Evidence for Policy and Practice Information and Co-ordinating Centre (EPPI-Centre).

Scoping or narrative review searching

see also Rapid evidence-assessments (REAs) and Scoping studies

Preparing for a scoping search involves a literature review (or set of reviews) for the clinical team. A review is performed to identify existing systematic reviews and major clinical trials. From the initial scoping searches, the clinical team focuses (or refocuses) the clinical question and the direction of their proposed research. There is no need to replicate studies that have been registered as a protocol at the Cochrane Library or elsewhere. Scoping searches are therefore done beforehand in the major biomedical databases such as the Cochrane Library, MEDLINE, CINAHL, Google scholar and PsycINFO.

Scoping involves a refining process and identifying what information sources should be searched. Full literature searching aimed at retrieving relevant studies or articles in a given discipline starts with scoping the search. The scoping process is iterative and helps to estimate the size of the literature in question and the costs of searching it. Some health librarians are beginning to offer their information retrieval skills for rapid evidence-assessments (REAs) because interdisciplinary search topics increase the likelihood of time-consuming and expensive searching, and it may be important to know this before a large-scale project is undertaken. Arksey and O'Malley in 2005 outline a methodological framework that identifies different types of scoping studies, and how these compare to systematic reviews. Identifying key, searchable sources of information is a process that varies considerably depending on the research or clinical question. The best tools and searching protocols are devised accordingly, but searching for systematic reviews requires search tracking far beyond scoping. Searching in the top five or ten biomedical databases is not enough; additional resources must be identified, even non-English materials, and the grey literature. Some sources of information are well-developed and stable, but others hidden in the deep web are not.

Systematic review searching differs from searching for the narrative review in search strategies are explicitly outlined, and how the searcher identified the available literature. In addition, how the literature was critically appraised and interpreted must be included. This makes it possible for other researchers, readers and users to reproduce your searches so that they can see how thoroughly your review was completed.

Search filters

see also Filters (ie. hedges)

"A search term or terms (such as 'random allocation') that select studies that are at the most advanced stages of testing for clinical application." (Haynes RB, Wilczynski N, McKibbon 1995, p436)

Search filters - also hedges, optimal search strings or clinical queries -- are predefined strategies designed to retrieve high levels of evidence (ie. randomized controlled trials, meta-analysis or systematic reviews) and/or articles that discuss clinical queries (diagnosis, prognosis, etiology, treatment). Search filters are called 'methodological' strategies because they consist of terms that relate to study design. RCT filters, for example, contain terms such as double blinded, randomized, clinical trial. Diagnosis filters contain terms such as sensitivity, specificity, and so on. While (RCTs) are achievable in clinical settings public health interventions can rarely replicate the controlled environment of the clinic. Researchers, policy-makers and decision-makers often rely on other types of study designs for evidence. There is currently no standard model for synthesizing results of studies that do not have controls. Synthesis methods, critical appraisal tools and studies that deal with appraising and synthesizing quantitative studies without control groups are needed; precautions when including non-controlled studies as evidence are also needed. (For more information, see Fitzpatrick-Lewis D et al. Methods for the Synthesis of Studies without Control Groups.)

Expert searches

see also: Expert searching

After research questions and search strategies are determined, exhaustive searches can be carried out to provide as thorough a list of studies as possible both published and unpublished, which may fit the inclusion criteria and hence be suitable. Expert searching for SRs is generally considered the province of health librarians. SR search specialists should be consulted during the early stages of planning as they are trained to search efficiently and stay current with information trends. Health librarians help to locate documents from the deep web and assist in formatting references using citation management tools. Some librarians specializing in expert searching often work in hospital or university libraries; others work outside traditional contexts nearer established research institutes, epidemiology groups, technology assessment programs, and other clinical research programs. Some are entrepreneurial and work on a contract basis with various health groups.

Horizon estimation

Researchers have no empirically based search stopping rule when looking for potentially relevant articles for inclusion in systematic reviews. This research team tested a stopping strategy based on capture-mark-recapture (CMR; i.e., the Horizon Estimate) statistical modeling to estimate the total number of articles in the domain of clinical decision support tools for osteoporosis disease management using four large bibliographic databases (Medline, EMBASE, CINAHL, and EBM reviews).

When planning a clinical trial, or when collecting information to include in a meta-analysis, an investigator searches available literature to get the most up-to-date bibliographic information available. Such a literature search, or systematic review, is usually performed by searching databases, but it may also involve a search of books, journals, and websites. Common database searches for medical researchers are the Cochrane Library, Medline, EMBASE, PsycINFO, and CINAHL, to name a few. Typically the goal of the review is to find as many relevant articles as possible, but such a review need not be limited to articles. Here we present an example where the search is for relevant journals rather than articles. It is quite likely that a review will miss some, if not many, articles or journals. Some journals could be missed because the review was limited; in its key words, to one or a small number of languages, to a particular time frame, or to a small number of databases. Knowing that it is unlikely that a review will find all relevant journals, it is important for a researcher to estimate the number of journals that may have been missed. Adding this estimate to the total number of journals identified by the review defines the Horizon. If a review identifies a small proportion of journals relative to the Horizon, the research project being planned may be based on inadequate information. This paper presents a method based on Poisson regression modeling to estimate the number of journals that have been missed

Researchers conducting systematic reviews need to search multiple bibliographic databases such as MEDLINE and EMBASE. However, researchers have no rational search stopping rule when looking for potentially-relevant articles. We empirically tested a stopping rule based on the concept of capture-mark-recapture (CMR), which was first pioneered in ecology. The principles of CMR can be adapted to systematic reviews and meta-analyses to estimate the Horizon of articles in the literature with its confidence interval. We retrospectively tested this Horizon Estimation using a systematic review of randomized controlled trials (RCTs) that evaluated clinical decision support tools for osteoporosis disease management. The Horizon Estimation was calculated based on 4 bibliographic databases that were included as the main data sources for the review in the following order: MEDLINE, EMBASE, CINAHL, and EBM Reviews. The systematic review captured 68% of known articles from the 4 data sources, which represented 592 articles that were estimated as missing from the Horizon.

Workshops worldwide

See also

References

Personal tools
Namespaces

Variants
Actions
Navigation
Toolbox