- Search for Research Summaries, Reviews, and Reports
- EPC Project
- Finding Evidence for Comparing Medical Interventions
Methods Guide for Effectiveness and Comparative Effectiveness Reviews -- Overview and Chapters
- Comparing Medical Interventions: AHRQ and the Effective Health Care Program
- Principles in Developing and Applying Guidance for Comparing Medical Interventions
- Identifying, Selecting and Refining Topics for Comparative Effectiveness Systematic Reviews
- Developing and Selecting Topic Nominations for Systematic Reviews
- The Refinement of Topics for Systematic Reviews: Lessons and Recommendations From the Effective Health Care Program
- Finding Grey Literature Evidence and Assessing for Outcome and Analysis Reporting Biases When Comparing Medical Interventions: AHRQ and the Effective Health Care Program
- Avoiding Bias in Selecting Studies
- Selecting Observational Studies for Comparing Medical Interventions
- Assessing the Risk of Bias of Individual Studies in Systematic Reviews of Health Care Interventions
- Assessing the Applicability of Studies When Comparing Medical Interventions
- Assessing Harms when Comparing Medical Interventions
- Conducting Quantitative Synthesis When Comparing Medical Interventions: AHRQ and the Effective Health Care Program
- Expanded Guidance on Selected Quantitative Synthesis Topics
- Handling Continuous Outcomes in Quantitative Synthesis
- Grading the Strength of a Body of Evidence When Assessing Health Care Interventions for the Effective Health Care Program of the Agency for Healthcare Research and Quality: An Update
- Integrating Bodies of Evidence: Existing Systematic Reviews and Primary Studies
- Using Existing Systematic Reviews to Replace de Novo Processes in CERs
- Updating Comparative Effectiveness Reviews: Current Efforts in AHRQ's Effective Health Care Program
- Guidance for the Conduct and Reporting of Modeling and Simulation Studies in the Context of Health Technology Assessment
Training Modules for the Methods Guide for Effectiveness and Comparative Effectiveness Reviews
Methods Guide – Chapter – Jan. 5, 2011
Finding Evidence for Comparing Medical Interventions
This is a chapter from the methods guide, "Methods Guide for Effectiveness and Comparative Effectiveness Reviews."
This report has also been published in edited form: Relevo R, Balshem H. Finding evidence for comparing medical interventions: AHRQ and the Effective Health Care Program. J Clin Epidemiol 2011 Nov;64(11):1168-77. PMID: 21684115.
Table of Contents
- Key Points
- Regulatory and Clinical Trials Searching
- Scientific Information Packets: Requests to Industry
- Developing the Published Literature Search
- Strategies for Finding Observational Studies
- Specialized Database Searching
- Using Key Articles
- Hand Searching Journals
- Corresponding With Researchers
- Updating and Reporting the Search Strategy
- Concluding Remarks
- Appendix A. Techniques for Observational Studies and/or Harms
- Appendix B. Specialized Databases
- Appendix C. CONSORT-Style Flow Diagram of Literature Search, Annotated
Comparative Effectiveness Reviews are systematic reviews of existing research on the effectiveness, comparative effectiveness, and harms of different health care interventions. They provide syntheses of relevant evidence to inform real-world health care decisions for patients, providers, and policymakers. Strong methodologic approaches to systematic review improve the transparency, consistency, and scientific rigor of these reports. Through a collaborative effort of the Effective Health Care (EHC) Program, the Agency for Healthcare Research and Quality (AHRQ), the EHC Program Scientific Resource Center, and the AHRQ Evidence-based Practice Centers have developed a Methods Guide for Effectiveness and Comparative Effectiveness Reviews. This Guide presents issues key to the development of Comparative Effectiveness Reviews and describes recommended approaches for addressing difficult, frequently encountered methodological issues.
The Methods Guide for Effectiveness and Comparative Effectiveness Reviews is a living document, and will be updated as further empiric evidence develops and our understanding of better methods improves. Comments and suggestions on the Methods Guide for Effectiveness and Comparative Effectiveness Reviews and the Effective Health Care Program can be made at www.effectivehealthcare.ahrq.gov.
This document was written with support from the Effective Health Care Program at AHRQ
None of the authors has a financial interest in any of the products discussed in this document.
Suggested citation: Relevo R, Balshem H. Finding Evidence for Comparing Medical Interventions. Agency for Healthcare Research and Quality; January 2011. Methods Guide for Comparative Effectiveness Reviews. AHRQ Publication No. 11-EHC021-EF. Available at http://effectivehealthcare.ahrq.gov/.
This report has also been published in edited form: Relevo R, Balshem H. Finding evidence for comparing medical interventions: AHRQ and the Effective Health Care Program. J Clin Epidemiol 2011 Nov;64(11):1168-77. PMID: 21684115.
Rose Relevo, M.L.I.S., M.S., AHIP1
Howard Balshem, M.S.1
1Scientific Resource Center, AHRQ Effective Health Care Program, Oregon Health & Science University, Portland, OR.
The views expressed in this paper are those of the authors and do not represent the official policies of the Agency for Healthcare Research and Quality, the Department of Health and Human Services, the Department of Veterans Affairs, the Veterans Health Administration, or the Health Services Research and Development Service.
- A librarian or other expert searcher should be involved in the development of the search
- Sources of grey literature including regulatory data, clinical trial registries and conference abstracts should be searched in addition to bibliographic databases.
- Requests should be made to industry to request additional sources of unpublished data.
- For the main published literature search, more than one bibliographic database needs to be searched.
- Searches should be carefully documented and fully reported.
While, this article both describes and advises on the process of literature searching in support of comparative effectiveness reviews (CERs) for the Effective Health Care Program, it does not address searching for previously published systematic reviews, which is discussed in other articles in this series.1,2
Searches to support systematic reviews often require judgment calls about where to search, how to balance recall and precision, and when the point of diminishing returns has been reached. Searchers with more experience with complex search strategies are better equipped to make these decisions.3 A number of reviews of the quality of systematic reviews suggest that those reviews that employed a librarian or other professional searcher had better reporting of and more complex search strategies.4-6
Table 1 describes the various search activities discussed in this paper and identifies who is responsible for performing each of these tasks. As is evident from the table, the EPC conducting the review is responsible for most of these activities. Because the EPC is involved in the development of the Key Questions, is familiar with the literature, and consults with experts regarding studies relevant to the topic, the EPC is in the best position to develop the required search strategies. Because grey literature can provide primary documents to verify published results, EPCs should routinely search regulatory data, clinical trial registries, and conference papers and abstracts for all CERs. This has been a centralized activity conducted by the Scientific Resource Center (SRC), but is now an activity conducted by the EPCs. However, one aspect of the search strategy benefits from centralization. Centralizing the request to drug and device manufacturers for data on their products—what we call the Scientific Information Packet (SIP)—ensures that all requests to industry are conducted in the same manner; this also minimizes or eliminates contact between manufacturers and the EPC involved in writing the report.
|Activity||Sources||Who does it|
|Key Questions and analytic framework||n/a||Evidence-Based Practice Center|
|Grey literature search||Clinical trial registries
|Evidence-Based Practice Center|
|Scientific Information Packets||Manufacturers of products under review||Scientific Resource Center|
|Main literature search||MEDLINE (plus in-process and other un-indexed citations)
Cochrane Central Register of Controlled Trials
|Evidence-Based Practice Center|
|Specialized database search||Variable (see Appendix B)||Evidence-Based Practice Center|
|Forward citation search||Scopus
Web of Science
|Evidence-Based Practice Center|
|Backwards citations (reading references)||Results of main literature search||Evidence-Based Practice Center|
|Hand search||Targeted journals||Evidence-Based Practice Center|
|Corresponding with researchers||Publication authors||Evidence-Based Practice Center|
Regulatory and Clinical Trials Searching
In addition to searching for studies that have been formally published (as described below), a comprehensive search will include a search of the grey literature.7,8 Grey literature is defined as, “that which is produced on all levels of government, academics, business and industry in print and electronic formats, but which is not controlled by commercial publishers.”9 Grey literature can include abstracts presented at conferences, unpublished trial data, government documents, or manufacturer information. Grey literature is, by definition, not systematically identified, stored, or indexed and therefore it can be difficult to locate.
The primary goal of the grey literature search is to identify and overcome publication and reporting bias.10,11 Published literature does not always accurately represent trial results. Often, only articles with positive results are published, while those with “null” or negative results are not. And, even when studies are published, reporting can be biased in many other ways. Systematic reviews and meta-analysis based solely on published literature that report positive results will exaggerate any estimate of effectiveness. McAuley et al.12 has shown an exaggerated estimate of 12 percent when grey literature is excluded, and Hopewell et al.13 found a 9 percent exaggeration.
The usefulness of the grey literature naturally varies by topic, but it is particularly helpful in areas where there is little published evidence, where the field or intervention is new or changing,14 when the topic is interdisciplinary,15 and with alternative medicine.16,17
Despite these reasons to include grey literature, there are also potential problems. From a practical standpoint, grey literature is the least efficient body to search18 and may not turn up more evidence to evaluate. Even if grey literature is located it may be of low quality or may not contain usable data.19 Often unpublished studies are (or at least are perceived to be) of lower quality,17,20 although there is limited evidence to support this.13
Because we have found them to be the most useful for identifying primary documents to compare with published results, EPCs routinely search the following three types of grey literature for all CERs: regulatory data, clinical trial registries, and conference papers and abstracts.
The approval process for new drugs and devices involves submission to the Food and Drug Administration (FDA) of data that may not be published elsewhere. These approval documents—which can be found at Drugs@FDA.gov—may help identify publication bias even when complete methodological details of unpublished trials are not available.21,22 This information is not available prior to a drug’s approval and may be redacted. When they are available, reviewers can compare results of published and unpublished trials, identify inconsistencies, and often find additional data. In one meta-analysis, investigators found that published trials reported larger estimates for the efficacy of quinine than did FDA documents.23 Similar discrepancies have been found by Turner24 for the efficacy of antidepressants.
The SRC identifies for potential inclusion, all available medical and statistical reviews for all drugs under consideration, regardless of indication. This is partly because it is difficult to distinguish specific indications in the database, but also because the actual clinical data within the reviews may cover more than one indication and harms data are of importance regardless of indication. In addition to searching for regulatory documents from the FDA, the SRC also searches the Health Canada Drug Products Database25 and the European Medicines Agency's European Public Assessment Reports.26
Online trial registries such as ClinicalTrials.gov may include results of completed but unpublished clinical trials. In a prospective study of two systematic reviews, Savoie27 found trial registries to be useful in identifying studies eligible for inclusion in systematic reviews; registries were more sensitive sources than were scanning references, hand searching, and personal communication. Trial registries can be helpful in identifying otherwise unreachable trials and in providing additional details of trials that have been published. Mathieu has found that elective outcome reporting is prevalent when trial registry information is compared with published results.28 Even without results, knowledge that the trial exists can be helpful for reviewers because the principle investigator can be contacted for more information.13 The FDA Amendments Act of 2007 mandates the expansion of ClinicalTrials.gov to include results of completed trials of approved drugs and devices. The results database now contains 2,279 entries, 1,958 of them from industry.29 Although ClinicalTrials.gov contains trials completed and ongoing, we search only for completed trials, as those are the only trials that would potentially have data for inclusion in a systematic review. In addition to ClinicalTrials.gov, we routinely search the following trial registries, Current Controlled Trials,30 Clinical Study Results,31 and WHO International Clinical Trials Registry Platform.32
Abstracts and Conference Proceedings
Finally, abstracts and conference proceedings should be searched because those results often never end up as full publications,33,34 or more formally published results often differ from the preliminary data presented in abstracts.19,34 The SRC searches general databases of conference proceedings routinely and may search specific meetings as suggested by EPCs and key informants.
Scientific Information Packets: Requests to Industry
When interventions identified in key questions involve drugs or devices (or other products for which a manufacturer can be identified), it is important to supplement the literature search with a request to the manufacturer for a SIP. The SIP includes information about products available from the product label as well as information about published and unpublished trials or studies about the product. Requests for SIPs should not be confused with specific request to authors of publications about clarifications of data or to request additional information. These are ad hoc scientist-to-scientist communications and represent a different activity than the systematic request for SIPs from industry.
SIPs are important for two reasons. One is to overcome publication bias by identifying trials that remain unpublished. Manufacturers are not required to report results of studies of products marketed before 2008 to ClinicalTrials.gov, and so information on these studies may not be found when searching this data source. SIPs may also inform researchers about soon-to-be-published studies so that they can be included in the review without waiting for formal publication. A second reason for requesting SIPs is that they provide an explicit and transparent opportunity for drug and device manufactures to be actively involved in the CER and to provide data the manufacturer believes is important to the review of the topic. As noted above, to ensure consistency in the way SIPs are requested and to ensure transparency by eliminating contact between the EPC conducting the review and the manufacturers of products being reviewed, the SRC requests SIPs from manufacturers on behalf of the EPCs for all CERs and technical briefs.
Developing the Published Literature Search
The published literature search for a CER must begin with the concepts identified in the analytic framework and key questions that define the scale and scope of a project. The development of the key questions, the scope of the review, and the analytic framework is a formalized process undertaken by the systematic review team at an EPC.2 Librarian involvement in the initial stages of the process, including reading the background materials that are prepared as the topic is developed, is an essential first step to understanding the key questions and crafting a pilot search. The searcher responsible for the main literature search is a member of the research team at the EPC performing the search. The analytic framework developed in the scoping explicitly describes both the relevant clinical concepts, as well as the logic underlying the mechanisms by which an intervention may improve health outcomes. Searchers should utilize the analytic framework to build queries for specific elements of the framework.
One thing to keep in mind while developing the search for a CER is that the retrieved results will be reviewed by hand with explicit inclusion and exclusion criteria dictated by the key questions and scope of the report. We recommend that the search be developed in tandem with these criteria.10 Many aspects of the key question may not be adequately addressed in the search because index terms for the relevant concepts are poor or nonexistent.35 While developing the search, if there are concepts that are difficult to articulate using search criteria alone, be sure to specify that these aspects need to be addressed in the inclusion and exclusion criteria.
The results of the pilot search can be used to help resolve questions about the boundaries of the key questions. Checking the indexing of known relevant articles provided by experts or found via reference lists can suggest additional terms and concepts that can be added to the strategy to improve its effectiveness.35,36
In the development of the main bibliographic search, we recommend the use of any validated hedges (filters) that exist for any of the concepts.37-39 Hedges are predefined search strategies designed to retrieve citations based on criteria other than the subject of the article, such as study methodology or to identify papers dealing with harms.39 Using hedges will save the work of developing the search from scratch and add a level of consistency to the Effective Health Care Program’s CERs. One set of hedges are the clinical queries that were developed by Haynes et al. for MEDLINE.40 Additional filters are available from the Cochrane Collaboration,41the Scottish Intercollegiate Guidelines Network,42 and the InterTASC Information Specialists' Sub-Group.43 The Canadian Agency for Drugs & Technology in Health (CADTH) has developed a pragmatic critical appraisal tool for search filters to assist expert searchers working on systematic review teams to judge the methodological quality of a search filter.44 For a comparison of filters designed to retrieve randomized controlled trials, see McKibbon et al.39
Additionally be sure to use advanced searching techniques as described in Sampson et al.'s 2008 Peer Review of Electronic Search Strategies.45 This is a tool developed for peer review of expert searches that can also be useful as a check of the search strategy. Items to consider are:
- Spelling errors
- Line errors—when searches are combined using line numbers, be sure the numbers refer to the searches intended
- Boolean operators used appropriately
- Search strategy adapted as needed for multiple databases
- All appropriate subject headings are used, appropriate use of explosion
- Appropriate use of subheadings and floating subheadings
- Use of natural language terms in addition to controlled vocabulary terms
- Truncation and spelling variation as appropriate
- Appropriate use of limits such as language, years, etc.
- Field searching, publication type, author, etc.
Although many of the items on the list are self-explanatory, some need further clarification. Use of both natural language and indexing terms is essential for a comprehensive search.37,46 Indexing is an important tool, but it often fails for any of the following reasons: lag time of indexing, inappropriate indexing, and lack of appropriate indexing terms or changes in indexing terms over time. Using only controlled vocabulary will miss any in-process citations in MEDLINE. As these represent the most recently published articles it is important to include natural language searching to retrieve them. When using natural language terms be sure to check for spelling errors, use truncation, and be aware of spelling variants, such as: anaemia, oesophagus, paralyse, etc.
Although the use of limits such as date ranges or age ranges may help improve the efficiency of the search, we don’t recommend the use of the English language limit. Although the resources available to read or translate non-English language full text articles will vary, English language abstracts are usually available for reviewers to make an initial assessment of the study. Routinely limiting searches to English risks producing biased results.47
We recommend the use of a bibliographic management software package such as EndNote or RefWorks to keep track of the results.10 We have no recommendation on specific software, however, Hernandez et al.48 describe many currently available products. While many of these products have features that allow searches to be performed in databases such as MEDLINE from within the software itself, we do not recommend the use of these features as they do not allow the complex searches needed for CERs.49
Strategies for Finding Observational Studies
CERs emphasize the use of randomized controlled trials when they are available, as this study design is least susceptible to bias and can produce high quality evidence. However, CERs include a broad range of types of evidence to confirm pertinent findings of trials and to assess gaps in the trial evidence.50 A common use of observational studies is to compare results of trials with results in broader populations or in everyday practice.51
Searches for observational studies should always be included in reviews when harms and adverse effects are studied, or if the topic itself is unlikely to have been studied with randomized controlled trials.52 For the most part, the decision on how to include observational studies will be made as the topic is being developed and is driven by the formulation of key questions and inclusion and exclusion criteria.53 Unfortunately there is little empirical evidence on how best to approach a systematic search for observational studies.54-56 In the absence of evidence the following is advice based on the consensus of the Cochrane Adverse Effect Methods Group57 and other experts.58,59
A search for adverse effects should be more inclusive than a search for effectiveness.53 While a search for studies about effectiveness would include only studies of the indication of interest, harms data should not be limited in this way; data about harms is of interest regardless of indication. The targeted search for adverse effects is best accomplished by combining the intervention search with terms to identify harms without limiting to any particular study type.51,54
Golder et al.60 describe a number of approaches to search strategies for harms in both EMBASE and MEDLINE. In general, remember to use textwords, MeSH headings, as well as floating subheadings to identify adverse effects.51 Because most hedges for adverse effects were designed within the context of a specific report, they may need to be adapted for new topics. For example a term such as “adverse drug reaction” would not be appropriate for nondrug interventions. Appendix A contains specific examples of these techniques and hedges.
Observational Studies in Other Situations
It can be challenging to search for observational studies because there are many designs and vocabulary is not used consistently.56 Furlan et al.,61 Fraser et al.,58 and the SIGN group62 have all explored hedges for retrieving observational studies. While they have not been validated outside of the reviews they were designed for, they offer a starting point for developing a strategy suited to the topic of the review and are described in detail in Appendix A.
While it is currently difficult to construct searches for observational studies, in the future, improved reporting and improved indexing may make it possible to develop standardized generic hedges that would be appropriate for systematic reviews. The STROBE statement59,63 gives specific advice for the reporting of observational studies, which is a necessary first step to more accurate indexing and retrieval of observational studies.
Specialized Database Searching
While the Cochrane Central Register of Controlled Trials and MEDLINE are necessary for a thorough search of the literature, they are hardly sufficient.64 Many topics of interest to the Effective Health Care Program are interdisciplinary in nature and are concerned with more than strictly biomedical sciences. It is common, for example, to search databases such as CINAHL or PsycINFO for topics related to nursing or mental health, respectively. Failure to search multiple databases risks biasing the CER to the perspective of a single discipline and, because there is often little overlap between different databases,46,65,66 has a high risk of missing studies that would affect the outcome of a systematic review. Sampson et al.67 investigated the effect of such failure on meta-analyses and found that the intervention effect was increased by an average of 6 percent when only those studies found using MEDLINE were used.
When performing additional database searches, adapt search terms for each database. While keeping the conceptual structure of the original search, review the controlled vocabulary headings for each database to identify appropriate terms. Often headings that have similar scopes or definitions may vary slightly in the terminology used or differ in granularity from one database to another. Finally, keep in mind that search syntax will be different with every database, so be sure to review each database’s unique syntax before performing the search.68 Many of the more specialized databases do not have the advanced search interfaces needed to conduct complex searches, thus the searches need to be simplified. The loss in precision from the simplified search is often made up for by the fact that the databases contain a smaller number of citations, so the absolute number of citations needed to be screened—even with a simplified search—is often small.
Finally, it is always helpful to ask key informants if they know of any databases specific to the topic of interest. Consult Appendix B for a listing of possible databases of interest.
Using Key Articles
Consultation with experts will identify key articles, and these can be an important resource. If these key articles were not identified in the initial search, investigate why. By looking at the indexing terms applied to key articles, additional search terms can be identified.35,36 Additionally, citation tracking—looking at both forward and backward citations of these key articles—can be invaluable for identifying studies.
Citation Tracking—Forward Citations
Citation tracking is an important way to identify articles because it relies on the author’s choice to cite an article rather than keywords or indexing.69 Therefore, citation tracking often identifies unique and highly relevant items. It can also be an efficient way of locating subsequent and tertiary articles stemming from a landmark trial, as these studies will all cite the original trial.
The Web of Science (which includes the Science Citation Index) is the original citation tracking service. In recent years, a number of other citation tracking databases have become available, including Google Scholar,70 Scopus,71 PubFocus,72 and PubReMiner.73 In addition, many publishers offer citation tracking within the pools of journals they publish.
While all citation tracking databases reveal who cited what, there is considerable variability in their coverage and search interfaces. Databases differ both in the number of journals included as well as the number of years that are tracked, with Web of Science covering more years than the others.74
Recent comparisons of Scopus, Web of Science, and Google Scholar found that there were unique items located with each source75,76 and that the amount of overlap varied considerably depending on the topic of interest.74,77 Because the variation between databases is sensitive to the topic being researched, it is difficult to determine beforehand which database would be most fruitful based on content coverage alone. The decision of what database to use for citation tracking will likely be driven by more pragmatic differences between databases such as cost, availability, and search interfaces.
Web of Science and Scopus are both subscription-based services. If access is available to either of these databases, we recommend their use as they have the most developed search and export interfaces. Free citation tracking databases include: PubReMiner, PubFocus, and Google Scholar. Of these, we recommend Google Scholar for its broader coverage and superior interface. Google Scholar offers the ability to download citations into bibliographic management software as well as to link through to full-text with Google Scholar’s Scholar Preferences settings.
Although many publishers offer citation tracking within the set of journals that they publish, we do not recommend their use because the citations are limited to results from that single publisher. Similarly, we do not recommend the “find citing articles” feature of OVID Medline, as that is restricted to journals available from Journals@OVID and does not represent all forward citations.
Reading References—Backward Citations
In addition to finding what articles have cited key studies, articles the key study has cited are a valuable resource. Sources of grey literature such as conference proceedings or poorly indexed journals relevant to the key questions are often discovered in this manner.
Reading the references of key articles is standard practice for systematic reviews78,79 although this practice has not been systematically evaluated for effectiveness.80 This step is often performed by the researchers tasked with reading the full text of studies and abstracting data. Since these people are often not the same people doing the literature searching, it is important to make sure that they communicate with each other during this process so that insights are not lost. We recommend that any articles that are identified through the reading of references be reviewed by the librarian conducting the search to examine why the original search strategy did not identify the article in question.
Often key articles are previous systematic reviews. The decision on when and how to use an existing review’s search strategy and references is part of a larger question on how to utilize existing systematic reviews;1 searchers should work closely with the review team to determine how to approach the use of previously published systematic reviews.
Related Articles Algorithms
Another way to use key articles is as a starting point for “related article” algorithms. Many databases offer a link to “related articles.”37 These links can be helpful in the preliminary, exploratory, and scoping stages of a search. However, we do not recommend them for the formal part of the search for a CER; it is difficult to be systematic about and report on these types of searches, and generally, they are impossible to reproduce.
Hand Searching Journals
Not all journals of interest will be indexed by the databases searched; often, abstracts, supplements, and conference proceedings are not indexed, even if the rest of the content of a journal is. Because many studies first appear (or only appear) in these nonindexed portions of a journal, hand searching journals can be an effective method for identifying trials.
We recommend that journals be hand searched if they are highly relevant to the topic of the report, but are not fully indexed35,81,82 or not indexed at all by MEDLINE.83 It is often the case that articles were missed by the initial search strategy because the journal the article is published in is poorly indexed. Asking key informants about specific journals or conferences related to the topic is another way to identify candidates for hand searching.84,85
Hand searching doesn’t necessarily mean hand searching of the print journal (although that may be appropriate in some cases). Now that tables of contents and abstracts are often available electronically, hand searching can be done online by systematically reviewing the journal’s content on an issue-by-issue basis. A more focused hand search may limit the number of years searched, or focus only on supplements or conference abstracts.
Corresponding With Researchers
During the course of preparing a CER it may be necessary to contact investigators and authors. Savoie
E-mail makes author correspondence quite easy. Gibson et al.92 found that the response rate to e-mail was higher than for postal mail. Aside from the usual Google search, e-mail addresses can be identified by searching the author’s institution’s Web site. PubMed is also a good source of e-mail addresses, as they are included in the author institution field shown in the abstract display.
Updating and Reporting the Search Strategy
While conducting the search be sure to take detailed notes. These will be useful for reporting as well as rerunning the search in the future. EPC Program policy requires saving the main bibliographic searches to be rerun at the time the draft is sent for peer review. In addition, detailed notes about the full search strategy should be kept in order to accurately report the search strategy in the review. Transparency and reproducibility of the systematic review requires clear reporting;93 critical appraisal is impossible if the search strategy is not thoroughly reported.94
Unfortunately, there is no consensus on how to report search strategies in systematic reviews. Sampson et al.94 identified 11 instruments, either specific to search strategy reporting or more global reporting instruments that include elements for the search strategy. From these 11 instruments, they extracted the following elements:
- Database used
- Dates covered by the search
- Date search was conducted
- Statement of the search terms used
- Statement of any language restrictions
- Statement of nondatabase methods used
- Additional inclusion/exclusion criteria
- Presentation of the full electronic search strategy
- Statement of any publication status restrictions
- Platform or vendor for electronic database
- End date of the search
- List of excluded references
- Qualifications of the searcher
- Is the reported strategy repeatable?
- Number of references identified
- CONSORT—style flow diagram or other accounting for all references
- Evidence of effectiveness of the search strategy
- Statement of any methodological filters used
- Description of the sampling strategy
The CONSORT-style flow diagram refers to a chart that accounts for all citations identified from all sources as well as accounting for all citations that were later excluded and why.95,96 See Appendix C for an annotated example.
Another element that falls outside of the basic mechanics of the search is evidence of the effectiveness of the search strategy.94 The evidence of the effectiveness of the search strategy may be difficult to ascertain conclusively. However, reporting what techniques were used to check a strategy—such as expert review, use of previously published strategies or hedges, or testing against a group of known relevant articles (for example, from a previous review)—may be helpful.
With the lack of consensus on reporting, it is hardly surprising that current reporting of search strategies for systematic reviews is variable. In a recent review, Yoshii et al.93 provided a good overview of studies of reporting of search strategies in systematic reviews; they also examined the reporting in Cochrane reviews. Reporting of search strategies is an area of systematic review methodology that can be improved, and the problems with poor reporting go beyond not being able to reproduce the search or build on it for updates. There is very little evidence on the effectiveness of various search strategies for CERs, and there is a need for primary research to identify the characteristics of valid searches.94 Currently, it is difficult to do any research on this issue because reporting is so poor. Completely reported search strategies will build an evidence base from which research can be done on effective search strategies.
In the absence of reporting standards, we recommend working with the team writing the report to determine what to report in the review. Page limitations of journal publications may necessitate abbreviating the reporting in journal publications, but there is always room for complete reporting in the online appendices of the CER that are posted to the Effective Health Care Web site or included with the e-published version of the journal article.
One of the most difficult aspects of conducting a comprehensive search is confidently knowing when to stop searching. Unfortunately, there is little guidance on how to determine that point. While Spoor et al.97 suggests capture-mark-recapture statistical modeling to retrospectively estimate the closeness to capturing the total body of literature, there is currently no tool that can easily be applied to searches for CERs. In the end, we rely on experienced searchers’ judgments as to when the labor expended to search additional sources is likely to result in new and unique items or whether the search has reached the point of saturation. Like other decisions, such as the sensitivity of the search, the desire for comprehensiveness must be balanced with available resources.
Much of the methodology described here is not yet evidence based, but rather based on principles of expert searching and searcher experience. In order to develop more evidence-based methods we must first have an evidence base to work with. Poor reporting of search strategies in comparative effectiveness and other systematic reviews has hindered evaluations of the effectiveness of various techniques. Clear reporting of search strategies, therefore, is the first step needed to support further research on the effectiveness of various search techniques.
Within the AHRQ Effective Health Care Program, searching lacks the type of quality control that is found in many other steps in the process of conducting CERs, such as dual abstraction and internal peer review. The SRC, therefore, has initiated projects such as peer review of search strategies and improved structures for communication and dissemination of techniques intended to identify best practices that will help librarians share expertise across EPCs.
- Whitlock EP, Lin JS, Chou R, et al. Using existing systematic reviews in complex systematic reviews. Ann Intern Med 2008 May 20;148(10):776-782.
- Whitlock EP, Lopez SA, Chang S, et al. Identifying, selecting, and refining topics for comparative effectiveness systematic reviews: AHRQ and the Effective Health Care program. J Clin Epidemiol 2009 Jun 18.
- Medical Library Association. Role of expert searching in health sciences libraries: Policy statement by the Medical Library Association adopted September 2003. J Med Libr Assoc 2005 Jan;93(1):42-44.
- McGowan J, Sampson M. Systematic reviews need systematic searchers. J Med Libr Assoc 2005 Jan;93(1):74-80.
- Golder S, Loke Y, McIntosh HM. Poor reporting and inadequate searches were apparent in systematic reviews of adverse effects. J Clin Epidemiol 2008 May;61(5):440-448.
- Mokkink LB, Terwee CB, Stratford PW, et al. Evaluation of the methodological quality of systematic reviews of health status measurement instruments. Qual Life Res 2009 Apr;18(3):313-333.
- Alberani V, De Castro Pietrangeli P, et al. The use of grey literature in health sciences: a preliminary survey. Bull Med Libr Assoc 1990 Oct;78(4):358-363.
- Illig J. Archiving "event knowledge:" bringing "dark data" to light. J Med Libr Assoc 2008 Jul;96(3):189-191.
- Grey Literature Network Service, editor. New frontiers in grey literature. Fourth International conference on Grey Literature; 1999 Oct 4-5; Washington, DC: GL'99 proceedings.
- Conn VS, Valentine JC, Cooper HM, et al. Grey literature in meta-analyses. Nurs Res [Review] 2003 Jul-Aug;52(4):256-261.
- Dickersin K, Scherer R, Lefebvre C. Identifying relevant studies for systematic reviews. BMJ [Review]. 1994 Nov 12;309(6964):1286-1291.
- McAuley L, Pham B, Tugwell P, et al. Does the inclusion of grey literature influence estimates of intervention effectiveness reported in meta-analyses? Lancet 2000 Oct 7;356(9237):1228-1231.
- Hopewell S, McDonald S, Clarke Mike J, et al. Grey literature in meta-analyses of randomized trials of health care interventions. Cochrane Database of Systematic Reviews. 2007; (2). Available at: http://www.mrw.interscience.wiley.com/cochrane/clsysrev/articles/MR000010/frame.html .
- Hartling L, McAlister FA, Rowe BH, et al. Challenges in systematic reviews of therapeutic devices and procedures. Ann Intern Med 2005 Jun 21;142(12 Pt 2):1100-1111.
- Helmer D, Savoie I, Green C, et al. Evidence-based practice: extending the search to find material for the systematic review. Bull Med Libr Assoc 2001 Oct;89(4):346-52.
- Shekelle PG, Morton SC, Suttorp MJ, et al. Challenges in systematic reviews of complementary and alternative medicine topics. Ann Intern Med 2005 Jun 21;142(12 Pt 2):1042-7.
- Egger M, Juni P, Bartlett C, et al. How important are comprehensive literature searches and the assessment of trial quality in systematic reviews? Empirical study. Health Technol Asses 2003;7(1):1-76.
- Cook AM, Finlay IG, Edwards AG, et al. Efficiency of searching the grey literature in palliative care. J Pain Symptom Manage 2001 Sep;22(3):797-801.
- Fergusson D, Laupacis A, Salmi LR, et al. What should be included in meta-analyses? An exploration of methodological issues using the ISPOT meta-analyses. Int J Technol Assess Health Care 2000 Autumn;16(4):1109-1119.
- van Driel ML, De Sutter A, De Maeseneer J, et al. Searching for unpublished trials in Cochrane reviews may not be worth the effort. J Clin Epidemiol 2009 Aug;62(8):838-844e3.
- Bennett DA, Jull A. FDA: untapped source of unpublished trials. Lancet 2003;361:1402-1403.
- MacLean CH, Morton SC, Ofman JJ, et al. How useful are unpublished data from the Food and Drug Administration in meta-analysis? J Clin Epidemiol 2003 Jan;56(1):44-51.
- Man-Son-Hing M, Wells G, Lau A. Quinine for nocturnal leg cramps: a meta-analysis including unpublished data. J Gen Intern Med 1998 Sep;13(9):600-606.
- Turner EH, Matthews AM, Linardatos E, et al. Selective publication of antidepressant trials and its influence on apparent efficacy. N Engl J Med 2008 Jan 17;358(3):252-260.
- Health Canada Drug Products Database. Health Canada. Available at: http://webprod.hc-sc.gc.ca/dpd-bdpp/index-eng.jsp . Accessed September 21, 2010.
- European Public Assessement Reports. European Medicines Agency. Available at: http://www.ema.europa.eu/ema/index.jsp?curl=pages/medicines
.jsp&mid=WC0b01ac058001d124 . Accessed September 21, 2010.
- Savoie I, Helmer D, Green CJ, et al. Beyond MEDLINE: reducing bias through extended systematic review search. Int J Technol Assess Health Care 2003;19(1):168-178.
- Mathieu S, Boutron I, Moher D, et al. Comparison of registered and published primary outcomes in randomized controlled trials. JAMA 2009 Sep 2;302(9):977-984.
- U.S. National Institutes of Health. ClinicalTrials.gov. Available at: http://clinicaltrials.gov. Accessed September 21, 2010.
- Current Controlled Trials. BioMed Central. Available at: http://www.controlled-trials.com/ . Accessed September 21, 2010.
- Clinical Study Results. Available at: http://www.clinicalstudyresults.org/home/ . Accessed September 21, 2010.
- WHO International Clinical Trials Registry Platform. World Health Organization. Available at: http://apps.who.int/trialsearch/ . Accessed September 21, 2010.
- von Elm E, Costanza MC, Walder B, et al. More insight into the fate of biomedical meeting abstracts: a systematic review. BMC Med Res Methodol 2003 Jul 10;3:12.
- Toma M, McAlister FA, Bialy L, et al. Transition from meeting abstract to full-length journal article for randomized controlled trials. JAMA 2006 Mar 15;295(11):1281-1287.
- Matthews EJ, Edwards AG, Barker J, et al. Efficient literature searching in diffuse topics: lessons from a systematic review of research on communicating risk to patients in primary care. Health Libr Rev 1999 Jun;16(2):112-120.
- Brettle AJ, Long AF. Comparison of bibliographic databases for information on the rehabilitation of people with severe mental illness. Bull Med Libr Assoc 2001 Oct;89(4):353-362.
- O'Leary N, Tiernan E, Walsh D, et al. The pitfalls of a systematic MEDLINE review in palliative medicine: symptom assessment instruments. Am J Hosp Palliat Care 2007 Jun-Jul;24(3):181-184.
- Glanville JM, Lefebvre C, Miles JN, et al. How to identify randomized controlled trials in MEDLINE: ten years on. J Med Libr Assoc 2006 Apr;94(2):130-136.
- McKibbon KA, Wilczynski NL, Haynes RB, et al. Retrieving randomized controlled trials from medline: a comparison of 38 published search filters. Health Info Libr J 2009 Sep;26(3):187-202.
- Haynes RB, Wilczynski N, McKibbon KA, et al. Developing optimal search strategies for detecting clinically sound studies in MEDLINE. J Am Med Inform Assoc 1994 Nov-Dec;1(6):447-458.
- Cochrane Handbook for Systematic Reviews of Interventions : 6.4.11 Search filters 2008 updated September 2008; Version 5.0.1. Available at: http://www.cochrane-handbook.org . Accessed September 21, 2010.
- Scottish Intercollegiate Guidelines Network (SIGN). Search Filters. Edinburgh, updated August 3, 2009. Available at: http://www.sign.ac.uk/methodology/filters.html . Accessed September 21, 2010.
- InterTASC Information Specialists' Sub-Group. Search Filter Resource updated July 2, 2009. Available at: http://www.york.ac.uk/inst/crd/intertasc/diag.htm . Accessed September 21, 2010.
- Bak G, Mierzwinski-Urban M, Fitzsimmons H, et al. A pragmatic critical appraisal instrument for search filters: introducing the CADTH CAI. Health Info Libr J 2009 Sep;26(3):211-219.
- Sampson M, McGowan J, Lefebvre C, et al. PRESS: Peer Review of Electronic Search Strategies. Ottawa: Canadian Agency for Drugs and Technologies in Health; 2008.
- Conn VS, Isaramalai SA, Rath S, et al. Beyond MEDLINE for literature searches. J Nurs Scholarsh [Review] 2003;35(2):177-182.
- Morrison A, Moulton K, Clark M, et al. English-language restriction when conducting systematic review-based meta-analyses: systematic review of published studies. Ottawa: Canadian Agency for Drugs and Technologies in Health. Available at: http://www.mrw.interscience.wiley.com/cochrane/clcmr/articles/CMR-13119/frame.html .
- Hernandez DA, El-Masri MM, Hernandez CA. Choosing and using citation and bibliographic database software (BDS). Diabetes Educ 2008 May-Jun;34(3):457-474.
- Gomis M, Gall C, Brahmi FA. Web-based citation management compared to end note: options for medical sciences. Med Ref Serv Q 2008 Fall 2008;27(3):260-271.
- White CM, Ip S, McPheeters M, et al. Using existing systematic reviews to replace de novo processes in conducting Comparative Effectiveness Reviews. Rockville, MD. Available at: http://effectivehealthcare.ahrq.gov/repFiles/methodsguide/systematicreviewsreplacedenovo.pdf. Accessed September 21, 2010.
- Loke YK, Price D, Herxheimer A, et al. Systematic reviews of adverse effects: framework for a structured approach. BMC Med Res Methodol 2007;7(32).
- Chou R, Helfand M. Challenges in systematic reviews that assess treatment harms. Ann Intern Med [Review] 2005 Jun 21;142(12 Pt 2):1090-1099.
- Chou R, Aronson N, Atkins D, et al. Assessing harms when comparing medical interventions: AHRQ and the Effective Health-Care Program. J Clin Epidemiol 2008 Sep 25.
- Derry S, Kong Loke Y, Aronson JK. Incomplete evidence: the inadequacy of databases in tracing published adverse drug reactions in clinical trials. BMC Med Res Methodol 2001;1(7).
- Golder S, McIntosh HM, Duffy S, et al. Developing efficient search strategies to identify reports of adverse effects in MEDLINE and EMBASE. Health Info Libr J 2006 Mar;23(1):3-12.
- Wieland S, Dickersin K. Selective exposure reporting and Medline indexing limited the search sensitivity for observational studies of the adverse effects of oral contraceptives. J Clin Epidemiol 2005 Jun;58(6):560-567.
- Loke YK, Price D, Herxheimer A. Appendix 6b. Including adverse effects. In: Higgins JP, Green S, eds. Cochrane Handbook for Systematic Reviews of Interventions 425 [updated May 2005]. Chichester, UK: Cochrane Colloboration; Cochrane Adverse Effects Subgroup; 2007.
- Fraser C, Murray A, Burr J. Identifying observational studies of surgical interventions in MEDLINE and EMBASE. BMC Med Res Methodol 2006 Aug 18;6(41).
- von Elm E, Altman DG, Egger M, et al. The strengthening the reporting of observational studies in epidemiology (STROBE) statement: Guidelines for reporting observational studies. Ann Intern Med 2007;147(8):573-577.
- Golder S, Loke Y, McIntosh HM. Room for improvement? A survey of the methods used in systematic reviews of adverse effects. BMC Med Res Methodol 2006;6(3).
- Furlan AD, Irvin E, Bombardier C. Limited search strategies were effective in finding relevant nonrandomized studies. J Clin Epidemiol 2006 Dec;59(12):1303-1311.
- Scottish Intercollegiate Guidelines Network—Search Filters. Available at: http://www.sign.ac.uk/methodology/filters.html .
- Vandenbroucke JP, von Elm E, Altman DG, et al. Strengthening the reporting of observational studies in epidemiology (STROBE): Explanation and elaboration. Ann Intern Med 2007;147:W163-W194.
- Zheng MH, Zhang X, Ye Q, et al. Searching additional databases except PubMed are necessary for a systematic review. Stroke 2008 Aug;39(8):e139; author reply e40.
- Suarez-Almazor ME, Belseck E, Homik J, et al. Identifying clinical trials in the medical literature with electronic databases: MEDLINE alone is not enough. Control Clin Trials 2000 Oct;21(5):476-487.
- Betran AP, Say L, Gulmezoglu AM, et al. Effectiveness of different databases in identifying studies for systematic reviews: experience from the WHO systematic review of maternal morbidity and mortality. BMC Med Res Methodol 2005 Jan 28;5(1):6.
- Sampson M, Barrowman NJ, Moher D, et al. Should meta-analysts search Embase in addition to Medline? J Clin Epidemiol 2003 Oct;56(10):943-955.
- DeLuca JB, Mullins MM, Lyles CM, et al. Developing a comprehensive search strategy for evidence based systematic reviews. Evid Based Libr Inf Pract 2008;3(1):3-32.
- Kuper H, Nicholson A, Hemingway H. Searching for observational studies: what does citation tracking add to PubMed? A case study in depression and coronary heart disease. BMC Med Res Methodol 2006;6:4.
- Google Scholar. Available at: http://scholar.google.com/ . Accessed September 21, 2010.
- Scopus. Elsevier. Available at: http://www.scopus.com/home.url . Accessed September 21, 2010.
- PubFocus. Available at: http://pubfocus.com/ . Accessed September 21, 2010.
- PubMed PubReMiner. Available at: http://bioinfo.amc.uva.nl/human-genetics/pubreminer/ . Accessed September 21, 2010.
- Falagas ME, Pitsouni EI, Malietzis GA, et al. Comparison of PubMed, Scopus, Web of Science, and Google Scholar: strengths and weaknesses. FASEB J 2008 Feb;22(2):338-342.
- Salisbury L. Web of Science and scopus : a comparative review of content and searching capabilities. The Charleston Advisor 2009 July;11(1):5-18.
- Jasco P. As we may search—Comparison of major features of the Web of Science, Scopus, and Google Scholar citation-based and citation-enhanced databases. Curr Sci 2005;89(9):1537-1547.
- Bakkalbasi N, Bauer K, Glover J, et al. Three options for citation tracking: Google Scholar, Scopus and Web of Science. Biomed Digit Libr 2006;3(7).
- Jadad AR, McQuay HJ. Searching the literature. Be systematic in your searching [comment]. BMJ 1993 Jul 3;307(6895):66.
- Gotzsche PC. Reference bias in reports of drug trials. Br Med J (Clin Res Ed) 1987 Sep 12;295(6599):654-656.
- Armour T, Dingwall O, Sampson M. Contribution of checking reference lists to systematic reviews. Poster presentation at: XIII Cochrane Colloquium 2005.
- Al Hajeri A, Al Sayyad J, Eisinga A. Handsearching the EMHJ for reports of randomized controlled trials by U.K. Cochrane Centre (Bahrain). East Mediterr Health J 2006;12 Suppl 2:S253-S257.
- Jadad AR, Moher D, Klassen TP. Guides for reading and interpreting systematic reviews: II. How did the authors find the studies and assess their quality? Arch Pediatr Adolesc Med 1998 Aug;152(8):812-817.
- Hopewell S, Clarke M, Lusher A, et al. A comparison of handsearching versus MEDLINE searching to identify reports of randomized controlled trials. Stat Med 2002 Jun 15;21(11):1625-1634.
- Avenell A, Handoll HH, Grant AM. Lessons for search strategies from a systematic review, in The Cochrane Library, of nutritional supplementation trials in patients after hip fracture. Am J Clin Nutr 2001 Mar;73(3):505-510.
- Armstrong R, Jackson N, Doyle J, et al. It's in your hands: the value of handsearching in conducting systematic reviews of public health interventions. J Public Health (Oxf) 2005 Dec;27(4):388-391.
- Zarin DA, Ide NC, Tse T, et al. Issues in the registration of clinical trials. JAMA 2007 May 16;297(19):2112-2120.
- Tramer MR, Reynolds DJ, Moore RA, et al. Impact of covert duplicate publication on meta-analysis: a case study. BMJ. 1997 Sep 13;315(7109):635-640.
- Reveiz L, Cardona AF, Ospina EG, et al. An e-mail survey identified unpublished studies for systematic reviews. J Clin Epidemiol 2006 Jul;59(7):755-758.
- Kelley GA, Kelley KS, Tran ZV. Retrieval of missing data for meta-analysis: a practical example. Int J Technol Assess Health Care 2004 Summer;20(3):296-299.
- Peinemann F, McGauran N, Sauerland S, et al. Negative pressure wound therapy: potential publication bias caused by lack of access to unpublished study results data. BMC Med Res Methodol 2008;8:4.
- Rennie D. Fair conduct and fair reporting of clinical trials. JAMA 1999 Nov 10;282(18):1766-1768.
- Gibson CA, Bailey BW, Carper MJ, et al. Author contacts for retrieval of data for a meta-analysis on exercise and diet restriction. Int J Technol Assess Health Care 2006 Spring;22(2):267-270.
- Yoshii A, Plaut DA, McGraw KA, et al. Analysis of the reporting of search strategies in Cochrane systematic reviews. J Med Libr Assoc 2009;97(1):21-29.
- Sampson M, McGowan J, Tetzlaff J, et al. No consensus exists on search reporting methods for systematic reviews. J Clin Epidemiol 2008 Aug;61(8):748-754.
- Egger M, Juni P, Bartlett C, et al. Value of flow diagrams in reports of randomized controlled trials. JAMA 2001 Apr 18;285(15):1996-1999.
- Hopewell S, Clarke M, Moher D, et al. CONSORT for reporting randomized controlled trials in journal and conference abstracts: explanation and elaboration. PLoS Med 2008 Jan 22;5(1):e20.
- Spoor P, Airey M, Bennett C, et al. Use of the capture-recapture technique to evaluate the completeness of systematic literature searches. BMJ. 1996 Aug 10;313(7053):342-343.
Appendix A. Techniques for Observational Studies and/or Harms
|MEDLINE (OVID)||EMBASE (OVID)|
(preoperat$ or pre operat$).mp
(compare$ or compara$).tw
(preoperat$ or pre operat$).mp
Major clinical study/
(preoperat$ or pre operat$).mp
(compare$ or compara$).tw
Major clinical study/
(compare$ or compara$).tw
major clinical study/
|specified adverse effects||Drug terms AND Exp LIVER DISEASES/ci||Drug terms AND Exp LIVER DISEASE/si|
|subheadings linked to drug name||Exp DRUG NAMEadverse events, po, to||Exp DRUG NAMEadverse events, to|
|floating subheadings||Drug terms AND (ae OR po OR to OR co OR de).fs.||Drug terms AND (ae OR to OR co).fs.|
|text word synonyms of “adverse effects” and related terms||Drug terms AND (safe OR safety OR side-effect$ OR undesirable effect$ OR treatment emergent OR tolerability OR toxicity OR adrs OR [adverse adj2 (effect or effects or reaction or reactions or event or events or outcome or outcomse)])||Drug terms AND (safe OR safety OR side-effect$ OR undesirable effect$ OR treatment emergent OR tolerability OR toxicity OR adrs OR [adverse adj2 (effect or effects or reaction or reactions or event or events or outcome or outcomse)])|
|indexing terms for “adverse effects”||Drug terms AND exp DRUG TOXICITY/||Drug terms AND (exp ADVERSE DRUG REACTION/ OR Exp Side-Effect/ )|
/adverse drug reaction
|1 Epidemiologic studies/
2 Exp case control studies/
3 Exp cohort studies/
4 Case control.tw.
5 (cohort adj (study or studies)).tw.
6 Cohort analy$.tw.
7 (Follow up adj (study or studies)).tw.
8 (observational adj (study or studies)).tw.
11 Cross sectional.tw.
12 Cross-sectional studies/
|1 Clinical study/
2 Case control study
3 Family study/
4 Longitudinal study/
5 Retrospective study/
6 Prospective study/
7 Randomized controlled trials/
8 6 not 7
9 Cohort analysis/
10 (Cohort adj (study or studies)).mp.
11 (Case control adj (study or studies)).tw.
12 (follow up adj (study or studies)).tw.
13 (observational adj (study or studies)).tw.
14 (epidemiologic$ adj (study or studies)).tw.
15 (cross sectional adj (study or studies)).tw.
|1 Prospective studies/
2 Exp case control studies/
3 Correlational studies/
4 Nonconcurrent prospective studies/
5 Cross sectional studies/
6 (cohort adj (study or studies)).tw.
7 (observational adj (study or studies)).tw.
Fraser C, Murray A, Burr J. Identifying observational studies of surgical interventions in MEDLINE and EMBASE. BMC Med Res Methodol 2006 Aug 18;6(41).
Furlan AD, Irvin E, Bombardier C. Limited search strategies were effective in finding relevant nonrandomized studies. J Clin Epidemiol 2006 Dec;59(12):1303-1311.
Golder S, McIntosh HM, Duffy S, et al. Developing efficient search strategies to identify reports of adverse effects in MEDLINE and EMBASE. Health Info Libr J 2006 Mar;23(1):3-12.
Loke YK, Price D, Herxheimer A, et al. Systematic reviews of adverse effects: framework for a structured approach. BMC Med Res Methodol 2007;7(32).
Scottish Intercollegiate Guidelines Network (SIGN). Search Filters. Edinburgh 2009. Available at: http://www.sign.ac.uk/methodology/filters.html . Accessed September 21, 2010.
Appendix B. Specialized Databases
Please note that the topics listed are not the only topics indexed by that database, rather they are a subset of covered topics that are likely to be of interest to the Effective Health Care Program. References are to articles which discuss specific search strategies, present a general overview of the database, or discuss the use of these databases in systematic reviews. The URL’s listed are for the database itself if it’s a free resource, or a page describing the product if it’s a subscription database. Please note that many of these databases are available from many vendors, and the choice of URL does not indicate a preference or endorsement of any particular vendor. If you are unsure about subscription databases, remember that free trials can often be arranged in order for you to evaluate its usefulness to your program.
(Campbell Collaboration’s Social, Psychological, Educational and Criminology Trials Register)
|http://geb9101.gse.upenn.edu/||Trial Register for Social Sciences
(similar to DARE)
(Education Resources Information Center)
|http://www.eric.ed.gov/||Education, including the education of health care professionals as well as educational interventions for patients||Anon, 2006|
(International Bibliographic Information on Dietary Supplements)
|http://ods.od.nih.gov/Health_Information/IBIDS.aspx||Dietary supplements||Tomasulo, 2000|
(Index to Chiropractic Literature)
(New Abstracts and Papers in Sleep)
(Occupational Therapy Systematic Evaluation of Evidence)
|http://www.otseeker.com/||Occupational therapy||Bennett, 2003
(Physiotherapy Evidence Database)
|http://www.pedro.org.au/||Physical therapy||Sherrington, 2000
|PILOTS||http://www.ptsd.va.gov/ptsd_adv_search.asp||PTSD and traumatic stress||Banks, 1995
|PopLine||http://www.popline.org||Population, family planning & reproductive health||Adebonojo, 1994|
|PubMed||http://www.ncbi.nlm.nih.gov/pubmed/||Biology and health sciences|
(Research and Development Resource Base)
|http://www.rdrb.utoronto.ca/about.php||Medical education||Anne, 1995|
|Social Care Online||http://www.scie-socialcareonline.org.uk/||Social care including: healthcare, social work and mental health||Gwynne-Smith, 2007|
(Transportation Research Information Service)
|http://ntlsearch.bts.gov/tris/index.do||Transportation research||Wang, 2001|
|WHO Global Health Library||http://www.who.int/ghl/medicus/en/||International biomedical topics. Global Index Medicus.|
|Hoffecker, 2006 Pilkington, 2007|
|Applied social sciences
and emphases on
and the law,
|Social Services Abstracts||http://www.csa.com/
health services, gerontology
Adebonojo LG, Earl MF, POPLINE. A valuable supplement for health information. Database Magazine 1994:112-115.
Aker PD, McDermaid C, Opitz BG, et al. Searching chiropractic literature: a comparison of three computerized databases. J Manipulative Physiol Ther 1996 Oct;19(8):518-524.
Anne T-V. Information needs of CME providers: research and development resource base in continuing medical education. J Contin Educ Health Prof 1995;15(2):117-121.
Arnold SJ, Bender VF, Brown SA. A review and comparison of psychology-related electronic resources. Journal of Electronic Resources in Medical Libraries 2006;3(3):61-80.
Avenell A, Handoll HH, Grant AM. Lessons for search strategies from a systematic review, in The Cochrane Library, of nutritional supplementation trials in patients after hip fracture. Am J Clin Nutr 2001 Mar;73(3):505-510.
Bakkalbasi N, Bauer K, Glover J, et al. Three options for citation tracking: Google Scholar, Scopus and Web of Science. Biomed Digit Libr 2006;3(7).
Banks JL. PILOTS (Published International Literature on Traumatic Stress) database. J Trauma Stress 1995 Jul;8(3):495-497.
Bennett S, Hoffmann T, McCluskey A, et al. Introducing OTseeker (Occupational Therapy Systematic Evaluation of Evidence): a new evidence database for occupational therapists. Am J Occup Ther 2003 Nov-Dec;57(6):635-638.
Bennett S, McKenna K, Tooth L, et al. Searches and content of the OTseeker database: informing research priorities. Am J Occup Ther 2006 Sep-Oct;60(5):524-530.
Betran AP, Say L, Gulmezoglu AM, et al. Effectiveness of different databases in identifying studies for systematic reviews: experience from the WHO systematic review of maternal morbidity and mortality. BMC Med Res Methodol 2005 Jan 28;5(1):6.
Brettle AJ, Long AF. Comparison of bibliographic databases for information on the rehabilitation of people with severe mental illness. Bull Med Libr Assoc. 2001 Oct;89(4):353-362.
DeLuca JB, Mullins MM, Lyles CM, et al. Developing a comprehensive search strategy for evidence based systematic reviews. Evidence Based Library & Information Practice. 2008;3(1):3-32.
Eady AM, Wilczynski NL, Haynes RB. PsycINFO search strategies identified methodologically sound therapy studies and review articles for use by clinicians and researchers. Journal of Clinical Epidemiology 2008 Jan;61(1):34-40.
Falagas ME, Pitsouni EI, Malietzis GA, et al. Comparison of PubMed, Scopus, Web of Science, and Google Scholar: strengths and weaknesses. FASEB J 2008 Feb;22(2):338-342.
Fishman DL, Stone VL, DiPaula BA. Where should the pharmacy researcher look first? Comparing International Pharmaceutical Abstracts and MEDLINE. Bull Med Libr Assoc 1996 Jul;84(3):402-408.
Fitzpatrick RB. Global health database. Med Ref Serv Q 2006 Summer;25(2):59-67.
Fitzpatrick RB. REHABDATA: a disability and rehabilitation information resource. Med Ref Serv Q. 2007 Summer;26(2):55-64.
Fitzpatrick RB. PEDro: a physiotherapy evidence database. Med Ref Serv Q. 2008 Summer;27(2):189-198.
Flemming K, Briggs M. Electronic searching to locate qualitative research: evaluation of three strategies. J Adv Nurs. 2007 Jan;57(1):95-100.
Giglia E. PEDro: this well-known, unknown. Physiotherapy Evidence Database. Eur J Phys Rehabil Med 2008 Dec;44(4):477-480.
Gwynne-Smith D. The development of social care online. Legal Information Management 2007 Spring2007;7(1):34-41.
Hochstein C, Arnesen S, Goshorn J. Environmental health and toxicology resources of the United States National Library of Medicine. Medical Reference Services Quarterly Fall 2007;26(3):21-45.
Hoffecker L, Reiter CM. A review of seven complementary and alternative medicine databases. Journal of Electronic Resources in Medical Libraries 2006;3(4):13-31.
Jasco P. As we may search—comparison of major features of the Web of Science, Scopus, and Google Scholar citation-based and citation-enhanced databases. Current Science 2005;89(9):1537-1547.
Kubany ES. Searching the traumatic stress literature using PILOTS and PsycLIT. J Trauma Stress 1995 Jul;8(3):491-494.
LaGuardia C. Database & disc reviews. Library Journal 2002:166.
Lerner F, National Center for Post-Traumatic Stress D. PILOTS database user's guide. White River Junction, VT: Department of Veterans Affairs, National Center for Posttraumatic Stress Disorder; 2007.
Minozzi S, Pistotti V, Forni M. Searching for rehabilitation articles on MEDLINE and EMBASE. An example with cross-over design. Arch Phys Med Rehabil 2000 Jun;81(6):720-722.
Moseley AM, Herbert RD, Sherrington C, et al. Evidence for physiotherapy practice: a survey of the Physiotherapy Evidence Database (PEDro). Aust J Physiother 2002;48(1):43-49.
Murphy LS, Reinsch S, Najm WI, et al. Searching biomedical databases on complementary medicine: the use of controlled vocabulary among authors, indexers and investigators. BMC Complement Altern Med 2003 Jul 7;3:3.
Petrosio A, Boruch R, Cath R, et al. The Campbell Collaboration social, psychological, educational and criminological trials register (C2-SPECTR) to facilitate the preparation and maintenance of systematic reviews of social and educational interventions. evaluation and research in education. Evaluation and Research in Education 2000;14(3 & 4):206-219.
Pilkington K. Searching for CAM evidence: an evaluation of therapy-specific search strategies. J Altern Complement Med 2007 May;13(4):451-459.
Plikus MV, Zhang Z, Chuong CM. PubFocus: semantic MEDLINE/PubMed citations analytics through integration of controlled biomedical dictionaries and ranking algorithm. BMC Bioinformatics 2006 Oct 2;7:424.
Salisbury L. Web of science and scopus: a comparative review of content and searching capabilities. The Charleston Advisor 2009 July;11(1):5-18.
Sampson M, Barrowman NJ, Moher D, et al. Should meta-analysts search Embase in addition to Medline? Journal of Clinical Epidemiology 2003 Oct;56(10):943-955.
Sherrington C, Herbert RD, Maher CG, et al. PEDro. A database of randomized trials and systematic reviews in physiotherapy. Man Ther 2000 Nov;5(4):223-226.
So whatever happened to ERIC? Searcher. 2006;14(2):10-18.
Stevinson C, Lawlor DA. Searching multiple databases for systematic reviews: added value or diminishing returns? Complement Ther Med 2004 Dec;12(4):228-232.
Suarez-Almazor ME, Belseck E, Homik J, et al. Identifying clinical trials in the medical literature with electronic databases: MEDLINE alone is not enough. Controlled Clinical Trials 2000 Oct;21(5):476-487.
Subirana M, Sola I, Garcia JM, et al. A nursing qualitative systematic review required MEDLINE and CINAHL for study identification. Journal of Clinical Epidemiology 2005 Jan;58(1):20-5.
Taylor B, Wylie E, Dempster M, et al. Systematically retrieving research: a case study evaluating seven databases. Research on Social Work Practice 2007:697-706.
Tomasulo P. A new source of herbal information on the Web: the IBIDS database. Med Ref Serv Q 2000 Spring;19(1):53-57.
Tomasulo P. MANTIS—Manual, alternative, and natural therapy index system database. Med Ref Serv Q 2001 Fall;20(3):45-55.
Tomasulo PA. AgeLine: free and valuable database from AARP. Med Ref Serv Q 2005 Fall;24(3):55-65.
Ulincy L. EMCare. J Med Libr Assoc2006;94(3):357-360.
Walker-Dilks C, Wilczynski NL, Haynes RB. Cumulative Index to Nursing and Allied Health Literature search strategies for identifying methodologically sound causation and prognosis studies. Appl Nurs Res 2008 May;21(2):98-103.
Wang J. TRIS Online. Charleston Advisor 2001:43-46.
Wolfe C. International Pharmaceutical Abstracts: what's new and what can IPA do for you? Am J Health Syst Pharm 2002 Dec 1;59(23):2360-2361.
Wong SS, Wilczynski NL, Haynes RB. Optimal CINAHL search strategies for identifying therapy studies and review articles. J Nurs Scholarsh 2006;38(2):194-199.
Appendix C. CONSORT-Style Flow Diagram of Literature Search, Annotated
For more on the CONSORT flow diagram, see http://www.consort-statement.org/consort-statement/flow-diagram0/ .
Adapted from: Qayyum R, Bolen S, Maruthur N, et al. Systematic review: comparative effectiveness and safety of premixed insulin analogues in type 2 diabetes. Ann Intern Med Oct 21 2008;149(8):549-559.