- Search for Research Summaries, Reviews, and Reports
- EPC Project
- Evaluating the Potential Use of Modeling and Value-of-Information Analysis for Future Research Prioritization Within the Evidence-based Practice Center Program
Future Research Needs – Methods Research Series
- Frameworks for Determining Research Gaps During Systematic Reviews
- Engaging Stakeholders To Identify and Prioritize Future Research Needs
- Defining an Optimal Format for Presenting Research Needs
- Assessing the Impact of AHRQ Evidence-based Practice Center (EPC) Reports on Future Research
- Minimal Modeling Approaches to Value of Information Analysis for Health Research
- Finding Evidence on Ongoing Studies
- Framework for Considering Study Designs for Future Research Needs
- Prioritization Criteria Methodology for Future Research Needs Proposals Within the Effective Health Care Program
- Presentation of Future Research Needs
Future Research Needs Projects
Research Report - Final – Jun. 10, 2011
Evaluating the Potential Use of Modeling and Value-of-Information Analysis for Future Research Prioritization Within the Evidence-based Practice Center Program
People using assistive technology may not be able to fully access information in this file. For additional assistance, please contact us.
Systematic reviews conducted as part of the Evidence-based Practice Center (EPC) program routinely identify evidence gaps and suggest further research to help close these gaps, but there is little evidence that these suggestions lead to the performance of the needed research. As part of an EPC-wide program to evaluate potential mechanisms for ensuring that research needs identified by systematic reviews are addressed, the Duke EPC reviewed the use of modeling techniques, including value-of-information (VOI) analysis, for prioritizing research gaps, under the assumption that quantitative prioritization could help facilitate the performance of research to address those gaps.
We first searched PubMed® for relevant literature published in English between 1990 and 2010 using search terms related to research prioritization and VOI analysis to understand how modeling and VOI is currently used in research prioritization. Inclusion/exclusion screening criteria were aimed at identifying articles that focused on research prioritization using a formal framework or process and reported specific prioritization recommendations, with a special emphasis on modeling and VOI.
To supplement this search, we then conducted a nonsystematic review of research prioritization processes used by major research-sponsoring organizations in the United States and abroad. We searched organization Web sites and the results of our literature search, and contacted the organizations by e-mail and/or telephone. Materials were reviewed for information on the focus of the prioritization process and the methods and criteria used for prioritization, again with a special emphasis on modeling/VOI.
Finally, we performed two case studies of the potential use of modeling techniques in research prioritization. First, we developed a model for the use of angiotensin-converting enzyme inhibitors (ACEIs) or angiotensin II receptor antagonists (ARBs) in the management of ischemic heart disease based on the results of a prior comparative effectiveness review, then engaged nine stakeholders in a prioritization process that involved both a consensus-based approach and the use of model results. Second, we adapted a model on the outcomes of treatment of uterine fibroids developed for a previous systematic review and conducted a VOI analysis; these results were then shared with nine participants in a separate consensus-based research prioritization process. In both case studies, we elicited stakeholder feedback on the potential use of modeling and VOI in research prioritization.
Only 6 of the 214 papers identified during the literature search reported using a previously published systematic review as the basis for identifying research gaps. Of the 60 unique modeling-based papers, all but 8 used cost-effectiveness analysis and VOI, with most of these focused on the question of immediate adaptation versus future research for a specific health intervention. The United Kingdom (UK) Health Technology Assessment (HTA) program conducted 19 of the 52 VOI analyses.
Of the 31 research organizations providing information on prioritization processes, only the UK National Institute for Clinical Excellence (NICE), through the HTA program, explicitly included modeling and VOI in their recommendations for future research.
Although the results of the modeling exercises for both case studies provided insight into the underlying decision problems, both models require further development. Despite this, stakeholders from both case study groups reported that the results of the modeling exercises were helpful in thinking about research prioritization, although none thought that modeling alone could substitute for a consensus-based approach. There was some diversity of opinion about the optimal timing of the modeling, with some stakeholders indicating that the results would be more helpful as background to a consensus-based process, while others preferred a parallel, iterative process involving both modeling and consensus.
Outside of the UK NICE/HTA program, systematic reviews were rarely cited as important sources for identifying evidence gaps for research prioritization. Cost-effectiveness and VOI analyses were the most commonly used modeling-based methods, but, outside of the UK, it is unclear to what degree the priorities identified by these methods were translated into actual research funding. Stakeholders in our two case studies found modeling and VOI to be potentially useful tools, but there are a variety of methodological and operational issues that need to be considered and resolved if these methods are to be used to assist with prioritizing research gaps identified through systematic reviews. These include identifying ways to compare the impact of different prioritization methods on the likelihood that priority questions will be answered through research, identifying the appropriate resources (including technical expertise) to conduct the analyses, defining the appropriate timing of the modeling and analyses, and identifying the appropriate level of modeling complexity.