Professor Julian Higgins explains why he believes the systematic review and meta-analysis methods described in many highly cited papers are routinely misunderstood or misused. Julian Higgins is Professor of Evidence Synthesis at the Bristol Evidence Synthesis, Appraisal and Modelling (BEAM) Centre at the University of Bristol. His research has focussed on the methodology of systematic review and meta-analysis and he has been senior editor of the Cochrane Handbook for Systematic Reviews of Interventions since 2003. He is an NIHR Senior Investigator and currently co-directs the NIHR Bristol Evidence Synthesis Group.
Systematic reviews and meta-analyses have become influential and popular. Papers describing aspects of the systematic review and meta-analysis toolkit have become some of the most highly cited papers. I will review those that appear at the top of the most-cited list and explain why I believe the methods described are routinely misunderstood or misused. These include a test for asymmetry in a funnel plot, the I-squared statistic for measuring inconsistency across studies, the random-effects meta-analysis model and the PRIMSA reporting guideline.
All content for Evidence-Based Health Care is the property of Oxford University and is served directly from their servers
with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.
Professor Julian Higgins explains why he believes the systematic review and meta-analysis methods described in many highly cited papers are routinely misunderstood or misused. Julian Higgins is Professor of Evidence Synthesis at the Bristol Evidence Synthesis, Appraisal and Modelling (BEAM) Centre at the University of Bristol. His research has focussed on the methodology of systematic review and meta-analysis and he has been senior editor of the Cochrane Handbook for Systematic Reviews of Interventions since 2003. He is an NIHR Senior Investigator and currently co-directs the NIHR Bristol Evidence Synthesis Group.
Systematic reviews and meta-analyses have become influential and popular. Papers describing aspects of the systematic review and meta-analysis toolkit have become some of the most highly cited papers. I will review those that appear at the top of the most-cited list and explain why I believe the methods described are routinely misunderstood or misused. These include a test for asymmetry in a funnel plot, the I-squared statistic for measuring inconsistency across studies, the random-effects meta-analysis model and the PRIMSA reporting guideline.
The overwhelming volume of evidence and its lack of relevance to patient care and decisions means health professionals require skills to sift evidence more efficiently: discarding what doesn't make a difference to focus on evidence that matters for health This talk will present a simple effective appraisal system based on two first steps to rapidly appraise and sift the evidence for its relevance and application to actual patient care, prior to assessing its validity.
Professor Carl Heneghan is Director of CEBM, and an NHS Urgent Care GP, and has been interested for over twenty years in how we can use evidence in real world practice.
This talk is being held as part of the Practice of Evidence-Based Health Care module which is part of the MSc in Evidence-Based Health Care and the MSc in EBHC Systematic Reviews. Creative Commons Attribution-Non-Commercial-Share Alike 2.0 UK: England & Wales; http://creativecommons.org/licenses/by-nc-sa/2.0/uk/
Evidence-Based Health Care
Professor Julian Higgins explains why he believes the systematic review and meta-analysis methods described in many highly cited papers are routinely misunderstood or misused. Julian Higgins is Professor of Evidence Synthesis at the Bristol Evidence Synthesis, Appraisal and Modelling (BEAM) Centre at the University of Bristol. His research has focussed on the methodology of systematic review and meta-analysis and he has been senior editor of the Cochrane Handbook for Systematic Reviews of Interventions since 2003. He is an NIHR Senior Investigator and currently co-directs the NIHR Bristol Evidence Synthesis Group.
Systematic reviews and meta-analyses have become influential and popular. Papers describing aspects of the systematic review and meta-analysis toolkit have become some of the most highly cited papers. I will review those that appear at the top of the most-cited list and explain why I believe the methods described are routinely misunderstood or misused. These include a test for asymmetry in a funnel plot, the I-squared statistic for measuring inconsistency across studies, the random-effects meta-analysis model and the PRIMSA reporting guideline.