Home
Categories
EXPLORE
True Crime
Comedy
Society & Culture
Business
Sports
History
Fiction
About Us
Contact Us
Copyright
© 2024 PodJoint
00:00 / 00:00
Sign in

or

Don't have an account?
Sign up
Forgot password
https://is1-ssl.mzstatic.com/image/thumb/Podcasts122/v4/13/27/3c/13273cbf-a4d1-8cdd-cea5-b9b1dd943466/mza_2213542639887248229.jpg/600x600bb.jpg
Evidence-Based Health Care
Oxford University
10 episodes
4 months ago
Professor Julian Higgins explains why he believes the systematic review and meta-analysis methods described in many highly cited papers are routinely misunderstood or misused. Julian Higgins is Professor of Evidence Synthesis at the Bristol Evidence Synthesis, Appraisal and Modelling (BEAM) Centre at the University of Bristol. His research has focussed on the methodology of systematic review and meta-analysis and he has been senior editor of the Cochrane Handbook for Systematic Reviews of Interventions since 2003. He is an NIHR Senior Investigator and currently co-directs the NIHR Bristol Evidence Synthesis Group. Systematic reviews and meta-analyses have become influential and popular. Papers describing aspects of the systematic review and meta-analysis toolkit have become some of the most highly cited papers. I will review those that appear at the top of the most-cited list and explain why I believe the methods described are routinely misunderstood or misused. These include a test for asymmetry in a funnel plot, the I-squared statistic for measuring inconsistency across studies, the random-effects meta-analysis model and the PRIMSA reporting guideline.
Show more...
Education
RSS
All content for Evidence-Based Health Care is the property of Oxford University and is served directly from their servers with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.
Professor Julian Higgins explains why he believes the systematic review and meta-analysis methods described in many highly cited papers are routinely misunderstood or misused. Julian Higgins is Professor of Evidence Synthesis at the Bristol Evidence Synthesis, Appraisal and Modelling (BEAM) Centre at the University of Bristol. His research has focussed on the methodology of systematic review and meta-analysis and he has been senior editor of the Cochrane Handbook for Systematic Reviews of Interventions since 2003. He is an NIHR Senior Investigator and currently co-directs the NIHR Bristol Evidence Synthesis Group. Systematic reviews and meta-analyses have become influential and popular. Papers describing aspects of the systematic review and meta-analysis toolkit have become some of the most highly cited papers. I will review those that appear at the top of the most-cited list and explain why I believe the methods described are routinely misunderstood or misused. These include a test for asymmetry in a funnel plot, the I-squared statistic for measuring inconsistency across studies, the random-effects meta-analysis model and the PRIMSA reporting guideline.
Show more...
Education
https://is1-ssl.mzstatic.com/image/thumb/Podcasts122/v4/13/27/3c/13273cbf-a4d1-8cdd-cea5-b9b1dd943466/mza_2213542639887248229.jpg/600x600bb.jpg
Not just "what," but also "how well:" Intervention fidelity in clinical trials of complex interventions in healthcare
Evidence-Based Health Care
41 minutes
9 months ago
Not just "what," but also "how well:" Intervention fidelity in clinical trials of complex interventions in healthcare
The concepts of intervention fidelity and how they can influence the results of clinical trials. The focus of clinical trials is typically interventions' efficacy, or whether they attain their desired outcomes. Comparatively less attention is focused on understanding how or why interventions succeed, or fail to attain, those outcomes. This may be particularly important in trials of complex interventions such as surgery or physiotherapy, which are multifaceted and often tailored to individual participants, providers, or settings, increasing the potential for variations in intervention delivery and effects. The correspondence between the intervention that was planned and what was actually delivered in a trial is the intervention's fidelity. In this presentation, we will discuss intervention fidelity and concepts related to it such as participant adherence (the actions of patients and participants in a clinical trial), and how they can influence the results of a clinical trial, as well as our level of confidence in the results of published trials. A checklist for assessing intervention fidelity in clinical trial publications will also be presented. Dr. Paez is a post-doctoral fellow at the Sleep, Cognition, and Neuroimaging Laboratory at Concordia University, Montreal, an Assistant Professor of Medicine and Clinical Skills Training, NAPCA, and a Senior Lecturer at the Bouvé College of Health Sciences, Northeastern University, Boston. He obtained an MSc and DPhil in Evidence-based Healthcare from the University of Oxford, UK, a PhD in Health and Exercise Science from Concordia University, and doctoral degree in Physiotherapy from Northeastern University, Boston. Dr. Paez is also a visiting scholar and council member of the IDEAL Collaboration, Nuffield Department of Surgical Sciences, University of Oxford, which focuses on improving innovation and evidence for complex interventions in healthcare, such as Surgery and Rehabilitation.
Evidence-Based Health Care
Professor Julian Higgins explains why he believes the systematic review and meta-analysis methods described in many highly cited papers are routinely misunderstood or misused. Julian Higgins is Professor of Evidence Synthesis at the Bristol Evidence Synthesis, Appraisal and Modelling (BEAM) Centre at the University of Bristol. His research has focussed on the methodology of systematic review and meta-analysis and he has been senior editor of the Cochrane Handbook for Systematic Reviews of Interventions since 2003. He is an NIHR Senior Investigator and currently co-directs the NIHR Bristol Evidence Synthesis Group. Systematic reviews and meta-analyses have become influential and popular. Papers describing aspects of the systematic review and meta-analysis toolkit have become some of the most highly cited papers. I will review those that appear at the top of the most-cited list and explain why I believe the methods described are routinely misunderstood or misused. These include a test for asymmetry in a funnel plot, the I-squared statistic for measuring inconsistency across studies, the random-effects meta-analysis model and the PRIMSA reporting guideline.