WEBVTT NOTE Created by CaptionSync from Automatic Sync Technologies www.automaticsync.com 00:00:00.356 --> 00:00:04.576 align:middle >> These individuals are pioneers, innovators in medicine. 00:00:04.576 --> 00:00:07.066 align:middle >> They've ushered in an era of evidence 00:00:07.246 --> 00:00:10.966 align:middle and made scientific research an essential basis of medical practice. 00:00:11.516 --> 00:00:15.436 align:middle [ music ] 00:00:15.936 --> 00:00:17.756 align:middle >> [Howard Bauchner:] Hello, I'm Howard Bauchner, 00:00:17.756 --> 00:00:20.536 align:middle Editor in Chief of JAMA and the JAMA Network. 00:00:20.536 --> 00:00:22.856 align:middle >> [Fiona Godlee:] And I'm Fiona Godlee, Editor in Chief of the BMJ. 00:00:23.426 --> 00:00:25.836 align:middle >> [Howard Bauchner:] We hope you enjoy these inspiring stories. 00:00:26.516 --> 00:00:32.756 align:middle [ music ] 00:00:33.256 --> 00:00:35.786 align:middle >> [Richard Smith:] So Gordon, how did you come to evidenced based medicine? 00:00:36.046 --> 00:00:39.396 align:middle >> [Gordon Guyatt:] Well after doing two years of post-graduate training in medicine 00:00:39.456 --> 00:00:46.286 align:middle in Toronto, I came back to McMaster and found Brain Haynes and Dave Sackett there. 00:00:46.526 --> 00:00:48.986 align:middle And got very interested in what they were doing. 00:00:48.986 --> 00:00:58.476 align:middle Ended up in the health research methodology master's program and became then skilled 00:00:58.476 --> 00:01:03.706 align:middle in what was critical appraisal and what Dave Sackett then called bringing critical appraisal 00:01:03.706 --> 00:01:04.696 align:middle to the bedside. 00:01:05.176 --> 00:01:10.476 align:middle And as we were doing this in clinical practice we found more and more 00:01:10.666 --> 00:01:14.606 align:middle that we were doing a different form of medical practice. 00:01:14.686 --> 00:01:20.866 align:middle And in 1990, I took over the internal medicine residency program to run the program 00:01:21.286 --> 00:01:29.096 align:middle with the notion that I wanted to train residents in this new way of doing medical practice. 00:01:29.096 --> 00:01:31.786 align:middle And I wanted to attract residents who were interested. 00:01:32.116 --> 00:01:37.446 align:middle And I wanted to advertise our program as one that did this. 00:01:37.566 --> 00:01:39.756 align:middle So we were doing what? 00:01:40.046 --> 00:01:40.976 align:middle Needed a name. 00:01:41.506 --> 00:01:47.086 align:middle And the first, my first attempt at coming up with a name for it was scientific medicine. 00:01:47.486 --> 00:01:54.166 align:middle And in a meeting where I was introduced by the department chair to the members 00:01:54.166 --> 00:01:59.226 align:middle of the department of medicine, the basic scientists were so enraged 00:01:59.226 --> 00:02:03.046 align:middle that I was calling what we were doing scientific medicine 00:02:03.256 --> 00:02:05.666 align:middle that I thought, back to the drawing board. 00:02:06.036 --> 00:02:10.526 align:middle And the second notion of what we might call it was evidenced based medicine. 00:02:10.826 --> 00:02:13.356 align:middle And boy did that turn out to be a good choice. 00:02:13.616 --> 00:02:14.706 align:middle >> [Richard Smith:] So you've never regretted it? 00:02:14.836 --> 00:02:15.586 align:middle >> [Gordon Guyatt:] Of course not. 00:02:17.326 --> 00:02:19.216 align:middle >> [Richard Smith:] Teddy [assumed spelling], how did you get to evidenced based medicine? 00:02:19.896 --> 00:02:24.216 align:middle >> [Kay Dickersin:] Well I came through the root of experimental science cell biology. 00:02:24.216 --> 00:02:29.156 align:middle And I was then joined the doctoral program in clinical trials at Johns Hopkins. 00:02:29.636 --> 00:02:32.396 align:middle And where Kirt Minard [assumed spelling] was my mentor. 00:02:32.776 --> 00:02:35.926 align:middle And was looking for a thesis project. 00:02:35.926 --> 00:02:40.166 align:middle Talked to Tom Chalmers who was in Boston doing a sabbatical at that time. 00:02:40.166 --> 00:02:46.056 align:middle And Tom was a real trialist and really ahead of time in everything he was doing. 00:02:46.056 --> 00:02:49.066 align:middle He had a grant at that time from the National Library of Medicine and was looking 00:02:49.066 --> 00:02:53.046 align:middle at the literature and even thinking about and teaching a class actually, 00:02:53.046 --> 00:02:55.126 align:middle a seminar of meta-analysis at Harvard. 00:02:55.726 --> 00:02:58.216 align:middle And so I said well what do you think I should do my thesis on? 00:02:58.216 --> 00:03:00.866 align:middle I really am interested in registration of trails. 00:03:00.866 --> 00:03:04.106 align:middle And he said oh no, registration of perinatal trials had been done 00:03:04.106 --> 00:03:05.596 align:middle by a guy named Iain Chalmers. 00:03:05.996 --> 00:03:08.226 align:middle And he showed me this letter and a lancet. 00:03:08.226 --> 00:03:08.886 align:middle And he... 00:03:08.886 --> 00:03:09.986 align:middle >> [Richard Smith:] When are talking about here? 00:03:09.986 --> 00:03:14.236 align:middle >> [Kay Dickersin:] Oh sorry yeah, that was about 1983 I'd say or '84. 00:03:14.756 --> 00:03:21.446 align:middle And, and so he said well later on that year in the summer he said why don't you go over 00:03:21.446 --> 00:03:24.816 align:middle and talk to Iain Chalmers and see about doing a project. 00:03:24.816 --> 00:03:26.576 align:middle Maybe he'd be interested. 00:03:26.576 --> 00:03:27.926 align:middle And I wrote a letter to Iain. 00:03:27.926 --> 00:03:30.856 align:middle I said hey you have your register of perinatal trials. 00:03:31.196 --> 00:03:33.316 align:middle Have you considered unpublished trials? 00:03:33.316 --> 00:03:35.446 align:middle Is that something that you're working on? 00:03:35.446 --> 00:03:37.756 align:middle And Iain said come on over, let's talk about it. 00:03:37.756 --> 00:03:42.526 align:middle Let's write a grant which we did to try to find out how many unpublished trials 00:03:42.526 --> 00:03:46.486 align:middle in the perinatal field there were that would contribute to the register that he had 00:03:46.486 --> 00:03:48.076 align:middle with Murray Aitken and Mark Kurza [assumed spelling]. 00:03:48.586 --> 00:03:50.456 align:middle And that was the beginning for me. 00:03:51.086 --> 00:03:53.786 align:middle >> [Richard Smith:] And how many trials did you find were unpublished at that time? 00:03:55.096 --> 00:03:59.106 align:middle >> [Kay Dickersin:] Well the study that came out of that really found, 00:03:59.276 --> 00:04:03.186 align:middle we looked at Johns Hopkins School of Public Health and in John Hopkins School of Medicine 00:04:03.596 --> 00:04:08.686 align:middle and probably about a third were unpublished at the School of Public Health and probably 00:04:08.686 --> 00:04:11.906 align:middle about 20 percent at the School of Medicine. 00:04:11.956 --> 00:04:15.466 align:middle So, that was a high publication rate compared to what we know now 00:04:15.466 --> 00:04:17.496 align:middle as studies done by many, many different people. 00:04:17.496 --> 00:04:20.656 align:middle >> [Richard Smith:] Did that, did that figure shock you or surprise you? 00:04:22.636 --> 00:04:27.356 align:middle >> [Kay Dickersin:] I don't know if it did by that time because I'd done so many interviews. 00:04:27.356 --> 00:04:30.146 align:middle I think at that time because I was working on my doctoral thesis, 00:04:30.686 --> 00:04:32.766 align:middle it was those interviews that made a big difference. 00:04:32.766 --> 00:04:36.796 align:middle I'd call people and ask them about publishing and say why didn't you publish? 00:04:37.096 --> 00:04:40.126 align:middle And the people who were working on theses said because I was sick of it. 00:04:40.416 --> 00:04:41.446 align:middle And I was sick of it. 00:04:41.856 --> 00:04:44.946 align:middle So, that really made an impact on me. 00:04:44.946 --> 00:04:50.336 align:middle But I, I think that that was very interesting finding as why people weren't publishing. 00:04:50.336 --> 00:04:51.436 align:middle Because they were sick of it. 00:04:51.966 --> 00:04:55.066 align:middle Because they just, you know, they moved on to other things. 00:04:55.106 --> 00:04:56.716 align:middle The findings weren't interesting. 00:04:57.116 --> 00:04:59.606 align:middle And that was more interesting than the percent I think. 00:04:59.686 --> 00:05:01.456 align:middle >> [Richard Smith:] And the editors weren't really the problem? 00:05:01.716 --> 00:05:02.876 align:middle >> [Kay Dickersin:] They weren't really the problem. 00:05:02.876 --> 00:05:05.266 align:middle They were a very tiny fraction of the problem. 00:05:05.266 --> 00:05:09.686 align:middle It's really the non-authors who were the problem, that they hadn't bothered to submit. 00:05:10.776 --> 00:05:12.496 align:middle >> [Richard Smith:] But the editors still tend to get blamed. 00:05:12.496 --> 00:05:13.486 align:middle So onto an editor. 00:05:13.916 --> 00:05:15.416 align:middle >> [Drummond Rennie:] Of course they get blamed. 00:05:16.366 --> 00:05:20.616 align:middle And unfairly as Teddy has told you. 00:05:22.396 --> 00:05:23.936 align:middle I had a piece of luck. 00:05:23.986 --> 00:05:25.856 align:middle I got severely injured climbing. 00:05:26.556 --> 00:05:31.036 align:middle I was then unable to do my research at very high altitudes. 00:05:31.076 --> 00:05:35.586 align:middle I was a physiologist and had to change my career. 00:05:36.026 --> 00:05:40.456 align:middle I went to the New England Journal where I was the then the sole deputy editor. 00:05:41.146 --> 00:05:47.556 align:middle And in November of 1977, I got a manuscript from a guy Tom Chalmers, 00:05:47.926 --> 00:05:52.136 align:middle you've just heard about, and this entranced me. 00:05:52.256 --> 00:05:55.136 align:middle It was the first meta-analysis I'd ever read. 00:05:55.986 --> 00:06:04.016 align:middle And it seemed to me to solve a huge numbers of problems at one time. 00:06:04.466 --> 00:06:09.876 align:middle This was a way, a logical way of dealing with the medical literature. 00:06:09.876 --> 00:06:14.746 align:middle And you wanted to throw away everything of the past or you use this system. 00:06:16.006 --> 00:06:17.056 align:middle That was the first. 00:06:17.936 --> 00:06:23.266 align:middle This impressed me also because it was dealing with a practical patient problem 00:06:23.266 --> 00:06:28.106 align:middle that I'd heard argued constantly all the way through medical school, internship, 00:06:28.406 --> 00:06:33.796 align:middle residency in London and so on and then again in the states. 00:06:34.446 --> 00:06:39.466 align:middle And that was to give anticoagulants or not after myocardial infarction? 00:06:40.346 --> 00:06:44.176 align:middle And this showed, this meta-analysis showed good idea. 00:06:45.346 --> 00:06:47.916 align:middle And proved it or seemed to to me. 00:06:47.916 --> 00:06:59.156 align:middle At the same time, I in this, in this change, I met Dave Sackett. 00:06:59.336 --> 00:07:06.946 align:middle Dave Sackett came from McMaster and he came to see the editor 00:07:07.666 --> 00:07:10.606 align:middle and the other editor, me, the little editor. 00:07:11.436 --> 00:07:16.756 align:middle And he claimed to talk about a series with the editor. 00:07:17.316 --> 00:07:23.496 align:middle I was very impressed by Dave and I kept in contact with him in subsequent years. 00:07:23.496 --> 00:07:28.026 align:middle I was exceedingly impressed by his way of thinking. 00:07:29.366 --> 00:07:39.446 align:middle This led to an invitation from him when I moved to JAMA and led to an invitation. 00:07:39.536 --> 00:07:40.876 align:middle I went to McMaster. 00:07:41.546 --> 00:07:49.136 align:middle And one way or another met everybody and from that came a long series of articles, 00:07:49.346 --> 00:07:53.666 align:middle The User's Guide Series with Gord Guyatt. 00:07:54.016 --> 00:08:01.956 align:middle And then the rational clinical exam using features of the examination 00:08:02.126 --> 00:08:06.106 align:middle and the history taking and regarding them as clinical tests 00:08:06.236 --> 00:08:10.686 align:middle with operational characteristics that you could define. 00:08:11.456 --> 00:08:14.916 align:middle >> [Richard Smith:] And the New England Journal published that Tom Chalmers meta-analogy? 00:08:14.916 --> 00:08:16.126 align:middle >> [Drummond Rennie:] Yes indeed it did. 00:08:16.336 --> 00:08:18.876 align:middle >> [Richard Smith:] But am I right to think it became a bit snotty about, 00:08:19.216 --> 00:08:22.536 align:middle about systematic reviews and meta-analyses subsequently? 00:08:23.466 --> 00:08:25.046 align:middle >> [Drummond Rennie:] Well the facts are that. 00:08:26.546 --> 00:08:29.586 align:middle The facts are that and in fact I've discussed this 00:08:29.586 --> 00:08:33.556 align:middle with a senior statistical editor at the time. 00:08:33.556 --> 00:08:36.736 align:middle A great friend of mine, John Baylor [assumed spelling], 00:08:37.376 --> 00:08:44.176 align:middle and his problem was you're never comparing apples with apples if you do a meta-analysis. 00:08:44.766 --> 00:08:54.096 align:middle To which I would say, you've, if you, unless you junk the entire literature and only rely 00:08:54.096 --> 00:09:00.566 align:middle on one trial, you have to have a little accommodation there 00:09:00.806 --> 00:09:02.506 align:middle and you do the very best you can. 00:09:03.036 --> 00:09:05.216 align:middle >> [Richard Smith:] So Brian, how did you get to evidence based medicine? 00:09:05.716 --> 00:09:09.596 align:middle >> [Brian Haynes:] When I was a second year medical student at the University of Alberta, 00:09:09.596 --> 00:09:14.016 align:middle we had a lecture by [inaudible] psychiatry on Freud's theories. 00:09:14.196 --> 00:09:18.106 align:middle At the end of that session I asked what's the evidence 00:09:18.106 --> 00:09:19.866 align:middle that points that Freud's theories are true? 00:09:19.866 --> 00:09:24.186 align:middle And the person who was doing the lecture sort of broke from his cameo role 00:09:24.966 --> 00:09:27.496 align:middle and said well I don't really think they are true. 00:09:27.496 --> 00:09:29.236 align:middle There's no evidence to support them. 00:09:29.236 --> 00:09:32.626 align:middle But the chair of the department's a Freudian and he asked me to give this lecture. 00:09:33.066 --> 00:09:40.866 align:middle And I just had this sudden realization how much of my medical training 00:09:40.956 --> 00:09:44.456 align:middle to that point has been based on theories that were not supported by facts? 00:09:46.006 --> 00:09:50.056 align:middle It caused me to stew for a while and then I went to University of Toronto where, 00:09:50.646 --> 00:09:53.276 align:middle Toronto General Hospital for my residency training. 00:09:53.356 --> 00:09:58.316 align:middle Because thinking that the truth might not be in Edmonton but it could be in Toronto. 00:09:58.946 --> 00:10:02.646 align:middle And of course they always purported to have a lock on truth. 00:10:03.676 --> 00:10:09.626 align:middle The only difference I could see between the two places that was in Toronto 00:10:09.626 --> 00:10:11.756 align:middle if you asked them what's the evidence, they'd get mad at you. 00:10:11.966 --> 00:10:16.426 align:middle So I stewed about that some more. 00:10:16.426 --> 00:10:23.486 align:middle And I was fortunate enough in Toronto that Dave Sackett had just started the Department 00:10:23.486 --> 00:10:27.686 align:middle of Clinical Epidemiology down the road at McMaster in Hamilton Ontario. 00:10:28.766 --> 00:10:33.026 align:middle And one of the people, Jack Laidlaw, who subsequently became the Dean at McMaster 00:10:33.236 --> 00:10:37.306 align:middle but was at Toronto at the time invited Dave Sackett to give a little talk 00:10:38.096 --> 00:10:41.236 align:middle on Is Healthcare Researchable which is exactly what I wanted to know. 00:10:41.976 --> 00:10:46.786 align:middle I certainly didn't know what to do about the situation of figuring 00:10:46.786 --> 00:10:48.436 align:middle out what things worked and what didn't work. 00:10:48.926 --> 00:10:51.016 align:middle And I figured I need to get some method logic training. 00:10:51.816 --> 00:10:57.066 align:middle And Dave's talk was just right on target for showing examples where you could get 00:10:57.356 --> 00:11:03.226 align:middle to an incredible answer for any number of range of medical problems. 00:11:03.226 --> 00:11:07.016 align:middle So I came over to McMaster after that, went through their graduate program. 00:11:08.306 --> 00:11:13.976 align:middle Went away again and came back on the faculty there to work with Dave on the original series 00:11:13.976 --> 00:11:20.886 align:middle on Critical Appraisal of the Medical Literature and the initial teaching that we did on that. 00:11:20.886 --> 00:11:23.446 align:middle And recruiting young squirts like Gord Guyatt, etc. 00:11:24.106 --> 00:11:27.006 align:middle >> [Richard Smith:] I can't help but imagine there must have been a little glow of pleasure 00:11:27.006 --> 00:11:29.286 align:middle of each of your professors as they see you sitting there 00:11:29.286 --> 00:11:32.626 align:middle in the audience wondering what you're going to ask next. 00:11:32.956 --> 00:11:35.116 align:middle So Paul, how did you get to it? 00:11:35.706 --> 00:11:38.446 align:middle >> [Paul Glasziou:] Well I had somewhat similar experience to Brian's. 00:11:38.446 --> 00:11:39.566 align:middle We were in medical school. 00:11:39.566 --> 00:11:44.556 align:middle I wondered what the basis of making decisions were and started to see these disagreements 00:11:44.556 --> 00:11:48.146 align:middle between the physicians saying one thing, surgeon saying another for the same condition 00:11:48.546 --> 00:11:50.466 align:middle or even two physicians saying different things. 00:11:51.016 --> 00:11:52.686 align:middle But I couldn't work out how to resolve that. 00:11:52.756 --> 00:11:58.036 align:middle Did medical practice for a while but became a bit disenchanted by not finding this basis. 00:11:58.586 --> 00:12:02.076 align:middle And went to, moved into epidemiology in clinical trials. 00:12:02.076 --> 00:12:06.926 align:middle Worked with two guys in Sydney, [inaudible] who'd done work at McMaster 00:12:07.296 --> 00:12:11.576 align:middle and John Simms [assumed spelling] who'd been at Harvard where Tom Chalmers was. 00:12:12.096 --> 00:12:14.816 align:middle And done some early work on publication. 00:12:14.816 --> 00:12:19.256 align:middle [Inaudible] taught me about clinical trials and publication bias and meta-analyses. 00:12:20.096 --> 00:12:22.926 align:middle But it wasn't until Dave Sackett came out to Sydney 00:12:23.126 --> 00:12:25.386 align:middle at [inaudible] invitation and did a workshop. 00:12:26.116 --> 00:12:33.186 align:middle But I really decided to become very interested in this evidence practice gap and the fact 00:12:33.246 --> 00:12:38.886 align:middle that he was a basis for making decisions, the evidence, but it wasn't being used in practice 00:12:39.226 --> 00:12:41.556 align:middle and that you needed to work in both camps to do that. 00:12:41.556 --> 00:12:42.806 align:middle So I actually went and retrained. 00:12:42.806 --> 00:12:45.906 align:middle I did emergency department work for a while in the, 00:12:45.906 --> 00:12:48.946 align:middle on Friday evenings in a busy emergency clinic. 00:12:49.416 --> 00:12:52.206 align:middle But eventually retrained as a general practitioner 00:12:52.206 --> 00:12:56.526 align:middle so that I could see how it was trying to get it from the researchers end. 00:12:56.526 --> 00:13:01.436 align:middle So that was incredibly useful and inspired by Dave because he'd done a very similar thing 00:13:02.226 --> 00:13:04.166 align:middle to be able to work at both ends at once. 00:13:04.496 --> 00:13:08.336 align:middle To see how research is unusable for the practitioner and how to make it so. 00:13:08.816 --> 00:13:11.856 align:middle And also how to make researchers do the right sort of research that's going 00:13:11.856 --> 00:13:13.456 align:middle to be useful for clinical practice. 00:13:13.856 --> 00:13:16.796 align:middle And that was sort of bridge people I think are absolutely crucial. 00:13:16.796 --> 00:13:18.226 align:middle We need more of those in medicine. 00:13:18.926 --> 00:13:24.206 align:middle So anyway after that a lot of work in the area and that culminated in the moving to Oxford 00:13:24.206 --> 00:13:28.106 align:middle and taking up The Center for Evidence Based Medicine after Dave Sackett had left. 00:13:28.516 --> 00:13:30.806 align:middle >> [Richard Smith:] If you want to be serious about evidence based medicine, 00:13:30.806 --> 00:13:33.256 align:middle do you have to be an epidemiologist and some way? 00:13:34.156 --> 00:13:39.186 align:middle >> [Paul Glasziou:] No, but I think to, I think for the person just wanting to practice it, 00:13:39.416 --> 00:13:42.536 align:middle there's a modest depth that you need to have practiced [inaudible] 00:13:42.636 --> 00:13:45.796 align:middle through medical school so that you're skilled at it. 00:13:46.176 --> 00:13:50.166 align:middle It's like being shown how to use a stethoscope once, you know, it's not enough. 00:13:50.166 --> 00:13:54.746 align:middle You've actually got to listen to lot of hearts and become skilled at doing it. 00:13:54.746 --> 00:13:58.026 align:middle The same thing is true for but you don't have to do a full blown PhD. 00:13:58.026 --> 00:14:00.876 align:middle or master's program in how to use a stethoscope. 00:14:01.796 --> 00:14:04.746 align:middle But I think if you want to teach it and move the boundaries 00:14:04.746 --> 00:14:09.186 align:middle of evidence based medicine then I think that much greater depth is there. 00:14:09.186 --> 00:14:10.906 align:middle And we'll probably come on to discuss the sort 00:14:10.906 --> 00:14:14.126 align:middle of evolution that's occurring in evidence based practice. 00:14:14.126 --> 00:14:15.546 align:middle There's still a lot of work to do. 00:14:15.546 --> 00:14:19.516 align:middle >> [Richard Smith:] Right, well you've heard lots of references to Dave Sackett. 00:14:19.516 --> 00:14:23.706 align:middle So I went to the Kilgore Trout Research Institute. 00:14:23.706 --> 00:14:27.416 align:middle Those of you that are Vonnegut fans will recognize Kilgore Trout. 00:14:27.576 --> 00:14:30.796 align:middle And I interviewed Dave for about an hour and 3/4. 00:14:30.796 --> 00:14:34.536 align:middle And I think we will be, you know, putting that together into something 00:14:34.536 --> 00:14:36.206 align:middle that everybody will be able to see eventually. 00:14:36.506 --> 00:14:40.136 align:middle But one of the questions I asked him, he talked about how they'd first got 00:14:40.136 --> 00:14:41.926 align:middle into critical appraisal as Gordon was saying. 00:14:41.926 --> 00:14:46.406 align:middle And I said what's the difference between critical appraisal and evidence based medicine? 00:14:46.406 --> 00:14:47.956 align:middle And this is what Dave said. 00:14:49.026 --> 00:14:52.316 align:middle So, how would you say evidence based medicine is different from critical appraisal? 00:14:52.626 --> 00:14:54.166 align:middle >> [Dave Sackett:] It goes beyond it. 00:14:54.226 --> 00:14:58.806 align:middle In other words, what it does is it, it integrates the science 00:14:59.216 --> 00:15:03.286 align:middle and the literature with your best clinical skills. 00:15:03.286 --> 00:15:05.756 align:middle So you have to get the diagnosis right. 00:15:05.756 --> 00:15:05.926 align:middle >> [Richard Smith:] Right. 00:15:05.926 --> 00:15:10.906 align:middle >> [Dave Sackett:] But also incorporates patient values, you know, so someone like Sharon Strauss 00:15:10.906 --> 00:15:16.916 align:middle for example developed a bedside strategy for very quickly being able 00:15:16.916 --> 00:15:22.646 align:middle to tell a patient the likelihood we're going to help versus harm you of with if we follow 00:15:22.646 --> 00:15:25.756 align:middle down this path using patient values. 00:15:25.956 --> 00:15:30.836 align:middle The typical case being someone with nonvalvular atrial fibrillation at risk of a stroke, 00:15:31.116 --> 00:15:35.766 align:middle small risk, but terrible if they had the stroke, on Warfarin, 00:15:36.136 --> 00:15:38.556 align:middle pretty safe but you could have a bleed. 00:15:39.066 --> 00:15:42.596 align:middle And so we would get patients to weigh these two outcomes. 00:15:42.976 --> 00:15:46.356 align:middle A bleed, yeah that's bad but you're going to be over in a few weeks. 00:15:46.356 --> 00:15:48.066 align:middle A stoke, that's forever. 00:15:48.516 --> 00:15:52.906 align:middle And most of them would tend to see a stroke at about four or five times as bad as a bleed. 00:15:53.016 --> 00:15:53.306 align:middle >> [Richard Smith:] Yeah. 00:15:53.466 --> 00:15:56.406 align:middle >> [Dave Sackett:] We could integrate that with the number needed to treat and the number needed 00:15:56.406 --> 00:16:01.966 align:middle to harm to come out with we're about 11 times as likely to help versus harm you with this. 00:16:02.406 --> 00:16:06.746 align:middle >> [Richard Smith:] So Gordon, Dave put a huge emphasis there on, you know, patient values. 00:16:06.746 --> 00:16:11.446 align:middle But I think it's fair to say that when you first use the phase evidence based medicine there 00:16:11.446 --> 00:16:13.856 align:middle wasn't such an emphasis on patient care. 00:16:13.856 --> 00:16:16.536 align:middle >> [Gordon Guyatt:] Yeah if you look, if you look at the first publication 00:16:16.536 --> 00:16:23.296 align:middle such as the 1992 JAMA publication that really introduced it to the broader world, 00:16:23.646 --> 00:16:28.476 align:middle there's very, you have to look hard to find any reference to it at all. 00:16:29.056 --> 00:16:35.436 align:middle And so at the beginning it was the original series had been a reader's guide. 00:16:35.666 --> 00:16:39.136 align:middle We moved in the series Champion by Drummond 00:16:39.306 --> 00:16:43.466 align:middle to a user's guide getting physicians to use it in practice. 00:16:43.466 --> 00:16:49.166 align:middle And as we were doing this and as we were practicing clinically with the emphasis of okay, 00:16:49.166 --> 00:16:54.176 align:middle we're considering course of action A versus B, 00:16:54.486 --> 00:16:57.216 align:middle what is going to happen to the outcomes of interest? 00:16:57.516 --> 00:17:01.746 align:middle And we would invariably find there are some good things and there's some bad things. 00:17:02.046 --> 00:17:03.816 align:middle And then how do you make the decision. 00:17:04.296 --> 00:17:09.606 align:middle And having done this repeatedly it became evident that every time there were value 00:17:09.606 --> 00:17:11.396 align:middle in preference judgments going on. 00:17:11.666 --> 00:17:16.446 align:middle And then the next thing is who's values and preferences should it be becomes pretty evident 00:17:16.446 --> 00:17:20.746 align:middle as soon as you begin to think about it that it should be the patient's values and preferences. 00:17:20.976 --> 00:17:27.886 align:middle And so over the next five years that became more and more central and really now for at least, 00:17:27.886 --> 00:17:35.216 align:middle at least 15 years values and preferences as a core principle of evidence based medicine, 00:17:35.496 --> 00:17:37.656 align:middle we talk about it as something of an irony. 00:17:37.956 --> 00:17:41.616 align:middle One of the core principles of evidence based medicine is evidence 00:17:41.616 --> 00:17:43.486 align:middle by itself never tells you what to do. 00:17:43.706 --> 00:17:47.046 align:middle It's always evidence in the context of values and preferences 00:17:47.186 --> 00:17:50.846 align:middle which as Dave was pointing out has been a big emphasis. 00:17:51.086 --> 00:17:52.696 align:middle >> [Richard Smith:] But I think it's fair to say 00:17:52.696 --> 00:17:56.356 align:middle that initially there was quite a backlash against evidence based medicine. 00:17:56.356 --> 00:18:00.676 align:middle It's cookbook medicine, it's all dreamt up by managers and insurance companies. 00:18:00.676 --> 00:18:04.026 align:middle It's all about statisticians and horrible people like that. 00:18:04.236 --> 00:18:05.726 align:middle It's not patient centered. 00:18:05.726 --> 00:18:07.546 align:middle It's all about randomized trials. 00:18:07.546 --> 00:18:08.756 align:middle It's too reductionist. 00:18:09.176 --> 00:18:11.406 align:middle Tell us about your experience of that backlash. 00:18:11.406 --> 00:18:12.516 align:middle Did you experience it Drummond? 00:18:13.536 --> 00:18:16.326 align:middle >> [Drummond Rennie:] Well of course, absolutely. 00:18:16.686 --> 00:18:25.386 align:middle And it seems to me so obvious and it is a human if you don't the law on your side 00:18:25.626 --> 00:18:29.266 align:middle and you don't have the facts on your side, you attack 00:18:30.216 --> 00:18:33.176 align:middle and you attack particularly in a mindless passion. 00:18:33.546 --> 00:18:36.316 align:middle And there's a tremendous amount of that going on. 00:18:37.116 --> 00:18:40.336 align:middle >> [Richard Smith:] Well also, I mean there was this sort of implication that everything 00:18:40.336 --> 00:18:42.526 align:middle that had gone before and not really paid any attention 00:18:42.526 --> 00:18:45.206 align:middle to evidence at all which was clearly absurd. 00:18:45.206 --> 00:18:46.666 align:middle >> [Drummond Rennie:] Well of course it's absurd. 00:18:46.666 --> 00:18:53.066 align:middle I mean Tom Chalmers meta-analysis was a perfect example of paying great attention 00:18:53.066 --> 00:18:56.816 align:middle to what had gone on before far more actually 00:18:56.816 --> 00:18:59.766 align:middle than people who'd just looked at the latest trial. 00:19:01.126 --> 00:19:03.296 align:middle >> [Richard Smith:] So a question I want to ask you Brian, I mean do you, 00:19:03.296 --> 00:19:07.126 align:middle there's a sense that actually this is just really hard 00:19:07.126 --> 00:19:12.616 align:middle to do this evidence based medicine especially the way it came about in its first incarnation, 00:19:12.616 --> 00:19:17.666 align:middle you know, every patient you're going to kind of respond to by go and search the literature, 00:19:17.666 --> 00:19:21.296 align:middle etc. Was that, that was never going to work was it? 00:19:21.356 --> 00:19:25.366 align:middle >> [Brian Haynes:] That was definitely part of this antibody response that people had 00:19:26.156 --> 00:19:28.936 align:middle to evidence based medicine that it was too much work. 00:19:30.246 --> 00:19:34.806 align:middle That even if we agreed with you in principle and practice there's no way we could possibly do it. 00:19:35.556 --> 00:19:38.416 align:middle And right from the start we started to figure out other ways 00:19:38.416 --> 00:19:40.036 align:middle that we could simplify this process. 00:19:40.096 --> 00:19:45.266 align:middle Can we provide a way of defining which studies are more important? 00:19:46.066 --> 00:19:53.146 align:middle Can we provide those as information inputs to textbooks to journals to guidelines 00:19:53.286 --> 00:19:57.216 align:middle to decision aids, computerized decisions [inaudible] and so on. 00:19:57.296 --> 00:20:02.456 align:middle So, I spent, that's why I created the health information research unit for, 00:20:02.896 --> 00:20:07.936 align:middle to try to figure out how to make this easy enough that people could actually do it. 00:20:08.846 --> 00:20:10.966 align:middle And we're still in that evolution. 00:20:11.436 --> 00:20:12.716 align:middle It's not perfected. 00:20:12.756 --> 00:20:15.426 align:middle But the resources that are available now so that people don't have 00:20:15.426 --> 00:20:18.486 align:middle to do the primary critical appraisal themselves are much, 00:20:18.486 --> 00:20:20.586 align:middle much better than they ever have been before. 00:20:21.016 --> 00:20:23.036 align:middle Whether that will be enough to move to the next step 00:20:23.086 --> 00:20:26.506 align:middle where that can actually change their behavior in a timely fashion, 00:20:27.186 --> 00:20:31.916 align:middle that's the forefront right now, the frontier which we're trying to deal 00:20:32.006 --> 00:20:36.196 align:middle with in research endeavors called knowledge translation, 00:20:36.196 --> 00:20:40.796 align:middle implementation science compared to effective research and so on. 00:20:40.796 --> 00:20:43.106 align:middle >> [Richard Smith:] Gordon you're teaching evidence medicine 00:20:43.106 --> 00:20:45.046 align:middle and you've kind of changed the way you teach it? 00:20:45.226 --> 00:20:50.436 align:middle >> [Gordon Guyatt:] We haven't changed the way we teach it but our targets have changed. 00:20:50.766 --> 00:20:57.046 align:middle So when I took over the internal medicine residency program in 1990, I had it in my head 00:20:57.046 --> 00:21:01.016 align:middle that we were at the end of residency training with us. 00:21:01.096 --> 00:21:05.516 align:middle Individuals would be able to pick up a randomized trial, critically appraise it, 00:21:05.866 --> 00:21:10.146 align:middle decide, assess it, understand the results very well. 00:21:10.306 --> 00:21:14.246 align:middle And they'd be able to do that for studies of prognosis or diagnosis 00:21:14.246 --> 00:21:17.056 align:middle or systematic reviews and meta-analysis. 00:21:17.136 --> 00:21:22.516 align:middle And at the end of 7 years of residency training I have learned that very few 00:21:22.516 --> 00:21:25.916 align:middle of my graduates were actually able to do that. 00:21:26.336 --> 00:21:29.326 align:middle And so the target became different. 00:21:29.596 --> 00:21:36.896 align:middle The target became to have people appreciate the principles of what makes trustworthy evidence 00:21:37.206 --> 00:21:42.606 align:middle and not trustworthy evidence and be able to go to secondary sources of information 00:21:42.926 --> 00:21:47.406 align:middle that produced summaries of evidence that were trustworthy 00:21:47.566 --> 00:21:52.616 align:middle and identified the confidence that one could put into those. 00:21:52.616 --> 00:22:00.386 align:middle And we published in I believe 2002 a paper in the BMJ in which we identified 00:22:00.386 --> 00:22:03.476 align:middle that everybody doesn't have, every clinician 00:22:03.636 --> 00:22:07.516 align:middle who is practicing evidence based medicine doesn't have to be an evidence base expert. 00:22:07.786 --> 00:22:13.406 align:middle They need to understand the principles and they need to be identified the appropriate resources 00:22:13.586 --> 00:22:17.906 align:middle that prevent, that present processed evidence to them in a way that they can apply it 00:22:17.906 --> 00:22:22.596 align:middle to clinical care understanding the underlying confidence in the evidence. 00:22:22.686 --> 00:22:29.716 align:middle And some of my colleagues as we were talking about this on email beforehand referred to this 00:22:29.716 --> 00:22:36.316 align:middle as evidence based capitulation but in fact it is a realistic way 00:22:36.546 --> 00:22:41.826 align:middle and that is what evidence based practice means for most clinicians. 00:22:41.826 --> 00:22:49.216 align:middle And as Brian points out, the strategy is to get to the highest level of easily accessed, 00:22:49.296 --> 00:22:52.236 align:middle easily understood preprocessed evidence. 00:22:52.746 --> 00:22:55.606 align:middle >> [Richard Smith:] Teddy, you were telling us what I think was a very interesting story 00:22:55.606 --> 00:22:58.286 align:middle about your experience of teaching medical students. 00:22:58.286 --> 00:23:01.506 align:middle I don't whether, you know, finding kind of hard. 00:23:01.626 --> 00:23:02.876 align:middle Of course you could tell that story. 00:23:02.876 --> 00:23:04.346 align:middle I think it'll interest people. 00:23:04.796 --> 00:23:06.436 align:middle >> [Kay Dickersin:] That was another interesting story 00:23:06.436 --> 00:23:08.816 align:middle where I was teaching a medical school class. 00:23:08.876 --> 00:23:15.536 align:middle I taught medical school for 16 years and evidence based health care. 00:23:15.696 --> 00:23:22.996 align:middle And we had gone through in the class screening, epidemiology, everything you need to know 00:23:22.996 --> 00:23:25.356 align:middle about being able to critically assess the literature. 00:23:25.356 --> 00:23:28.656 align:middle And so we had an empty spot, I forget why. 00:23:28.656 --> 00:23:33.056 align:middle And it was right around the time the breast cancer, the mammography guidelines came 00:23:33.056 --> 00:23:37.676 align:middle out for younger women and saying that really there wasn't enough evidence 00:23:37.676 --> 00:23:39.386 align:middle for younger women to have mammography. 00:23:39.436 --> 00:23:45.076 align:middle So we had a debate that day and I brought in the head of the breast service from the hospital 00:23:45.076 --> 00:23:47.676 align:middle at the university where I was teaching and the President 00:23:47.676 --> 00:23:51.296 align:middle of the National Breast Cancer Coalition who was very evidence based. 00:23:51.716 --> 00:23:56.016 align:middle The head of the breast service however, who came in, he was wearing a white coat, a bowtie, 00:23:56.016 --> 00:24:00.506 align:middle he kept referring to Harvard where he had been before, had a stethoscope 00:24:01.096 --> 00:24:05.026 align:middle and it really established his authority as a doctor. 00:24:05.026 --> 00:24:07.526 align:middle And anyway, so I sat in the front row. 00:24:07.526 --> 00:24:12.156 align:middle I couldn't see the audience behind me but the whole time I was thinking this was so fantastic, 00:24:12.156 --> 00:24:16.146 align:middle everything the two of them were saying was just what the students had learned. 00:24:16.556 --> 00:24:22.606 align:middle The consumer advocate was right on the money with evidence and, you know, lead time bias, 00:24:22.606 --> 00:24:27.016 align:middle etc. and he was saying well we know mammography is useful because before I was head 00:24:27.016 --> 00:24:29.226 align:middle of the breast service all these women died. 00:24:29.226 --> 00:24:32.816 align:middle And after I came in and instituted mammography everybody lived. 00:24:33.286 --> 00:24:35.106 align:middle And I say okay, they've got to get this. 00:24:35.426 --> 00:24:38.716 align:middle And then it was clear when it was finished that he had won the debate. 00:24:39.096 --> 00:24:42.256 align:middle And what I thought was perfect was wrong. 00:24:42.256 --> 00:24:47.446 align:middle And there was a historian also there in the front row and she said to me and it woke me up, 00:24:47.906 --> 00:24:51.256 align:middle she said this just shows the importance of authority. 00:24:51.566 --> 00:24:56.576 align:middle They each established their authority in different ways and his was successful. 00:24:56.576 --> 00:25:01.516 align:middle He said I'm head of the breast service, he had on a white coat, I went to Harvard, I, 00:25:01.516 --> 00:25:03.436 align:middle you know, he had a lot of Harvard stories. 00:25:03.826 --> 00:25:09.246 align:middle And she came in and was a lawyer and a consumer advocate and did not establish her authority 00:25:09.246 --> 00:25:12.016 align:middle in a way that made sense to the medical students. 00:25:12.416 --> 00:25:16.986 align:middle And so that's one of the reasons I've raised issues about experts and so forth. 00:25:16.986 --> 00:25:21.066 align:middle I become very interested in authority and transmission of the evidence. 00:25:21.066 --> 00:25:23.126 align:middle >> [Richard Smith:] Did you not have any mini Brian Haynes 00:25:23.126 --> 00:25:25.546 align:middle in the audience, the challenges guy? 00:25:26.386 --> 00:25:28.066 align:middle >> [Kay Dickersin:] No we didn't actually. 00:25:28.256 --> 00:25:29.856 align:middle The students were quite good. 00:25:29.856 --> 00:25:33.016 align:middle As a matter of fact there were a bunch of advocates there and I heard from one 00:25:33.016 --> 00:25:36.946 align:middle of them sitting behind me that one of the students was doing texting or whatever. 00:25:37.386 --> 00:25:40.906 align:middle And she went, pay attention, what's being said is correct. 00:25:41.566 --> 00:25:45.216 align:middle And so there were people in the audience who were trying to get them to listen 00:25:45.216 --> 00:25:48.356 align:middle to the good evidence but didn't work. 00:25:48.906 --> 00:25:51.156 align:middle >> [Richard Smith:] So does that story make you despair Paul? 00:25:51.736 --> 00:25:53.886 align:middle >> [Paul Glasziou:] It is slightly despairing. 00:25:53.886 --> 00:25:58.086 align:middle I think we need more people like Brian who got to the Freudian and ask what the evidence is. 00:25:58.726 --> 00:26:00.906 align:middle But what we know, but Brian grew up that way. 00:26:00.906 --> 00:26:03.086 align:middle I don't know what is it that made him that way. 00:26:03.606 --> 00:26:07.796 align:middle We need to know that though because [inaudible] we need all of Teddy's class 00:26:07.796 --> 00:26:11.326 align:middle to be more skeptical and to think about the rules 00:26:11.326 --> 00:26:14.476 align:middle of evidence rather than being swayed by authority. 00:26:14.906 --> 00:26:16.926 align:middle And realizing that that's what persuaded them. 00:26:16.926 --> 00:26:20.476 align:middle In fact it would have been good to have extended that class and get them to think 00:26:20.476 --> 00:26:23.606 align:middle about what was it that persuaded them one way or other? 00:26:23.606 --> 00:26:26.536 align:middle Was it an evidence thing or was it something to do with authority? 00:26:27.446 --> 00:26:30.246 align:middle And we don't know how to do that at the moment, at least I don't. 00:26:30.246 --> 00:26:34.916 align:middle >> I have an alternative solution to the problem which I'm trying to work on which is 00:26:34.916 --> 00:26:37.716 align:middle to get all the authorities to be evidence based. 00:26:37.896 --> 00:26:41.816 align:middle And I'm not being, I'm not being in the least facetious. 00:26:41.866 --> 00:26:47.606 align:middle So what we're trying to do now is for instance, educate all the folks who are guideline panels. 00:26:47.846 --> 00:26:51.696 align:middle And that's a big initiative within evidence based medicine. 00:26:52.136 --> 00:26:56.276 align:middle >> [Paul Glasziou:] And I think at the same time unless you've got an educated public both 00:26:56.276 --> 00:27:00.806 align:middle in terms of the clinicians and perhaps the wider public as well, they won't appreciate 00:27:00.806 --> 00:27:02.906 align:middle that they need evidence based authority. 00:27:02.906 --> 00:27:04.486 align:middle >> I completely agree that we... 00:27:04.486 --> 00:27:06.456 align:middle >> [Richard Smith:] So wait a minute, how are you going to educate the public 00:27:06.456 --> 00:27:08.286 align:middle when you can't educate Harvard professors? 00:27:08.706 --> 00:27:13.286 align:middle >> [Paul Glasziou:] I'm not, I am actually optimistic about this Richard because when I was 00:27:13.286 --> 00:27:15.146 align:middle in Oxford I was getting depressed. 00:27:15.146 --> 00:27:17.286 align:middle I thought the evidence based movement had gotten nowhere. 00:27:17.286 --> 00:27:23.066 align:middle And Tim Lancaster who's one of the medical teachers there and who ran the tobacco 00:27:23.066 --> 00:27:28.066 align:middle and addiction Cochran review group said to me look Paul, the conversation has changed. 00:27:28.446 --> 00:27:32.616 align:middle Since the term evidence based medicine was invented there's been this big shift. 00:27:32.616 --> 00:27:37.746 align:middle You now can't talk of a policy issue like that without bringing in the evidence. 00:27:38.036 --> 00:27:41.736 align:middle Whereas it was conceivable before that that you could have the conversation 00:27:41.736 --> 00:27:43.586 align:middle without the who said what. 00:27:44.376 --> 00:27:46.616 align:middle So the conversation has certainly changed. 00:27:46.616 --> 00:27:48.036 align:middle There's been a big shift there. 00:27:48.666 --> 00:27:51.636 align:middle I think that's a sort of superficial way though if you're liking 00:27:51.636 --> 00:27:56.066 align:middle that we actually need a deeper understanding and skepticism built into our medical students 00:27:56.436 --> 00:27:59.536 align:middle so that they don't obey the white coat from Harvard. 00:27:59.986 --> 00:28:01.226 align:middle >> [Richard Smith:] So let me share two things 00:28:01.226 --> 00:28:03.786 align:middle that really bother me these days about evidence based medicine. 00:28:03.786 --> 00:28:08.296 align:middle One is that increasingly healthcare is about people 00:28:08.296 --> 00:28:11.046 align:middle with multiple problems, not single problems. 00:28:11.046 --> 00:28:15.546 align:middle And yet the evidence is often not very good about those people with multiple problems. 00:28:15.896 --> 00:28:20.546 align:middle Plus we realize as Teddy was saying evermore that so much 00:28:20.546 --> 00:28:22.626 align:middle of the evidence just isn't published. 00:28:22.626 --> 00:28:24.916 align:middle And what is published is probably biased. 00:28:25.356 --> 00:28:30.986 align:middle So we're trying to apply biased irrelevant evidence to highly complex problems. 00:28:30.986 --> 00:28:32.846 align:middle And this seems a little tricky to me. 00:28:33.396 --> 00:28:38.616 align:middle >> [Paul Glasziou:] As a general practitioner I think I'll have an attempt at that. 00:28:38.616 --> 00:28:43.636 align:middle First of all I think you're right that some patients that are excluded from trials based 00:28:43.636 --> 00:28:46.326 align:middle on comorbidities but not the common comorbidities. 00:28:46.326 --> 00:28:51.966 align:middle If you look at the statins or the hypertensive agent trials, the asthmatics, the depressives, 00:28:51.966 --> 00:28:57.386 align:middle etc. people with various conditions were generally included in those trials. 00:28:57.626 --> 00:29:01.096 align:middle A few things are excluded but not most of the comorbidities. 00:29:01.486 --> 00:29:08.146 align:middle So I think that's a misconception that we can find trials to people with a single condition. 00:29:08.856 --> 00:29:11.036 align:middle So I don't think the evidence is a problem in that case. 00:29:11.546 --> 00:29:17.716 align:middle I think where the problem comes in is trying to deal with patients with multiple conditions 00:29:18.326 --> 00:29:22.146 align:middle and a lot of it's about not the applicability but the prioritization. 00:29:22.826 --> 00:29:26.686 align:middle There's a lovely paper in JAMA by Cynthia Boyd where she got a typical, 00:29:26.726 --> 00:29:31.076 align:middle I think it was a 55 year old woman, with what seemed like reasonable set of conditions. 00:29:31.076 --> 00:29:35.886 align:middle And said well, what would the guidelines recommend for this particular patient? 00:29:35.886 --> 00:29:38.116 align:middle And you come up with this very extensive list 00:29:38.116 --> 00:29:40.566 align:middle that both the clinician and the patient had to do. 00:29:40.976 --> 00:29:43.566 align:middle You read the list and you say this is not possible. 00:29:43.566 --> 00:29:45.456 align:middle You can't deal with all of these things. 00:29:46.826 --> 00:29:48.206 align:middle So two problems there. 00:29:48.206 --> 00:29:52.736 align:middle One is that the guidelines themselves often focus on single conditions 00:29:53.936 --> 00:29:57.846 align:middle and that's makes it very difficult for me as a general practitioner dealing 00:29:57.846 --> 00:30:01.696 align:middle with somebody whose actually got five conditions to prioritize things. 00:30:02.276 --> 00:30:04.906 align:middle The second problem is how do we go about that prioritization? 00:30:04.906 --> 00:30:07.476 align:middle Clearly we can't listen to all of those guidelines. 00:30:07.806 --> 00:30:09.916 align:middle We have to have the conversation with the patient 00:30:09.946 --> 00:30:12.836 align:middle about what's the important thing to them? 00:30:13.376 --> 00:30:14.676 align:middle What are their prioritites? 00:30:14.676 --> 00:30:15.506 align:middle Get an idea of that. 00:30:15.506 --> 00:30:18.026 align:middle And then think about well how would the evidence that I know 00:30:18.336 --> 00:30:21.306 align:middle about how I can manage each of those things help? 00:30:22.256 --> 00:30:25.926 align:middle And that's part of this shared decision making process to which I think you're strengthening 00:30:25.926 --> 00:30:29.896 align:middle within evidence based medicine is a very strong dialogue going 00:30:29.896 --> 00:30:34.306 align:middle on between the evidence based medicine group and the shared decision making groups 00:30:34.306 --> 00:30:38.436 align:middle with Victor Montori, one of Gordon's students sort of leading the way there. 00:30:38.436 --> 00:30:41.346 align:middle >> [Richard Smith:] Yeah we're going to come on to the future in a minute. 00:30:41.666 --> 00:30:45.446 align:middle Teddy, what about this problem of just a lot of the evidence just isn't there? 00:30:47.176 --> 00:30:49.016 align:middle >> [Kay Dickersin:] Well you're absolutely right. 00:30:49.016 --> 00:30:54.126 align:middle There's failure to publish and so we don't know where it's sitting. 00:30:54.126 --> 00:30:59.376 align:middle It could be sitting in somebody's file drawer, it could be a decision, 00:30:59.376 --> 00:31:01.426 align:middle a conscious decision not to publish. 00:31:01.896 --> 00:31:04.206 align:middle There could be outcomes that are missing. 00:31:04.206 --> 00:31:09.406 align:middle For example perhaps just the outcomes that made the drug 00:31:09.406 --> 00:31:11.636 align:middle or the intervention look positive were published. 00:31:11.766 --> 00:31:14.086 align:middle So there's a lot we don't know about. 00:31:14.086 --> 00:31:18.526 align:middle And I think it's increasingly worrisome all that's sitting 00:31:18.526 --> 00:31:20.306 align:middle in file drawers and not published. 00:31:20.306 --> 00:31:25.416 align:middle And so it does raise the question what is the evidence? 00:31:25.416 --> 00:31:27.406 align:middle Is there some evidence we don't know about? 00:31:27.406 --> 00:31:29.566 align:middle Can we rely on what we do know about? 00:31:30.046 --> 00:31:32.876 align:middle And those two things make it worrisome, no question about it. 00:31:33.246 --> 00:31:34.126 align:middle >> [Richard Smith:] So that's the problem. 00:31:34.126 --> 00:31:34.876 align:middle What's the solution? 00:31:35.446 --> 00:31:38.806 align:middle >> [Drummond Rennie:] Take the money out of the system. 00:31:39.356 --> 00:31:46.176 align:middle Give a contract to Northwestern and say okay you win this contract to test drug A against drug B. 00:31:46.536 --> 00:31:52.416 align:middle By the way that might actually be a relevant and interesting test to do. 00:31:53.856 --> 00:31:58.006 align:middle You have to get the money out of the system to make the system credible. 00:31:58.296 --> 00:32:03.596 align:middle Because the system is incredible in the one use of that term. 00:32:05.246 --> 00:32:06.466 align:middle >> [Richard Smith:] Is this going to happen Drummond? 00:32:06.956 --> 00:32:07.836 align:middle >> [Drummond Rennie:] I don't know. 00:32:08.146 --> 00:32:16.786 align:middle The other thing is you have Cochrane has to go on for example because one of the chief things 00:32:16.856 --> 00:32:19.966 align:middle that Cochrane has shown doing in a systematic 00:32:19.966 --> 00:32:25.526 align:middle and enormous way is how many conditions there is no evidence whatsoever 00:32:26.356 --> 00:32:29.186 align:middle for any treatment either way. 00:32:30.286 --> 00:32:33.566 align:middle Or there's no evidence for any effective treatment. 00:32:33.746 --> 00:32:36.586 align:middle Now that may be depressing but it's realistic. 00:32:37.656 --> 00:32:42.036 align:middle And that should be an enormous opportunity for everybody out here in this room. 00:32:42.036 --> 00:32:42.286 align:middle >> [Richard Smith:] Okay. 00:32:43.246 --> 00:32:44.726 align:middle Paul you wanted to make a point. 00:32:44.726 --> 00:32:48.056 align:middle >> [Paul Glasziou:] Well I was going to say I like Drummond's idea. 00:32:48.286 --> 00:32:50.756 align:middle It's happening to a small extent in the UK 00:32:50.756 --> 00:32:53.906 align:middle within the National Institutes of Health Research program. 00:32:54.436 --> 00:32:59.916 align:middle So the health technology assessment program that commissions trails are important uncertainties 00:32:59.916 --> 00:33:04.086 align:middle that come up through Cochrane reviews or through the [inaudible] guidelines process. 00:33:04.526 --> 00:33:07.896 align:middle They've got about 100 million pounds a year being spent on this. 00:33:07.896 --> 00:33:09.176 align:middle They commission the trial. 00:33:09.546 --> 00:33:12.306 align:middle This has been going on for about 15 years now I think. 00:33:12.946 --> 00:33:16.986 align:middle Their publication rate at the moment stands at 98 percent. 00:33:17.836 --> 00:33:20.996 align:middle If anyone can beat that I'd dearly like to know about it. 00:33:21.506 --> 00:33:24.006 align:middle But they have a number of tricks in doing that. 00:33:24.006 --> 00:33:26.236 align:middle One is to have a journal that everyone can publish 00:33:26.236 --> 00:33:29.046 align:middle in the HTA program, the technology reports. 00:33:29.456 --> 00:33:31.596 align:middle They also withhold 10 percent of the funding 00:33:31.596 --> 00:33:34.886 align:middle until you've actually not just published something or given in the report 00:33:34.946 --> 00:33:36.996 align:middle but actually made it publically available. 00:33:37.416 --> 00:33:38.986 align:middle You can't get that last 10 percent. 00:33:39.506 --> 00:33:42.896 align:middle But Rory Mill [assumed spelling] who runs that program tells me that actually it's not, 00:33:43.236 --> 00:33:47.626 align:middle that's a nice stick or carrot if you like to have to hold people. 00:33:47.686 --> 00:33:51.526 align:middle But it's actually then checking with people and problem solving. 00:33:51.526 --> 00:33:54.326 align:middle And going, you know, why haven't you published so far? 00:33:54.886 --> 00:33:59.056 align:middle You've got the carrot and stick there but it's actually monitoring it and making sure 00:33:59.056 --> 00:34:01.326 align:middle that it actually gets published which makes the big difference. 00:34:01.326 --> 00:34:05.186 align:middle And at the moment most funders don't, they give the money out and then they don't seem to care 00:34:05.186 --> 00:34:07.176 align:middle that the main results get published. 00:34:08.016 --> 00:34:12.256 align:middle It's not just pharmaceutical companies sitting on this stuff and hiding it, 00:34:12.666 --> 00:34:18.376 align:middle it's also as Teddy was saying, other people losing interest in their trial 00:34:18.376 --> 00:34:20.786 align:middle after they've finished it and then not publishing. 00:34:20.866 --> 00:34:23.626 align:middle So it happens just almost as much as in the public sector 00:34:23.756 --> 00:34:25.406 align:middle as it does in the commercial sector. 00:34:25.816 --> 00:34:27.476 align:middle >> [Richard Smith:] Good well before we come to you, 00:34:27.476 --> 00:34:32.316 align:middle we're just going to have a little conversation about the future and I asked both Iain Chalmers 00:34:32.316 --> 00:34:37.976 align:middle and Muir Gray to look forward and think what was important to them about the future. 00:34:38.076 --> 00:34:42.336 align:middle So let's watch those clips and then I'll ask the panel to say something 00:34:42.336 --> 00:34:43.756 align:middle about what they see in the future. 00:34:48.456 --> 00:34:54.906 align:middle >> [ Iain Chalmers:] I still think that it's going to be important for it to promote 00:34:55.646 --> 00:34:58.306 align:middle that idea of the need to find out what we know already 00:34:58.446 --> 00:35:00.946 align:middle because that's actually where I started out, in Gaza. 00:35:00.946 --> 00:35:04.226 align:middle What was known already was not available to me. 00:35:04.226 --> 00:35:06.066 align:middle And as a consequent my patients suffered. 00:35:06.186 --> 00:35:09.066 align:middle So I come full circle around to that. 00:35:09.066 --> 00:35:10.896 align:middle And we haven't got anywhere near that yet. 00:35:11.256 --> 00:35:13.896 align:middle Things have improved but it's still a long way away. 00:35:18.246 --> 00:35:22.226 align:middle >> [Muir Gray:] The paradigm has shifted now and the first big job is over. 00:35:22.226 --> 00:35:25.526 align:middle That the people know when they make a proposition 00:35:25.526 --> 00:35:28.436 align:middle and the fancy philosophers [inaudible] they make a statement. 00:35:28.996 --> 00:35:30.336 align:middle What is the base of that statement? 00:35:30.476 --> 00:35:31.656 align:middle Is it based on evidence? 00:35:31.656 --> 00:35:35.276 align:middle And if so what evidence and how secure are you in that. 00:35:35.276 --> 00:35:39.186 align:middle So we are now in a world which people are clear 00:35:39.186 --> 00:35:42.736 align:middle that there's knowledge has a quality as well as a quantity. 00:35:42.736 --> 00:35:46.796 align:middle So they and we can't relax and what needs to go 00:35:46.796 --> 00:35:49.416 align:middle on to make the knowledge better and that's one thing. 00:35:50.126 --> 00:35:57.766 align:middle But I still, I think, the evidence based medicine there is also a problem in healthcare 00:35:57.766 --> 00:36:02.476 align:middle that people have quite a limited attention span, I mean, clever people, powerful people. 00:36:02.556 --> 00:36:04.986 align:middle So they think oh we've done evidence based medicine. 00:36:05.096 --> 00:36:13.806 align:middle So I think the term personalized now is a term I'm using more because actually 00:36:13.806 --> 00:36:19.426 align:middle in reading the original definition we put in the BMJ, you know, it's not only evidence, 00:36:19.426 --> 00:36:23.916 align:middle it's the thoughtful, more thoughtful identification and compassionate use 00:36:23.916 --> 00:36:27.636 align:middle of individual patient's predicaments, rights and preferences in making a clinical decision. 00:36:27.636 --> 00:36:28.396 align:middle >> [Richard Smith:] And genes. 00:36:28.536 --> 00:36:28.836 align:middle >> [Muir Gray:] Yeah. 00:36:28.876 --> 00:36:31.086 align:middle Now we come what's called stratified medicine. 00:36:31.086 --> 00:36:36.716 align:middle But the same thing is what is the evidence that you're claiming [inaudible]. 00:36:36.716 --> 00:36:38.566 align:middle Now I'm still on the case. 00:36:38.766 --> 00:36:39.566 align:middle We're not finished yet. 00:36:39.566 --> 00:36:44.036 align:middle It's very clear now the, this is the century of the patient. 00:36:44.756 --> 00:36:46.896 align:middle The last century was the century of the doctor. 00:36:46.896 --> 00:36:48.296 align:middle This is the century of the patient. 00:36:48.836 --> 00:36:50.636 align:middle And we now have the knowledge. 00:36:51.096 --> 00:36:54.416 align:middle We now have the technology to deliver it. 00:36:55.066 --> 00:36:56.446 align:middle >> [Richard Smith:] So Gordon, what do you see for the future? 00:36:56.446 --> 00:36:59.456 align:middle There's Iain saying we still don't know what we know. 00:36:59.766 --> 00:37:04.156 align:middle And there's Muir saying this is the century of the patient, forget doctors. 00:37:04.456 --> 00:37:09.286 align:middle >> [Gordon Guyatt:] Well there's in terms of summarizing evidence the industry goes on. 00:37:09.676 --> 00:37:14.506 align:middle Cochrane is, they may eventually get to its goal of all the randomized trials 00:37:14.506 --> 00:37:16.216 align:middle and move onto other sorts of studies. 00:37:16.526 --> 00:37:19.486 align:middle So I think we've gone a long way as far as that's concerned. 00:37:19.856 --> 00:37:24.436 align:middle The challenge now is in terms of getting the information out to people 00:37:24.736 --> 00:37:30.386 align:middle and in being more effective as we've talked about with preprocessed information. 00:37:30.546 --> 00:37:31.616 align:middle >> [Richard Smith:] And you feel optimistic? 00:37:31.826 --> 00:37:34.746 align:middle >> [Gordon Guyatt:] Oh I feel extremely optimistic. 00:37:34.826 --> 00:37:41.376 align:middle The, as Muir Gray said at the end, we now have the technology, we have the knowledge 00:37:41.636 --> 00:37:45.116 align:middle and we've learned a lot very recently. 00:37:45.376 --> 00:37:49.746 align:middle I was at the guidelines international meeting with [inaudible] and his colleagues 00:37:50.026 --> 00:37:55.656 align:middle where we were sitting down with the people who know about the evidence, the clinicians who know 00:37:55.656 --> 00:38:00.996 align:middle about the flow, the information technology people who know how to do, 00:38:00.996 --> 00:38:06.056 align:middle get the electronic systems and the designers who know how to make it work for patients. 00:38:06.096 --> 00:38:10.326 align:middle And, and as I say where before we just would produce these things. 00:38:10.326 --> 00:38:11.316 align:middle They looked good to us. 00:38:11.616 --> 00:38:18.176 align:middle We're now doing tons of user testing to insure that in fact the products we put 00:38:18.296 --> 00:38:20.496 align:middle out are presented in a way that people can use them. 00:38:20.496 --> 00:38:21.726 align:middle So I'm very optimistic. 00:38:21.936 --> 00:38:23.046 align:middle >> [Richard Smith:] Okay, Teddy. 00:38:24.136 --> 00:38:26.936 align:middle >> [Kay Dickersin:] Well I have some wishes 00:38:26.936 --> 00:38:30.616 align:middle for how the future would go that may be a little far out. 00:38:30.686 --> 00:38:34.026 align:middle So I'll give the far out ones first because I really do wish they'd help. 00:38:34.096 --> 00:38:37.906 align:middle First of all I think we should change the academic reward system. 00:38:37.906 --> 00:38:41.876 align:middle Because I think the way it is now we're publishing a lot of junk. 00:38:41.906 --> 00:38:44.196 align:middle People are doing studies that aren't very good. 00:38:44.646 --> 00:38:48.856 align:middle And they're all sorts of rewards in there that keep us 00:38:48.976 --> 00:38:52.426 align:middle from being evidence based in what we produce and use. 00:38:53.286 --> 00:38:58.866 align:middle The second thing I'd like to see done is the medical literature cleaned up and fewer journals 00:38:59.316 --> 00:39:02.266 align:middle and publishing only good research. 00:39:02.476 --> 00:39:07.036 align:middle And Drummond thinks I'm crazy because you don't have good research unless you have bad research. 00:39:07.036 --> 00:39:09.696 align:middle But I'd like to have just good research published. 00:39:10.286 --> 00:39:15.726 align:middle And then the third thing which I think is more doable is it's related 00:39:15.726 --> 00:39:17.486 align:middle to learning more about safety. 00:39:17.486 --> 00:39:20.416 align:middle I don't think we've been very good about learning about safety 00:39:20.416 --> 00:39:22.156 align:middle and the harms of our interventions. 00:39:22.296 --> 00:39:26.876 align:middle And one of the reasons I don't have confidence in the harms literature is 00:39:26.966 --> 00:39:31.546 align:middle that we haven't done all the methodologic research that all of us and all of you 00:39:31.546 --> 00:39:36.106 align:middle and others in the world have done to say what's a good trial, 00:39:36.106 --> 00:39:38.916 align:middle what's reporting bias, how do you find a good trail? 00:39:39.466 --> 00:39:43.666 align:middle We haven't done any of that work really for observational studies. 00:39:43.736 --> 00:39:46.016 align:middle Brian's done a little bit about how to find it. 00:39:46.456 --> 00:39:52.646 align:middle But there's far less energy going into observational research about harms for example. 00:39:53.116 --> 00:39:56.606 align:middle And I'd like to see a real investment there because it doesn't make sense 00:39:56.676 --> 00:40:00.296 align:middle that we're just looking at effectiveness and not harms with the same emphasis. 00:40:00.396 --> 00:40:01.716 align:middle >> [Richard Smith:] We're only half the pictures. 00:40:01.716 --> 00:40:02.966 align:middle There's a challenge to people there. 00:40:03.096 --> 00:40:03.616 align:middle So Drummond? 00:40:04.676 --> 00:40:06.866 align:middle >> [Drummond Rennie:] How it works is like this. 00:40:07.436 --> 00:40:13.146 align:middle What should happen is that we only published really good stuff. 00:40:14.376 --> 00:40:20.526 align:middle But the really good stuff you can show on economic principles, [inaudible] and others, 00:40:21.256 --> 00:40:30.276 align:middle that it cannot exist without this vast mass of other not quite 00:40:30.276 --> 00:40:38.376 align:middle such good scientists doing not quite such good work forming, making the community 00:40:38.376 --> 00:40:42.136 align:middle of science beavering away at that. 00:40:42.486 --> 00:40:48.056 align:middle And then the system is full of noise and the noise you're hearing is 00:40:48.056 --> 00:40:53.206 align:middle of scientists working together, education themselves and trying to get better. 00:40:54.116 --> 00:40:58.276 align:middle I think that's the better way of looking at it as opposed to saying 00:40:58.276 --> 00:41:02.026 align:middle as I used to say, who wrote this junk? 00:41:02.536 --> 00:41:06.056 align:middle >> [Richard Smith:] Hurrah for the mediocre. 00:41:07.426 --> 00:41:08.926 align:middle So, Brain? 00:41:09.476 --> 00:41:14.766 align:middle >> [Brian Haynes:] Well my mission for the last 10, 15 years has been to try to provide a cool, 00:41:15.396 --> 00:41:21.706 align:middle clear stream of high quality evidence that can be used by various resources to try to make sure 00:41:21.706 --> 00:41:25.776 align:middle that they're feeding on work that they don't have to create themselves. 00:41:25.886 --> 00:41:27.786 align:middle There's no point in everybody trying to do this. 00:41:27.856 --> 00:41:30.226 align:middle So we feed into textbooks. 00:41:30.226 --> 00:41:32.476 align:middle We feed into guidelines. 00:41:32.526 --> 00:41:36.366 align:middle We feed into systematic reviews, clinical decision aids. 00:41:37.166 --> 00:41:41.316 align:middle Try to just make it so cheap to be able to access current best evidence 00:41:41.426 --> 00:41:44.576 align:middle that people will have that in hand when they try to make their resources. 00:41:45.686 --> 00:41:46.546 align:middle I think that's not enough. 00:41:47.086 --> 00:41:51.336 align:middle And I think that the next phase of things has to deal with some knotty issues. 00:41:51.476 --> 00:41:52.756 align:middle >> [Richard Smith:] Naughty not noty? 00:41:53.126 --> 00:41:57.256 align:middle >> [Brian Haynes:] Not naughty, k n o t t y issues. 00:41:58.126 --> 00:42:00.126 align:middle They might be naughty as well. 00:42:01.126 --> 00:42:02.626 align:middle >> [Richard Smith:] Well then [inaudible] attention to them. 00:42:02.626 --> 00:42:08.596 align:middle >> Brain: Related to the fact that there is no teeth in continuing education programs 00:42:08.596 --> 00:42:12.546 align:middle around the world so that practitioners don't have to learn new things or can't figure 00:42:12.546 --> 00:42:18.846 align:middle out ways to do new things after medical school which is why we still have a generation effect 00:42:18.846 --> 00:42:20.306 align:middle in new evidence getting into practice. 00:42:20.306 --> 00:42:22.326 align:middle We have to wait for the old doctors to die off. 00:42:23.566 --> 00:42:27.656 align:middle And second we have a problem at the patient level with behavior change 00:42:27.716 --> 00:42:31.136 align:middle because if they're going to take advantage of current best treatments, 00:42:31.676 --> 00:42:33.686 align:middle they need to change the way that they're doing things. 00:42:33.686 --> 00:42:37.866 align:middle And we don't, we need better ways to help them do that than we have at the present time. 00:42:37.866 --> 00:42:41.786 align:middle So I see a big research agenda here if we're ever going to have the great traction we want. 00:42:42.436 --> 00:42:46.016 align:middle The areas are called knowledge translation research, implementation science, 00:42:47.096 --> 00:42:49.436 align:middle those are the areas we need to invest in now. 00:42:49.436 --> 00:42:55.016 align:middle And we're starting to see the payoff from lobbying for that through the funding agencies. 00:42:55.016 --> 00:42:57.176 align:middle We're now getting much more funds into those areas. 00:42:57.206 --> 00:43:01.106 align:middle We've got a new crop of researchers who have been trained in how 00:43:01.106 --> 00:43:03.056 align:middle to do that kind of complex research. 00:43:03.746 --> 00:43:06.496 align:middle So unfortunately though we'll have to wait for another era 00:43:06.586 --> 00:43:10.076 align:middle to pass before we get that figured out. 00:43:10.076 --> 00:43:12.946 align:middle >> [Richard Smith:] Where there's death, there's hope. 00:43:12.946 --> 00:43:16.316 align:middle >> [Brian Haynes:] Right, so [inaudible] are in place but there's still work to be done in terms 00:43:16.316 --> 00:43:18.556 align:middle of understanding the barriers to implementation. 00:43:19.736 --> 00:43:20.266 align:middle >> [Richard Smith:] Good, thank you. 00:43:20.416 --> 00:43:20.636 align:middle Paul? 00:43:20.636 --> 00:43:25.166 align:middle >> [Paul Glasziou:] Well first of all to say Richard I don't think there's any one thing. 00:43:25.166 --> 00:43:32.156 align:middle I think we actually need a systems approach where we map out the whole process that begins 00:43:32.156 --> 00:43:40.216 align:middle with a clinical uncertainty, a piece of research gets done, synthesized, published, disseminated, 00:43:40.366 --> 00:43:44.706 align:middle implemented and what happens in the consulting room and beyond the consulting room, 00:43:44.706 --> 00:43:48.626 align:middle what the patient actually does with it to map out their entire pathway 00:43:48.626 --> 00:43:52.576 align:middle and understand what goes right and wrong within that is essential 00:43:52.576 --> 00:43:53.576 align:middle if we're going to improve things. 00:43:53.576 --> 00:43:56.636 align:middle And that includes everything that everyone's said along here. 00:43:57.046 --> 00:44:01.216 align:middle So I'm going to say I've got a wish list of two things that I think are crucial in that. 00:44:01.786 --> 00:44:04.366 align:middle But that it's that whole pathway that needs fixing. 00:44:05.106 --> 00:44:12.096 align:middle My two wish things would be the first one is that we're too slow at doing systematic reviews, 00:44:12.096 --> 00:44:14.766 align:middle the process of integrating research. 00:44:14.766 --> 00:44:17.266 align:middle It takes about two years to do a systematic review. 00:44:17.266 --> 00:44:20.856 align:middle By that time more trials have been published and it's just too slow. 00:44:21.576 --> 00:44:25.576 align:middle We need to get that down to two weeks or two days or even two minutes. 00:44:25.956 --> 00:44:31.346 align:middle And it's feasible if we work on it like they've done with the human genome project which took, 00:44:31.346 --> 00:44:35.396 align:middle I don't know, I've forgotten how many years it took for that first human genome sequencing. 00:44:35.856 --> 00:44:37.566 align:middle But now it's just much easier to do. 00:44:37.616 --> 00:44:39.036 align:middle The costs have been plummeting. 00:44:39.526 --> 00:44:45.896 align:middle We need to invest the same effort, funds, technology, brains into working out how 00:44:45.896 --> 00:44:49.026 align:middle to do the same thing with integrating our research literature. 00:44:49.516 --> 00:44:53.736 align:middle So we can do those systematic reviews much more rapidly and it is feasible. 00:44:54.756 --> 00:44:58.416 align:middle That's one with list that I think underlies this whole process. 00:44:59.316 --> 00:45:04.746 align:middle The second for me is that a lot of the work that's been done has been on pharmaceuticals. 00:45:04.746 --> 00:45:10.406 align:middle I'd like to see the same amount of work go into non-pharmaceutical interventions like, you know, 00:45:10.406 --> 00:45:16.566 align:middle exercise for heart failure or the Mediterranean diet, etc. We don't have those things 00:45:16.566 --> 00:45:20.046 align:middle as readily available and so we're working on a thing called the handbook 00:45:20.046 --> 00:45:22.846 align:middle of non-drug interventions which is a [inaudible] 00:45:22.996 --> 00:45:27.046 align:middle and non-pharmaceuticals, a non [inaudible] if you like. 00:45:27.396 --> 00:45:32.956 align:middle And we need that, nutrition in physiotherapy and the things that I do in primary care. 00:45:33.616 --> 00:45:36.556 align:middle But it again needs to be underpinned by all the good evidence. 00:45:36.586 --> 00:45:40.846 align:middle But there's an extra layer of problem in that getting the details of those interventions, 00:45:41.276 --> 00:45:43.896 align:middle getting sufficient information to be able to implement 00:45:44.086 --> 00:45:47.226 align:middle in practice is much harder than the pharmaceuticals. 00:45:47.876 --> 00:45:50.556 align:middle But the effects can be just as large if not greater for some 00:45:50.556 --> 00:45:52.476 align:middle of the non-pharmaceutical interventions. 00:45:52.686 --> 00:45:53.116 align:middle >> [Richard Smith:] Here, here. 00:45:53.546 --> 00:45:56.336 align:middle Good, well I want to thank all of our speakers. 00:45:56.336 --> 00:45:57.946 align:middle I think it's been extraordinarily interesting. 00:45:57.946 --> 00:46:01.816 align:middle I thought I knew a fair bit about this but actually I've learned a lot. 00:46:01.816 --> 00:46:04.726 align:middle We've look at the origins of evidence based medicine. 00:46:04.726 --> 00:46:10.286 align:middle But what I think is become very clear at the end here is just what a long way we have to go. 00:46:10.546 --> 00:46:14.266 align:middle So if you're a young student, wow, what a prospect you've got. 00:46:14.476 --> 00:46:15.346 align:middle Thank you everybody. 00:46:16.516 --> 00:46:45.500 align:middle [ Music ]