Podcast interview with Dr. Robert Wachter and Dr. Arpana Vidyarthi: Part 2 (transcript)
This is the transcript of Part 2 of my recent podcast interview with Dr. Robert Wachter and Dr. Arpana Vidyarthi of UCSF.In Part 1, Vidyarthi and Wachter provided an overview of the traditional case review process and discuss their progress in shifting to a new, technology-enabled process. They discussed key principles of their approach, along with the relationship between culture and technology.David Williams: Are you using this system outside of UCSF or does it have that potential? With the electronic platform and security it seems like there might be the potential to use it across different institutions. Is that something that you would want to do or that would have any value?Dr. Arpana Vidyarthi: We're using Acesis for our local case review process and a peer review process. I don't see any value necessarily in collaborating with other institutions regarding the case review process itself. I definitely see value in communicating with other institutions regarding some of our data analysis and the trends, value in disseminating what we've been doing so that other institutions can learn from the stumbles that go along with building something brand new. I think there is also value in using Acesis not only in our division, but across divisions and departments within our own institution.Dr. Robert Wachter: Nationally, the question I think you're getting at is one of the very big questions in patient safety. As individual institutions --everything from a three-person doctors office to a 1000 bed hospitals-- begin analyzing their errors, or their outcomes weren't as good as they should have been, how do you disseminate that information? How do you benchmark and compare one place to another to look for best practices or see who needs to improve?Presently most institutions are not particularly enthusiastic about doing it because they worry about disseminating data --where you're looking at things that didn't go well-- beyond their walls. It's hard enough to get people to do it even within your walls, but there are some legal protections about that in terms of the malpractice system. There are data that are peer reviewed under this framework for quality improvement purposes that are not subject to being sequestered for a malpractice case for example.Congress passed a law a few years ago that allowed for the creation of Patient Safety Organizations (PSOs) that create a scaffolding for multiple organizations to come together for quality improvement or patient safety improvement. The Agency for Healthcare Research and Quality (AHRQ) in Washington was put in charge of managing this process. It's still in the fairly early phases. At this point maybe more than 100 organizations have applied to become PSO's. Some of them are already members of other affiliated groups. They might belong to the same hospital network or have the same owner.Some of them are using the same computer system to do order entry. Once PSOs come into being they will be looking for data sharing tools that create all the attributes that Arpana described: a secure environment, facilitate moving data around, analytical tools, those sorts of things. So we're doing it thinking quite locally about how to analyze cases in our own division, thinking about how that might roll up within a 600 bed hospital to be used more broadly through the organization. But fairly soon if the tool continues to perform well, organizations that are forming networks will be thinking about using it or something like it to try to promote this kind of activity across the silos.Williams: How do you expect the case review process to evolve over the next five to seven years and how will be supported by technology?Wachter: It's important to see this peer review process as one of the building blocks of a robust quality and safety enterprise. I almost divide them into two separate buckets. One is: how does an organization know where its problems are, what's going wrong? That can range from patients having bad outcomes (tracking outcomes that are veering off from expected outcomes.) It can include ‘trigger tools’ where patients needed, for example, an antidote to a certain medication. That gives you a clue that maybe the medication was given incorrectly or at the wrong dose.Two: self-reports coming from providers or from patients. There are a lot of different signals that organizations are going to need to tap into in order to find out what's actually happening out there that's not perfect. Then on the back end, once they find that out, the question is how they analyze it, how they move the information around, how they create stories that motivate people to learn how they create new processes or structures that ensure that the care gets better.What is very clear is that the motivation to do this kind of work has grown tremendously in the last ten years as the public, the media, Congress and others have become more interested in it. Part of this was a recognition of the quality and safety defects in American health care. There is much more pressure to do this kind of work from a lot of different pathways: regulatory, accreditation, media scrutiny, public interest. The business case to do this work is growing.We need new tools, new models, and new ways of educating people on how to do this work, new cultures of openness. It’s all pretty new for us. Arpana went to medical school ten or 15 years ago. I went 25 or 30 years ago, but I can say that neither one of us learned hardly anything about all of the things that we've talked about today in terms of how you analyze this kind of data, how you change organizations, leadership, and so on.So the tools become an absolutely critical part to try to answer both of these questions. How do we collect these signals in a balanced way? It can’t all just be from self-reports by providers. There have to be other mechanisms. You only find out certain things that way. Then, how do you analyze it? And much more importantly than anything, how do you then make changes that work and stick and change that doesn't create unintended consequences? I think the tools are an absolutely critical part of that, but it has to be embedded in a culture where people are doing the right thing and where you have true experts who know how to do this kind of work.Vidyarthi: The majority of case review or peer review processes won’t cut the mustard moving forward. Just having a one-off processes where you look at a few cases here and there, not only is it not going to meet the letter of the law as those will be changing over time, but it's also not going to give you the information you need. So I think all sorts of health care organizations from very small clinics to very large organizations are going to need to start thinking about how better to learn from their cases, building a robust process and having some sort of technology assistance to be able to make that process efficient, effective and allow for real change.Wachter: And the technology of course is not in a vacuum. You have a system like Acesis that facilitates case review increasingly in hospitals, particularly as the Federal government throws $20 billion or $30 billion at this. Hospitals will also have more robust information technology systems and medical records and computerized order entry and bar coding systems. Ultimately all of these systems have to speak to each other. It may be that the generation of a case that needs to be reviewed is not coming from a patient or a provider raising a concern but from some signal that comes out of the computerized medical record that shows the patient had a very bad outcome or violated some norm.It still requires a human being to review it. I don't think we're anywhere near the point where this is so straightforward and clear cut that the review can be based on all measurable parameters. There is a lot of judgment that goes on here. Sometimes you really need an expert in the area to look at it to know whether there was actually a problem in the care the patient had. So we need tools for all of these things, but ultimately, we then are going to have to make sure that they speak to each other in a way that's really seamless.Right now, we’re still in a mode of creating individual tools for individual functions. I think we're going to go through a fairly gangly period where those tools don't work very well together and you're kind of moving data, porting data from one electronic tool to another. Part of what the Federal Office of Health Care Information Technology is trying to do is create these interoperability standards. Part of that is designed so that when you have different ways of collecting data about patients, it can move from System One to System Two to System Three in a way that's appropriately private and in a way that uses the same language for the same thing so that you don't have railroads with different gages trying to connect to each other.We're really at the early stage of that kind of interoperability, but that's what we need where all these things really create or serve a unique function. For example, I can't envision a world anytime in the next decade or two where the electronic tools that you get to do medical charting or computerized order entry also serve the functions that Acesis serves and vice versa. But ultimately all these tools have to interdigitate so that you don't have wasted time and effort, the information moves seamlessly from one to another when necessary.Williams: I've been speaking today with Dr. Robert Wachter, Chief of the Division of Hospital Medicine as UCSF and Dr. Arpana Vidyarthi, Director of Quality. Thank you both for your time today.Wachter: A pleasure.Vidyarthi: Thanks a lot.