Joining me on this week’s episode is Kelli Bravo, the vice president of healthcare and life sciences for Pegasystems. Kelli is responsible for Pega’s go to market strategy for the engagement and CRM solutions across payer, provider, life sciences, and PBM organizations. I had the pleasure of working with Kelli more than 15 years ago, back when the Trio and Palm Pilot were top of the line mobile devices.
Kelli discusses a number of important ethical issues, including:
- The ethical considerations in admitting an error in your software
- How do we ethically fix what is broken in the healthcare system
- Top ethics challenges in healthcare communication, patient engagement and collaboration
- How do you ethically balance transparency and confidentiality
Why don’t you tell us a little bit more about yourself and your career?
I’ve been in healthcare nearly my entire career, and I’ve really been focused on growth and change in organizations like athenahealth, Skyscape, Microsoft, Adobe, and McKesson. I focused on operations, practice management, marketing strategy, medical and medical review content, and more recently, care management engagement. So, I’m really, really fortunate today that my role encompasses all aspects of my prior roles, plus more. It enables me to build out our CRM strategy for decisioning customer service, care management, sales, automation, and platform. And I can’t forget that I’m also a proud mom of twin 12-year-old girls.
Thinking back over your career in healthcare, what is the most difficult ethical challenge you ever confronted?
I’ve been fortunate to have worked in so many great organizations that take their ethics very, very seriously, but it really took me a while to think about a great example. But I do recall one past instance.
A while back, I worked at a healthcare IT company that also provided healthcare content. One year, one of our software content releases had a major error in it. Not just like a typo, but one that affected the workflow. And the dilemma was really centered around what to do next. Do we re-release the product which costs a lot of time and money to do, or do we just communicate about the error and hope folks catch it when they use the content?
Back then software releases followed a waterfall approach, which meant it could be a three to six-month release cycle because it wasn’t just a patch. You needed a full QA process. You had to reissue the software, retrain the clients on how it used and how it worked, which was time consuming and expensive. We had to decide as a team if we were going to really rerelease it, or if we could just issue a notice of correction and be done with it.
So, we sat back and we said, what should we do now? We conferred with our software team, the content developers, and a bunch of our clients to determine the best path forward. We knew it was expensive. It was going to be time consuming to reissue this software. But at the end of the day, it really was the right decision.
We ultimately moved forward with a new release, but it took a lot of conversation. This new release required a new issue to all the customers, new client training, and really admitting the fact that we had made a mistake. And you think about today’s environment, the agile environment with low code or no code, this process would have had a much lower cost and more rapid turnaround, a lower client impact. And it would have made the decision a whole lot easier, but ultimately, I think we would have absolutely come to the same conclusion. You really have to do the right thing.
How do you go about framing that decision with the leadership team and working to secure buy-in for what you believe was the right thing to do?
It was all about great communication. Talking about the tradeoffs and being able to tell the organization that at the end of the day, we really all want to work for an organization that acknowledges its mistakes. Because your reputation and the reputation of your company really holds fast when you do the right thing. And the right thing includes admitting when you make a mistake. And as long as you learn from that, it’s okay. It really shows that you and your company are mature, accountable, and really care about your customers. When you lay it out and show not just the financial impact, but the reputation impact, the feedback from your clients, the feedback from your employees, you build a story about what the right thing is to do.
You were in a situation where you were a content provider and I could see some executives pushing back and saying that, wait a minute, this is undermining confidence in our product. Are customers going to start questioning about expert advice and look at other solutions? Was that one of the factors that came into the discussion?
I think you have those conversations all the time when you’re in competitive environments about how your system is differentiated, why your content is better, why your software and your workflows are better. But at the end of the day, people do business with organizations and other people they trust. And if they were to find out that we hadn’t been trustworthy, we would have lost more than we would have gained from the financial impact. It’s a long-term play. At the end of the day, if you can’t continue to maintain trust with your customer base, you will not continue to grow and succeed. At the end of the day, the right decision comes forth as a result of really putting your thinking cap on and having that collaborative discussion with your executives to help them understand and frame the problem.
You mentioned in today’s low code, no code environment, it wouldn’t be as much of an issue. But today the question is do you just add it as a patch note and a bullet point, which people may or may not realize, or do you actually call it out. How do you recommend we address these content mistakes in a low code, no code environment?
I’m pretty transparent. At the end of the day, hiding something in a bullet on page 98 isn’t really going to be the process that works for me or my team. I think what we would espouse with any kind of communication going forward is that you’re very direct and clear. We recognize people don’t typically hear the message the first time. You need to share it two or three times to ensure that people really understand the difference.
And what we’re talking about here isn’t just simply something that could have been ignored in the back end, but it was a process and content in a workflow that could change the outcome of a clinical decision. And when you actually have a longer-term impact, either on that patient or on that health system or on that payer, you really have to take responsibility for that. We would have wanted to ensure that the customer really understood the decision of taking that patch today. In low code/no code, you can spin something out relatively quickly, but you really want to make sure that customers take that upgrade or they take that patch so that they have the ability to use the software and the means that it was expected to be used.
Looking back over that one example, is there anything you would’ve done differently?
We learned a couple of things. I think first, it’s important to gather all the input before you make a final decision. You were smart to mention like what were the executives thinking. But it wasn’t just that, it was what do your employees think? What do the developers think? What do the content and clinicians think? And what do your clients think? And then at the end of the day, it’s all around following your gut.
We knew what the right decision was right from the start, but we needed to gather our facts and getting input helped us validate what was right. We knew what we had to do regardless of the cost or the time and resource. And so, I feel like you can’t be swayed from your convictions by things like cost or convenience when it really is the right thing to do.
And it really comes back to what I said before around your reputation. It’s okay to make a mistake, but you need to be accountable for it and show that you care about your customers. So to answer your question, would I do anything differently? No, definitely not. We made the right call. We reissued the software. We retrained those customers. We communicated transparently and clearly. I wish we could have made our decision maybe a smidge more rapidly because we might’ve been able to get our updates out faster and less pressure would have been put on the team related to balancing the cost. But the way I look at it as if it takes time to come to the right decision, I wouldn’t want to rush that decision.
Beyond your own personal life, what are you seeing as some of the key communications, marketing, and business ethics challenges for today and tomorrow?
I have to say recent events bring an awful lot of things to mind, right?
It’s crazy out there right now. Normally, you and I’d be doing this interview in person. But that’s not going to happen this time of year. We should look at how do we fix what’s broken in the healthcare system. We know that we still don’t give patients the best experiences in navigating through the healthcare system. Even for a veteran healthcare person like myself, it’s hard. There’s a major discrepancy between how patients rate engagement or experience with the health care organization and how those organizations rate their own performance.
I actually did a survey back in January of 2000 patients and healthcare IT engagement leaders. And what we found out was that only 33% of healthcare organizations think patients would switch providers due to poor communication or engagement, but a whopping 78% of patients surveyed said they would actually swap providers due to this poor experience.
And it goes against what we traditionally believe. Historically, whatever your doctor or provider said, that’s what the patients did. But in today’s environment, that’s just not the way it is anymore. People have choice. And we now have a big disconnect. And consumerism is driving this divide even further. And healthcare consumers, they expect to be treated like they are treated in any other industry and being engaged with that organization on a much more proactive and personalized level.
If you think about their digital and self-service experiences at Amazon, you get this complete experience. You never have to talk to anybody. You never wait on hold. It just happens seamlessly. But healthcare is the only place where you buy a product and service, and you have no idea what it’s going to cost for 30 days. It’s just so crazy. So bridging this gap, its communication challenges and not doing it correctly, may drive real ethics challenges, as well as patient privacy and data privacy issues going forward.
You’re right, especially I think with the shifting demographics. Because when I look at my mother and helping her with some serious medical issues last year, just the way she interacted with her physician is different than me, which is different than my kids. I think you’re definitely going to see that the expectations are shifting.
This COVID pandemic and discussions today between payers and providers are bringing a lot of challenges to light because people are doing unique things because of the pandemic. The industry is recovering, but businesses are actually still continuing to consolidate and we still need to collaborate in ways really never before executed or imagined. It all drives towards this better patient engagement experience.
We know a better engaged patient is actually a healthier patient, a much healthier patient. And as part of this process of learning through this COVID pandemic, we also have been looking at what we can do to drive costs out of this skyrocketing system. And I personally hope that we can continue to lean in, like we’ve done during the pandemic, enabling organizations to work better together, breaking down silos, and really just doing the right thing.
Payers, providers, even government agencies have all come together to help during the pandemic. They’ve spun out new processes, agreements, benefits, programs, and still maintained a very ethical behavior. The things you expect that should happen when people do the right things. Like let’s make COVID testing free, so everyone has access for the greater good, because it’s better to get tested than spread the virus.
That kind of stuff makes sense. It wasn’t easy to do before, but during this crisis, everybody pulled together to do it. And we’ve really seen amazing work and collaboration over the past few months, everyone doing the right thing for the right reason. And we really need to keep that up. We need healthcare to behave more like other organizations while still keeping with our industry’s promise and our ethics. To protect privacy, personal health information, deliver better outcomes for your mom. My daughter, who was in the hospital a couple of weeks ago, what can we do to help this process? Because if we don’t personalize it, if we’re not proactive, we’re not driving better outcomes. And really, I feel like nothing is more personal than health.
You mentioned privacy a number of times. We’re getting more connected and gathering more information on people. We have HIPAA, GDPR and CCPA. How do you balance that privacy versus benefit in terms of understanding what you can do with data?
I’ll take that in two parts. I see there are some current challenges, even something as thought provoking as the 21st century CARES Act with some patient notification around treatments. And then during the pandemic, there was some relaxing the rules around HIPAA data compliance, which had to happen in order to keep track of what was going on. But then we’re going to need to put some rules back in place or document what we’re doing to be more clear going forward, because even the tracing of people who had the virus sent up red flags for some people and organizations and policies related to what you just said, and the GDPR. As wonderful as it is to break down those silos and barriers to health information and benefits and value based care and payments, we really need to ensure privacy is maintained.
So what I think the second part of my thought related to that is that if you think about the way we’re using AI, this is a place where we can actually build some of those bridges. And to the future, I see a lot of teams using AI, a lot of people. So, we need to ensure it is unbiased in its decisioning, its algorithms, its prioritizations. While the risk and rewards of AI are really staggering and hugely beneficial, organizations must remain accountable for the actions of their AI driven systems. Otherwise, you risk really collapsing your engagement programs, your supply chains, your reputations, and even the bottom line.
So what I see, certainly what we’re striving towards in our organization, is that there’s an obligation on the part of every enterprise to develop your AI in a responsible manner, focusing on transparency so people understand how the decisions were made. Acting with empathy, so we really take the person, that individual into account. And based on their circumstances at that moment in time, how we’re reacting and treating them. Ensuring we’re accountable, so the answers and the results that we deliver are the right ones. And that we really focus on reducing or eliminating bias, both in the algorithm and the way things are rolled out. So we need to use AI that has empathy for the people in situations to really ensure we do the right thing, make the right decisions, and get the best results.
I agree with you. I think AI is going to be more transformative than the internet was in the mid-90s and it’s chilling and it’s really exciting at the same time. We’re not data scientists. How do you as a communicator and a strategist engage with the data scientists and the teams to make sure those ethical issues are addressed when they’re implementing AI?
There are a couple of different ways I look at that. One is just having this conversation, like the conversation we’re having right now, ensuring people are aware that there are unconscious biases. There are ways that you can program and develop code that can build in biases, that you really need to step back, have a diverse team of people review and go through your data, identify your personas, evaluate your strategies, and then really using the best technology out there. There is technology like what we provide and Pega that allows you to actually be able to understand and look at the transparency of our decisions. Look at whether or not things are biased. Look at dialing up and dialing down the empathy with the decisions that are made.
You must have the ability to test and review before you go live with the program. And I think that’s the thing we’re seeing more broadly in the industry is people are really taking into account those four key themes, right? It’s transparency, empathy, accountability, and eliminating bias. You’ve got to really be able to have a full program that gives you access to all of those decision triggers, so that your team can be educated and can deliver great results, but also ensuring that they’re fair results.
Transparency has been a buzz word in communication for at least the past decade, if not more. Especially when it comes to healthcare, there’s transparency on what you can share and then what you can’t share. How do you balance that need for transparency with confidentiality?
Well, there are pretty strict guidelines on what you can and can’t share. And I think what it comes down to is again really following your gut. Following the rules, making sure that you’re only sharing the information that’s absolutely necessary with only the people that need to know. I think that’s part of where people get a little out of hand is where they too broadly rush the sharing of information, where they don’t control the roles based on access. And you really need to be able to do that, but you do need to be transparent in the decision process.
You should be able to identify at any point in time what the decision was at that point in time that made that decision. What were the conditions? What was the situation? And if that same situation came up again, you’d expect that you would make the same decision, you need to be able to prove that.
I think it’s just really important that you use your AI or use any kinds of these advanced technologies in a responsible manner. And that you’re very open. And I hate to say the word transparent again, but I think that is the goal. Be very clear on what the expectations are and how you’re delivering on that promise.
You mentioned the 21st century CARES Act. And what are some of the ethical concerns around that, that we should be keeping in mind?
I think the one that caught my attention was specifically around when you’re treating a patient, if you’re providing some kind of care that doesn’t actually impact the care. So maybe you’re using a pressure bandage instead of a regular bandage in a certain situation, that you don’t necessarily need to tell the patient that you’re using this new pressure bandage.
I’m a little bit more transparent myself in that situation. I prefer a full disclosure and would want people to know, but I also recognize the need to be able to come rapidly to results. I think that was the best way of saying it. That part of the reason for that was to get to the end goal as quickly as possible in order to help a larger number and volume of people, in order to support some of the medical device research. And so to me, it’s kind of a fine line. I think it’s important that people understand what kind of care they’re being delivered, but it’s also important that we’re able to deliver care in a broad way. And so, I think it’s really a compromise of ensuring that we’re not too close to the gray in those kinds of situations. Really give the patient the benefit of the doubt and the opportunity to make an educated decision.
What is the best piece of ethics advice you were ever given?
I believe you really have to follow your gut. That was advice I was given very early on in my career. If you feel uncomfortable with the direction of a conversation or decision or team direction, you should question it. You should speak up. It’s very much like what we’re seeing in the world today. You need to do the right thing. I know you’ll feel a lot better and I know you know what’s right. So if you follow your gut, you’re going to do the right thing. You’re going to put all the right places and pieces and questions and answers together. And at the end of the day, you’re going to make the best decisions for both yourself and your organization.
Now, is there anything I didn’t ask you that you wanted to highlight?
I am so excited about what I’m seeing with the collaboration between payers and providers, pharma companies, med device companies. What I want to see going forward is that we continue that collaboration, to continue the acceleration of what we can do in creating a better healthcare world, a better healthcare ecosystem, because we’re getting results. We’re seeing those results. And when people lean in, we can make a difference. I really just hope we continue that ethically. It will really help us drive to better outcomes, both health outcomes and business outcomes.
How can we drive that greater collaboration?
People really just need to take a step back and look at what they accomplished just now? Through the past few months people were doing the right thing. People were collaborating, they were breaking down barriers and silos and they were getting great results. And if they continue to do that, we’ll be able to accelerate the rate of change for success going forward. If people step back and take a look at what they were able to accomplish, we will be able to continue that momentum. And that’s what’s really exciting. It’s having people leaning in, doing the right thing, and getting the right outcome. And it doesn’t have to be during a pandemic. It can be at any point in time.
Listen to the full interview, with bonus content, here:
- How to best counsel your client when they want to respond unethically to an unethical competitor – Tatevik Simonyan - October 28, 2024
- Why it’s important to know many codes of ethics – Erin Kennedy - September 9, 2024
- The Importance of Really Small Things – Capt. Barbara Bell, USN (ret) - May 28, 2024
0 Comments