Evidence-Based Management
Evidence-Based Management
Module 1 Principles of evidence-based management
This episode accompanies Module 1 of the course which covers the basic principles of evidence-based management and gives an overview of the steps involved in taking this approach.
Everyone uses evidence when making decisions, right? But only one source, or many? And are those sources assessed for their reliability and trustworthiness? Are we actively trying to identify the biases that so often lead us down the wrong path?
There is a fundamental problem with how many decisions are currently made in organisations so the first important realisation is examine current practice and to identify the problems of this approach. And then to appreciate that many current sources of evidence (i.e. colleagues, experts, gurus and academics) might not be the best sources of reliable evidence.
And really, who has time to take an evidence-based approach, when organisations just want to get on with "stuff"
This is tricky territory which needs careful navigation as our guests discuss.
Host: Karen Plum
Guests:
- Eric Barends, Managing Director, Center for Evidence Based Management (CEBMa)
- Denise Rousseau, H J Heinz University Professor, Carnegie Mellon University
- Rob Briner, Professor of Organizational Psychology at Queen Mary University of London
Find out more about the course here: https://cebma.org/resources-and-tools/course-modules/
00:00:00 Karen Plum
Hello and welcome to the evidence-based management podcast. This episode focuses on the basic principles of evidence-based management - what it is, why we need it, what counts as evidence, and how we gather, assess and evaluate its reliability before making decisions.
I'm Karen Plum, a fellow student of evidence-based management, and I'm joined by some leading experts to answer the questions I had after doing this module. Here we go.
00:00:34 Karen Plum
I found this module fascinating, and if I'm honest, a little frustrating. There seems to be so much to know and explore. I quickly realized that I didn't understand anywhere near as much as I thought I did. It started to dawn on me that I needed to change the way I think and that made me rather uncomfortable because it sounded like quite a big ask. And it made me appreciate the poor level of decision making that I've been witness to, and probably a party off, in the past.
So in this episode I'm talking to, Eric Barends and Denise Rousseau, the authors of the book upon which the course is based; and leading evidence-based management authority Rob Briner.
Eric is managing director of the Center for Evidence-Based Management, Denise is University Professor of Organizational Behavior and Public Policy at Carnegie Mellon University and Rob is Professor of Organizational Psychology at Queen Mary University of London. So you couldn't be in better hands.
I started by asking Rob how organisations typically make decisions.
00:01:40 Rob Briner
That's a very good question and a very important question. I think we don't really know, I would say. So my understanding, which is very limited from the literature on sort of organization decision making, is you know there's some examples of failures and decision whatever that means. But in terms of sort of on the ground, detailed evidence about how decisions are made, I think there isn't much around.
My guess, and it really is a guess, is that all organisations always use evidence and data, but they use it in a very partial and limited way. So this is one of the problems I think with the term evidence-based management. Because people look at you a bit puzzled sometimes and say well, we always use evidence we go, yeah, that's true you do and we always do, but evidence-based management implies something a bit different, which you probably don't do.
So short answer is I don't know, but my guess is it's a bit of evidence, it’s a bit of politics, it’s a bit of cognitive biases, it’s a bit of people deciding what they would just want to do anyway, and then justifying it later all that kind of usual stuff.
00:02:44 Karen Plum
There's no doubt that experienced people have strong opinions and put a lot of weight on their own experience when it comes to making decisions. They may also have power and influence in an organization and have other political objectives that they're trying to achieve. Rob explains why taking a structured approach, as opposed to simply relying on one's gut instinct and 20 years’ experience, can be quite challenging.
00:03:10 Rob Briner
I think it's a lot more effortful, and I think it's harder to be political, so some of the things you might want to do as a leader or a manager involve politics involve getting your own way, involve power and there's something about I think following a structured approach which also threatens that.
So it’s less I think sometimes that it threatens the sense of autonomy and expertise, but more that its threatens your ability through making decisions to achieve other goals that you want, that aren't necessarily for the benefit of the organization.
And I think there’s also paradox here. I think sometimes people, when they're doing their - to use the cliché - choosing a car or something like that, or deciding where to buy a house or whatever it happens to be - those kind of fairly big consumer decisions. People oddly do seem to be quite prepared to take a structured approach, so there's a sort of paradox that sometimes when the outcomes are important to people they're prepared to do it.
But in more organizational, political management context, even though the outcome could be important, there may be slightly less reluctant to take that approach. So there's something interesting going on here which I don't quite understand.
00:04:13 Karen Plum
So I asked Rob whether organisations have evidence that what they do currently actually works and how they measure or justify their approach.
00:04:23 Rob Briner
I think it happens sometimes, and I think there's two main things that determine it. One is whether there is a requirement, often externally to the organization, to make decisions in a way they can be well informed, and also there's accountability. So maybe for example in something like the airline industry or very safety critical context when people make decisions are very aware it could go wrong, something could happen and there may be an inquiry. People may ask what did you do? Why did you do it?
So I think sometimes in those situations people do attempt to use and gather and have to be accountable for their decisions. I think the other context where it probably happens more is where it's easier to actually get data and evidence and information and feedback about whether what you're doing is working.
So for example, if you are working in some kind of activity where the outcome is very easily quantifiable, you can get data very quickly, the data you get are quite reliable and you can tinker and play with stuff, for example in sales or the design of a website where you're selling something; I think the easier it is to experiment and try things out, to get quick feedback, that's quite valid and reliable data, that more likely is people will find out whether what they do works.
I think often the decisions that are made are quite strategic, they’re very long term, they’re very hare to measure and people don't want to be pinned down to particular, you know how do we know its worked or not worked? They prefer to keep it ambiguous for obvious reasons.
So I think outside those two things I think people often don't know too well what's worked or not. They know when something has gone wrong, but that's not quite the same thing, is it?
00:06:04 Karen Plum
It certainly isn't. It's also clear that many people get excited by new approaches, fads, and fashions. here is a great pull towards doing the latest thing that everyone else is doing. The cult of personality from the latest guru and the stories told by people you know and trust that we've tried this new thing and it works.
It's very tempting to do something that seems to get results, and we can find out quickly how to do it. So what's wrong with getting on the latest bandwagon? Here's Eric.
00:06:35 Eric Barends
Well, the problem with that is that it's often not based on solid evidence because in most cases, what consulting firms claim to be true or to be an important new model or a new insight by a guru is very often driven by commercial interests. So it's not based on proper research or based on years of accumulated experience. It is based on a new insight that was developed by a consulting firm or a guru or whatsoever, and it sounds nice, it sells good and it's appealing and it has nice buzzwords and it taps into actually the insight of a lot of people, or the experience of a lot of managers, that it's hard to change organizations. It hard to cut costs.
I mean, it's like the dieting industry. It's hard to lose weight. So every three or four months there is a new diet promising that if you follow this diet you will finally lose some weight. And the same counts with a lot of models or insights or 6 steps approaches that are promoted by consulting firms or with management authors, gurus. There's a whole industry behind it selling these ideas.
There are thousands of books on agile, on lean, on - you name it. We are not claiming that this is all BS, this doesn't work, but you need to be very, very careful and critically appraise those claims and try to figure out what is it based on. It may be an interesting new insight, a promising insight, but nine out of 10 times there are principles or insights that are really evidence-based and we know they have an effect of positively affecting your organization that is less sexy or less novel or less fashionable, that may yield better results.
So that is why you need to be very careful regarding best practices, new insights, new cutting edge methods that are promoted by firms or people, businesses, that have a commercial interest.
00:08:59 Karen Plum
This also extends to any ideas and approaches that are newly discovered and held up as the latest quick fix or solution as Rob explains.
00:09:07 Rob Briner
It captures quite well or plays into very well this kind of what is the newest latest thing bias that I think many of us have in all walks of life in all professions. We think the latest thing is the best thing or the newest thing is the coolest thing, or that the newest thing is the cutting edge thing, because most times that's nonsense.
Mainly because these new exciting things are by definition of things which we have least evidence and most things don't really work particularly well, or any better than previous things. So I think it not only makes it interesting, entertaining, accessible, it also I think, my guess is it helps readers feel they’re on top of the latest thinking, as though that is better, somehow.
00:09:49 Karen Plum
It's interesting to ponder how often these types of approaches have been used in organisations for a while, and then quietly they slip into the background. Maybe because the approach didn't bear the fruit that was expected. Or maybe it was used to fix a problem that had been wrongly diagnosed, or maybe not diagnosed at all. Maybe people just thought this would be a good thing to do.
My suspicion is that people don't measure outcomes. It's not always easy to do and just as how solutions seem to feel obvious and something we should just get on with, often the outcomes are also felt to be obvious and therefore not worth spending our time on.
So I'm interested to know what evidence there is, but being evidence-based leads to better outcomes. Here's Eric's response.
00:10:36 Eric Barends
But it is indeed known from lots of research, even several Nobel Prize laureates have done research on that topic, that the way people make decisions, regardless whether it's you in your own private situation or as a manager in an organization, is often flawed. I think if you would ask me or ask an evidence-based practitioner, what are the two major core principles of evidence-based practice? One, evidence from multiple sources not just one not a single source decision. [KP1]
Companies that say oh, but we're very good in data analytics, all our decisions are data driven. Yes, thank you, but that's a single source decision. You should also take into account the other sources of evidence. And second, critical appraisal of the trustworthiness of the evidence. We know if you put these two things in place, there's a lot of research that demonstrates beyond any reasonable doubt that your decisions will improve and you will reach better outcomes.
But again, it's not a one-off event. It is an approach. I mean, it's not guaranteed that your decisions will be successful. Something sometimes things fail even when you take an evidence-based decision. But in the long run, if you base your decision as best as possible on the best available evidence, in the long run, you will be more successful than just basing your decisions or opinions, or hunches, or gut feeling or experience.
00:12:22 Karen Plum
Another aspect is the notion that there's no certainty of good outcomes when you take an evidence-based approach. When you start to become an advocate for taking this approach, people inevitably are challenging. How do you know that this leads to better outcomes? The new is always challenged in a way that the existing never seems to be.
Asking your manager how they measure the effectiveness of their current way of operating can be irritating and potentially career limiting. Taking an evidence-based approach doesn't tell you what to do, nor does it guarantee outcomes.
Eric explains this in terms of probabilities, and this is more in line with what you typically hear from medical professionals who also promote an evidence-based approach.
00:13:06 Eric Barends
Doctors that are evidence-based, educated in evidence-based, always talk in terms of probabilities. They won't say well if you take this medicine you will be cured! No they say well, the probability they explain things in risk reductions, or X percent of the patients that take this treatment will be cured or will be have less symptoms within a year. They always explain things in likelihoods and probabilities or risks, but that is not very common in the domain of management, leadership, and business. Because we are not used to think in terms of probabilities, but always make bold statements - we should do this because this will come out. It helps if you ask, I don't know, your HR director that makes such a claim to put a bet on it. How certain are you? Are you willing to bet your annual salary on this outcome or not? And when they say, Eric, that's ridiculous, why would I do that? No, it's just I want to know how certain you are. Maybe a mid sized car? Would you be willing to bet that? Eric, this is ridiculous. Maybe a bottle of wine then?
So try to figure out how certain this person is making this claim, that will help you to get a better idea of the probability that something good or bad will come out.
00:14:40 Karen Plum
When dealing in probabilities, it also becomes obvious that we aren't going to find that something will definitely work, but that there is a good chance that it will work better than the alternatives, because we've considered all the evidence. How much better it will work of course depends on the situation and the specific circumstances.
So module one is all about raising awareness, helping us as students to ask questions, to challenge assumptions, to quiz people about how certain they are that things will work, and gathering multiple sources of evidence. We're gradually empowering ourselves to make a different contribution to the conversation about the decision at hand and how solutions might be implemented.
Here's Denise Rousseau.
00:15:26 Denise Rousseau
The most valuable impact, I think, an evidence-based practitioner can have is when he or she is themselves the manager. You'll still be in a hierarchy, perhaps, but you can be a model for your own employees about a process that improves the quality of information and voices involved, and the ultimate decision.
One of the big issues that we find in our own research on evidence-based management practice is that when managers explain to their employees the evidence that was ultimately used in making a decision, the employees come to trust the manager more, which is important. Trust is a huge currency in our world and often in short supply. [KP2]
The other part of it is they begin believing that the organization is doing quality work. You know certain circumstances in which it's hard to judge whether the work is good or not, because we don't have good metrics on end user reactions or impact on the environment or whatever it is. Employees start feeling more committed to the work that's being done, and that trust and that commitment I think, are very valuable to strengthen.
And this is not by cheap talk. It's actually why helping people see the uses of information and the trustworthiness of information that was put into the decision. It's very powerful to engage in this.
00:16:56 Karen Plum
Module one shows that we have to critically appraise the evidence. How trustworthy is each source of evidence that we're weighing up as we edge towards a decision? It becomes challenging to think about someone in the organization upon whom you've relied in the past, trusted their judgment, and not challenged them in the past. Now you're wondering, is their evidence even trustworthy? That feels uncomfortable and I put this point to Eric.
00:17:23 Eric Barends
You should really differentiate between this is a trustworthy person, and the evidence this person brings forward is trustworthy. Some people are very, very trustworthy, but they don't realize that the evidence they bring forward is actually flawed. We are often not aware of our biases. I have preferences, I fall prone to all kinds of biases like confirmation bias.
You will learn more about this in the module on critical appraisal of evidence from practitioners. But the reason why we have this first module about awareness is because most people are not aware that the evidence they have or the way they make decisions or the claims they make are actually not really trustworthy. So these trustworthy people may not be aware of the untrustworthiness of the evidence they bring forward. So, that's a problem.[KP3]
00:18:29 Karen Plum
Moving on, it's clear that every source of evidence has to be appraised, not just the evidence brought forward by practitioners. Seemingly we can't rely on academics either. There's an increasing trend among colleagues and clients to quote academic research that they've read, in, say, the Harvard Business Review and they take that as gospel.
I asked Rob if the Harvard Business Review is a good source of evidence.
00:18:54 Rob Briner
No, I’d say they’re pretty unhelpful. This is not to say there's never any articles in there that are useful and interesting and important. There are sometimes, but my perception is, and I haven't tried to quantify this, but increasingly the stuff that's in there is more to entertain and excite and pique people’s curiosity than actually providing accurate overviews and summaries of scientific evidence around particular practice problems.
But one of the challenges is, it is very readable. It's very interesting and it I guess it's what we might call edutainment. It's on this cusp between sort of education sort of entertainment, but ever more, I think moving towards more entertainment bit of that. But on the other hand it is very readable and it's interesting.
00:19:42 Karen Plum
Eric provides some additional views on the trustworthiness of academics and academic research.
00:19:48 Eric Barends
So the statement “research has shown” - I mean, it's a good start that apparently there's research on this topic, but now we need to go to the next step and say, OK, that sounds impressive. How reliable and trustworthy is this research? And indeed what you say, that's absolutely true. You will learn about authority bias in the following modules, but we should not be impressed by the authority of academics.
The fact that the author is from Harvard or Stanford and is a world famous academic is not relevant. It does not guarantee in any way that the outcomes are trustworthy. Sometimes the most important journals and the most renowned academics have to retract their study because there are flaws or it can't be replicated, etc.
So evidence-based practice creates a level playing field. It's not the pay grade of the guy or girl that makes the claim that actually makes the difference, or the reputation of the journal, or the academic that has done this research. That's all irrelevant. We will only take into account the evidence itself.
00:21:05 Karen Plum
Clearly there will be more on that in a later episode.
In undertaking this first module, I sometimes found the distinction between experts, practitioners and stakeholders confusing. Sometimes those people could be in more than one category, perhaps an expert, but also a stakeholder. I have 20 years’ experience as a change manager is that a good source of evidence when making a decision about implementing change? Or does that just give me a set of opinions that I should be suspicious about? I put this to Eric.
00:21:38 Eric Barends
Well, first of all, of course it's a bit arbitrary, artificial to make a distinction between stakeholders and practitioners. For instance, there's an example in the quiz at the end, or maybe a ‘learn by doing” exercise where the issue at hand is a merger between two hospitals and then they have a larger market share and the idea would be that you would be able to reduce costs and increase the quality of the patient outcomes.
How about physicians? Are they stakeholders? Or are they practitioners with experience? Well, it depends a little bit on what you would ask them. First of all, if you would ask them how do you feel about this merger, they're obviously stakeholders. If you would ask them, do you think this merger will be effective and they say yes, I'm pretty sure it will be effective, you should treat them as practitioners.
And as you will learn in the next modules, there are three requirements that determine whether the expertise of a practitioner is actually valid or trustworthy. I won't get into it now, because that's in the next modules, but one of them is how much experience do you have with the situation. Now you could ask the surgeon how many mergers have you been involved in? Well, you know in my previous hospital I was…
OK, that's one, that's not many, I mean, so that's probably not very reliable. So in this case, you could argue the physicians are mainly stakeholders. If you would ask them, will this improve the patient outcomes or enhance the medical quality of I don't know, total hip replacements. And when they say yes because there's a lot of evidence, actually that shows if you have more opportunities to do this surgery you will become better at it and when we join forces we’ll have more patients and I as a surgeon will have more opportunity to conduct or perform these surgeries, so I will become better at it.
That is already a different thing that is practitioner expertise that sounds more trustworthy, so you should really really try to differentiate when it's about practitioner expertise, does this person indeed have experience with this specific issue at hand that we're interested in, or is it about - it's important to hear about a person's opinion or feelings or perceptions, because this person will be affected by the outcome.
Surely a physician will be affected by a merger because things will change, and when they're absolutely against, even when they're against for subjective reasons, it is stakeholder evidence that you should take into account.
So yes, it's not always clear, and to answer your question about your years of experience as a change manager, you will be able to figure that out if you do the modules on practitioner expertise. You you will find out whether your experience is trustworthy, yes or no.
00:24:55 Karen Plum
In terms of the process outlined in module one, I for one found it rather overwhelming. As with anything new that challenges your view of the world, it felt like there was so much to learn and was going to take a lot of energy. As a perfectionist, I also want to make sure I get this right. However, as with all learning, I'm realizing that being patient and more content with making small gains might be good enough for now. Here's how Denise puts it.
00:25:24 Denise Rousseau
We need to be respectful of the defenses and positions that others in our environment have and evidence-based practice starts with the individual developing his or her own skills. So we first need to broaden our capabilities and our critical thinking - that's a foundational feature.
My second idea is the perfect is the enemy of the good. What we're trying to do is to make things better. We're looking for progress, ain’t perfection - progress. And as evidence-based practitioners, we can be catalysts of a change conversation and a broadening source of evidence.
00:26:05 Karen Plum
And Eric agrees.
00:26:07 Eric Barends
Yes, it's overwhelming. There's so many details, etc. But you will grow into it and become better at it and faster at it. So you can apply an evidence-based decision or an evidence-based approach in a reasonable amount of time.
00:26:24 Karen Plum
The last topic I wanted to cover in this episode was the question of time, and by that I mean how long it takes to adopt an evidence-based approach. The process is thorough and detailed and I wonder who has the time to do all this stuff? I put that question to Eric and then to Rob Briner.
00:26:43 Eric Barends
First of all, a lot of managers are not very in favor of taking an evidence-based approach because they say, oh really? Four sources of evidence and I have to acquire each source, so four times acquire and then I have to critically appraise it. I don't have the time for that. Come on, we know what needs to be done, guys. So let's go and get on with it and let's have some results. That is the approach.
A lot of managers are not there to do the right thing, but to get things done. So it's decided somewhere that we're going to implement A, B or C and then you have to these big shot managers that will do a good job in implementing it and getting it done. However, the assumption that decisions need to be made fast is actually a bit of a fallacy.
We often ask this or we always ask this in the class with a group of executive, seasoned managers and we ask them, can you give me an example where a managerial decision has to be made fast and I mean within 24 hours. And then it's silent for a while, and then you know, after a while examples come up like yeah, when someone did something really bad and needs to be fired on the spot.
And you say well, really, I mean in those cases, probably you need to have the HR consultant in, and maybe the legal advisor and to see whether there’s enough ground. And it's not that easy.
But you know a hostile takeover. Hostile takeover. Have you ever been involved in a hostile takeover? It takes months. You know there needs to be due diligence. The financial people need to have a look at all the books and the legal people come in, et cetera. There is often sufficient amount of time to take an evidence-based approach. Even when it's about high risk or emergency situation where you can prime the pump in advance. Second answer is that an evidence-based approach can be done very, very fast.
00:28:57 Rob Briner
It can take longer and again it goes back to sort of instant answers or instant fad diets or instant solutions to anything. Yes, they are faster, but they just on the whole, don't work. So it depends what your goal is. If your goal is just to do things fast, sure, and don't bother with evidence-based management.
If your goal is to do things that are more likely to work and get you and the organization what it wants, then you should really consider evidence-based management. So it depends your incentives are, I suppose, and I think the more people are incentivized to do stuff fast and rewarded for that, then the less likely they are to be interested in something like evidence-based management. [KP4]
00:29:33 Karen Plum
I guess the pressure to do stuff is very much part of many organisations culture these days. Be seen to be doing something rather than doing the right thing. The thing that you've researched and identified as having the best chance of success given all the circumstances. So taking an evidence-based approach doesn't need to take forever.
And that's it for this episode, which has been about raising awareness of the breadth and depth of taking an evidence-based approach. Here's a final thought from Denise Rousseau.
00:30:04 Denise Rousseau
We spend some time in the modules first helping people ask better questions about the situations they face. And then the next issue is, and what makes me think that this is a solution to that problem?
And both of those issues - what's the problem I'm trying to solve and what's a solution to that problem, have their own evidence-based processes. I want to be clear, this doesn't mean you spend a year trying to figure out the problem and the solution. You may spend 6 minutes, but your processing will be different with the tools of evidence-based management.