Evidence-Based Management

Ask the experts

Season 1 Episode 16

This episode was recorded live with three evidence-based management experts, Denise Rousseau, Rob Briner and Eric Barends, answering and discussing questions sent in by teachers and students of the subject. 

During the disucssion, we touched on many aspects of evidence-based management, including:

  • Starting with basic principles rather than complex frameworks makes evidence-based management accessible to busy professionals
  • Problem identification is the most critical and often overlooked step in the evidence-based management process
  • AI tools can support evidence-based decisions but require specific prompting and critical evaluation of outputs
  • Evidence-based management works best as a team sport where colleagues help identify each other's blind spots - including biases
  • Rather than trying to build a whole evidence-based culture, start with your sphere of influence and share evidence supportively
  • Building evidence-based practices requires social networks of support within and outside organizations


If you have questions about evidence-based management that you'd like addressed in future episodes, please send them to us. We're planning to make "Ask the experts" a regular feature of the podcast.

 

Host:
Karen Plum


Guests:

  • Eric Barends - Managing Director, Center of Evidence-Based Management
  • Denise Rousseau, H J Heinz University Professor, Carnegie Mellon University, Pennsylvania, USA
  • Rob Briner, Professor of Organisational Psychology, Queen Mary University of London; Associate Research Director at Corporate Research Forum


 Contact:

 Eric Barends, Managing Director of the Center for Evidence-Based Management 

Karen Plum: 0:01

Well, hello and welcome to this special live episode of the podcast. I'm your host, Karen Plum, and today we're having a kind of 'ask the expert' session, with three evidence-based management experts who really need no introduction, but I'm going to ask them to introduce themselves anyway just to get us going. So let's start with Denise.

 

Denise Rousseau: 0:22

I'm Denise Rousseau. I'm a Professor in Pittsburgh at Carnegie Mellon University and along with my colleague, Eric Barends, we co-founded the Center for Evidence-Based Management.

 

Karen Plum: 0:35

Wonderful. And Rob.

 

Rob Briner: 0:37

Yeah, I'm Rob [Briner] and I'm also a half-time academic working at Queen Mary University of London and also I've got a position in an HR research training consultancy business as well. I've been doing that for a couple of years, so I have a sort of split role between academia and practice.

 

Karen Plum: 0:54

Wow, okay. Last but not least, Eric.

 

Eric Barends: 0:58

Yeah, that was all very short. I think I should add to Rob's profile that he has been the Scientific Director, of course, for years for the Center of Evidence- Based Management. I'm the Managing Director, so I'm not an academic, I'm not a professor, but I do teach, that's true. But I have a long history in businesses, organizations, mostly healthcare. Yeah, that's the short version.

 

Karen Plum: 1:31

Okay, well, welcome to you all. It's great to have you all together on the podcast, because we don't have this luxury very often, and it's very exciting. So I thought it would be fun to ask the three of you to answer some common evidence-based management questions and maybe some not so common ones. Some of the questions have been sent in from students, and we also have one from an evidence-based management teacher, because they get asked these sorts of questions all the time. 

So let's get started with the first one, which indeed is from a teacher, and this one is from Preston Davis, who teaches at Kennesaw State University in Atlanta, Georgia, and he's asked "what's a practical way to teach evidence-based management thinking to mid-level managers who feel overwhelmed by frameworks or academic terms? Rob, do you want to kick us off on that one?

 

Rob Briner: 2:22

Yeah, I mean I'm assuming this is. I guess it's teaching part-time students if they're also mid-level managers. I make that assumption. I think I often find now, in terms of frameworks and academic stuff, I think it kind of helps to start off quite gently and I think thinking about the basic principles of evidence-based management, such as multiple sources, taking a structured approach, making use of the best quality and most relevant evidence, is not a bad place to start. And if they're actually working part-time as well as studying part-time, that's great, because they can come with real examples from their own work and their own organizations. 

And one challenge is sometimes of teaching this stuff I'm sure Eric and Denise have found a lot is if students aren't working and have very little organizational experience. They can't bring real practical problems to bear in the classroom. So I think really I'd say, start with the principles and focus on real things that the students, the managers, are actually dealing with in their organizations right now

 

Karen Plum: 3:22

Yeah, because I guess students very often will say surely those sorts of things don't happen in organizations?

 

Rob Briner: 3:29

Exactly yeah.

 

Karen Plum: 3:31

So, Denise or Eric, do you want to weigh in on this one?

 

Denise Rousseau: 3:35

Well, it's a very real challenge in the classroom is to make things come alive. Now almost all my students are graduate students or executives, so they have work experience. But I think one of the issues is the worked problem is taking a situation that is commonplace and understandable and then having them think through how one would gather best available evidence and, before they know the frameworks, evidence on what?

 

Because, of course, one of the real big issues probably the critical step that's missed in practice is solving the right problem, figuring out what's really going on and gathering evidence around the problem of the situation or the case before working on solutions. Solutioneering is a little bit more common in actual practice for busy managers, so the idea of taking time at the beginning to suss out what is going on, using different sources of evidence, is often the most powerful step for people to work, and that can be from a case they bring or that can be from, oh, the pile of cases that we have available to teach from, and one of the things I will say in teaching is, if we work a case in one class, we'll always come back to it in different ways later, because revisiting problems we think we've understood is really valuable as people get a sense of geez, this is how a systematic process would work. It doesn't have to be a one and done. Um, would be my response to Preston's question.

 

Karen Plum: 5:27

Super. Well, let's wrap this one up. Eric, do you have any additional thoughts?

 

Eric Barends: 5:32

Well, it reminds me of my own time being a middle manager and it's horrible because the top of the organization comes up with all kinds of ideas and sort of abundance of new policies and stuff and things like that. And it reminds me of a t-shirt I once gave to Denise Don't panic, or was it keep calm and ask for evidence. I think an evidence-based approach.

 

Denise Rousseau: 5:56

Wearing it just the other day on a hike.

 

Eric Barends: 5:59

I think I think an evidence-based approach. I mean for me, it gave me some peace of mind. Like you know, there's so much stuff going on in organizations and don't think that everything happens in an organization is well thought through or is really solving true problems, because a lot of the stuff that happens you probably won't recognize as being a problem. As a middle manager, you go why are we doing that? I don't think we have a problem here.

 

I mean, this is some. You know, some new executive comes in and has this idea or wrote this Harvard Business news paper or came from, oh, but this is a former McKinsey partner, he knows exactly. Well, no, he doesn't, but you know we're going to do it anyway. Uh, so I think taking an evidence- based approach teaches people also to be a, get a little bit of peace of mind and just stay calm and try to ask some questions and do realize that a lot of stuff that happens in organization is just fluff. It's just a lot of things going on, but not always something that you should take too seriously.

 

Karen Plum: 7:05

Making things up as we go along, right. So let's ramp it up a bit, because there were quite a few questions around making decisions when evidence isn't conclusive. And here's the first one, indeed, from a Spanish business school called ESADE, and we had loads of questions from there, and this one is from Matthias who's, who asks how do you handle situations where the best available evidence is limited or inconclusive, yet a strategic decision still needs to be made?", Denise, what do you reckon?

 

Denise Rousseau: 7:41

It's a great question and it's realistic because the biggest issue in evidence-based practice, I think, is to work to reduce the reducible uncertainty, and in some problems that's quite a lot. Other problems, especially in novel situations or in strategy, in emerging contexts where the business is changing. An evidence-based approach is to shift gears from acting on knowing to act on learning and to reduce uncertainty, the best bet is to adopt a strategy of trial, test, feedback and learn. So the notion of identifying the kind of decision you are making and whether the uncertainty is really reducible or will only be surfaced as you behave and practice and learn causes you to do very different things. Learn from the evidence you generate as opposed to the evidence that exists historically.

 

Karen Plum: 8:43

I guess my thought about this one is that you know that there's going to be the temptation to say, well, the evidence is limited or inconclusive, so let's just go with our gut, or let's just do what we always do.

 

Rob Briner: 8:56

Yeah, and if I can jump in there, something I do less with students but more with you know sort of real practitioners is. They often ask this question. So it's a really good question. It's quite common and I sort of tend to say to them well, it depends what kind of problems you're dealing with. Typically in management, we're not dealing with two plus two equals four type problems. So of course the evidence is inconclusive, of course it's partial. That's normal.

 

And the different way to do that first is to actually say well, we're not looking for conclusive evidence. The evidence is always partial. We're trying to make a better informed decision. As Denise said, even when you get to that point in some contexts you can test and trial things as well. But I think if people go into this with the expectations they're going to get a lovely you know, even if such thing were possible complete set of evidence, evidence whatever that means and reach a really clear conclusion. I think that's pretty unusual. So I think more typical is yeah, it's not conclusive, we haven't got everything, that's okay, we're trying to make a better informed decision

 

Karen Plum: 9:55

Yeah, and, and if it was that simple, then we wouldn't be sitting here talking about it.

 

Rob Briner: 10:01

Or we could just get AI to do it if it was that simple, right.

 

Eric Barends: 10:04

Yeah, I would argue that it also depends on the outcome. I totally agree with, of course, my colleagues that evidence-based decision-making is about reducing uncertainty. I mean, that's how it started in medicine. You know, should I do this, should I do that, what's going on with this patient? But you can't reduce it to zero. You won't get an absolute, you know best available evidence approach that gives you 100% certain outcome. That's not going to happen.

 

Eric Barends: 10:30

So it is important to quantify the uncertainty a little bit. That you have some idea like. So how uncertain are we? Uh, is it a crap shot and the only thing we have is just intuition and maybe one paper or some expert opinion here and there, or do we have way more? And then it depends, of course, on the outcome.

 

Eric Barends: 10:50

If you're talking about a hospital, you should maybe invest a little bit more time to get a little bit more evidence than when the uncertainty is high and when the outcome is, I don't know, a new treatment for cancer for children or something serious. Or you work with a school or educational institution, which is completely different when I or Rob or someone has his own business and we're doing an innovation and I put in my own money, then you know, it's up to me as an entrepreneur, to, you know, take a lot of risk, that's perfectly okay. So the outcome also determines how much effort and time you should put in. It has to do with accountability. I mean, if it's a very important outcome and you just had just little pieces of evidence and you said, ah well, let's go with it anyway, man, maybe you should think twice and invest a little bit more time.

 

Karen Plum: 11:49

Good stuff. So a few minutes ago I think it was Rob mentioned AI and actually from the students in ESADE we had loads of questions about AI. I guess we're not surprised, but let's talk a little bit about the use of AI in decision making. So here's a question from Christian - "With the growing influence of AI, are you worried that human critical thinking might be eroded as we delegate judgment to AI assistants?

 

Rob Briner: 12:19

Well, I'm worried it already has been eroded way before AI.

 

Karen Plum: 12:23

Who said that? Who said that?

 

Rob Briner: 12:27

Eric Denise what do you think? What do you think?

 

Eric Barends: 12:31

Yeah, I mean it's so funny because I hear this question all the time, so it's so recognizable. I mean, the younger the people are, the employees are in organization, they're all like, oh, everyone needs to invest in AI now, I mean, regardless of what kind of sector or domain or whatsoever, it's all about AI, especially in the field of change management and HR. There is a lot of AI going on. And, yeah, I do feel people, if people don't realize that it is so imperfect and a lot of the answers are based on what we call hallucinations or just being nice, especially American AIs. I mean, they're most of all American.

 

Denise Rousseau: 13:12

Now, now, Eric!

 

Eric Barends: 13:16

They want to please the client, they're very customer focused. So when you make a proposition, you go like, oh, you're absolutely right. And there is, and you go, no, you should say you're absolutely wrong. So, yeah, it happens, and so I'm afraid that it may corrode critical thinking. So it's very important to pay attention to that, absolutely, yeah.

 

Rob Briner: 13:39

Eric, do you think we need more Dutch AI? Then More Dutch,

 

Eric Barends: 13:42

Yeah, absolutely.

 

Karen Plum: 13:43

Straight forward. Yeah, very straight talking.

 

Eric Barends: 13:47

Yeah, absolutely.

 

Karen Plum: 13:50

Isn't part of this that you know, we have to know what questions to ask the AI?

 

Denise Rousseau: 13:56

You hit the nail on the head, Karen. I'm probably the optimist of the group. I think social media has already ruined our critical thinking, so we can only go up from here. But the good use of AI to somebody who's knowledgeable and says, geez, I want these different kinds of evidence. I want to know industry information with regard to such and such, or I want to get a summary of what is known about efficacy of remote work.

 

Denise Rousseau: 14:36

We have very good AI-based tools. I think Consensus AI is one that is a lot more effective for many people than going to Google Scholar and trying to get access to research evidence or access to industry information. So there is a bright, you know, there is a silver lining to the AI cloud, and one of the things I delight in doing with regard to the classes that I teach is to having people first learn a process for gathering, let's say, industry evidence or evidence from science for the questions that they want to answer.

And then to look at how AI - Consensus in particular - works, and though they see differences, because they have a more fine-grained understanding of what is a good research method to answer the question or what makes for representative industry data, they still can see the what do you want to say the substantive value of quick searches in a well-structured AI tool, and I think that's going to help us, because what everybody will say is I'm too busy. Well, you're not going to be too busy for this.

 

Eric Barends: 15:45

Yeah, I agree. I mean I like to be critical towards AI, but that's the sort of I think a lot of people have this idea oh, AI is this one thing, this general intelligent thing that as a source I can go to. I don't think it's going to work that way. Consensus is a group of people that we are, we're in communication with them, so that's really great that we can put things forward like this part. But Consensus, for instance, is a specific tool focused on, you know, looking into the scientific evidence, peer-reviewed papers on a specific topic. So it's very specialized and there that is different and there are more specializations coming and it really makes a huge difference.

 

In the past, indeed, I mean in the old days probably Denise remembered that, but I was not there because I was not into this topic yet that if you wanted to find out if there was any research on a topic, you had to go to the library and the librarian would say well, you, you know, it's probably in the Journal of Organizational Behavior or the journal of this, and then you have to go through all these papers and, yeah, any, any idea?

 

No, you just go through a couple of years, maybe there's a study. Then, of course, with the internet, we got all these online research databases that you could ask queries, but you had to know about Boolean operators and systematic, which worked. It was already a revolution. It was one of the most important drivers of evidence-based medicine. Let me be clear here, and we taught this till about a year ago. But a year ago we had to change our online course modules and book on how to search because search tools like AI, search tools like Elicit and Consensus came around and that completely changed the landscape. So I think things will improve. I think taking evidence-based approach will get easier on some parts, but still, you really really need to critically appraise, be able to critically appraise and have critical thinking skills, because we know that some very basic stuff still gets wrong by AI tools, even Consensus and Elicit.

 

Rob Briner: 18:10

Yeah, Just to pick up on that I think one of the times I've seen AI be most used from this context is actually in analysing different forms of data. So, whether it's stakeholders or expertise, organizational data, academic stuff, if you've already done the critical appraisal and checked it's relevant, and then you put it into an AI and then you ask them to help you analyse it. It's pretty good for that, I have to say, but not if it's just scraping around the web looking for all kinds of rubbish. But I think it's pretty good if you actually put the stuff in and then you use it to quickly help you analyse stuff.

 

Karen Plum: 18:41

Yeah, so it's using the tool in the best way, I guess.

 

Rob Briner: 18:45

At the moment, yeah, I would say it seems yeah.

 

Karen Plum: 18:48

I mean there were some other questions about how we validate the evidence that the AI is using. And, to your point, Rob, if you're giving data to, or if you're giving sources to the AI and asking for its view on what you've got or the conclusions you're drawing, that's very different than asking the AI, well, what is the evidence for this particular construct or whatever, and then just accepting what it says, rather than saying, well, what are your sources for that, um? And then investigating those. Is that another valid way to approach the thing?

 

Rob Briner: 19:28

I think so. Yes, I think it is, um, but again, it's as Eric and Denise are saying as well it's also using your own critical thinking and understanding of what the data and evidence from whatever source is and how, what its quality is now relevant. I think that's still something that feels to me, Denise quite a lot of human rather than AI intelligence at the moment, and that may change. That may change very fast, I'm not sure. Yeah

 

Denise Rousseau: 19:52

But that's a lovely way, I think, to focus on what the value of evidence-based decision-making training is. It's first learning how to ask questions, to understand what is the real problem, and then it's identifying where the evidence is and, as both gentlemen have said, critically appraise its trustworthiness and relevance to the question. And that training, I think, is incredibly important because often people don't really have tools for thinking about. Is this really relevant? Well, the topic's the same, yeah, but I'm asking a causal question and this is a correlational study, and should I really follow what its conclusions are? 

If my question is really about a more specific kind of outcome, that's enlightening for people, to the point that we've all had students who wind up correcting other faculty once they complete the evidence-based management course and the faculty make causal claims and the student says but can you do that with that research method? You know, my little heart just hums when that happens. Fortunately, I'm in a place where my colleagues come up and say, wow, that was really thoughtful of them, as opposed to they're really ticked off or embarrassed.

 

Karen Plum: 21:10

Yeah or they're not winning any popularity contests in their university.

 

Eric Barends: 21:16

I also noticed that AIs are in general very lazy. They take the shortcuts.

 

Karen Plum: 21:24

They like to be loved too, don't they?

 

Denise Rousseau: 21:28

Not the Dutch ones but yes!

 

Eric Barends: 21:32

I have a special name for my AI,

 

Karen Plum: 21:35

You're building up a relationship - is it very Dutch then?

 

Eric Barends: 21:37

Yeah, well, you can ask them. You know to ask to take into your account your history or give them information which is relevant for you. So if you say, hey, I'm very much into evidence-based decision making or whatsoever, I'm an evidence guy it will take that into account when you ask a very serious question. But it is important to prompt them in the correct way. And we just did a one-to-one comparison with ChatGPT and Consensus and et cetera, and we asked about a paper that was on human resource interventions and I don't know what the intervention was flexible working or whatever and the question we asked was in this paper the effect of flexible working on productivity was examined and the question was were there any? And there was a positive effect. And we asked were there any mediators or moderators, or as we call contextual factors that affected the relationship, the outcome, where they discussed in this paper and they say oh yes, there were several and blah di blah di blah and we and then we had a look into it.

 

Yeah, that was in the discussion section, but it was not an original finding of the paper. So you need to be clear and specify. Could you see whether in the results section or are there any original new findings? And we discussed it with Consensus and they now, if you get that question, I know now that you should not look at the discussion section but only at the results section, but you need to be very specific. So maybe follow-up questions.

 

Like you know, Usain Bolt has never been tested positive on drugs and stimulating stuff and I asked how likely is it that this guy, indeed, has been completely clean his whole career? And, yeah, well, never tested positive, so probably has been completely clean his whole career and yeah, And said, well, could you take into account, as we called the Bayesian prior, that first of all, a lot of people tested positive and Lance Armstrong never tested positive and actually has used a lot of drugs and things like that to stimulate their performance, and then it starts thinking like that's a good point. Yeah, if you take in that account that it's so rife that so many athletes have been caught. Yeah, well, probably it's not 100%

 

Eric Barends: 24:11

So you need to prompt it and say are you sure, could you please take this into account and can you have a look, and could you please only have a look at peer reviewed journals or, you know, make it specific.

 

Karen Plum: 24:36

In a sense I think what we're saying is that if you're going to use an AI in the pursuit of better evidence, you've got to know the right questions to ask, the process that you're trying to follow so that you can ask it, you can interrogate it and make sure of the veracity of its answers. So you can't just use AI without knowing what it is you're looking for.

 

Denise Rousseau: 24:48

Knowing the standards of evidence or of trustworthiness that you seek to apply and specifying those criteria to the AI. Which is perhaps the most critical aspect of evidence-based decision-making training is having a way of judging what is trustworthy and what is not. I think that's actually a competency that has real legs, because it then goes to how organizational data are put together and claims that are made in other parts of a person's life. So that's one thing I'm kind of proud of about this evidence-based decision-making management movement. It is what it's opened up for the thought processes of people at the 80 different universities and all the different students and executives that we've been able to reach with these ideas.

 

Karen Plum: 25:49

The thing that struck me a few minutes ago. One of you mentioned the identification of the problem that we're trying to solve and, looking back over my career as a consultant, how many times have I been pitched into a project with a poorly defined problem that then everybody gathers around and sort of pitches in to try to help solve? And only sort of down the line do you discover that there aren't many people that agree that that's the problem that they have.

 

Denise Rousseau: 26:22

Absolutely so. Can we ask you then, Karen, as a consultant, what are your, what's your options or what are your approach in the context of a problem that may not be, you know, really a valid problem?

 

Karen Plum: 26:39

I think it's really difficult because obviously you're in a commercial situation where something has been sold to the client in terms of what they're going to get out of the exercise, and I think sometimes you're just not in a powerful enough position to influence their thinking and to change the trajectory of the project. So if I'm in that sort of situation, I suppose I'm going for the marginal gains approach of just trying to introduce little tweaks, of not doubt, but trying to encourage more deep thinking about some of the decisions that are being made and the direction that's being taken, because there's only so many battles you can fight and particularly I was thinking about this the other day there's a number of situations I've had where I've been pitched into a project when it's two thirds of the way through and then the clients thought you know what? We need a bit of change management here, let's hire a change management consultant. And you arrive and you go. Oh my God, I really wouldn't start from here. Rob does that resonate with you?

 

Rob Briner: 27:47

Yeah, and I think Denise mentioned it already, this idea of solutioneering and I think I don't know whether it's hard to see is a bias, exactly, but maybe it is. But I think certainly in the teaching or the training in particular I do, it is. It is really unusual to find a group of people trying to understand the problem first and then think about the solution, who haven't, either explicitly or implicitly or unconsciously, already have a view about what the solution is. And it is so hard somehow.

 

Rob Briner: 28:21

I've found it very difficult to sort of work out how you can overcome that, apart from taking this very structured approach and saying we're not going to think about the solution yet, let's just stay here.

 

Rob Briner: 28:32

And it seems, I think, almost impossible for all of us to do it, because I think, as you mentioned, Karen, people want to be helpful, people want to do stuff, so they're already thinking about what should we do about this problem. We don't know what the problem is, but what should we do about it? And that's it's a very powerful sort of barrier, I think, to trying to be evidence-based and holding people what is the issue, what is the problem, what thing they're dealing with first, without going to solution mode, seems to be one of the most challenging things I think. Interesting I don't think it's so challenging with inexperienced students because they're often not aware of what the solutions are anyway. 

But in an organization say in an HR team which I work with a lot they know what all the potential solutions are. They think they do and they know who can, they can buy it off, they can know what the colleagues in other businesses are doing. They're very aware of what the kind of range of solutions are. They're already on to that and that's really tough I think, really tough.

 

Eric Barends: 29:31

I want to push back a little. I mean, it's always been easy to bash practitioners and organizations and they're doing weird, stupid things in organizations, and I really like that Rob now has some talks where he explains that that is not always the case. And I do remember that Rob and I were talking with some of the board members of a big banking firm and we were explaining something about the evidence, about something they introduced in the organization, and I think this guy said hey, you guys think that we're idiots, that we're not aware of that what we're doing is actually not working and not supported by the evidence. We know that. And we were like, yeah, but why are you doing this then? Well, we're not in a vacuum. You think that this organization is run by us and we call all the shots. That's not true. We have regulators, policymakers, et cetera, et cetera, and we do this because the regulator requires us to do that and we know it's bullshit and we can't call out bullshit because they will come after us. So we're going to do it anyway, so we can check the box.

 

The same with insurance companies, in healthcare organizations, they demand all kinds of crazy stuff of organizations and, to be honest if you're a middle manager with all due respect, specifically in more maybe American sorry, Denise organizations where it's a little bit more top-down. I mean, in Holland we always ask why, who says, etc. But in the United States it's in a lot of organizations it's not common to inform everyone from the top to the bottom about why things happen. So for middle managers, things happen and they go.

 

Eric Barends: 31:25

Why is this happening? And there may be a very legitimate reason why things are happening, and even stuff that you go like but that's not going to change anything or that's not going to be effective. Yeah, but there is a reason why they do that, and so I think we need to be careful, because there are a lot of clever people working in organizations that are in charge of million dollar decisions and they really give it some thought and yeah. So be careful to assume that they're clueless or, you know, ignorant that there may certainly be a reason why things are happening. It may be political, it may be regulatory or whatsoever. So I'm always a little bit careful.

 

Rob Briner: 32:11

By the way, I don't think this is a practitioner thing or about being stupid or clever. I think it's a human thing. I think humans want things or have ideas about what they want to do, and then they configure the world around them to get to the conclusion they've already thought of. And it's not about being clever or smart, I think it's just something people do in all contexts. So I think, and I think from an evidence-based practice perspective, I think the challenge is to say that's fine for everyday stuff, like buying an air fryer, and buying an air fryer is going to be the solution to every single kitchen issue in my life. Wonderful, it may do, but it probably won't. And I think it's just like. It's like a sort of temptation that you see something that you really think is going to help and then, as you say, you configure the evidence to drive you to the conclusion you've already thought of

 

Eric Barends: 33:06

Let's not forget sorry to drag on about this that the pressure to buy that air fryer is huge, because, and all your friends have that air fryer.

 

And that happens with organization well, no, it is. Don't you have air fryer? And that happens with organizations. In Holland, there was a huge pressure on banking firms to adopt agile working. Everyone was working with agile, had tribes and things and I don't know scrum and God knows what. And supervisory boards asked the boards like do you already have this in place? Do you have a talent management system in your role? You should. You know that is something that is now the standard. Oh, you're doing agile. This is actually really a new management innovation. You should really look into this. So let's not forget that the pressure is huge.

 

Karen Plum: 33:58

Absolutely Okay. Well, I'm going to exert some discipline here and bring us back to our questions to see if we can get in a couple more before we wrap up. So here's another one from ESADE, from Christian this time. "What is a really common bias that often flies under the radar?”

 

Denise Rousseau: 34:25

Well, probably confirmation bias. I guess we would say that if it's worked before, it'll work again, and if it's easily available as it was applied. As you just said, other people have done it, so maybe then we should do it too, and I wouldn't diss people for using social cues and what others are doing as a source of insight, because it's a type of social proof. You know that something might work. It's just to be critical and say will it work for us? And compared to what?

 

Karen Plum: 34:57

Yeah, I mean, you know trying to fight our biases is futile anyway. Right, but what other common biases might fly under the radar?

 

Eric Barends: 35:08

I would argue that's the whole thing. The moment you're trained and skilled in an evidence-based approach, you will be the radar. You make it visible. You will notice that everything that was indeed, you know, not outspoken or under the radar, like confirmation bias or authority bias or just trying to get consensus and confirmation and that is something you will notice if you learn about these biases I mean, they're hard to correct them in yourself but if you are trained and you learn about all these biases and you step into your organization it's one of the exercises we do. Just you know, be observant during some of your management, or board or executive meetings or whatever meeting you are in and try to be a fly on the wall and just observe and see what kind of biases come along and maybe you can figure out what is the most dominant.

 

Usually, organizations have a dominant bias, one or two, that they always file for. That's their go-to approach. You're going to be the radar and then the question is, of course, what are you going to do? Are you going to call them out? But yeah, evidence-based approach makes those biases visible or brings them to the surface, I would argue.

 

Denise Rousseau: 36:28

At the same time it's really difficult as Karen is alluding for us as humans to see our own biases. We can see it in other people and indeed that's probably the insight, which is to say I'm aware of biases but I can't see my own. So I need other people in conversation, in reflection, in checking out different sources of ideas, to be able to see oh, this was my assumption. That's kind of questionable. So I would say this is a evidence-based practice, is a social practice, because it's very hard to do it alone.

 

Rob Briner: 37:07

I'm just going to exactly, to say what Denise said, I think it's like a team sport and it is, uh. We are really good at spotting other people's biases, particularly when we don't agree with them, uh, but we're really terrible at spotting our own. So I think, as a sort of exercise, doing this with a team, doing it with other people, is really important, so you can interrogate, as it were, each other's assumptions about what they think they're seeing in some data, for example.

 

Karen Plum: 37:33

Okay, thank you for that. So next question back to Preston, our Kennesaw State University teacher. "How do you build a culture around evidence-based decision-making when leadership prefers visionary or gut-driven styles?

 

Denise Rousseau: 37:52

I would say start local, because one of the issues in evidence-based practice is it can be a whole organization or it can be a group of people, or indeed it could be an individual. I would start in the area of which I have control over my decisions, and one of the things we do know from research on leadership and the practice of evidence-based decision-making is that managers who engage in evidence-based decision-making and explain to their employees how they made a decision, regardless of the outcomes, regardless of the impact on employees, are more trusted. And so think about this as a campaign to be trustworthy with regard to your own staff, but also to model for them what you would like them to do, and rather than always worry about what the broader organization is doing, because each and every practitioner only has a zone of influence over his or her own practice. So that's what I say start there.

 

Rob Briner: 38:53

Yeah, and I also tend to agree there about that in a way. I think if you're making decisions and you're trying to make better informed decisions, I don't think you need to create a culture as such. Uh, I think that's. I think you need to help shape people's behavior.

 

Rob Briner: 39:10

So I've focused much more, as Denise said, on the behavior and role model rather than this vague concept of culture, and also to bear in mind again and Eric said this too that a lot of decisions, a lot of things that go on in organizations are no, they are beyond, they're nothing to do with evidence-based management. So I think you don't need to do it in a widespread way. You need to focus on those activities, those decisions, those areas where it actually makes sense to do it. If people are just telling you to do stuff or you've got to do because of a regulator, there's no point in taking evidence-based management approach. Where you can do it, do it there. So, rather than this culture and everyone doing it, do it specifically when and where it really makes sense to do so, but it's going to have the most impact, I think

 

Karen Plum: 39:52

I was waiting for one of you to say well, what do you mean by culture?

 

Rob Briner: 39:57

I was there.

 

Eric Barends: 40:00

Normally we would ask that, but let's, let's do that. I would think also, um, I remember this man. He's, he's very well known in the United States, Quint Studer, and uh, I think he made a comment like well, if your organization is really not interested to take an evidence-based approach, you could ask yourself do you want to be in that organization? So sometimes it's overwhelming and pick your fight. So don't, don't think for a second that you can change if there would be something as the culture, that you on your own or your colleagues can change the culture or the climate of the organization. But what I do think. So pick your battles.

 

As Denise always says, and as Rob also often points out, a little bit more evidence is more than no evidence, and two sources of evidence is better than one source of evidence. So I think you can add value here by taking a more supportive approach, that if the leadership in your organization tends to take a more visionary or consensus-based or there are all kinds of alternatives for an evidence-based decision-making approach, of course it's just be supportive. Be supportive and, for instance, if a decision is being made or is considered, try to find the evidence regarding that decision and bring that forward and say, hey, I understand, we're considering this. I had a look in this and don't in sort of you know lecturing way, like you should know this, but bring forward. Maybe you should have a look at this. I think this is relevant. So then, support your boss, support your manager, support your executive. Bring your executive in a position that he or she has maybe a little bit more evidence that he or she can use in the organization.

 

But I noticed that if you say, hey, I understand, we're considering this. I found this paper I'm not completely sure, but I think it's relevant - maybe you should have a look at it and they go, oh, thank you. And then they make it their own. So, rather than you pushing it in their face, they own it and read it, saying this is indeed no, thank you, this is really important, we should. And then they take it with them to a larger, larger uh discussion and uh, that. So don't tell them hey, man, the evidence says that this is absolutely not going to work. But you need to be a little bit more, you know, more strategic, you know it's.

 

Eric Barends: 42:40

It's called upward managing. You need to be strategic towards the, decision makers in your organization, and show them hey, I found this. I'm not completely sure, but maybe you should have a look at it. You're probably clever enough to figure out whether this is important and it's a very yeah, you're way clever than me and it and it's a meta analysis that suggests that X actually does or does not. Maybe it's helpful. And that is a starting point.

 

Karen Plum: 43:08

And then if they ignore it, then that tells you probably all you need to know.

 

Denise Rousseau: 43:14

It's worth pointing out that there is research on evidence-based practice in organizations and there's like 109 studies I've found in the published literature. And what's striking to me is virtually all the research is on nonprofits and governmental organizations. You know Peter Drucker always said that nonprofits have a lot to teach for-profit organizations. A big part of the reason is they don't have tons of resources so they have to be careful. They have to have clear models of how their services are provided and rework them as they come to understand what's successful and what's not. But one of the things that's really striking from all this body of research is two things. Number one leadership is hugely important and it's much easier to engage in evidence-based decision-making at all levels if the leaders are supportive and they do two things provide resources and model it themselves.

 

And the other part of it is the social nature that people who are practicing using evidence in different ways in their jobs typically don't do it alone. There's networks of people in their own organization who provide information. There's somebody who's everybody's librarian. That would be me if I were in a non-profit. I'd be the local reference librarian because I like that. They have external ties to universities and organizations that run studies or have access to data, and typically people are also. They see themselves as professional. So in their professional networks be it public health or fundraising or whatever it might, program management they have access to others who give them insights and information with regard to the changing nature of their practice area.

 

Denise Rousseau: 45:07

This is a much more socially embedded way of using evidence than I think when we were first working with the three of us, we ever really appreciated, because we were looking at the lone wolves who or who wanted to be the evidence- based practitioner in an organization. That was anything but. But the landscape is changing and it is more social and there are many more organization examples of cross level practices. Does it look like everything in our book on what evidence-based decision-making ought to be? No, but we were focused on the skills that people might want to develop to gather evidence and ask questions about their problem. I don't think we've identified the design of effective evidence-based practice in organizations. In my mind, that's still sort of a that's up for grabs. That's something we're going to be learning in the next 10, 15 years as these practices are taken up.

 

Karen Plum: 46:16

Well, that sounds like a good enough place to finish. Think we're going to wrap up there. So well my thanks to all three of you Denise, Rob and Eric for your time today answering everyone's questions. I found it really interesting to hear all your thoughts and experiences, and I'm sure our listeners will too. I think we're going to make this a regular podcast feature, so keep those questions coming in. Thanks for listening to this episode. See you next time. Goodbye.