Evidence-Based Management

Module 13 Apply - incorporating evidence into decision-making

May 25, 2022 Season 1 Episode 13
Evidence-Based Management
Module 13 Apply - incorporating evidence into decision-making
Show Notes Transcript Chapter Markers

This episode accompanies Module 13 of the course, which is about incorporating evidence into the decision making process.

Is the evidence appropriate for our situation (is it generalizable). Can we (and should we) action  the evidence? And what are the practical aspects associated with using the evidence that we’ve gathered? The real world is messy and complex, so there are many things to take into consideration.

We share real world examples  from professional HR body CIPD, Creelman Research and a large trial at pharmaceutical organization Sandoz (part of Novartis Group), which shows not only the power of capturing stakeholder experiences, but the value of keeping interventions simple and practical.  

Host: Karen Plum

 Guests:

 

Find out more about the course here:   https://cebma.org/resources-and-tools/course-modules/ 

00:00:00 Karen Plum

Hello and welcome to the Evidence-Based Management Podcast. This episode accompanies Module 13 of the course, which is about incorporating evidence into the decision-making process. Is the evidence appropriate for our situation? Can we, and should we action it? And what are the practical aspects associated with using the evidence that we've gathered? 

I'm Karen Plum, a fellow student of evidence based management, and in this episode we hear from two of our regular contributors, Eric Barends, Managing Director of the Center for Evidence-Based Management and Professor Denise Rousseau from Carnegie Mellon University. And this time we're joined by three new guests - Jonny Gifford, Senior Researcher at CIPD, the professional body for HR and people development; David Creelman, CEO of Creelman Research and expert in HR strategy, analytics and learning; and Stefanie Nickel, Global Head of Diversity and Inclusion at pharmaceutical manufacturer Sandoz, part of the Novartis group. 

Let's hear what they had to say about applying evidence to decision making. 

 

00:01:25 Karen Plum

In this episode I wanted to include some practical examples of using evidence in decision making, and that's included in the later part of the episode. Firstly I think it's worth saying that the further I got through the course, the more the modules seemed to overlap. I felt that by the time I reached the “Apply” stage, I shouldn't still be evaluating the evidence - that should have been done earlier. So if something isn't generalizable to our situation, it should have been rejected during the “Aggregate” process. 

Perhaps that's being a bit too linear and precise, particularly as life is rarely that cut and dried, and the Apply stage is actually the final opportunity to ensure that only the best evidence is used. 

So let's start by looking at whether the evidence we've gathered is applicable to our situation. I asked Eric and Denise for their thoughts about how to determine whether the findings are generalizable. 

00:02:22 Eric Barends

Yes, the generalizability. Does it apply? Does this evidence also apply to my organization, to my population, or given the outcome I'm aiming for? That is typically something that you should discuss with the experienced people in your organization. So go to managers, go to employees - if it is about a division or a group of employees or a new way of working - talk with the people and say, hey, this is the evidence we found. I think it's actually applicable to our situation, to our organization, to our people - how do you feel about this? 

And then sometimes you get good feedback that gives you pause, because you don't know every aspect of your organization. When I was in charge of a hospital and we wanted to implement performance feedback and that was a no brainer, you need to give people feedback. And then we had to look at all the research on feedback and one of the most important moderators of whether feedback has an effect yes or no is whether the person is open to feedback yes or no. 

And then one of my managers said whoops, then we have a problem with the surgeons. I said what do you mean? Well, you know Eric, internists and pediatricians - they're all open to feedback, they're kind of insecure or they want to check double check. But hey, surgeons are pretty straightforward, they know what they're doing, so they don't need your feedback. And that may be a cliché, that may be a caricature, of course, but hey, this is information from your organization that may be valuable or is something that you need to take into account if you design the intervention in your organization. 

00:04:08 Denise Rousseau

In my experience with teaching management students is that people are very quick to discount findings they don't like based on the setting. And that's a human bias that is problematic in itself. 

So my advice is to factor in the scientific findings that you've acquired into your thinking and say how might they apply, what might be different in my understanding, if I were to give this information some standing, treat it as trustworthy? Because it's how it helps us think about the problem we're trying to solve or our choice of solutions is really what matters and being unwilling to think about the implications is a problem. 

I think generalization is an important issue, but to a great extent there are some pretty clear answers, I would say that if we're talking about matters that pertain to human nature, motives and cognition, people are people, and the factors that may lead to some variation in people, well gender in the sense of socialization for gender roles or society, so some research would indicate somewhat different orientations towards relationships in Asian cultures as opposed to Western cultures. 

But on the whole, cognitive and behavioral responses to conditions are pretty stable and consistent, and picking up on the key idea in evidence-based management, single studies don't count, but bodies of studies do, and if they converge, it's likely they will generalize to your situation, if indeed you framed the question right that you're trying to resolve, because if the question you've asked is relevant to the research findings, those findings have a role to play in guiding your thinking. 

00:06:08 Karen Plum

Denise also explained that one area where generalizability needs further thought, is where technology is involved. For example, if we're looking at research on how people communicate using technology, then studies done in the 90s with the then level of video conferencing technology, would give very different results, compared to looking at this today - in a world where we have super-fast broadband, Zoom and high-end webcams.

So we have the evidence. it's applicable to our situation, but how do we decide whether it's actionable? The Center for Evidence-Based Management does a lot of rapid evidence assessments, essentially compiling the best evidence from the scientific literature on different topics like psychological safety, performance management, feedback. All of this is interesting, but how do we apply it? 

00:07:01 Eric Barends

So for instance, one of the outcomes that was about ethical climate and people doing nasty things in organizations. Turns out that one of the most important modulators or drivers or predictors is whether they perceive their supervisors or the management in the organization as ethical. 

Now that's very nice, nice to know, but then the question is now what? And I can't tell all my directors and the board and the supervisors, please come across as ethical because that will solve our problems with people doing nasty things. So here again, what we often do is invite managers, employees, practitioners to reflect on these findings and ask the experienced managers, given the fact that this is an important driver, this is an important aspect, how would you go about this? What would you do with this? And then of course, you may come across ways to apply this evidence that is way more actionable. 

I mean, in our case the HR people said well, that means that we need to screen our supervisors or managers on ethical behavior and it is something that we need to take into account, whether we promote someone or hire someone, et cetera, because this company has a very serious issue with people doing nasty things. This was a financial service organization and that is how it was made actionable.

And they decided to develop a workshop and point out to the managers and the supervisors that their behavior, their comments, their attitude, has a direct impact on their subordinates, so they should be aware of that. 

So here's an example where there is information from research that is not actionable. Academics just leave it there and say, OK well, this is an important moderator, that is what we found, this is an important predictor or driver. And then you have to take the next step yourself to make it actionable by consulting people in your organization. 

00:09:15 Karen Plum

And that brings you back to practitioners and stakeholders. But as David Creelman points out, in the circular nature of evidence-based management, if you don't start in the right way, don't be surprised if there's a problem when you get to this stage. 

00:09:29 David Creelman

Sometimes you've got all the evidence, and you've aggregated it, and so as far as you're concerned as an evidence-based practitioner, you're 100% ready to go, and then you find that people don't want to go along with it. 

It's very common to think well the problem is that I need to tell a better story. I need to learn to tell the story so that people understand how valuable this evidence is and why this practice is indeed a best practice that's going to lead to good results. And so you think there's something gone wrong at the end, whereas in fact something went wrong at the beginning, which is that you didn't get people buy in to whatever it is you were working on right at the beginning. 

So for example, if you've discovered that, here's a thing we should do in all our meetings for them to be more effective - something simple. You find all this evidence of having an agenda leads to better outcomes, and you've got all kinds of academic research and other data you've gathered and then you run around telling people - I'm an evidence-based practitioner and I'm a researcher and I found out this useful thing and we should definitely all start using agendas, the evidence is very conclusive.

And the managers are sitting there saying - who asked you? I'm happy with the way the meetings are run, I've got other things on my mind. 

00:10:49 Karen Plum

As we've discussed on other episodes, you have to start with the question that you want to answer, not to set up a data analytics department and go looking for problems to solve, which David says he's seen many times. 

That can simply result in uncovering relevant, actionable findings, but with no organizational buy-in to fixing the problem they may solve, or indeed no agreement that the problem actually exists in the organization. David was also quick to point out that there are other types of implementation issues, and he gave an example of an organization keen to improve job interview outcomes. Even with buy-in within the organization and plenty of evidence to show what the right solutions are, there are always complications. 

00:11:35 David Creelman

You might have a process based around the idea of a one-hour interview, one hour face to face interview, and then you get to some part of the organization, say, well, first, we never do face to face interviews and they're never an hour, it's just not how it's done in this type of a hiring process that I don't think anybody will want to change. So you now have to come up and find well how do we adapt to this particular situation? 

So I think that implementing anything always has a little difficulty - so it does become of problem-solving journey, even after you're pretty clear about what best practice should be. 

A lot of the time people only have so much bandwidth in terms of things they're worried about, and so even if you're coming to them with something that maybe they care about, and maybe there is an evidence-based way to do it better, but they might just - I just don't have the time or the intellectual energy to go about figuring out and actually implement this. 

Like maybe I have to train people, maybe even though I'm the boss, I have to convince other people, I have to find some way of monitoring how they're doing it, and I have to work out these little glitches that are bound to arise when we get into practice and I just don't have the heart to take on that project. 

00:13:00 Karen Plum

The truth is that the world is much messier than in academic studies. People don't always respond rationally, or they have political motivations or self-interests, that get in the way of accepting a new approach, even if there's a ton of evidence to support. The human factors that don't necessarily show up in your analysis, as David put it. 

In determining how to apply the evidence, I talked to Jonny Gifford, who has worked extensively with the Center for Evidence-Based Management to gather the best available scientific evidence on a number of topics of interest in the HR field. 

00:13:36 Jonny Gifford

When you look at the established practices involved in evidence-based practice, you've got this sort of very clear processes that you go through. When it comes to bringing together the different important sources of evidence, I think it's much more of a craft. We need an iterative approach where we're weaving together those different sources of evidence. 

I can give you an example of a project that we did looking at what works in diversity management. So we ran 3 workshops. First workshop we got together 20 or so different senior D&I professionals. And we essentially said to them, what keeps you awake at night? What are the problems and challenges that you face in your profession trying to increase inclusion and diversity in your organizations? 

So we got insights from them on that. Then we went away. We did some evidence reviews on those questions, we honed them into research questions and did a bit of an evidence review on them. Came back in a second workshop and said, OK, this is what the research looks like. What might that look like in practice, if you were to draw implications from it?

And from that conversation we then went away. We sort of draw up some recommendations, and we met again and we kind of road tested them with it. So it is genuinely, I think, needs to be that iterative process. 

00:14:58 Karen Plum

As Jonny explained to me, that approach can be very resource intensive. So another approach they use when exploring topics for their members, is to do some initial research and then circulate a short summary to interested HR directors for their input and comment which helps bring the subject and its complexity to life. 

00:15:19 Jonny Gifford

Even when we do that latter approach, that lighter touch approach, we still take our steer from practitioners in the first place. What I don't think works is starting just with academic research and only really engaging with practitioners because you want to disseminate your own research. 

I think another way of thinking about it is what value do practitioners and academics and researchers, different parties, bring to the process? The most important question, I think any practitioner can ask researchers in their field is, what works? So how can we address this problem? Or how can we realize that opportunity for improvement? That I think really needs to be the starting point.

00:16:05 Karen Plum

I've been in similar situations with research sponsors. Their practical insights bring the whole discussion to life and highlight the challenges of implementing things that initially seem like no brainers. My focus in the research was to find practical ways of implementing. Keeping them simple and do’able. 

David mentioned that being able to nibble away at the complex problem, based on all the things that have been found, is much easier than trying to find one magic bullet or solution. 

00:16:35 David Creelman

As you implement, you’re gathering additional evidence all the way, so your initial studies made it look like this was the right approach. And as you begin to implement a piece at a time, you refine your understanding based on what you're learning as you're doing it. 

Let me just add in one idea 'cause this is what I would have been guilty of when I was younger, you know, I like science, I like numbers and I'd love nothing more than to sit in my cubicle and play with numbers all day and do analysis and then send an email to someone with what I found and never have to deal with all the messy organization. 

And I think, oh, I'm doing a great job as an analyst here, applying evidence to decisions. And what you learn after some years of experience is that your role as a smart analyst looking at evidence and data is not as important to the organization as your role as someone who can drive change or someone who's good at change management. And so you've got to be out there talking to people, understanding their concerns, getting them onside, involving them, bringing them evidence a little bit at a time, so depending on how big a change you’re going to be asking of them so that they can absorb it a bit at a time and get their head around this new way of thinking.

So all that change management side of evidence-based management is as important as the technical side of it. 

00:18:05 Karen Plum

I think that's a great point. Data won't win the day, but people can if they're patient, if they listen and they're flexible to the needs of stakeholders. 

Another key concept in this module is thinking about whether what you're planning to do will give you the biggest bang for your buck. We know that we're looking for solutions that improve outcomes - things that will work better than what we're doing today, or better than an alternative. But how much better will it be, and is it worth the money, the effort or the time it will take to implement? Here's Eric. 

00:18:39 Eric Barends

Will this work? Yes, it probably will work. I mean regardless whether there are things in the method itself or the fact that we're finally doing something, and that's OK, but you need to ask therefore, will this work better than what we're doing right now. Or will this work better than any of the alternative options? And how much more or how better will this work than the alternative options or the status quo? 

And if the impact is not that big, then you should think twice. So the question is if it comes to increasing performance, what is the biggest bang for your buck? Is it agile, lean or Six Sigma or is it maybe setting clear goals, making sure that people know what they're supposed to do? Task clarity, creating a psychological safe environment, supervisory support, social cohesion. 

Those are very basic things. And that is actually your baseline, because that's where you get the biggest bang for your buck. Make sure you've done that first. So with everything that comes after that or new things in terms of apply - does it apply? Yes, it does apply. Should you apply it? Well, is it the biggest bang for your buck? Maybe we should set clear goals and start there before we start doing something new. 

00:20:14 Karen Plum

Eric mentioned creating a psychologically safe environment, which is a very hot topic these days, particularly with more and more people working remotely. It links nicely to my final guest in this episode, who's been working on strengthening diversity and inclusion at pharmaceutical company Sandoz. 

She is Stefanie Nickel and coming from a medical background to the HR function to head up diversity and inclusion, she was keen to make use of multiple sources of evidence when deciding what interventions would help the organization change its approach, starting with a study to understand the scientific insights about diversity and inclusion, D&I for short. 

00:20:57 Stefanie Nickel

Let's understand the scientific insights into D&I. How is it defined? What are some of the interventions that effectively drive equity and inclusion for an organization? And that's when we started collaborating with Eric and the team at the Center for Evidence-Based Management. So the problem statement was really having good evidence to guide our actions to be effective and traditionally that was not something that was widely considered in HR. 

00:21:25 Karen Plum

Stefanie explained that the approach, as in so many situations, was for people to rely on one expert opinion or personal experience and as we know, these can widely differ.  A personal opinion or experience doesn't meet the criteria for practitioner evidence, let alone to carry the whole decision. 

00:21:44 Stefanie Nickel

So I think adding this piece around good quality data and having also the guidance around what is the data that's available? But also really looking into the methodology and saying this higher relevance fulfills the criteria of a randomized controlled trial and thus more valid and can guide you more effectively, I think was a huge contributor to us being more effective in this space. 

00:22:10 Karen Plum

Looking at multiple sources of evidence, she found herself with evidence of things that don't work, but not a clear picture of things that do work in the D&I arena. The scientific literature identified some interventions that had a high likelihood of working, but rather than go for a firm implementation, the decision was made to test them and assess their effectiveness in the organization. 

Here Stefanie describes the details of their trial, which included 7,000 people across 1,000 teams, using a non-invasive approach using data that was already being gathered and focusing on testing interventions. 

00:22:49 Stefanie Nickel

We randomized teams to three different groups to have one control group and to really understand are we being effective at improving psychological safety, which we know is a precursor if you want for inclusion. If you don't feel safe, it's difficult to be included, so psychological safety is a big priority for us at Sandoz and it's also when you look in the scientific literature from another angle, you look at the drivers for individual performance, for effective team collaboration, for innovation climate, psychological safety features as a big social enabler for all of those important outcomes for an organization. 

00:23:23 Karen Plum

Given the importance of psychological safety, it made a lot of sense to focus on that and to understand how to do that effectively. 

00:23:32 Stefanie Nickel

So we tested very simple interventions, just sending emails to the two intervention groups - the control group was just informed that we are looking at meeting behaviors and nothing more. In the two intervention groups that we tested, one was around individuation, which is also closely linked to diversity and inclusion - everyone being unique and having different needs. 

And we asked the managers to simply support their team members, schedule one on ones that they had anyway, but use them in a very specific way. Asking the team members - how can I support you, as your manager? Being empathetic, listening and really supporting where the associate needed support. 

00:24:08 Karen Plum

I think it's a great example of a really simple intervention. Probably things that managers know already work, but reminding them, giving simple guidance, encourages participation and makes the goal achievable. And for those that didn't appreciate that these actions contribute to psychological safety and inclusion, well, that's a great learning too. 

The other intervention was associated with goal conflict. If people have competing goals, it makes it more difficult to feel psychologically safe. 

00:24:39 Stefanie Nickel

We've done a huge ethics survey in the organization last year, and one of the key barriers to psychological safety was identified as goal conflict - people with competing goals, that makes it more difficult for them to feel psychologically safe. 

So we said the other intervention is really around asking associates, as a manager, do you have conflicting goals? Are there any barriers to you succeeding in your work, and how can I be of help? So more work oriented, but still supportive. 

00:25:11 Karen Plum

So another really simple intervention. Stefanie told me that the individuation approach (so really treating each person as an individual with their own needs and preferences), came from a review of the scientific literature, which showed that it was likely to have the biggest impact on psychological safety. 

Because they were already measuring levels of psychological safety, they were able to track changes as a result of the new behaviors they asked people to adopt. There was a lot of interest in psychological safety in the organization, so there was good stakeholder interest and involvement as well as the consultation of other sources of evidence 

But Stefanie said the biggest opportunity was to understand the scientific literature on the topic and to identify the available data in the organization and the gaps that existed, so those gaps could be closed. Although they were already measuring psychological safety and other factors in their regular staff surveys, it was interesting to see that having reviewed the literature, they changed some of the questions because they felt their wording didn't produce a strong correlation with psychological safety. 

00:26:20 Stefanie Nickel

We asked the organization regularly, ‘Did you feel free to speak your mind without fear of negative consequences?’ which you could argue captures the concept of psychological safety in the broader sense. But when we did the study found this question wasn't reactive, right? And one reason might be that psychological safety is a team level construct. 

So luckily we also have team level surveys and the two questions there, they captured the concept of psychological safety much better. They also are much closer to what Amy Edmonson, based on her research, recommends as validated questions. And they are ‘Different perspectives are valued in my team’ and ‘I feel safe sharing feedback with colleagues’. 

00:26:58 Karen Plum

And if you don't know Amy Edmondson, she's currently Novartis Professor of Leadership and Management at the Harvard Business School and back in 1999, she coined the term psychological safety and has carried out extensive research into its relationship with team learning and performance. 

In addition, Sandoz also looked at other variables like support for progress, which might contribute to psychological safety. 

00:27:23 Stefanie Nickel

We looked at our team level surveys and we also looked into support for progress, through questions such as ‘I have support for my career development’, ‘I'm encouraged to find new and better ways to get things done’ and ‘I receive ongoing coaching that helps me constantly develop’ and ‘I feel supported when tackling obstacles that hinder my best work. So those were markers for support for progress. 

And then we also said, does this also influence how associates see their managers and we actually have a positive impact for the individuation intervention on both of these, so both perceived progress and also seeing the manager as a role model significantly increased if managers ask, ‘How can I best support you?’ 

00:28:07 Karen Plum

‘How can I best support you?’ is such a simple and powerful question to ask your colleagues. I was very struck by the simplicity of what they were asking people to do. These aren't huge asks, but as is so often the case, small changes can have big effects. 

Stefanie will be sharing what happened in the next episode of the podcast, which is all about assessing outcomes. So tune in to episode 14 to hear more. And before you ask, yes, they did capture baseline data before starting the trials. 

I'd like to wrap up this episode with a summary from Jonny in response to my question about whether and in what ways, having good quality evidence really makes a difference. 

And in the next episode, we'll look at Assessing the outcome of our decision making. 

00:28:55 Jonny Gifford

I think our understanding is very clearly that having good quality evidence helps HR and other people professionals be much more influential. If you can say, this is the sort of impact that we can reasonably expect from this kind of outcome, and I can show you why I'm confident that this view represents the best research, that puts you in a pretty powerful position, I think.

The one thing that we need to be careful of is that this doesn't turn into a kind of fishing exercise for evidence in order to give weight to your preformed opinion. That's not evidence-based practice. That's a kind of search for practice-based evidence, if you like.

The thing is, its bad science, but it's always going to be really tempting to do it. 

 

 

Case study