Evidence-Based Management

Module 8 Acquire evidence from the organisation

March 23, 2022 Season 1 Episode 8
Module 8 Acquire evidence from the organisation
Evidence-Based Management
More Info
Evidence-Based Management
Module 8 Acquire evidence from the organisation
Mar 23, 2022 Season 1 Episode 8

This episode accompanies Module 8 of the course, which is about the data, information and evidence gathered within the organisation to aid our multi-source evidenced-based decision making. The appraisal of this evidence is covered in Module 9 and its corresponding podcast episode.

In this episode we consider some of the things we need to think about before either using existing organisational data or collecting data specifically geared to address our question or area of interest. As with other sources of evidence, there are a myriad of obstacles and roadblocks to be aware of in our search for organisational data and information to inform our decision making. 

There is a tendency for people to place organisational evidence in high regard, but as we will see, this faith can often be misplaced – not for nefarious reasons, but in some cases simply due to a lack of understanding when designing systems, or using data that has been churned out for years without anyone just asking where the data comes from.

As ever, we keep asking questions!

Host: Karen Plum

Guests:

  • Eric Barends, Managing Director, Center for Evidence-Based Management 
  • Martin Walker, Director of Banking & Finance at Center for Evidence-Based Management
  • Jeroen Stouten, Professor of Organisational Psychology, KU Leuven University

 Article by Jeffrey Pfeffer & Robert Sutton 

Find out more about the course here:   https://cebma.org/resources-and-tools/course-modules/ 

Show Notes Transcript

This episode accompanies Module 8 of the course, which is about the data, information and evidence gathered within the organisation to aid our multi-source evidenced-based decision making. The appraisal of this evidence is covered in Module 9 and its corresponding podcast episode.

In this episode we consider some of the things we need to think about before either using existing organisational data or collecting data specifically geared to address our question or area of interest. As with other sources of evidence, there are a myriad of obstacles and roadblocks to be aware of in our search for organisational data and information to inform our decision making. 

There is a tendency for people to place organisational evidence in high regard, but as we will see, this faith can often be misplaced – not for nefarious reasons, but in some cases simply due to a lack of understanding when designing systems, or using data that has been churned out for years without anyone just asking where the data comes from.

As ever, we keep asking questions!

Host: Karen Plum

Guests:

  • Eric Barends, Managing Director, Center for Evidence-Based Management 
  • Martin Walker, Director of Banking & Finance at Center for Evidence-Based Management
  • Jeroen Stouten, Professor of Organisational Psychology, KU Leuven University

 Article by Jeffrey Pfeffer & Robert Sutton 

Find out more about the course here:   https://cebma.org/resources-and-tools/course-modules/ 

00:00:00 Karen Plum

Hello and welcome to the Evidence-Based Management podcast. This episode accompanies Module 8 of the course, which focuses on acquiring evidence from organisations. There is more on organizational evidence in module 9 and its corresponding podcast episode.

Module 8 concentrates on how to gather data and information, that once analyzed and appraised, will provide a valuable source of evidence, to help with our decision making. 

In this episode, we consider some of the issues we need to think about before either using existing organizational data or collecting data specifically geared to address our question or area of interest. 

I'm Karen Plum, a fellow student of evidence-based management, and in this episode, I'm joined by podcast regular Eric Barends, Managing Director of the Center for Evidence-based Management, and this time we have two additional guests making their debut on the podcast. Martin Walker, Director for Banking and Finance at the Center for Evidence-Based Management and Jeroen Stouten Professor of Organizational Psychology at KU Leuven University in Belgium. Let's hear what they had to say about organizational evidence. 

 

As I reached the halfway point in the course, I started to see some recurring themes within the modules. When thinking about organizational evidence, it's important once more to ask questions. Lots of questions. To ensure we understand the nature of the available data and information. 

Before we go too far, let's have Eric remind us about the difference between data, information, and evidence by posing the question - is Eric older than 50 years of age? 

00:01:51 Eric Barends

In daily practice, in daily life, data information and even the term evidence is used interchangeably and the difference is not very relevant, because we just use it in common language. But within evidence-based decision-making data is something different than information and information is something different than evidence. 

So data are just the raw numbers, the words, the figures, symbols, images, without any context. So 250462 is meaningless - I mean they’re numbers and the moment when you relate the data to something or someone that is meaningful or relevant or whatsoever, then it becomes information. 

So when you say well 25 04 62, 25 is actually the day, 04 is actually the month and 62 stands for a year. And hey, what do you know, it's Eric's birthday. That's information!

If I want to know how old is Eric, I have now these numbers and they give me meaningful information. Now, one step further up is when it becomes evidence, when it's related to a claim. Evidence of what? So when the claim is ‘Eric is older than 50 years old’ – what’s the evidence? 

Well, I’ve got this number 25 is the day 04 is the month, hey 62 is the year, meaning given the fact it's now 2022, Eric must be older than 50 years old. So indeed, this supports the claim that Eric is older than 50 years old. So it helps to think about information, data, evidence in that way because just collecting data is not very meaningful. 

Collecting information may be helpful, but often the most valuable approach is if you tie it to an issue or a claim or an assumption, and then it becomes meaningful, relevant evidence. 

00:03:51 Karen Plum

So the numbers weren't a bank sort code then! And I also know that Eric has a big birthday coming up in 2022. 

Now we’re clear about that distinction, I want to turn to the question of the reliability of organizational data and information, because the premise here is that we have to question it. In my experience, people tend to put too much emphasis on the data their organisations churn out. Maybe that's because they've become conditioned to not asking questions about it. Or maybe it's just easier and less stressful to go with the flow. 

I asked Martin Walker if people just think that this stuff is reliable and trustworthy because it's being churned out by a computer system. 

00:04:32 Martin Walker

I think there is a very large element of truth in that. I mean in any organization there are historic functions whose role has been to try to make sure that data and the processes that produce data actually work. Many large organizations, for instance, haven't have an internal audit function, but even then, again, it's the excitement about data.

With the range of data visualization tools you have now, you can get very impressive and persuasive looking data and dashboards, more than ever before, and I think that that acts at the sort of the perception level - I think in many organizations has actually stopped people asking about the quality. Not asking where the data is actually coming from.

And from my experience I've seen many sexy graphs and KPIs and dashboards, but actually looking into the process you find it's not come out of a well-designed automated process. It's often came from you're asking people to collate this data, often in short timescales, mixing objective and subjective data together and the end effect can look very impressive, but the actual source data can actually be quite concerning. 

00:05:45 Karen Plum

Perhaps we're keen to conform and not rock the boat by asking awkward and difficult questions. And also we may not like the answers. It sounds obvious, but before we rush off and use any source of data, we need to ensure we know the source of the information and how it's been obtained, as Martin explained.

It can be tricky to get to the bottom of this if the data has been gathered in the same way for years, with nobody questioning or even using the output. Perhaps the way the data gathering was originally designed and engineered was based on an incomplete or inaccurate understanding of what was required when the system was initially specified. Now isn't that food for thought? 

00:06:26 Martin Walker

You need people in management who understand data as a thing, data in terms of usage, so basic statistics, data as a process. If you don't have that mindset broadly spread across management, not just in terms of specialists, you won't get the right questions asked about the quality of the data, you won't get the right questions asked when people actually putting in new processes or new systems, you have to have that mindset. 

I think what you see in some organisations is you have people whose task is simply to capture data, aggregate it, put it into a particular format, but they're not the ones who actually use the data. 

That's their job. If people are given a job, they'll come to work every day and they'll do it year in year out regardless. So it really does fall on the ultimate users of the data, assuming there is a use of the data and there isn't always - literally, I've seen where reporting is produced, it gets distributed to people, no one looks at it.

So again, it comes back to just basic management skills. If I receive this report every day, I don't look at it, it has no influence on my decision making - organisations actually need to have a mechanism to flag that. Managers aren't stupid. I get this report in my inbox every day and I delete it. They need a mechanism to actually just say this is useless, this is no use to me. 

00:07:57 Karen Plum

So we have to ask questions about the data available to us, but also, as we've said repeatedly, we have to be clear about the question we're trying to answer. 

So when we're looking for data that we can analyze to try to answer our question or questions, there are a number of potential roadblocks, much as there are for practitioner and in later modules stakeholder evidence. So where do we start? Here's Eric. 

00:08:22 Eric Barends

Organizational data, organizational information, evidence from the organization, should also be easily acquired and it can be very helpful if you want to figure out whether a problem is indeed a problem. If you want to know if an issue is indeed a serious issue, or whether there's urgency, and if you want to know something about assumed cause of a problem, then the organizational data, information and evidence can be very helpful. 

However, however, it seems like a no brainer to go into your organization and collect the information you need, to support an issue or to get better information or evidence on the specific issue. However, often that information is hard to get or it's not there, so organizations tend to collect tons of data without asking themselves where do we need this for? 

And also the other way around, asking managers what is helpful for your daily practice as a manager? What would really support the decisions you often have to make in your position as a manager or an executive? And that is a different discussion and you will probably come up with different sources of information, different data, organizational evidence than when you just start with what's available and what can we easy collect?

00:09:54 Karen Plum

And that's one of the first problems. Data we can easily collect may not be robustly gathered and analyzed as mentioned earlier. Worse still, as Eric mentioned, organisations can churn out tons of data that nobody uses. And here's Martin to illustrate the point. 

00:10:11 Martin Walker

Literally, in my IT career came across the scenario where someone had been receiving and apparently using a report for a year, from a system that had been switched off and just been put into like suspended animation, just for backup purposes. 

In the report, there's a process still generating this completely useless report on data that was every day becoming more and more out of date, and it was only literally just visiting those people in their office and they asked me questions about that report and I said how are you getting data from a system that's not been in use for a year? 

00:10:50 Karen Plum

I remember a parallel in my world, where I was reviewing the storage of an organization that was relocating their office. We wanted to ensure they only moved the hard copy storage they really needed, archiving what they had to keep and only using precious space in the new office for vital hard copy. 

Imagine my surprise to find not one but many large cupboards full of files that had no owner still working in the company. Oh no, that was Jane’s stuff, she left two years ago, but nobody wanted to get rid of it in case it was useful or important. 

I suspect that as with so many situations, people are inclined to start gathering data without thinking things through. If we've learned anything on this course, it's surely that we need to carefully consider what we need, and how we're going to use it.

00:11:37 Eric Barends

Every organization knows they have to collect data and have a data management system to support their managers and decision makers, but often it is poorly thought through. 

How exactly should this data be gathered? What kind of data should be acquired? And what is actually helpful and relevant to the people that are going to use this data? So this is not just go out and collect some data, and put them in an Excel sheet. There's usually I mean it adds more value when you really think this through. 

Discuss this with the users. What is relevant for the daily practice and based on that, design a kind of architecture that structurally acquires these data in a trustworthy and reliable way. But that is a step that organizations seldom make. It's usually when they're somewhere in between, they need to gather data, they need to have a sort of data management system, but usually when they're halfway through, they realize, well actually what are we doing? And is this the best way to collect these data? Are these indeed the most relevant data? Should we maybe collect other types of data? 

That moment comes later on in the process, usually when the managers are trained in an evidence-based way and say - but actually this information is not very helpful. I need these data because these data are helpful for me in my daily job as making decisions in this organization. 

00:13:20 Karen Plum

Once the architecture has been agreed and deployed, it's rather too late to realize that it isn't going to provide the information you need, in the way that's most helpful to your situation. In my experience, there are lots of situations where there are organizational roadblocks preventing you access to the data that you need. Organisations are so often organized in functional silos and the data they collect isn't shared with other silos - either to help them or to assess how the information they have could be useful to others. 

00:13:50 Martin Walker

If you look at organizations, there's two perspectives when it comes to silos. So one is the functional perspective, so you have finance, you have risk, you have operational department and they all have different perspectives on data, but they're very commonly actually use same ultimate sources of data.

And I think a lot of that actually comes down to just technology and making sure that the systems across different functions are sufficiently integrated so that they're all, even if they have different perspectives, they’re all essentially working off the same set of data. 

So I’ve seen this progress happening in banking over 20/30 years, it's been a hard and expensive process, but I think a lot of that just comes down to the technology side. And I don't mean to belittle that because that can be a significant challenge in itself. 

But I think the other kind of silos you get is the different business unit or the different subsidiary unit, which is, I think a lot harder to tackle because different businesses or different equivalents to businesses in non-profit organisations, can not only have just very different perspectives, they can also work under very different incentives and they can even have completely different in systems.

So that one is a bit more of a challenge. And it's way too easy to basically end up in a situation where like a central senior management demands data out of different business units and they all kind of munged the data so it looks the way people are expecting, but how reflective of that is of reality, or the data in that business units actual systems, there can be a very, very, very big gap. 

00:15:34 Karen Plum

And then again, there are more goofy ways in which the data isn't usable because - nobody thought we might want to combine the data from two different departments or systems. Really?!

00:15:43 Eric Barends

Most of the time it is just the way the organization is organized. So as a manager I need financial information, accounting information. Well, that information is in the accounting system and the head of finance, the chief finance officer or accountant or whatsoever is in charge of that information. 

And she or he has his own system where this information is acquired and is presented. However, I also need HR information, human resources. So I need to go to, so there's a head of the director of HR management or people management or whatever it's called, and they probably have their own system where all the information about the age and the pay grade and absenteeism, et cetera, et cetera, is collected and is stored, which does not communicate with the accounting system. 

Then I want to know something about risk or about quality and performance or customer service which is a different department. 

Now, if you take an evidence-based approach, often you want to combine data so you want to know which employees of which age or which department are creating the largest revenue stream. And I also want to know whether the customers are happy with the sales agents yes or no.

So I want to combine that information as well and. So that's going to be hard. Because all this information, all these data are stored at different departments, with different people being in charge and they may be willing to help me, but then the question is OK, how do we get it out of the system and how can we combine all these different data sources? That's absolutely a big challenge for most organizations. 

00:17:37 Karen Plum

I've been in that situation many times. Not only is it nigh on impossible to get accurate data from one source, but when you come to try to align data from more than one source, there's no common frame of reference to tie the data together. 

Ask an HR department how many people there are in the organization and what the team structure is and they usually have a meltdown. But once you do get a list of department names and headcount numbers from HR, you then set about trying to talk to each department to ascertain their needs by interviewing or surveying them. Only to find not only other numbers and structures different, but the team names that you have aren't recognized by the staff. I don't know what team I'm in, I just know that I report to James. Argh.

If you're trying to build a solid case for change and you can't get the most basic information correct, it doesn't help your credibility or your implementation plans. It also might lead you into using the data in a way you hadn't originally intended, or you get tempted to go on fishing expeditions as your Jeroen Stouten explains. 

00:18:44 Jeroen Stouten 

There's lots of data, but it's not really connected, so that in itself poses a problem, if you have all these little pieces of data. But suppose you have data that is really nicely connected. And then you're going to work with that data, that means you maybe have access to mail information, phone calls of service workers, all kind of memos that you can retract information from, perhaps; satisfactory surveys, performance appraisals.

And suppose you can combine all that information. That is particularly tricky because employees don't know that you are doing that, that's their emails, their phone conversations, their maybe random conversations with their leader, somehow get noticed and patterns are being searched for.

And that is a reason to be particularly careful because it does tap on ethical grounds there. 

00:19:48 Karen Plum

Another aspect of this picture is the inevitable consequence of people keeping their own data rather than relying on the established systems. The ones where the data doesn't reach you in the way you need, or you don't trust it. 

00:20:02 Eric Barends

I often in organizations asked managers of a department or even supervisors or team leaders, how trustworthy, how reliable is the information in the management information system of the organization? And too often I heard, well not at all, it's outdated, it's old or it's acquired or collected in a way that's unreliable. 

So I have my own Excel sheet here. So Eric if you want to know how many customers we served in the past month, I have to correct most accurate data. Don't trust the stuff that's in the organizational management information system, because actually that's obsolete or it’s old or it's outdated or whatever kind of reason. So that is a good point to keep in mind if you step into a real live organization. 

Ask the managers, the decision makers, how reliable they think the information the organization can provide actually is and whether they have other sources of information that I should take into account. 

00:21:15 Karen Plum

I think that's excellent advice. Don't assume people are using the information in the standard reports, ask them. And ask yourself whether the data that's gathered and that is most easily available, is relevant to the question you're considering. 

As an example, when determining the amount of office space an organization needs, I would run a utilization survey, where every desk and meeting space is observed multiple times a day to find out how much, it's used. This has traditionally been a labor-intensive activity, and clients also often worry that it sends a message to staff that they're being ‘observed’. 

Despite this being an easily addressed concern, some clients refuse to run the survey and offer instead the data they gather from their security access system as a proxy for the data I really need. This only tells me how many people came into the building, not what spaces they used, how the teams vary in terms of their usage, and a whole raft of useful data I need to build my case for right-sizing their space. 

So you have to be prepared to push back and explain why that data isn't going to cut the mustard for your analysis and the proposals you'll eventually put forward. Eric suggests there's a good way to combat the tendency to opt for the easily available data. 

00:22:33 Eric Barends

It starts with asking the decision makers or policymakers in your organization regardless whether or we're talking about practitioners like physicians or sales agents or lawyers or whatsoever, what information from the organization do they need in their daily practice or to make better decisions. 

Often that information is not readily available or easily available. That means you need to set this up and maybe build a whole architecture to collect this information in a routine way. But the pitfall is that if you don't start there, you just take the information that is easily available, that's just there and you come up with a BS story that this information is maybe not exactly what you need, but it's close. It's probably a good proxy for what you need so have a look at it anyway. We got it. 

Well, I want to know something about productivity. Yeah, well, we don't have, uh, we have job satisfaction. Yeah, but that's actually different. No, no probably when people are not very happy in their job, they’re probably not for productive either. Is it? 

You need to ask follow up questions and say, not sure whether that's accurate. So using evidence or data because it's available, you really have to figure out whether it is indeed a good indicator for the issues at hand. 

00:24:25 Karen Plum

I recently read an article written in 2006 by Jeffrey Pfeffer and Robert Sutton, both professors at Stanford. They quoted an example of a company CEO who had established the practice that if a metric was identified as important, but they currently didn't have a way to collect the necessary data, then the metric would appear on their monthly reports as ‘not available’ or ‘not currently available’. 

They felt that having a persistent reminder of this would motivate them to find a way to address that gap in their metrics. So don't let the perfect be the enemy of the good, as Denise Rousseau always says. But don't take your eye off the ball, that there's something that would serve your purpose better than the current measurements and metrics. 

To remind us that organizational evidence is just one element, here's Jeroen reflecting on the search for the best available sources. 

00:25:17 Jeroen Stouten 

Don’t feel satisfied with just the results of the data. Go look for other pieces of evidence that match it or don't match it. Just look for it and critically examine your own process as well and probably document your process. What have you done to look for solutions for that question? 

You can trace back where you might have gone into a different direction, but it's possibly helpful as well. That's what we do and we are confronted with an ethical dilemma. We have a documentation process, different steps we take. We document our different talks and actions to trace back later, whether we made mistakes or not, this is probably helpful here as well. 

It's a memory trace also for you and for others. 

00:26:07 Karen Plum

I really like that idea. It links back to documenting the searching steps that we used in Module 5, so we could reproduce the list of studies upon which our scientific literature sources were identified. Keeping track of your journey through an audit trail of sorts would help you remember how you got to where you are now. I think it could be very valuable and it's something that would be easily overlooked in a busy working day. 

The last thing I wanted to explore was the idea that if the only data you have is flawed and you know it's flawed, but there isn't any other data available, can it be said to be the best available evidence and we use it with a health warning maybe? 

00:26:47 Eric Barends

So that's the reason why we always, always have to critically appraise the evidence we have at hand, regardless whether it's evidence from a meta-analysis of 62 great studies, or the organizational data my IT staff person comes up with and I ask critical questions and turns out well, I'm not sure how reliable this is. Means that you can be less certain in the outcome of your decision based on this information. Well it's not very rock solid, so there's a risk involved here. There's uncertainty, and when the decision to be made is very important and affects a lot of employees or customers or clients or whatsoever, you maybe need to go back to the drawing board and to see if you can acquire organizational data that is more reliable.

If the outcome is actually not very relevant or does not affect a lot of people, you can take your risk and say, OK, well, let's go with it and see what happens. If we're wrong, there's not a man overboard. So yes, best available evidence is what we have. But sometimes the best available evidence is not good enough, and in terms of research you can say, well, there isn't any, it's just not there. We can't call up a few academics or researchers, hey, could you do some more research because…

Yeah they could, but it takes probably several years before they get back with the findings. In an organization there are often alternatives to get better evidence to get more reliable information and data. 

00:28:35 Karen Plum

Particularly if what is at stake is critical to organizational success and the risks of getting it wrong are high. 

To finish off, back where we started, I make no apology for saying this again, asking critical questions is a vital first step to keep us on the right track. 

I'll leave you with Eric's final thoughts and we'll pick up on the organizational evidence in Episode 9 where we'll be looking at big data, analytics and KPIs to continue to challenge our understanding of what's most useful and when. 

00:29:07 Eric Barends

Asking critical questions is, in terms of organizational evidence, again, the most important step. How relevant is this information? Why do we want to know this? What does this information suggest? Is this really helpful for the issue at hand? What kind of information does it provide? And of course, does it help me making a better decision? That is a very important question to start with, because again, information, data are collected on a wide scale and because it's there, it doesn't mean it's helpful.