Kevin G. Campbell
Employee Experience Scientist Qualtrics Follow

On today’s episode of the RecruitingDaily Podcast, William Tincup speaks to Kevin from Qualtrics about the end of people analytics as we know it.

Some Conversation Highlights:

I think it goes back to why the end of people analytics, in many ways, could use a reorientation and a different way of approaching this. Our colleagues and customer experience don’t ask that question because, for them, it’s always a closed loop process, wherever possible. If somebody’s having problems with your ordering on a website or getting something accomplished, there’s usually a closed loop process on the customer side that enables that person to get what they need done, done. It’s not a matter of how often do we review it because the feedback and the action associated with that feedback is integrated into the feedback itself and happens as part of a workflow. So, if someone’s having trouble onboarding, understanding the impact of that onboarding trouble on things like belongingness and intent to stay and engagement and customer outcomes are all very important, and we want to continue to make those connections.

But equally, if not more important, is being able to take action on people analytics in a proactive way so that that problem is either nipped in the bud, or it doesn’t become a problem to begin with. So, thinking about how can you move your decision making process upstream so that you’re no longer reacting in the moment or trying to make a business case down the road, but how do you use analytics in a proactive motion so that you’re looking at people as they onboard and asking yourself, “How is William doing transitioning into this new role? What do the data tell us about what we can do to better support William?” Rather than to say, “Oh, this is how onboarding impacts engagement for our employees.” Doing both of those things really well.

Tune in for the full conversation.

Listening time: 27 minutes

Codesignal Diverse Companys Outperform

 

Enjoy the podcast?

Thanks for tuning in to this episode of The RecruitingDaily Podcast with William Tincup. Be sure to subscribe through your favorite platform.

Announcer: 00:00 This is RecruitingDaily’s recruiting live podcast, where we look at the strategies behind the world’s best talent acquisition teams. We talk recruiting, sourcing, and talent acquisition. Each week, we take one overcomplicated topic and break it down so that your three year old can understand it. Make sense? Are you ready to take your game to the next level? You’re at the right spot. You’re now entering the mind of a hustler. Here’s your host William Tincup.

William Tincup: 00:34 Ladies and gentleness, this is William Tincup and you are listening to the RecruitingDaily podcast. Today, we have Kevin on from Qualtrics, specifically the EX Science part of Qualtrics. Our topic is the end of people analytics, as we know it. This is going to be fun you all. Can’t wait. Kevin, would you do us a favor and introduce both yourself and the work that you do at Qualtrics EX Science?

Kevin: 01:00 Yeah, absolutely. So, my role is as an employee experience scientist at Qualtrics, which beyond being a very interesting job title is essentially a mix between an organizational psychologist and a consultant. So, my role is to help organizations find, identify and eliminate gaps in the employee experience. The part that really excites me about that is connecting the impact of those gaps and the impact of that improvement back to customer and business outcomes. Most importantly, helping organizations act on those findings in a way that overall improves the employee experience, but ultimately improves the success of the organization.

William Tincup: 01:52 So, as we find those gaps, how often do we audit and look for those gaps, or how often do they come up? Then as you triage those, I love the way that bring them back to the business and what’s important to the business, but putting action around. So, it’s one thing to then find the gap, say, “Oh, we have a terrible onboarding process with remote employees.” Okay, there’s a gap. But then fixing the gap. So there’s two parts to that question. One is, how often are we looking for those gaps? Then the second part is, once we find them, what do we do?

Kevin: 02:30 I love that question. I think it goes back to why people analytics in many ways could use a reorientation and a different way of approaching this. Our colleagues and customer experience don’t ask that question because, for them, it’s always a closed loop process, wherever possible. If somebody’s having problems with your ordering on a website or getting something accomplished, there’s usually a closed loop process on the customer side that enables that person to get what they need done, done. It’s not a matter of how often do we review it because the feedback and the action associated with that feedback is integrated into the feedback itself and happens as part of a workflow. So, if someone’s having trouble onboarding, understanding the impact of that onboarding trouble on things like belongingness and intent to stay and engagement and customer outcomes are all very important, and we want to continue to make those connections.

But equally, if not more important, is being able to take action on it in a proactive way so that that problem is either nipped in the bud, or it doesn’t become a problem to begin with. So, thinking about how can you move your decision making process upstream so that you’re no longer reacting in the moment or trying to make a business case down the road, but how do you use analytics in a proactive motion so that you’re looking at people as they onboard and asking yourself, “How is William doing transitioning into this new role? What do the data tell us about what we can do to better support William?” Rather than to say, “Oh, this is how onboarding impacts engagement for our employees.” Doing both of those things really well.

William Tincup: 04:33 The way that we looked at people analytics historically, has it been too reflective? Again, where a lot of it came out of was business intelligence or a lot of BI products came about, and then we applied it towards humans and talent and employees, etc., people. So, a lot of it was dashboards, especially the beginning, it was dashboards, red, yellow, green, here’s what’s going on. But reactive by and large, not proactive and definitely not foreshadowing or giving us an idea both insight into real-time, but also giving us an idea of recommendations on what to do. So, what’s your take on how people analytics have come to market, and why some of this is the way it is?

Kevin: 05:30 Yeah, that’s a great question. I think part of it is you always want to keep the reflective component because you do want to reflect on what you’re doing and derive those overall insights. So, I don’t think it’s an either or around reflective or proactive. I think it’s a both end. One thing I do think needs to be shifted however is the mindset, is it analytics and data for inspection, or is it data and analytics for learning? What I mean by inspection versus learning is, is this a score that we’re using to track people and measure people and hold people accountable? Not to say that I’m against accountability, I’m just saying that the accountability should be for the outcome and not for the outputs. The outcome is that we want to have happy engaged hardworking employees. The outcome is that we want to have a place where people feel included and belong.

The metric is just a tool for measuring that outcome. So, how can we use the metrics to learn rather than use them as the actual outcome that we’re driving toward? You don’t want to incentivize people. You don’t want to incentivize managers for their employee engagement score too much, because that can lead to people doing things like having a pizza party on the days that you do the employee engagement survey. Pretty innocuous or I’ve seen things as egregious as, “Hey, you better give me all fives, or we’re going to have to have a conversation about whether this is the right place for you to work.” In which case, then you’ve just completely eliminated the whole point of your program. So, I think it’s more about the mindset with which we apply to the data. Whether we’re thinking about it as less scores on an exam or your final grade for a class, and more like the speedometer on your car. It’s information that allows you to adjust your behavior in order to help you accomplish your goal.

William Tincup: 07:37 So, let’s keep the metaphor of the odometer or speedometer on your car. If we were building the dashboard of the car for talent and people, what would those things be?

Kevin: 07:52 I love that question. I think you want to look at both the outcomes and the drivers. Too often, organizations will over index on one or the other. So, the outcomes you’re looking for are things like wellbeing, employee wellbeing, whether or not people intend to stay, whether or not people are motivated in their job, whether they have pride in your organization. But those aren’t enough. You can’t walk up to somebody and say, “Feel proud to work here.” Although people try and do that stuff all the time, it doesn’t actually feel like you belong. I think one of the funniest things is when organizations have trouble with belongingness and they have these big banners that they put up everywhere that says, “You belong here.” Which is well intended, but if you already don’t feel like you belong there, and there’s a huge banner telling you that you do, that might actually lead to people feeling less like they belong.

So, it’s important to also have the driver inputs. So, what are the drivers of belongingness? Well, here’s a good one. In a retail organization, we found that some managers were so busy that during some people’s first day of work, they didn’t actually sit down and have a meeting with them. They introduced themselves over a text message or an email. Now, intuitively you might be able to say, “Okay. Yeah, that’s probably not going to be the best way to onboard somebody.” But when you can use the reflective component to say, “Hey, this is going to lead to people leaving and not belonging.” Taking their time and talent somewhere else, that can help make the business case, so that hiring managers know, “Hey. Yeah, I get that you have the burning platform of whatever the fire you’re putting out that day happens to be, but you’re going to get a better return on your action for actually taking that time to meet with someone one-on-one versus shooting them a text message.”

Now, that doesn’t mean the part where people go way off the rails on this is to start to mandate that managers do that, and they use the survey as a way to track that, because then you’ll just have people gaming the system. So, to go back to your initial question of what are the metrics? You want to measure the things that people want to ultimately drive toward, and you want to help them prioritize and focus in on the specific behaviors that are going to yield the best bang for their time, the best bang for their buck, the best return on action around the drivers that will help them meet those outcomes.

William Tincup: 10:35 So, change is both constant and difficult. Often, I advise a bunch of technology startups, and it’s kind of a parlor trick question, if you will. During the onboarding process, I always ask about their competitors. “Who’s your competitor? Who do you feel like you’re competing against?” They’ll list off competitor A, competitor B. Usually it’s another software company. I’m like, “Yeah, no, that’s right.” So, your competitors are actually the status quo. People doing it, the exact same way that they did it yesterday. That’s actually who you could beat with. You can look at navel gaze and these other folks, but really it’s people just being lazy and doing it the same way. So, with people analytics in particular, especially the way that we’ve come up to understand people analytics as it is today, how do we change people’s minds about how to truly look at people analytics in the future?

Kevin: 11:40 That’s a great point and a great question, and you’re right. The status quo is everyone’s biggest competitor. Unless you can build the case to show that what you’re doing now is costing you money and costing you credibility, you’re not going to be able to make the case to take the next step. So, if you think about the status quo when it comes to this stuff, it’s usually something that’s led by HR, and it’s more often than not a survey driven program that measures things like employee engagement, maybe a couple of life cycle surveys. The action taking is usually done at a company-wide level and consists of leaders making sweeping process changes. The whole point of the program is for leaders and HR folks to really have an understanding of the employee.

That all sounds fine and dandy, it doesn’t sound like there’s necessarily a problem with that status quo until you look at the results that come along with that status quo. Because some of the results that happen when you have everything led by HR and it’s seen as an employee engagement program, is that people analytics then become seen as an HR thing rather than a business priority. When you start to create a program that has the survey at the center, when the survey becomes the focal point rather than a tool to measure employee engagement or a tool to measure your culture or a tool to measure the health of your business through your people, people will start to game the system. When you focus on action taking only at the leadership level, then that creates a delay in the action that gets taken. Because those organization-wide changes can take half a year to a full year to implement, and people are left wondering, why does it take a year to implement action on a seemingly simple survey? Ultimately, while gathering feedback for the purpose of understanding is nice.

What really drives changes in your organization is taking action on that understanding, and if you fail to take action, then you’re going to lose credibility as a people analytics function, and you’re ultimately going to create poorer experiences for your workforce, which is the whole point of doing the stuff to begin with.

William Tincup: 14:20 So, its several things I want to unpack there. One is the world of engagement, as we’ve seen it grew up in front of our eyes over the last 10 years, the employee satisfaction survey, the annual employee satisfaction survey that no one ever looked at the results. Then we went to a pulse survey. So, we’re going to do it more frequently. So, now we’re going to ask people questions that we don’t really care about their answers, but we’ll do it more frequently. So, I’ve said, in fact, I’ve been on public saying engagement is not the end goal, that it’s a speed bump on the way to retention. So, I’ve gotten into trouble actually from some of the engagement companies. They didn’t like that. I’m like, “Yeah, you don’t have a thing. It’s really a thing that’s on the way to a thing.” So, I want to get your take on engagement, and then I want to get your take on retention as well.

Kevin: 15:21 Yeah, I love that. It’s interesting. So, Qualtrics had to pivot recently with regard to our official definition of engagement. The official definition of engagement at Qualtrics and most organizations still follow this has a component related to retention or intent to stay, which is a leading indicator of retention. But I think it’s really helpful to parse it out as its own outcome that you don’t look at in lieu of engagement, but you look at it with engagement. So, think about it as a two by two. You have people that are engaged as defined by motivated and have pride in the organization. Some of those people are also intend to stay, and that’s that if you think about it as a two by two, those are the folks that are in the upper right hand corner. They’re engaged and they intend to stay. That’s what you really want.

You have people in the lower left hand corner, which are not engaged and they don’t intend to stay. Well, that’s fine, actually. If you’re not engaged, I’d prefer for you not to intend to stay. The tricky part is when you have people that intend to stay but they’re not engaged, I’d rather you want to leave if you’re not going to be engaged in your work. Oftentimes, that’s where you have things like golden handcuffs where maybe you have someone that just can’t get another job because they don’t interview well or whatever. Maybe you’re the best job in town, and they’re going to stick with you because there’s no other option. Then you have the people that don’t intend to stay, but they’re going to be really motivated and say really good things about your organization and work really hard for the time that they’re there. Those are the people that you want them to move from being motivated and engaged, and get them to intend to stay, somehow.

William Tincup: 17:26 I love that. So, the retention in terms of a people analytics perspective and data, and again, getting back to the measurables for the business, how should we be looking at retention?

Kevin: 17:40 So I think it’s, first, you have to understand that it’s messy, and that’s not an oversimplification in any stretch the imagination. Because if you’re thinking about it from an analytics perspective, there are so many variables you need to take into account. So, if you build a model that’s just based on who stays and who leaves, that model’s going to include people that were not regrettable, that were regrettable churn, it’s going to include people who left because of family and personal reasons. It’s going to include people who were preventable levers and non-preventable levers. So, you have to be really specific about when you’re measuring retention, who do you really want to know about? Who are the people that you really want to understand why and how they’re leaving? Make sure that you’re targeting that. Because if you don’t think about that upfront when you’re building your model or you’re thinking about this piece, you’re going to have a lot of messiness in that data. So, you have to know whether or not you can even track that to begin with.

But I think a radical simplification of that goes back to arming frontline managers and teams and leaders with something that’s not as exact but is much more actionable. So, going back to that reflective and proactive, I think there’s a reflective component to this, which is getting really specific about who do you want to know about, and building those models, and what we think of as the traditional people analytics piece. But not thinking about that as the end, thinking about that as the beginning to a more important thing, which is how do you give people on the front line a simple way of saying, “Of all the things that you can do, according to this quick and dirty survey, what’s the thing that’s going to make the biggest impact on whether or not your people want to stay?”

Have a conversation with your people around this stuff and go try some stuff. Might not work, might work, but as long as you’re including them in the conversation, and you’re proactively having that conversation and working on it together as a team, that’s going to go further than anything because you’re literally involving and engaging people at that point. The more that you include your people in the process of solving problems and using more than just their labor but their mind and talent, the better off you’re going to be because they can’t turn around and say, “Hey, you should have done what I suggested,” because you’re actually going to help them be a part of that conversation around what they would suggest.

William Tincup: 20:33 I love that. Dumb question alert. So, in the model of reflective and proactive, reactive is somewhere in the middle or is it a part of both?

Kevin: 20:46 That’s a great question. I think reactive gets a bad rap.

William Tincup: 20:51 Right.

Kevin: 20:53 I think it is in the middle and it’s not as bad. Maybe reactive is firefighting, whereas responsive is the positive version of reactive. If I order something from Amazon and for whatever reason, it gets sent to the wrong address, either because I clicked on the wrong thing and they forgot to update my address, if I send something into their customer service department and they take care of that, is that reactive? Yeah, but I prefer to think about that as responsive. So, from an employee experience perspective, if you’re coming back from disability leave or maternity leave, and you can’t log into the systems that you need to log into to do your job, yeah, I want someone that’s going to be reactive/responsive to that request.

I think the reactive is you do some analytics down the road and you find out that that’s a problem. So, you create some onboarding program that tells everybody to be really nice to the people coming back from leave, and doesn’t actually solve the technology issues that people really care about. If you ever talked to somebody coming back from parental leave and ask them about the disconnect between all of the material that HR puts out around how it’s supposed to be an easy transition, and you’re supposed to be eased back into your role and you can work part-time, and your manager’s going to have conversations with you about this, and the reality of what they actually experience. I think that’s the difference between reflective and reactive, because reactive just says, “Oh, this is an issue. Let’s throw a bunch of soft skills training at the problem and hope it goes away and say, we did something about it.”

William Tincup: 22:46 I love that. Then I agree with that. I love the way that we’ve thought about reflective and responsive and proactive. Again, all these things, they all have a place, they’re all important. They’re just used in different ways. So, I love that. You had mentioned impact and biggest bang, biggest impact, etc., and you’d also mentioned the actions. So, last thing I want to unpack with you is impact and actions and the relationship between impact and actions and what people analytics should be in the future.

Kevin: 23:22 Yeah. I think it should be empowering people to take actions on the things that are likely to yield the most positive impact on the organization, and to do so in ways that are not necessarily costly and don’t have a huge risk associated with them. So, you can have really sophisticated analytics that will tell you the specific dollar impact on a particular action. Those are really useful for big decisions that could cost the organization a lot of money, but there are also simpler analytics that are easy for frontline managers and leaders to understand that can be directionally true, but it’s okay because the risk associated with them with taking those actions is extremely low. So, I’ll give you an example. I was working with a telecom company and we found with our help that frontline technicians that felt like people at that company were recognized for outstanding customer service were nine times more likely to resolve issues for their customers on the first try. Now, the action that any manager could take as a result of that is to do a better job of recognizing employees.

We found that that was not only something that had a big impact, but it was also something where only 10% of their population strongly agreed that people were recognized for great customer service. So, there’s a huge room for improvement there. Now, a more traditional way of looking at that would be to say, “Okay, what’s the exact dollar amount related to this factor, this variable of recognition on customer problem resolution? Is that less than what it would cost for us to buy this recognition software program?” Well, I’ll tell you what, sometimes a handwritten note from your boss is more impactful than any kind of impersonal software recognition program, not to downplay or poo-poo software recognition programs. I think they’re great. We use one at Qualtrics. That’s amazing.

But sometimes that handwritten note from your manager is just as impactful if not more impactful and there’s zero cost associated with it. So you can encourage people to do that all day long and not have to worry about the negative repercussions. The only time you’d have to worry about negative repercussions is if you made it a performance metric and demanding that people do that because [inaudible 00:26:14]

But if you show the business connection, you encourage the behavior and you have people spend some time thinking about, how can we do a better job with this? Sit down with your team and say, “Hey, how can we do a better job of recognizing each other? How can I, as your manager do a better job of recognizing you? We found this has a huge impact. What do we want to do about it?” So, that’s what the way forward looks like.

William Tincup: 26:36 I love it.

Kevin: 26:37 It’s going back to bread and butter human behavior.

William Tincup: 26:40 Yeah, and it becomes more collaborative. That’s what I love about it is it’s not top down, it’s not bottom up, it’s collaborative. You come together and go, “Hey, this is what the data’s telling us. Let’s take a look at it and let’s try some things.” I love you did that before where you’re like, “You know what, we’re going to learn some things. We’ll try some stuff. Some of it will work. Some of it will fail and that’s okay.” I love that. I could talk to you forever. Kevin, thank you so much for your time and for coming on the podcast.

Kevin: 27:09 My pleasure.

William Tincup: 27:10 Absolutely. Thanks for everyone listening to the RecruitingDaily podcast, until next time.

Announcer: 27:16 You’ve been listening to the recruiting live podcast by RecruitingDaily. Check out the latest industry podcast, webinars, articles, and news at recruit…

The RecruitingDaily Podcast

Authors
William Tincup

William is the President & Editor-at-Large of RecruitingDaily. At the intersection of HR and technology, he’s a writer, speaker, advisor, consultant, investor, storyteller & teacher. He's been writing about HR and Recruiting related issues for longer than he cares to disclose. William serves on the Board of Advisors / Board of Directors for 20+ HR technology startups. William is a graduate of the University of Alabama at Birmingham with a BA in Art History. He also earned an MA in American Indian Studies from the University of Arizona and an MBA from Case Western Reserve University.


Discussion

Please log in to post comments.

Login