Welcome to the Use Case Podcast, episode 220. Today we have Siadhal from Metaview about the use case or business case for why his customers use Metaview.
Metaview provides unique data and analytics, and actionable feedback to help interviewers improve.
Give the show a listen and please let me know what you think.
Thanks, William
Show length: 28 minutes
Enjoy the podcast?
Be sure to check out all our episodes and subscribe through your favorite platform. Of course, comments are always welcome. Thanks for tuning in to this episode of the Use Case Podcast!
Music:
Welcome to RecruitingDaily’s Use Case podcast, a show dedicated to the storytelling that happens or should happen when practitioners purchase technology. Each episode is designed to inspire new ways and ideas to make your business better. As we speak with the brightest minds in recruitment and HR tech, that’s what we do. Here’s your host, William Tincup.
William Tincup:
Ladies and gentlemen, this is William Tincup, and you are listening to the Use Case Podcast. Today, we have Siadhal from Metaview and we’ll be learning about the business case or use case or cost-benefit analysis, however you want to think about it, of why his prospects and customers use Metaview. So without any further ado or things that are getting in the way, Siadhal, would you do us a favor and introduce both yourself and Metaview?
Siadhal Magos:
Sure thing. Hey, thanks for inviting me, very excited to have this chat with you. I’m Siadhal. I’m one of the co-founders and the CEO at Metaview, as William said. Metaview is a leading platform for driving the quality of interviews at modern, high-growth companies. So, the reason we exist is that we’ve seen when hiring at scale, it’s frankly, impossible to keep interviews high quality and consistent. There are just too many different interviewers running too many different interviews for too many different roles and too many GO’s for you to stay on top of. And this can result in missing out on the best people. It can result in making bad hires. And those things are painful, and they really damage velocity and culture. So yeah, Metaview’s all about giving unprecedented visibility into what is actually happening inside interviews and then unique data insights and training flows to help interviewers calibrate and improve.
William Tincup:
I love that. So, quality interviews, the word quality denotes, right, it’s the modifiers that everyone needs because you do quantity interviews and if you’ve got a poor interviewing process, then you’ve done nothing, but just make more of a problem for yourself. So with the lines for Metaview, where do you start in the interviewing process? Is it data that comes from sourcing or from recruitment marketing or from applications? And then you can root or rank the quality over quantity? How does the the workflow work?
Siadhal Magos:
Yeah. So Metaview starts at the point of actually running interviews. So probably helpful to have a bit of a top level on the actual product functionality itself. Essentially we have a piece of software that seamlessly joins and then records the interviews themselves. So when we talk about giving you visibility into what is actually happening in interviews and insights and training flows, the core asset really is on being able to record the conversation itself and then transcribe it and then run all sorts of analysis on the actual recording and transcript itself. There are great tools that are higher in the funnel than interviewing. What we really focuses on is that tip of the sphere, the bit where the decisions are actually being made about who we are going to hire by people who are not always recruiting professionals, right?
Siadhal Magos:
Of course, recruiters spend a ton of time perusing resumes or running interviews, but actually your hiring managers or your frontline interviewers, they’ve got other day jobs. And it is not their day job to run interviews, but conversely or paradoxically they are actually the people who decide who join the company. So you’ve got this disconnect almost between this point in time where the decisions are actually made, which is the interviews themselves, and the level of data you have about that part of the flow, which we’re looking to solve by actually capturing the interview itself.
William Tincup:
So one of the things I love about this is the recording, and then you can then apply down the road. You can apply some machine learning to that to then figure out what’s actually working, what’s not working. And as you said, you can use that to then train people on. This is a great question. This is not a great question. Are you looking to, and are your customers looking to, standardize the interview process or are you looking to help them standardize the interview process?
Siadhal Magos:
I think standardize can mean different things to different people. I think there’s a spectrum. I think what we are not aiming for in the type of customers that we work with, aren’t looking to have robotic, perfectly-scripted interviews from beginning to end.
William Tincup:
I would like to ask you the next question.
Siadhal Magos:
Go for it.
William Tincup:
No. I mean the robotic part.
Siadhal Magos:
Oh, I see. I thought you were big [inaudible 00:04:37].
William Tincup:
Someone reading from a script you want to give, yeah.
Siadhal Magos:
What we are deep believers in, frankly, is that we are all three-dimensional human beings and all unique snowflakes of different descriptions. And so having a completely scripted interview is not quite right, but equally you don’t want every interviewer being completely different to the next interviewer in line i.e. who you hire is determined by who happened to be available to run that interview today. You do want very good calibration between your interviewers without necessarily needing them to run the exact same interview. So I think there’s a balance there basically. But to answer your initial point around so therefore, what are we looking to help customers with?
Siadhal Magos:
What we do is we identify things like anomaly interviews. So, if there are interviews where the amount of time spent speaking by the interviewer is off the charts. They’re dominating the conversation 80% of the time they’re speaking. The candidate hardly speaks. And yet they still go on to reject that candidate and everyone else was a yes on that interview loop. That might be something you’re interested in where there might be a calibration issue there. Or if there are very few questions asked during the interview or lots and lots of questions and very short answers, maybe the interviewer is not doing a great job of asking open-ended questions, but are listing descriptive answers. There are all these signals that you could pick up without having to say, “Hey, we need you to interview just like everyone else.” You can still start to give nudges and moments to self-reflect to interviewers. So they can think about how they can be better at this really key skill.
William Tincup:
So one of the things I want to ask you is are you recording video? Are you recording mostly audio? Or is it a mixture of both? What’s the recording look like? Take us into that.
Siadhal Magos:
Yeah. It’s completely up to the customer. So audio is required. So you can’t get a lot out of interview without recording the audio. But video is default as well, but you can switch that off. Most of our customers record both. That’s the configuration.
William Tincup:
Oh, that’s awesome. And job by job they can turn those things on for certain things and turn them off for certain things. Done. I love that. Do you have kind a library or a recommendation engine? If not now down the road of, “Okay, this job is a full stack developer. Okay. These, what we see trending, these are the types of questions that they respond well to, et cetera.” Do you see a path forward to having that library or templates or recommendation engine, whatever you might call it?
Siadhal Magos:
Yeah, for sure. I see a path towards that. That’s not a focus. Building up that data asset is really key and it’s something that will pay dividends further down the road, but where people actually get the most value right now in that world of, “How do we, essentially train up new interviewers to be just as good as the interviewers that got us here?” So let’s imagine you are a 500 person company who’s growing to 1,000 person company over the course of a year or two. That’s pretty explosive growth. You made a ton of great decisions to get to the point where you were 500 people.
Siadhal Magos:
What you probably want is your next generation of interviewers to run interviews in a way that is at least inspired by and somewhat similar to the people that came before them. So rather than coming up with AI-driven recommendations on you should ask this question. Why not just actually check out one of the recordings from one of your most tenured interviewers? Just so you know, how they do it. That’s a really soft and almost user-generated way of training your next batch of interviewers that doesn’t need to be over engineered. It’s a pretty simple and effective way to do it.
William Tincup:
I love that. So we’re training people up. It’s the art and science of interviews, right? And I want to get your take on is it more art or more science? But is interviewing and interview questions, is it subjective to the company and the role? Meaning when you’re training people up, let’s say, you and I are at SpaceX and I’ve done an interview. You’ve listened to it. And you liked parts of it. You thought parts of it could have been better from an interview perspective. Is that subjective? Do you do you feel right now, that’s subjective? Or is that, “No Williams, there’s science to this.” It’s not subjective. It’s more objective than subjective.
Siadhal Magos:
So, I think the answer depends on your sample size, right? Like any one interviewer running an interview on any one occasion, that is going to be a subjective assessment of whether that was a good interview. That as an organization, you have to take a position on. Do we believe this is how we are going to attract great talent? Now pre-Metaview, you’ve got no idea how that interview’s even being run. With Metaview, pf course, you have the game tape. You have this observability and you can see it. So when the N is one, when the sample size of interview is one, I think it’s pretty subjective.
Siadhal Magos:
Where you get to the more objective side of things is when you’re looking at things on aggregate. And if you are seeing changes over time, how your interviews are being run, and at the same time, you’re seeing a drop in your offer acceptance rate, let’s say, then you can start to at least explore that correlation and develop really data-driven hypothesis as a talent leader, as a recruitment leader, and speak to those functional leaders about what might be going on here and what role their teams can play in bringing that back in line.
Siadhal Magos:
So, again, when you’re looking at one interview, I think we’re not at the point right now where it is. Again, humans are highly complex. That applies to the interviewer and to the candidate, and they’re going to be bouncing off each other. And you want there to be chemistry and you want there to be maybe tangents that people go down and that’s good. We do not want to change that. But when you look at things in aggregate, you should start to understand that the complexion of this process that it’s taking up tens of thousands of people hours in your organization, you should start to understand it.
William Tincup:
So I am a horrible interviewer, horrible, because the questions I ask are esoteric because I’m trying to figure out how people think. Like, “what’s your favorite Beatles album?” Stuff like that, my favorite interview question is we’re all misunderstood. How are you misunderstood? Really esoteric stuff, so I am a horrible interviewer. Horrible. Yeah. So this would actually help me because of the training that would come afterwards and go, “Yeah, that’s actually bias. You should not do that.”
William Tincup:
I wanted to talk about two things that you brought up. One is velocity. And the other one that I just brought up, which is bias. Velocity usually translated for everyone is speed, which we all know right now, candidates are moving really fast. So the faster we can take them through a great process and also velocity in terms of the training and getting people up to speed with institutionally how great interviews happen at Company X. Yeah. So talk a little bit more about velocity speed and how you see it play out with your customers.
Siadhal Magos:
Yeah. Yeah. Frankly, you just did a phenomenal job highlighting two of the areas of velocity that we think about, which yeah, happy to expand on. So one is the macro effect on your business of being great at interviewing. And what does that do for your ability to move faster as a business? So, frankly speaking, when you run a great process with really confident, empowered interviewers, you’re more likely extend offers, and have offers accepted by the right candidates. Frankly speaking, it’s hard to prove that by the numbers. Because of course, what is a great candidate? That is in the eye of the beholder, but we’ve got a ton of customers who believe that to be true. And even before Metaview were operating under that basis.
Siadhal Magos:
So that’s the most general area of velocity is if you hire people who end up being a bad fit, your business goes slower. If you hire people in the bad fit and one bad apple spoils the barrel-type situation and other people go on to leave, then you’re slowing down the velocity of your business. And that’s just a fundamental truth that I think most business leaders believe in.
Siadhal Magos:
The second part of velocity is specifically, how does Metaview impact your ability to like hire faster, which then affects your velocity? One is when you’ve got more confident interviewers, they’re more likely to have the confidence to say yes to a candidate. If you speak to any interviewer or hiring manager, they will tell you when I’m uncertain on a candidate, I fall back to no. Because I don’t want to get judged if I make a bad hire. If I’m not certain, therefore I’m not going to do it.
Siadhal Magos:
So if you can help interviewers run higher signal interviews and really have access to that data afterwards, because they can see the transcript, then they’re more likely to get the confidence to say yes to candidates that actually deserved the yes all along. So, that’s one element.
Siadhal Magos:
The second is that if you can remove the need to have big ceremonies to train up new interviewers and big human-driven processes to train new interviewers, and you can do it in a much more ongoing, automated fashion as Metaview enables, then your pool of available interviewers is larger. And when you have a large pool of available interviewers, you as a recruiter, have more calendar slots to choose from. It’s very simple. I’ve got a candidate. They’ve already under offer from, let’s continue to use your example of SpaceX. I want to interview them as soon as possible. I look at who’s available to interview, and there are no calendar slots until two weeks down the line. If you have 10 X, as many interviewers, you can interview them tomorrow. And so your speed of getting people through your hiring pipeline just explodes. And that’s really good for velocity too. So yeah, that’s me unpacking velocity.
William Tincup:
What questions are you receiving, are you fielding right now about bias hiring bias in general?
Siadhal Magos:
You mean almost during the buying process or do you mean something else?
William Tincup:
No bias, like sometimes what you have is biases that come out in weird ways. I think Sherm defines there’s seven distinct hiring biases, the like me bias. I like this candidate a lot because they’re a lot like me. The last interview is actually a hiring bias, ironically. So, and then everyone’s looking at diversity, inclusion, belonging, equity, and quality, and thinking, we want more diverse. We want a more inclusive place, et cetera. Are you fielding questions about this? Because again, hiring managers, as you stated, eloquently this isn’t their job. They’ve got a job. You’re trying to bring in a level of elegance and sophistication to their job to make them confident. But that might or might not bring out biases. So just, I’m curious about the questions that if you’re handling or fielding any questions about bias.
Siadhal Magos:
I mean, yeah, thanks for the clarification. It comes up a ton. I mean most of our customers, part of the reason, at least that they buy Metaview, and I would even say part of the definition they have for quality interviewing, which is what we are all about. And part of our definition as well is around fairness and avoidance of those bias. So the things that ourselves and our customers, thankfully we’re pretty well aligned on this, tend to focus on when it comes to bias and be curious about is I think one level is almost the more pernicious, but less evil versions of bias, which is shortcuts people take again when they haven’t got to certainty. If you’re in a hiring process, you are under the gun. You need to hire someone for this role by the end of month or end of quarter, otherwise you’re not going to hit your objectives. You’ve seen a bunch of candidates. You’ve never got to certainty in any of them. You’re going to fall back on, what’s called a resume bias, right? It’s almost like the interviews might as well not have happened. You might as well just have looked at the resumes in the first place and made your decision from that.
Siadhal Magos:
So, that’s the most common. Anecdotally, the feedback we get from customers is that’s the most common way Metaview helps them with bias is, “Well, I can avoid that resume bias because if I’m not sure I can get a second opinion.”It doesn’t matter if someone wasn’t in the interview with me, I can have them review some of the key parts. They don’t have to listen to the whole thing. They can listen to it at 1.5 X and I can just point them to the most important part of the interview. And they can give me their perspective as someone who’s a really trusted hiring manager. And I can feel much more confident about my decision, and therefore take what previously might have been seen as a risk on hiring that candidate. So, that’s super important.
Siadhal Magos:
The other though, is a few of our customers have specific goals. Many of them have specific goals around redressing the balance of underrepresented folks within their organization. And at the moment they have a ton of data on diversity before the interview funnel, at the very top of the funnel, i.e. who they’re sourcing and who they’re managing to get into that first interview. They have a ton of data about who makes up their current team and who they extend offers to and their different backgrounds.
Siadhal Magos:
What they don’t have any data on is how these folks have different backgrounds treated within the interview process itself. And are there systemic ways that we might be sort of treating, whether it’s different genders or different races differently in our interview process? And that’s all played out in the interview metrics, but that’s much more specific to individual customers where they may need to look at their data. There are a few broad trends I would say, but they really need to look at their data to understand if there’s an issue there.
William Tincup:
What I love about that is bad managers, right? There’s a layer of transparency here that’s being applied that we can easily start to see with transparency. This person just doesn’t hire women. From a sourcing perspective, there’s plenty of women that come through the process, plenty of qualified women that come through the process and they just don’t hire women. And now that gets exposed. There’s a light that gets shown on that, which I love. I’m assuming workflow. Because of where we sit, we’re integrated into ATS’s.
Siadhal Magos:
Yeah. So, that’s exactly correct. So all of the scheduling and the core candidate information still sits in the ATS and we play nicely with the ATS.
William Tincup:
Good, good, good, good. Demo. I’m not going to ask you who your favorite child is, but what’s the favorite part of the demo? When you show somebody Metaview for the first time, you’re like, “I love this part.” What is that?
Siadhal Magos:
Yeah. It’s funny because there’s the new thing that I’m personally really excited about. And then there’s the thing that historically has just been an absolute delighter for folks when they see it. I’ll go with the new thing. And if you let me, maybe I’ll mention the other thing too, but the new thing that we’ve just shipped recently is a basically new feature. I really love showing off, which is a new report. In a very clever way, it helps the customer understand if there are any pockets of inconsistency in their interviews that are being run in the company. And the reason I say it’s clever is it’s easy and not interesting to say, “Hey, the interviews in your company are inconsistent.” And you’re like, “Hey, duh, of course my software engineer coding interview should look different from a metrics perspective to my design manager portfolio review interview. There’s no reason they should look similar.”
Siadhal Magos:
Really the key is an understanding what apple can you compare to what apple and what apple you should not be comparing to another orange, because it’s not relevant. And being able to identify what is appropriate to compare to what so that you can identify where inconsistency exists is a big thing that we’ve been working recently. And is proven really popular with customers as we’re starting to roll it out. So yeah, I’m very excited when I get to speak to customers about that.
William Tincup:
All right. So go ahead and tell us. What’s historically been your favorite>
Siadhal Magos:
Sure, thank you. Thank you. So that feature I just mentioned, it’s to everyone’s benefit and enables great conversations between recruiting and functional leadership, where there are those pockets of inconsistency. But really Metaview has been built by interviewers and hiring managers. That is my background. That’s the experience I had with Sherry. My co-founder decided to start Metaview. And so we are very keen to make sure that the interviewers themselves are direct beneficiaries of the fact that the company decides to use this product. And so the feature that really delights them as well as the folks in recruiting is coaching. Based on a subset of interviews that I might have conducted on the platform, Metaview will automatically generate suggestions of how I can tweak my interview technique.
Siadhal Magos:
Whether that’s we’ve noticed that a lot of the questions you ask result in short answers, therefore you might want to ask open-ended questions. Here are some recommended, open-ended questions based on the types of roles that you hire for. Or whether that’s your introductions tend to be very short. Here are some good elements to include in your introduction to get the candidate pumped about the company. There are essentially a full syllabus we created of these nudges and tips that are only given to interviewers when they actually exhibit the behavior that renders that tip relevant. So it’s highly contextual, highly personalized. Ninety-one percent of interviewers say that it helps them run better interviews. And yeah, we really love to hang our hats on that functionality.
William Tincup:
I love it. Most recent customer story that you just have fallen in love with, without brand names and stuff like that, just a story that people you’re like, “I love how they’re using this.”
Siadhal Magos:
Yeah. Yeah. Well, there was this pretty cool case recently, actually that comes to mind. We tend to be pretty attentive to these customer use cases. But there was one thing that was pretty unexpected, which is, I mentioned that one the flows that Metaview facilitates is essentially this virtualization of shadowing. So listen to this great interviewer’s recording of running this particular interview type at your company in order to learn how to interview. We knew that was going to be a thing. And we knew that would delight customers, because we were part of those shattering processes, and we saw it first hand.
Siadhal Magos:
What we weren’t expecting was the effort put into by that tenured interviewer. So the person who ran the interview that was highlighted is this is a great interview. We should use this to train others. The effort that they would put in to actually annotate the transcript and the recording with their background thinking on why they took the approach they did in the interview. So this is people who are really rending out on their own interview and trying to pass on their actual thought process, which then becomes this organic user-generated, completely proprietary training content that is hyper-relevant to that one company and helps their future interviewers be even better. And that was just completely organic. We didn’t even facilitate that. They just used the transcript annotation functionality to do it for that purpose, which was really cool to see.
William Tincup:
Oh, that is fantastic. I mean, that’s capturing, as you said earlier, institutional knowledge. Once we know what a great interview looks like, we don’t have to relearn that. We can train on that. So, I love this. So, I love what you’ve built. I love it. And I’m so happy that you came on the Use Case Podcast.
Siadhal Magos:
Thanks so much for having me, William. It’s been a blast.
William Tincup:
Absolutely. And thanks for everyone listening to the Use Case Podcast. Until next time…
The Use Case Podcast
Authors
William Tincup
William is the President & Editor-at-Large of RecruitingDaily. At the intersection of HR and technology, he’s a writer, speaker, advisor, consultant, investor, storyteller & teacher. He's been writing about HR and Recruiting related issues for longer than he cares to disclose. William serves on the Board of Advisors / Board of Directors for 20+ HR technology startups. William is a graduate of the University of Alabama at Birmingham with a BA in Art History. He also earned an MA in American Indian Studies from the University of Arizona and an MBA from Case Western Reserve University.
Discussion
Please log in to post comments.
Login