#5 - Webflow's Playbook for Hiring Engineers in the AI era | with Akash Jain (Senior Engineering Manager at Webflow) & Sargun Kaur (CEO at Byteboard)
It doesn't have to be a binary approach.
Hey friends! 👋
Welcome to this new edition of our series of in-depth, actionable article about engineering leadership. If you’re new there, check our previous articles about Architecture, Engineering Efficiency, Migrations and Developer Productivity right here. And never miss a post again, by clicking the big, blue button below :)
For today’s article, we picked the brains of Akash Jain, Senior Engineering Manager at Webflow, and Sargun Kaur, CEO at Byteboard. They’re sharing their exact blueprint to hire engineers in a generative AI era, right below. Let’s start, as usual, with some background about them 👇
Akash Jain, Senior Engineering Manager at Webflow & Sargun Kaur, CEO at Byteboard
If I were to ask for help from technical interview experts, I wouldn’t think twice and get in touch with Sargun Kaur and Akash Jain.
The former, Sargun Kaur, a seasoned engineering professional, began her journey at Google, contributing to the backend of Google My Maps and later, the Android app for Google Photos. Her career then took a pivotal turn towards leadership, spending nearly four years as an Engineering Manager, where she built Byteboard and received a Forbes 30-under-30 distinction. In October 2021, after 7 years at Google, she spun it out and grows the company since, as its CEO. Byteboard is on a mission to revolutionize technical interviewing, and has successfully enabled over 20,000 technical interviews to date. Their goal has always remained the same: fostering equitable interviewing and hiring practices in engineering, ensuring fairness for every candidate.
The latter, Akash Jain, has more than 11 years of experience as a technology leader, in companies like Visa, CBRE, DroneDeploy, Asana or, since Dec 2022, at Webflow. He’s currently a senior Engineering Manager at Webflow, where he oversees the whole pricing, billing, subscription, payment and check-out processes; all while improving the whole company’s technical hiring process. During his career, he’s also successfully conducted hundreds of interviews.
But in the last few years, the landscape has seen tremendous changes, with the rise of ChatGPT and the likes. How do you find the right process in an era of Generative AI?
At Elevate 2023, we were fortunate to have Sargun and Akash on stage, who provided insights on enhancing technical interviews in an AI-driven era. And, lucky we are again: they created the below ‘byte-sized’ playbook, that describes Webflow’s current process for hiring the right engineers when ChatGPT is around.
You can follow Akash Jain and Sargun Kaur on LinkedIn. Let’s go!
Here’s Webflow’s playbook for hiring engineers in the AI era
Engineering teams today are under intense pressure. Business and customer demands are more pronounced, many teams have had to cut staff, and the nature of building technical products is getting disrupted by the rise of generative AI.
As software engineers and managers, we feel this pain very acutely. And, while we can’t solve all these challenges, we want to share an approach that helps with one of the more pressing issues: hiring in the age of AI.
We’ve both experienced this. One of us, Akash Jain, has experienced all sides of this, as an engineer, engineering manager, and now as Webflow’s Senior Engineering Manager, responsible for overhauling the entire company’s technical hiring process. This article’s other author, Sargun Kaur, was an engineer at Google before leaving to become the co-founder and CEO of Byteboard, the all-in-one platform for technical hiring based on real-world—not test-taking—skills. Combined, we’ve been in hundreds of technical interviews, and the Byteboard platform has administered over 20,000 of them.
Transforming technical hiring to be more effective for all parties is a topic we’re very familiar with and passionate about. Especially considering how solvable it is. This topic is particularly timely as generative AI and AI collaboration tools are completely transforming both the nature of software engineering and the process of hiring for it. This inflection point in building technical products and teams must be met with a similar evolution to technical hiring. There’s a clear playbook for technical hiring in the AI era and that is what we are excited to share with you here.
The problems with current technical interview processes
When it comes to technical interviews today, many hiring managers rely on LeetCode, but the prompts within LeetCode only demonstrate whether someone is good at solving these specific types of problems—not how they’d perform on the job. In a world where many companies are committing to building an equitable hiring process, this gives an unfair advantage to candidates who have many extra hours to devote to studying but still doesn’t translate into hiring the most qualified candidates.
Plus, these interviews lead to long, expensive hiring cycles that create a sub-par candidate experience. In other words, it’s a lose-lose situation.
We took a closer look at the technical interview process at Webflow and identified a few major areas for improvement.
One of the biggest problems was that we were asking candidates to perform hypothetical and isolated problems—this is because LeetCode doesn’t reflect the way engineers work every day. Not only was this a slow and expensive way to evaluate candidates, but it also still led us to make regrettable hiring decisions and candidates felt that they weren’t being given the chance to highlight their skills.
“LeetCode doesn’t reflect the way engineers work every day”
As engineers, we don’t work in a vacuum. We face a lot of complexity, but we also can rely on resources like StackOverflow, Google, and our peers. Our real work involves collaborating to build and solve complex problems.
And there’s a new problem we’re experiencing due to recent advances in generative AI. Here’s just a quick overview of all the things a tool like ChatGPT is capable of today:
Discussing the trade-offs between different solutions
Discussing the Big O performance of a particular solution
Generating alternative solutions
Explaining code in English
With these advances, hiring managers are finding that it’s much easier for candidates to game the system. ChatGPT can already quickly solve LeetCode questions and explain its decisions in English (or many other languages). In fact, when Sargun fed ChatGPT a “hard” LeetCode question, it could solve it within seconds—much faster than she could have gotten her left pointer variable initialized.
A mentor of Sargun’s ran an experiment with his executive assistant. Simply by using the internet and ChatGPT, she was able to pass Amazon’s technical phone screen, despite not having any technical background or experience. Woven did a similar study, pitting a marketer using ChatGPT against a junior engineer using ChatGPT.
While some companies are turning to anti-cheat tactics like retina scans or screen captures to limit reliance on generative AI, to put it bluntly, these tactics don’t work. Not only are you adding costs to your organization, but you’re starting from a place of distrust and severely limiting the candidate’s abilities to showcase their skills. You’re also adding unnecessary stress and unfamiliarity to the interview process, which the majority of candidates already find unnerving. Oh, and by the way, did we mention that you’re still not getting an accurate assessment of how the candidate will perform on the job?
What it all comes down to is this: The real issue is that the questions we’re asking candidates are not getting to what makes a good software engineer. We need to rethink our approach to technical interviews so we’re focusing on the skills that equate to being a good engineer.
The solution: Defining what makes a good software engineer today
So what makes a good engineer? Every technical lead might have their own slightly different answer to that question, but for us, it boils down to a few core qualities, which include the ability to ask the right questions, understand context, and reason about tradeoffs.
As we looked to redesign the interview process at Webflow, we had four key principles we kept in mind:
Make all interviews real-world, open-book, and project-based (this reduces the likelihood that candidates can game the system and instead get the chance to showcase what they can really do)
Outsource skill-based assessments to a high-quality third party like Byteboard (this limits engineers’ involvement in the process before candidates are vetted)
Match interview rubrics with Webflow’s qualities of a good engineer like communication, problem-solving on the spot, and asking for context (this ensures the candidates who are progressing fit our definition of “good”)
Align behavioral interviews with company values in a condensed single interview (again, this reduces engineers’ time spent on interviews while still providing a stronger signal that a candidate is a good culture fit)
And here’s what our new four-step interview process looks like.
Step 1: Recruiter phone screen
This stage remains largely unchanged, but it’s important because it provides the first one-to-one contact between a Webflow employee and the candidate. Our recruiters ask questions to learn more about the candidate's interest in the role and company and assess if there is a good match between the candidate’s experience and relevant skill set for the open role. This is also a good opportunity to share about Webflow’s remote-first culture and learn about any other important details.
Step 2: Byteboard assessment
The Byteboard assessment allows us to assess a candidate’s technical skills for a range of roles such as front-end engineer, back-end engineer, or site reliability engineer. Every Byteboard assessment has two parts: the project requirements (the design part) and the coding part. Each assessment is adapted to the appropriate coding language and level for the candidate.
LeetCode problems are hypothetical and isolated. In contrast, Byteboard uses a project setup, where candidates look at project requirements and use their system design skills to come up with solutions and fill in the gaps. This means they can’t simply copy and paste a solution. They need to look at the skeleton of the code to build context.
This assessment is purposely designed to be completed in under two hours, which we believe is enough time for a candidate to demonstrate their skills while still being a reasonable amount of time for them to dedicate to their application.
We looked at many different technical assessment tools and we picked Byteboard because it was the best tool for identifying the right engineer with the greatest efficiency. We’ve found Byteboard assessments to give a clear picture of a candidate’s real-world skills, rather than just their test-taking ability. This is because they have a unique offering of asynchronous project-based assessments (that candidates like to take). Those assessments are then graded by trained assessment engineers working from structured rubrics that capture the detail and nuance that we need to make the best decision. These rubrics have been built with I/O Psychologists so we know they're fair, and consistent, and give us a clear, actionable picture of how well a candidate will do for a particular role. Ultimately, we get the highest quality signal most efficiently. Only about 37% of candidates at Webflow move from this stage on to the next round.
Step 3: Hiring manager screen
Now that we have the context on a candidate's technical ability from the Byteboard assessment, we have clarity on where their technical skills are strong and where they aren't as strong. The hiring manager can then probe for the skills that were assessed to be weaker to learn whether or not it will be an issue for this particular role. We don’t ask candidates to code at this stage, but we’re trying to learn more about how they approach problem-solving.
Beyond that, this stage is mostly about evaluating the candidate for team fit. We’re trying to determine the following:
Will a candidate’s experience, skills, and approach line up with the team that they'd be joining?
Are their interests aligned with the team’s needs?
Is there a different role in an adjacent team that might even be better?
Because we’re a remote-first team, it’s also critical that we get a sense of a candidate’s capacity for working in a highly collaborative remote environment. This is the point where that signal can be extracted in a time-efficient manner. This interview screens candidates and forwards them to the next round only if there is a higher than 50% chance that the candidate will pass the onsite loop.
Step 4: Onsite loop
This is the most expensive and intensive round (a full day that requires a lot of engineering time), so we try to ensure that candidates who make it this far have a good chance of clearing this round. All of the work we've done up until now is about filtering for candidates who will be successful in this on-site phase. This is why we pay so much attention to the onsite-to-offer ratio. And that's one of the metrics that has improved so much with this new approach.
First, we evaluate for system design. We’ll ask the candidate to discuss and design a well-known product like Twitter or Uber. We don’t ask them to prepare anything in advance for this stage; it’s all done in real time during the interview.
We want to test a candidate's understanding of every component of engineering infrastructure—and, crucially, how they interact. Here we focus on a candidate's ability to design resilient and highly available systems that could scale. Even though a candidate won't be working across the whole system, they must consider how their work fits in among the overall system.
For the coding interview, we’ll have a question that’s more relevant to what we do at Webflow. There are different versions for front-end, back-end, and data engineering roles. In each case, we provide a skeleton component and we ask the candidate to provide more logic to it. We conduct these as "open book" interviews, empowering candidates to utilize resources like Google and StackOverflow.
We use the “open-book” approach because this reflects our real work environment, so in the interview, it's less relevant for a candidate to demonstrate their ability to memorize syntax. At the end of the day, we want to understand their problem-solving skills and critical thinking. Letting candidates look at syntaxes and whatever resources are available online to solve it gives us the best picture of how they would work if hired.
The project walkthrough is another facet of gauging problem-solving. In this case, the recruiter will let the candidate know ahead of time that this will be part of the interview so they have the opportunity to prepare for it. We ask the candidates to choose one project they’ve been involved in (either as a contributor or leader) and talk us through it. What was the project itself? What were the challenges? How were they overcome and what did they learn? We are watching for their critical thinking and ability to effectively diagnose the sorts of issues that will arise in any engineering project.
The last interview of the onsite is titled “Core Behaviors” during which we essentially have a senior engineer evaluate whether a candidate’s thinking and attitude aligns with our company values. For example, one of our values is “make your mark,” so we ask something like, “When was the last time you did something innovative?” Beyond technical skills (which we assess in the system design, coding interview, and project walkthrough), we’re also looking at the candidate’s communication skills throughout this stage.
Our new process has several advantages, including:
The Byteboard assessment allows us to filter out many candidates before we’ve spent any engineering time on them.
By the time someone speaks to a hiring manager, we know they’re technically sound and we can focus on other areas, like whether they’d be a good fit for the team.
Having an open-book interview removes some of the stress for candidates and allows them to showcase their skills. Our goal is not to trick them, but rather to learn how they work.
Because we’ve designed each interview to gain more context about the candidate, we no longer need multiple rounds of behavioral interviews.
Results: What we’ve seen so far—and what this means for you
There are several metrics to gauge the impact of a technical hiring process. We pay attention to several, including:
Engineering hours spent: both overall hours and hours spent at each stage. We want to return as many engineering hours as possible while ensuring we’re still very involved at the high-impact stage of onsite interviewing.
Time to hire: The sooner we can get a hire made, the more effective our team is.
Onsite-to-offer rate: A better ratio here means that we’re only spending expensive engineering time for onsite interviews with strong matches. [Note, though, that this ratio will never be 1:1 because you might have five great candidates on-site, but only have one role to fill.]
Candidate NPS: This is a vital gauge of how well we engage candidates, regardless of a hiring decision. Our company’s success depends upon being admired among developers, so this is an important touchpoint for preserving or even enhancing brand affinity among a crucial audience.
Number of mis-hires: This is a bit more subjective, but there is a huge cost to hiring someone who doesn’t work out. The Byteboard process gives us a clearer picture of how a candidate will actually perform on the job, which saves untold time and heartache in the long run.
Since we revamped our interview process, we’ve seen some impressive results, including a 34% higher conversion rate from onsite to offer. This means that by the time someone gets to the onsite stage, they’re much more likely to be a good fit and we haven’t wasted time and engineering resources on lengthy interviews with candidates who don’t meet our criteria. Speaking of time saved, having Byteboard handle the initial technical assessment has saved us at least 480 engineering hours in just four months.
As the demands on engineers change, we need to adapt our interviewing processes to meet them. It no longer makes sense to focus on plausible text or fixed code answers. We see the future of our industry as project-based interviews that are messy and complex. That’s the world engineers are working in, and that’s how we need to assess them. Now’s the time to get started on modifying your interview process to better match your needs as a hiring manager, and to reflect the skills you’ll need in your team for the more collaborative, AI-centric world that is coming.
Now what?
We hope this has been an in-depth and actionable resource about how the process for hiring engineers can be improved for better outcomes and efficiency—especially as AI reshapes the way engineering is done.
Now you’ve seen that AI is impacting the nature of technical hiring, but this is just a fraction of what companies need to think about when preparing for the impact of AI. In the course of Sargun’s work at Byteboard, they have found that AI is reshaping technical hiring in three key ways: how you hire, who you hire, and the roles you hire for. Here’s a quick overview, but you can also explore the full Playbook for hiring in the age of AI.
Tasks: AI is impacting the tasks that engineers perform. Engineering has always been more than coding, but when AI-assisted code is ubiquitous, the nature of engineering work will shift toward code review, attention to detail, and correcting the flaws of AI-generated code (which often appear correct but can have well-hidden flaws).
Processes: AI is impacting the way engineers are hired. As discussed in this article, from the top-of-funnel search process to the interview loop, the approach to finding and acquiring talent has changed in nearly every dimension. In today’s hiring landscape, we advise against elaborate efforts to catch AI-based cheating and instead prioritize a hiring process that welcomes AI-assisted work, since that’s most reflective of the tasks mentioned in point #1.
Roles: AI is impacting the positions that we hire for. Generative AI will offer a “speed hack” for engineering, but it will also introduce new challenges. The jobs that will build software in this future state will be able to accomplish some aspects of their work much faster, freeing them up to address the “what” and “why” of software engineering, while AI will handle the majority of the “how” questions.
What does this mean for you? The definition of a good software engineer is always evolving—it’s different today than it was ten years ago, and we can be sure that it will look considerably different a few years from now.
The engineers and engineering teams best equipped to capitalize on this evolution will be those that think deeply—and early—about how the change is happening and what to do about it now. We hope that we’ve given you a clear-eyed and actionable roadmap—or at least a step in that direction. If you’re an engineering leader who would like to dig deeper into this topic or Byteboard’s solution to it, please feel free to email Sargun here!
That’s a wrap! Let us know what you thought below!
Thanks in advance for your feedback.
Quang & the Plato team