03 December 2014

There's been a fair amount of conversation around how to recruit software developers effectively. I've participated in some of it. But just today, a blog post crossed my desk(top) and finally prompted me enough to get around and blog about it.

In "Resumes Suck. Here's the Data", Aline Lerner talks about a scientific process she used to actually try to gauge the efficacy of resumes in the hiring process. (I probably should say "pseudo-scientifically", only because while it looks pretty legit to me, and frankly it's more legit science than most companies will engage around hiring, it probably has a few statistical/scientific holes in it that could lead to some misleading data; that said, it's better than the traditional answer of "Well, I know how to interview people, so....", which to me is an outright so-you-buy-into-your-own-bullshit answer.)

Her conclusion?

... it may not be a matter of being good or bad at judging resumes but rather a matter of the task itself being flawed — at the end of the day, the resume is a low-signal document. ... Very smart people, who are otherwise fantastic writers, seem to check every ounce of intuition and personality at the door and churn out soulless documents expounding their experience with the software development life cycle or whatever... because they’re scared that sounding like a human being on their resume or not peppering it with enough keywords will eliminate them from the applicant pool before an engineer even has the chance to look at it.

Yeah. I can relate.

If you combine her findings here with my anecdotal evidence around how most companies' interview process is fundamentally broken (which Google seems to have confirmed), what you come to realize is that for most companies, the entire recruiting process is just one giant coin-flip based on equal parts intuition, mutual fakery with the best of intentions ("This company is great!" "I'm an amazing candidate!"), and almost no science--or accountability, or independent verification of the results, or testing--involved.

So here's what we do at iTrellis (the company where I'm currently the CTO):

... and so on. The goal here is to push your boundaries a little, and yes, maybe pile on a little stress to see how you react. But at every step, something that is what you'll actually have to deal with, every day, when working for a consulting company.

That's our recruiting process.

By the way, I say all of this here because none of this is Secret Sauce(TM). We want the entire process to be entirely transparent. No "hidden agenda". Yes, I'll admit, right up front, what each stage is designed to do, and why. I'll even tell you, right from the beginning, if you ask. Yes, it's a little longer than your average "one-hour interview with the hiring manager"; but if you notice, the time commitment on the iTrellis side is 30 minutes (recruiter call) + code evaluation time, probably not more than 30 minutes (my or an engineer's time) + pairing time, not more than 30 - 45 minutes per pair (of an engineer's time), which is actually pretty comparable to your average "stand at the whiteboard and code" interview time. Yes, you've put in 10 or so hours into the coding project--but it's coding, and tell me right now, which would you rather do: a full eight-hour "interview loop" in front of people at a whiteboard, or snuggled up with the cat in front of the TV, gloriously hacking using your favorite IDE and your favorite libraries with your favorite tunes blasting through your favorite headphones?

Yeah, I thought so. (So do most of the candidates who come through, whether we accept them or not. Of course, they could be trying to just brown-nose me, hard to say. )

How do I know it works? A couple of reasons. One, we've gotten some great hires out of it. (OK, yes, that could be just luck--random luck could give you that, too.) Two, we've had no "bozo" hires. (OK, we've only been around a year, so again, we could just be lucky.) Three, we've had more than a few candidates that looked great on paper either walk away from the coding challenge entirely (buh-bye!) or turn in some pretty weak results. Sure, they might've been great, but frankly, if your audition tape is bad, your band is not going to get gigs, either.

But mostly, I have faith in this process because it lends itself well to unit-testing; at some point (haven't done it yet, but this is in my plans for 2015), I'm going to slip a few "known quantities" into the system and see how they do: I'm going to slip a complete newbie with great confidence and a few buzzwords into the process to see if they can get all the way through, and I'm going to slip an amazing coder with a crappy cover resume in as well, just to see how we do. And, by the way, note that I'm not part of the recruiting process: as soon as I could get myself out of the way of it, I did, because I didn't want to create a company of Ted-clones. (Which is also something that we tend to do intuitively: prefer the familiar, and nothing is more familiar than the self. This is part of the reason why a team of white men tends to stay a team of white men.)

If you've made it this far, thanks for reading. Two more important points, then I'm out:

Peace, out, y'all.

(UPDATE 2021: Hey, I no longer work for iTrellis, but I think the process is still interesting. I've updated the post accordingly.)


Tags: industry   management   philosophy  

Last modified 03 December 2014