by justswim on 4/4/25, 1:41 AM with 708 comments
by no-dr-onboard on 4/7/25, 7:10 PM
Yeah, hiring is scary. Hiring is insanely expensive on all fronts. Firing people is difficult, it's expensive and legally exposing. Hiring the wrong person, allowing them access your systems and potentially exfiltrate your IP to them is a hazardous but necessary venture.
The thing is, none of these things really changed with AI. People have been lying about their experience for literally centuries. IMO the advent of AI-laden candidates is going to nudge the hiring process back to how we did it 10 years ago, with a good old fashioned face-to-face interview and whiteboard questions. This means a lot of things that we've grown accustomed to in the past 5 years is going to have to melt.
- people are probably going to have to fly out for interviews, again.
- awkward and/or neurodivergent people are going to have to learn social skills again.
- And yeah, you guys, it's time to buy a suit.
Companies should consider reverting to forking the upfront $13-1500 dollars for a set of plane tickets for their hiring team and rented conference rooms for a week. It's a whole lot cheaper than spending 50k because you hired the wrong person for half a year.
by jamesgasek on 4/8/25, 11:56 AM
This shouldn't be surreal at all. A candidate just wasn't able to make up relevant experiences on the spot.
by iamleppert on 4/8/25, 3:35 PM
If they say they don't remember, that's a red flag. If they can't describe how something works, that's a bigger red flag. You're not looking for photographic memory, but it's very obvious once you do it a few times who is real and who is lying.
It's common sense, if you don't put in at least a tiny bit of effort in your hiring process, you can only expect to attract similar low effort candidates.
by neilv on 4/7/25, 6:40 PM
1. Don't use blur to redact documents. Whatever blur was used can probably be reversed.
2. Don't try to hide the identity of someone you're talking about by redacting a few details on their resume. With the prevalence of public and private resume databases, that's probably easy to match up with a name.
by VohuMana on 4/7/25, 8:02 PM
I think at this point we are in a world where the cat is out of the bag and it's not are you or are you not using AI but how are you using it. I personally don't care if a candidate wants to use AI but be up front about it and make sure you still understand what it is doing. If you can't explain what the code it generated is doing an why then you won't be able to catch the mistakes it will eventually make.
by Foofoobar12345 on 4/7/25, 9:17 PM
What we do instead is send out a test - something like a mental ability test - with hundreds of somewhat randomized questions. Many of these are highly visual in nature, making them hard to copy-paste into an AI for quick answers. The idea is that smarter candidates will solve these questions in just a few seconds - faster than it would take to ask an AI. They do the test for 30 minutes.
It’s not expected that anyone finishes the test. The goal is to generate a distribution of performance, and we simply start interviewing from the top end and make offers every week until we hit our hiring quota. Of course, this means we likely miss out on some great candidates unfortunately.
We bring the selected candidates into our office for a full day of interviews, where we explicitly monitor for any AI usage. The process generally appears to work.
On a different note, things are just getting weird.
by gwbas1c on 4/7/25, 8:50 PM
What I don't understand is, what did the candidate do with AI? Did they use the AI as a coach? Did they use it to suggest edits to the resume?
---
I once interviewed a candidate who was given my questions in advance. (I should point out that it was quite time consuming for me to design an interview, so I couldn't just make up new questions for every candidate.)
When the candidate started taking the "schoolboy" tone of a well-rehearsed speech, I realized that they had practiced their answers, like practicing for an exam. I immediately threw in an unscripted question, got the "this wasn't supposed to be on the test" response, and ended the interview.
by dakiol on 4/8/25, 6:36 AM
I think the message here is: don’t ask for the moon, you are not Google.
by hbsbsbsndk on 4/7/25, 7:27 PM
Candidates who rely on AI seem to just be totally turning their brains off. At least a candidate who was embellishing in the old days would try to BS if they were caught. They could try and fill in the blanks. These candidates give plausible-sounding answers and then truly just give up and say "ummm" when you reach the end of their preparation.
I've been interviewing for 10+ years across multiple startups and this was never a problem before. Even when candidates didn't have a lot of relevant experience we could have a conversation and they could demonstrate their knowledge and problem-solving skills. I've had some long, painful sessions with a candidate who was completely lost but they never just gave up completely.
Developers I've worked with and interviewed who rely on AI daily are just completely helpless without it. It's amazing how some senior+ engineers have just lost their ability to reason or talk about code.
by myrandomcomment on 4/8/25, 5:37 PM
Interviewing is hard. Over the years the one thing I have learned is that for a technical role you want to interview people for how they THINK and REASON. This is hard and requires a time investment in the interview.
Back in the day when interviewing people for roles in networking, data center design, etc. I used to start by saying I am going to ask you a question and unless you have seen this very specific issue before you will NOT know the answer and I do not want you to guess - what I care about is can you reason about it and ask questions that lead down a path that allows you to get closer to an answer - this is the only technical question I will be asking and you have the full interview time to work thought it. I have people with 4+ CCIE family certs (this is back when they were the gold standard) and 10 year experience have no idea how to even reason about the issue. The candidates that could reason and work the problem logically became very successful.
For coding at my company now we take the same approach. We give candidates a problem with a set of conditions and goal and ask them to work through their approach, how they would go about testing it, and then have them code it in a shared environment of their choosing. The complexity of the problem depends on the level the candidate is interviewing for. For higher level engineerings besides the coding, we include a system architecture interview, presenting a requirement, taking the time to answer any questions, and then asking the candidate how they would implement it. At the end we do not care if it complies, what we care about is did the candidate approach the problem reasonably. Did they make sure to ask questions and clarifications when needed. Did their solution look reasonable? Could they reason on how to test it? Did their solution show that they thought about the question - IE, did they take the time to consider and understand before jumping in.
Anyone can learn to code (for the most part). Being able to think on the other hands seems to be something that is in short supply.
by msravi on 4/8/25, 7:25 AM
by lysecret on 4/8/25, 12:21 PM
by inertiatic on 4/7/25, 6:24 PM
Not being to remember small details about certain projects is also perfectly fine for people who have worked for more than a couple of years. Unless you can discover a pattern of lying like the author supposedly did then I would just be perfectly fine moving on to another topic.
by ChicagoBoy11 on 4/7/25, 6:45 PM
by neilv on 4/7/25, 7:03 PM
That was typical before some students got handed a lot of dotcom boom money.
(And then somehow most interviews throughout the industry became based on what a CS student with no experience thought professional software development was about. Then it became about everyone playing to the bad metrics and rituals that had been institutionalized.)
You can ask questions based on a resume without them disclosing IP, nor the appearance of it.
That resume-based questions thwarted a cheater in this case was a bonus.
by disambiguation on 4/8/25, 12:02 AM
I assume the folks at kapwing are monitoring the responses, so if you're really open to ideas then i offer the following for your consideration:
The best interview I've had to date has been a live debugging challenge. Given an hour, a printed sheet of requirements, and a mini git repo of mostly working code, try to identify and solve as many bugs as possible, with minimum requirements and bonus goals for the ambitious.
This challenge checks all the boxes of a reliable and fair assessment. It cant be faked by bullshittery or memorized leetcode problems. Its in person so cheating and AI is out of the equation, but more importantly it allows for conversation, asking questions, sharing ideas, and demonstrating, rather than explaining, their problem solving process. Finally its a test that actually resembles what we do on a daily basis, rather than the typical abstract puzzles and trivia that look more like a bizarre IQ test.
Stumbling upon this format was such a revelation to me and I'm stunned it hasn't been more widely adopted. You'll meet many more "Sams" as your company grows - many will fool you, some already have. But a well designed test doesn't lie. Its up to you and your company to have the discipline to turn down cheap and easy interviewing tactics to do things the right way.
by crazygringo on 4/7/25, 9:18 PM
People have been lying about their experience since time immemorial. You don't need an AI to do it, you can just ask a friend with experience to invent a few plausible projects you could have worked on, and solutions you might have found. Or just look at a bunch of resumes online and read a few blog posts of people describing their work.
I'm not surprised this happened. I'm surprised by why the author was surprised. Maybe "Sam" was exceptionally bad at "faking it" in person, but I've done tons of interviews where the candidate had exaggerated their experience and couldn't answer basic questions that they should have been able to.
Honestly, this is why some companies do whiteboard coding interviews before getting to the interviews about experience, because it does a decent initial job at filtering out people who have no idea what they're doing.
by lurker919 on 4/7/25, 6:37 PM
by theamk on 4/4/25, 3:21 AM
I've also had an AI cheater during phone screen, but they were pretty clumsy... A question of form "You mentioned you used TechX on your resume, tell me more what you did with it" was answered with a long-winded but generic description of TechX and zero information about their project or personal contribution.
Another thing that I can take away from that is "take home project" is no longer a good idea in AI times - the simple ones that candidates can do in reasonable time is too easy for AI, and if we do realistic system, it's too hard for honest candidates.
by a_t48 on 4/8/25, 3:34 PM
by ed_mercer on 4/8/25, 10:59 AM
Was it really necessary to take the moral high ground and lecture the candidate? As if companies are honest and well-meaning in interviews. You caught him and that's the end of it.
by forthwall on 4/7/25, 10:00 PM
What happened though was the candidate decided to paste the entire challenge prompt into cursor and I watched cursor fail at completing the assignment. I tried to nudge them to use their own skills or research abilities, but alas did not come to fruition, and had to end the interview.
The crazy part was they had 8 years of experience, so definitely have worked before not using AI, so it was very strange they did that, especially since they remarked that the challenge was going to be easy
by UncleOxidant on 4/7/25, 8:38 PM
by tetromino_ on 4/7/25, 8:52 PM
by robocat on 4/7/25, 9:41 PM
Is that really good advice?
If you have the wisdom of knowing when to embellish and when to blur, then you're more likely to get a job and more likely to fit in.
I'm a spectrum, and generally I'm over-truthful and I notice my habit regularly affects me negatively.
by vonneumannstan on 4/7/25, 6:47 PM
by oulu2006 on 4/8/25, 5:36 AM
Twilio indeed can't handle batching of SMS requests -- even to this day several years after I asked them to :)
To be specific, what I want is what sendgrid offers, copy + replacements, so I can send the copy I want to send, a list of recipients and a list of replacements for each recipient in a single request.
by ChrisMarshallNY on 4/8/25, 10:46 AM
I would probably have been fooled by the applicant's screening interview, but it would have rapidly come apart, in the ensuing steps.
My team was a very small team of high-functioning C++ programmers, where each member was Responsible for some pretty major functionality.
This kind of thing might be something they could get away with, in larger organizations, where they would get lost in the tall grass, but smaller outfits -especially ones where everyone is on the critical path, like startups- would expose the miscreant fairly quickly.
by minimally on 4/8/25, 9:41 PM
Yes, developers use AI in 2025 and this will only increase as the technology gets better. Shaming the use of AI is like taking away a plumber's toolbox because you'd prefer they work with thier hands alone. Developers at all levels have a use for AI, and given two developers with the same skill level why wouldn't you prefer one who could use AI as a tool.
If you are already hiring an engineer on their output over their comprehension, rate the output that they give you
by cynicalsecurity on 4/7/25, 9:03 PM
I can't stop repeating it, just invite the candidate to your office. That's it, that's how simple the problem is solved.
by janalsncm on 4/7/25, 7:14 PM
by Aurornis on 4/7/25, 6:22 PM
The AI was used as a tool to generate false stories, but that's not what I assumed when I read the title. It's common for people to "prepare" with LLMs by having them review resumes and suggest changes, but asking an LLM to wholesale fabricate things for you is something else entirely.
I do think this experience will become more common, though. There's an attitude out there that cheating on interviews is fair or warranted as retaliation for companies being bad at interviewing. In my experience, the people who embrace cheating (with or without LLMs) either end up flaming out of interview processes or get disappointed when they land a job and realize the company that couldn't catch their lies was also not great at running a business.
by gregncheese on 4/8/25, 6:11 AM
To add to your experience, I became increasingly suspicious of the "perfect fit" resumes. it's insane how so many people just put the right keywords. I think it might work to pass in larger companies where HR use automated systems to triage applicants.
by kazinator on 4/7/25, 8:20 PM
by jorgesborges on 4/7/25, 7:08 PM
by bitlad on 4/7/25, 6:21 PM
by Juliapierson1 on 4/13/25, 8:04 AM
by crabbone on 4/8/25, 11:58 AM
Oh my... I don't think I've ever seen a resume that didn't embellish or straight up lie about the applicant. AI does make lies more convincing and allows to go further with lies though.
Also, I'm impressed and upset that it takes so much effort to get a job doing something that sounds like entry-level Node.js / React stuff :( And the effort on the part of the applicant to manufacture this fake identity and experience to apply for this kind of job... and they are a masters student! Like... shouldn't this alone qualify you for the kind of low-stakes undemanding job?
by ofrzeta on 4/7/25, 10:15 PM
by atoav on 4/8/25, 5:30 AM
by rDr4g0n on 4/8/25, 7:14 PM
Scamming may not be new, but a person using AI in this way is able to penetrate quite deeply into (long, tedious, time-consuming) interview process if folks aren't keeping an eye out for it (and this article, like many personal experiences, indicate that people aren't yet). Having an AI voice in your ear, rapidly providing you answers in real time is something new; at least in terms of how easily accessible it is.
It's amazing to me that folks have the audacity to come to interviews like this. I think some candidates genuinely feel that it is a reasonable thing to do along the lines of stuffing their resumes with keywords to get through the various recruiter filters. It's like hey, everyone in baseball is doping, so I have to do it to keep up!
The behaviors are obvious once you've seen them before, but as an engineer and not a "talent acquisition" person, I feel deeply uncomfortable implying that some candidate I'm interviewing is lying or cheating, so it took me a bit to speak up about it.
These types of articles need to continue to come out and the conversation elevated, if just to save some poor devs hours of interviews with candidates who were able to bluff their way through the less technical initial conversations.
by sashimimono on 4/8/25, 8:45 PM
Remember you try to hire a ${coder, admin, } not the next tv-news-presenter, beeing on screen is not a mandatory needed skill in most jobs.
By asking for something, that makes people uncomfortable, you will exclude a lot of likely brilliant candidates.
People who refuse to do video interviews may be for example: - people who value privacy, not only their own, but most likely yours too - people who feel very uncomfortable beeing watched by strangers and who think or even know that they will perform significant worse than in an audio-only interviewsituation - people who simply don't own a camera - people who use textonly computers offjob - poeple who have experienced that your 'standard'-videochat-app may not work, maybe because they use linux, bsd, os/2 or nonstandard operatingsystems - people who don't have broadband internet, yes there are still people like that - people who pay for every bit send, and yes having a not so cheap phone/internet contract is still common in some areas - people who feel uncomfortable to let strangers in their bedroom, even virtualy - people who have disabilities or cosmetic issues that they fear may distract you - people who have disabilities where moving and out-of-sync pictures distract them - people who tend to refuse unreasonable requests and who therefor regard you as unqualified to be their next employeer - ...
All of them have good reasons not wanting video interviews.
You, as an employer, may miss your best fit.
by Glyptodon on 4/7/25, 7:01 PM
Not my favorite AI driven change as I think live coding is so high pressure it can give wrong signals.
by thih9 on 4/8/25, 6:55 AM
Off topic, why have such a take home exercise then?
by acjacobson on 4/8/25, 11:33 AM
One candidate was absolutely stumped and could not answer why and when they became interested in technology. They couldn't say anything about themselves personally. It was baffling.
by speckx on 4/8/25, 6:10 PM
All the candidates did really well on the online intake questions and the general meet and greet over video. However, once they arrived for the in-person part of the interview, and it got relatively technical, most did nowhere nearly as good as they did on the online. Only one or two admitted to using AI.
by ipunchghosts on 4/7/25, 11:46 PM
by qoez on 4/7/25, 6:48 PM
by yieldcrv on 4/7/25, 7:06 PM
so all I can say is fix your assessments because this whole “they cheated” idea isnt universal, and more likely matches what people do on your job already
but for anyone that didnt read this article yet, this one is just about embellished experience custom tailored to get the interview, and there was no technical assessment
by mvdtnz on 4/7/25, 9:14 PM
So why bother with it?
by austin-cheney on 4/8/25, 11:19 AM
The people in these positions are scared to death to write original code and then have the balls to whine about people who use AI to provide unoriginal answers.
by fgonzalesv on 4/13/25, 9:21 PM
by thaumasiotes on 4/8/25, 7:12 AM
Why are we calling these "phone screens"?
by JSR_FDED on 4/8/25, 7:42 AM
by coolThingsFirst on 4/8/25, 9:30 AM
Except it doesn’t if he hadn’t stretched the truth in his bombastic resume he would never have received an interview.
I will defend him because companies do the same thing of stretching the truth.
by j45 on 4/9/25, 5:58 PM
Because the preparation ended up not being sufficient.
Assuming people doing the hiring can be outsmarted in all cases like this is part of the problem.
Maybe 'preparation' can evolve to the candidates asking AI for a crash course and way to start using it instead of talking the talk.
It never ceases to amaze me that it's surprised it's hard to BS your way through tech jobs at tech companies. Maybe it works with tech positions at non-tech companies.
by Clubber on 4/7/25, 9:44 PM
I have no doubt as well, but I couldn't help but noticing, "Don't bother with take home tests," wasn't on the list of remedies.
by mk89 on 4/8/25, 7:10 PM
I think you're drawing the wrong conclusions from this experience, and if you believe it's right so, it means you didn't interview before AI.
It was exactly like that. The only difference was the lack of availability of tools that can give you the answer right away, fake the voice, etc.
But even then, if it stinks, trust your guts.
by veunes on 4/8/25, 6:58 AM
by slcjordan on 4/8/25, 11:32 AM
It just makes me wonder about the importance that an understanding and commitment to ethics will play as people start to use AI more and more in their daily life.
by outside1234 on 4/8/25, 5:16 PM
by Bluescreenbuddy on 4/8/25, 3:28 PM
by giantg2 on 4/8/25, 1:05 PM
by foobahify on 4/7/25, 7:55 PM
by rdtsc on 4/7/25, 6:44 PM
Well who are they? How would the next member of the community know this is a fake candidate. I like the idea in general of finding a way to eliminate these time-wasters but how would that work? The candidate can adjust a bit and improve the AI "foo" to come up with online answers for them.
by feverzsj on 4/8/25, 6:27 PM
by designAndCode on 4/8/25, 3:28 AM
by yahoozoo on 4/8/25, 11:45 AM
by nsonha on 4/8/25, 1:09 PM
Actually it would be interesting if the interviewer had an AI to counter these tactics
by p0sixlang on 4/9/25, 3:08 AM
by intalentive on 4/8/25, 4:07 PM
by greenavocado on 4/7/25, 6:49 PM