Im all for anything that moves us closer to making obsolete LeetCode quizzes, whiteboard hazing, high-pressure coding challenges, complex take-home projects, and intrusive interview-proctoring malware. Interviewing for a tech job has become a total nightmare from the candidate’s point of view. Not to say I have a solution—if I did, I’d be selling it.
Note that the privacy policy states that they will share the captured audio and screen grabs with third parties. So anyone you interact with who's using this will draw your content and audio into their ecosystem. Kinda like Microsoft Recall, only sexier.
If our interviews are so broken that a worse candidate could get a job over me by cheating with AI, I honestly just want to do something different. It’s just a race to the bottom at that point.
What this trend is going to push is more contract-to-hire where companies start hiring people as contractors for 1 month and let go who is not as expected.
This exists today and I always refused to play the game. But I guess companies can either go back to in person interview or other methods.
The only candidates who would accept such an abusive offer are those who are desperate and don't already have a decent job. So it can sort of work to put butts in seats but you're already filtering out many of the best candidates.
I suspect that in California (and probably much of the rest of the US), a person on the contract phase of contract-to-hire could make a good legal case that they are actually an employee.
Is it that hard to have something "undetectable" visible on your own monitor without video teleconferencing software seeing it? If it's just an AI chatbot that's actively listening or reading the screen while being viewable to only one party seems rather trivial.
Soon, video conference software will have “anti-cheat” features at parity with Steam and video game consoles.
There is already a huge anti-cheat industry built around college-level testing and certifications. Even live, remote proctors to watch and flag anything “suspicious”. So build it into Zoom and Teams with hooks to detect third-party add-ons, and whatever the dude runs on his own computer will be detectable to the remote side.
Folks will just intercept webcam feeds to spoof their eye movements and cheat on another device.
Fundamentally, companies cannot avoid the expense of flying candidates on-site and interviewing them in person anymore. It's kind of a miracle they ever could.
Some remote home testing software also requires connecting a second camera so that a proctor can watch the user and their screen. Now in theory it might be possible to intercept and modify that video stream in real time to filter out evidence of cheating but that would be quite difficult to do with high fidelity.
I've caught a couple of people using AI, which i found pretty surprising. It's possible people I've hired used it, but the people I've caught were so damn obvious it's hard to imagine much success.
now, AI to bypass recruiters and folks hiring by keyword? Yes please I would absolutely pay for that, and frankly I'd probably pay for that on the hiring side as well. Most recruiters I've worked with really struggle with what to look for in a candidate—the ones with an engineering background are worth at least 4x of those without.
you called it.
Fact is that whoever can exploit AI the best, is who a company needs,
things are going to move fast for a while ahead
and there is going to be a wave of "orphaned technology"
anybody talking about ethics, morality, legality, etc...should stick to literature and art history
Columbia has it right. In a just world, this behavior would be derided and considered anathema. This is moral bankruptcy plain and simple and ought to be considered a violation of CFAA.
Even if you are against modern interviewing practices, you should be more against deceiving your prospective coworkers. This is borderline sociopathic.