Two people sitting in modern armchairs facing each other with their heads replaced by colorful digital glitch effects, representing AI or technological interference in human interaction

The Human Factor: Navigating AI in Modern Job Interviews

by Bruno Gil 3 min read

Interviews are, by nature, complex. Candidates seek to highlight their strengths and minimize their weaknesses; interviewers, in turn, want to understand exactly what is solid and what is not. It's not about rejecting someone for failing to distinguish RabbitMQ from Kafka, but about assessing the strength of their fundamentals, which is, in practice, the most important asset a developer can have, along with the ability to apply lessons learned in one context to solve problems in another.

The AI Revolution in Hiring

The statistics paint a clear picture of transformation. According to research from DemandSage, 87% of companies now use AI in recruitment, and Resume Builder projects that 70% of companies will use AI for hiring by the end of 2025. Meanwhile, 24% of companies currently use AI for the "entire interview process," with that number projected to increase to 29% by 2025.

With the rapid adoption of artificial intelligence tools, it has become natural to use them in interviews, whether through text or voice. This often turns candidates into mere readers of pre-generated answers. The practice itself is not new: in virtual interviews, candidates have always been able to turn to Google. What has changed is the ease, even for topics a candidate already knows, but chooses to rely on AI instead.

The Scale of AI Usage by Candidates

The reality is more widespread than many employers realize. One tech leader reported that 80% of candidates used large language models to complete their top-of-the-funnel code test, even though they were explicitly told not to. Recent data from HRD America shows that 1 in 5 employees admit to using AI in job interviews, while other studies suggest the actual numbers may be significantly higher.

Capterra's research revealed that 40% of employees use AI to write or refine their resume, 33% use it to improve their cover letter, 31% use it for mock interviews, and 28% use it to generate interview answers.

The Dual-Edged Reality

It's important to make a distinction. Using AI as support in hiring processes is not inherently a problem: we all use it in our daily work, whether to generate code or to help understand complex functions in technical challenges. The problem arises when technology is used to project a deeper knowledge than the candidate actually has. This leads to two equally negative outcomes:

The interviewer detects the misuse of AI and dismisses the candidate.

The candidate is approved based on knowledge that cannot be sustained in practice.

Both scenarios create losses. In the first, the company may miss out on someone with strong potential, and the candidate loses the opportunity to join an interesting project. In the second, once the candidate's true skills are revealed, credibility is lost, and the team realizes it has not acquired the capabilities it expected through the hire.

The Corporate Response: Back to Basics

Organizations are adapting rapidly. According to a recent Gartner survey, 72.4% of recruiting leaders reported they are currently conducting interviews in-person to combat fraud. Major companies including Google, Cisco and McKinsey & Co. have all re-instituted in-person interviews for some job candidates over the past year.

However, punishing the use of AI entirely is counterproductive. Rather than restricting it, the best approach is to observe when the technology is being used and then probe deeper in those areas. As advanced as these tools are, when the context shifts or more detail is required, consistency tends to break down. That is when interviewers can distinguish between someone who has real-world experience and someone who is simply reproducing a pattern.

The Candidate Paradox

While 66% of U.S. adults hesitate to apply for AI-screened roles, many of these same individuals are using AI tools themselves during the application process. This creates a complex dynamic where candidates are simultaneously wary of AI evaluation while embracing AI assistance.

Candidates must understand the risks. In more conversational interviews, honesty becomes a differentiator. It is unrealistic to expect anyone to know everything, especially professionals who have spent much of their careers in large