People are avoiding jobs with AI interviewers
Last updated: 2025-08-05
A friend's horror story
My friend Sarah told me about her recent interview experience that perfectly captures why people are actively avoiding AI-driven hiring processes. She applied for a marketing role at a mid-sized tech company and was excited about the opportunity. But when she clicked the interview link, instead of meeting with a human, she was greeted by an AI system that analyzed her facial expressions, voice patterns, and responses to behavioral questions. She felt like she was being evaluated by a robot that couldn't understand context, humor, or the nuances that make someone a good fit for a team.
After hearing her story and seeing similar discussions online, I started paying attention to how widespread this phenomenon has become. People aren't just uncomfortable with AI interviews – they're actively screening them out during their job searches.
What's driving the avoidance
Through conversations with job-seeking friends and observations from hiring threads on various forums, I've noticed several consistent concerns:
- The uncanny valley effect: There's something deeply unsettling about being evaluated by a system that mimics human conversation without actually understanding it. People report feeling like they're talking to an advanced chatbot that's pretending to care about their career aspirations.
- Algorithmic bias fears: Many candidates worry that AI systems perpetuate hiring biases in ways that are less transparent than human bias. At least with a human interviewer, you can sense their reactions and potentially address concerns directly.
- No room for context: Human interviewers can understand when someone had a bad day, when technical issues affected their performance, or when a candidate's background doesn't fit neatly into standard categories. AI systems tend to reduce people to data points.
- Performance anxiety amplification: Many people find AI interviews more stressful than human ones because they can't read social cues or build rapport. You're essentially performing for a black box that gives no feedback.
- Privacy concerns: The idea that facial expressions, voice patterns, and behavioral metrics are being recorded and analyzed feels invasive to many candidates, especially when they don't know how this data will be stored or used.
How people are dodging them
Job seekers have developed various strategies to avoid AI interviews entirely:
- Pre-screening questions: Before applying, many people ask about the interview process during initial phone calls or research company practices online.
- Company research: Candidates are checking employer review sites like Glassdoor specifically for mentions of AI-driven interviews and avoiding those companies.
- Network referrals: Some people are focusing exclusively on opportunities that come through personal connections, assuming these are less likely to involve AI screening.
- Industry selection: Certain sectors (especially traditional industries) are less likely to use AI interviews, so some candidates are adjusting their target industries accordingly.
The unintended consequences
This avoidance behavior is creating some interesting market dynamics that companies probably didn't anticipate:
- Talent pool filtering: Companies using AI interviews may be inadvertently selecting for candidates who are either less informed about their process or more willing to tolerate impersonal interactions. This might not correlate with job performance quality.
- Diversity impact: If certain demographic groups are more likely to avoid AI interviews due to privacy concerns or previous negative experiences with algorithmic bias, these systems might actually reduce diversity despite being designed to improve it.
- Competitive disadvantage: Companies that rely heavily on AI screening might find themselves competing for a smaller candidate pool, while companies with human-centered processes get access to candidates who actively avoid AI systems.
What I think companies are missing
From my perspective, the push toward AI interviews seems to optimize for the wrong metrics. Yes, they can process more candidates faster and supposedly reduce human bias. But they might also be screening out candidates who value human connection, have strong emotional intelligence, or simply prefer to work for organizations that prioritize people over efficiency.
The irony is that many of these companies are simultaneously trying to build "people-first" cultures while using recruitment processes that explicitly deprioritize human interaction. The disconnect sends a strong signal about company values that many candidates are picking up on.
A hybrid approach might work better
I'm not completely anti-AI in hiring, but I think the current implementations miss the mark. A more thoughtful approach might involve:
- Optional AI pre-screening: Let candidates choose whether to do an AI interview or go straight to human screening, recognizing that some people perform better in different formats.
- Transparent AI augmentation: Use AI to help human interviewers prepare and take notes, rather than replacing human judgment entirely.
- Bias detection tools: Deploy AI to identify potential biases in human interview feedback rather than making the hiring decisions directly.
- Candidate choice: Always offer a human alternative for candidates who prefer it, even if it means a longer process.
The bigger picture
The trend of avoiding AI interviews reflects a broader tension in how we integrate artificial intelligence into human-centered activities. Hiring is fundamentally about human relationships – whether someone will fit with a team, contribute to company culture, and thrive in a particular environment. These are nuanced judgments that benefit from human insight.
Companies that recognize this and find ways to use AI as a tool to enhance rather than replace human judgment in hiring will likely have better outcomes – both in terms of candidate experience and actual hiring quality.