A job seeker named Conor applied for a content architecture position and received an immediate interview offer, only to discover he was being interviewed by a poorly programmed AI system that couldn’t provide basic job details. After the interview, he received an email promoting “mock interviews with an AI interviewer,” leading him to suspect the entire job posting was a fake designed to generate leads for Alex’s new product.
The big picture: Alex, a $2.8 million startup founded by Brown University dropout John Rytel and former Facebook AI employee Aaron Wang, appears to be using fake job listings to funnel candidates into their AI interview platform, highlighting broader concerns about deceptive practices in AI-powered hiring.
Key details: The startup graduated from Y Combinator, a prominent startup accelerator, in winter 2024 and raised $2.8 million in seed funding led by 1984 Ventures.
• When Futurism reporters applied for the same position, they experienced identical issues: a glitchy AI that refused to provide job details and the same promotional email afterward.
• Alex’s AI denied knowledge of the mock interview service when asked directly, despite automatically sending promotional emails about it.
• The company’s mock interview landing page initially showed only “coming soon” on a blank white page.
What they’re saying: Conor described the experience as fundamentally deceptive and exploitative.
• “It just feels like a new approach to a scam,” he said.
• “I felt so terrible about the interview. They kind of just took all of my information and just didn’t provide me anything about the company.”
Why this matters: The incident exemplifies growing concerns about AI-mediated hiring processes that alienate job seekers in an already challenging market.
• A June report by the Ludwig Institute for Shared Economic Prosperity found nearly 25% of Americans are “functionally unemployed,” with many discouraged workers calling AI interviews an “added indignity.”
• Earlier this year, 62% of US job seekers said they’d be turned off from applying to jobs using AI recruitment software.
The discrimination problem: AI hiring tools carry significant bias risks that can exclude qualified candidates.
• Researchers found AI hiring programs favor names associated with white applicants 85% of the time.
• A February Harvard study concluded that applicants with internal referrals—typically from well-connected or high-income backgrounds—were most likely to succeed with AI-gatekept positions.
• According to the Washington Center for Equitable Growth, automated hiring systems often exclude qualified applicants while confounding both companies and job seekers.
Company response: Alex didn’t respond to questions about the story but eventually updated their mock interview site to include implausible job listings like “Space Tourism Guide,” “Professional Athlete,” and “Mayor of Springfield, IL.”
• However, clicking “Launch Interview” on these listings caused the site to crash and become unusable.