TRY THE DEMO Account Shopping Cart

AI Hiring Is Now a Legal Risk — What Employers Need to Know About the Eightfold AI Lawsuit

AI in hiring has officially crossed a new threshold. On January 21, 2026, a proposed class action lawsuit was filed in California against Eightfold AI, alleging that the company’s hiring algorithms were used to evaluate candidates without proper notice, transparency, or access, potentially violating federal consumer protection laws and California’s Fair Employment & Housing regulations.  For anyone using systems like Eightfold AI outside of the US, the EU Artificial Intelligence Acthas likely been breached.  

The complaint, brought by Erin Kistler and Sruti Bhaumik, argues that Eightfold AI produced hidden candidate “scores” used by major employers, including Microsoft, PayPal, Salesforce, and Bayer, to screen applicants. The women claim the system denied them job opportunities in STEM fields without ever allowing them to see, understand, or challenge the AI-generated evaluations. 

This case marks a turning point. 

Whether or not the lawsuit succeeds, the message to employers is unmistakable: 

AI hiring systems are no longer just about efficiency.  They are now a compliance, fairness, and trust issue. 

 

What the Lawsuit Alleges 

According to the complaint and related reporting, Eightfold AI’s tools: 

  • Scored job applicants from 0–5 based on their predicted “likelihood of success.” 

  • Used sensitive personal data — including social media profiles, location data, device activity, cookies, and other tracking signals. 

  • Operated secretly, with applicants allegedly receiving:  

    • No disclosure 

    • No consent 

    • No opportunity to correct errors 

    • No copy of the report influencing their job prospects 

“There is no meaningful opportunity to review or dispute Eightfold’s AI-generated report before it informs a decision about one of the most important aspects of their lives — whether or not they get a job.” 

The filing also underscores that: 

“There is no AI exemption to these laws.” 

If the court finds that Eightfold’s algorithmic rankings qualify as “consumer reports” under existing U.S. law (such as the Fair Credit Reporting Act), then the entire category of opaque AI hiring systems may fall under strict regulation. 

Why This Matters for Employers 

The tools implicated in the lawsuit were reportedly used by large brands including Microsoft, PayPal, Salesforce, Bayer, and others, showing how widespread AI-driven candidate evaluation has become. 

The message for employers is clear: 

AI in hiring is not a technical shortcut. It is a regulated activity with legal consequences if misused. 

Three Key Risks Employers Should Now Recognize 

1. Compliance Risk: Undisclosed or Unreviewable AI = Legal Exposure 

If your hiring AI makes decisions that applicants cannot see or challenge, your organization may violate consumerprotection and fairassessment rules, even unintentionally. 

2. Data Risk: AI Systems Can Pull More Than You Realize 

Tools that ingest social media, device data, location information, or browsing behaviors create high-stakes privacy obligations. 

3. Bias & Fairness Risk: “Black Box” Scoring Can Discard Qualified Candidates 

Opaque AI scoring may eliminate strong applicants before a human ever looks at them, increasing the likelihood of legal claims and damaging employer's brand. 

 

How Employers Can Reduce Their AI Hiring Risk Today 

To safely adopt AI, organizations need workflows that are defensible, auditable, and transparent — not just fast. 

Here are the pillars every employer should implement: 

1. Transparency by Design 

Applicants should always know: 

  • When AI is used 

  • What it evaluates 

  • What data does it rely on 

Transparency builds trust and reduces regulatory exposure. 

2. Human-in-the-Loop Decision-Making 

AI should augment hiring decisions, never replace them. 
A human reviewer should always be part of the final evaluation. 

3. Applicant Rights: Access, Explanation, and Correction 

If an algorithm generates a score or assessment: 

  • Applicants should be able to see it 

  • Understand how it affected outcomes 

  • Correct inaccurate information 

This mirrors existing fairness and consumerreporting obligations. 

4. Reduce Data Collection to What Is JobRelevant 

Avoid systems that pull data from: 

  • Social media 

  • Location tracking 

  • Browsing or device activity 

Use only job-related, validated inputs. 

5. Use Tools Aligned with Employment and Consumer Law 

Choose AI hiring vendors who proactively design: 

  • Audit trails 

  • Explainable scoring 

  • Bias testing 

  • Compliance documentation 

If a tool cannot explain its decisions, your legal team will not be able to defend them. 

A Better Path Forward: Faster Hiring Without the Legal Risk 

Employer demand for faster hiring is real — but speed cannot come at the cost of fairness, transparency, or compliance. 

Resources like our Hire Fast with Confidence Guide show that it is possible to accelerate hiring without: 

  • Losing qualified candidates to opaque AI filters 

  • Relying on undisclosed algorithmic scores 

  • Creating avoidable legal exposure 

The future of AI in hiring belongs to systems that are: 

  • Transparent by design 

  • Human-guided, not human-replaced 

  • Aligned from the ground up with employment and consumer-protection law 

 

Conclusion 

  • The Kistler & Bhaumik v. Eightfold AI lawsuit is more than a single case. It is a signal of what’s coming in 2026 and beyond: 
    AI hiring will be scrutinized in the way credit reporting, background checks, and other regulated decision systems are today. 

  • Employers who prioritize transparency, fairness, and compliance now will be the ones who navigate this shift successfully — and earn greater trust from candidates in the process. 

 

Want to see if Accountests will work for your firm?  

Steve Evans  |  Steve founded Accountests alongside a career using his expertise in candidate testing and assessment to support employers to attract, recruit, and develop talent. 

Accountests  |  Accountests deliver the world’s only online suite of annually updated and country-specific technical skills, ability and personality tests designed by and for accountants and bookkeepers. 
 

Comments 0

Leave a comment

Please note, comments must be approved before they are published

Welcome, !

We noticed that you have items in your cart. Would you like to checkout?

Go to checkout Continue browsing
true
not logged in
empty