There’s a decent chance your organization is using an AI tool to screen job candidates, and you may have no idea what that tool is actually doing with applicant data. Not because you’re negligent, but because most of these platforms aren’t telling you.

That gap between what AI hiring tools do and what employers and applicants actually know about them is exactly where a major new lawsuit and sweeping incoming regulations are about to land.

The lawsuits that have everyone’s attention

In January 2026, two job applicants filed a landmark class action against Eightfold AI, a hiring platform whose clients include a significant number of Fortune 500 companies—Microsoft, PayPal, Starbucks, Morgan Stanley, and Chevron among them. The plaintiffs, both women with STEM backgrounds, argue that Eightfold operates more like a background check company than a software tool and that it’s been doing so without following the laws that govern background checks.

The question at the heart of this case is whether Eightfold is secretly functioning as a consumer reporting agency without giving applicants any of the rights that come with that designation.

According to the complaint, Eightfold pulls data from public sources including social media and professional profiles to build detailed dossiers on candidates, then gives each applicant a score between zero and five based on their predicted likelihood of success.

If your organization is using Eightfold or a similar platform, that means you’re likely receiving scores or rankings generated from data sources you didn’t select, can’t audit, and may not even know exist. That’s a compliance exposure most employers didn’t build into their vendor evaluation.

What’s more, applicants never see their scores, can’t review them for accuracy, and have no way to dispute errors. The Fair Credit Reporting Act (FCRA)—which has governed third-party employment screening since 1970—requires exactly those disclosures and rights. The plaintiffs argue there’s no exemption just because the screening tool runs on AI.

Eightfold disputes the claims, stating that its platform operates on data consciously shared by candidates or provided by its customers and that candidates can view and correct their information. But the case is moving forward, and it could be the first of a new wave of class actions targeting AI hiring vendors on consumer protection grounds rather than discrimination theories.

Eightfold isn’t the only AI hiring vendor in court. Workday is fighting a separate collective action alleging its AI screening tools discriminate against applicants over 40. A federal judge granted preliminary certification in May 2025, potentially opening the door for millions of applicants to join.

Together, these cases form a pincer:

  • Eightfold attacks the process—how AI hiring tools collect and use data without transparency.
  • Workday attacks the outcome—whether those tools produce discriminatory results.

Different legal theories, same direction. AI hiring vendors can no longer credibly claim they’re just providing software.

There’s a regulatory gap that makes this worse. In 2024, the Consumer Financial Protection Bureau issued guidance explicitly stating that algorithmic employment scores qualified as consumer reports under the FCRA. That guidance was rescinded in 2025.

The rescission doesn’t change the statute, but it means there’s no active federal regulatory backing for the theory. The courts may end up answering a question the regulators walked away from.

California set the national compliance clock

California’s Privacy Protection Agency has finalized regulations under the CCPA that impose the country’s most stringent requirements on employers’ use of automated decision-making technology in employment decisions. The rules take effect January 1, 2027.

Mid-to-large for-profit businesses—generally those with annual revenues above $26.6 million—operating in California will need to:

  • Conduct formal risk assessments before deploying AI tools for significant employment decisions
  • Provide pre-use notices to applicants and employees
  • Honor opt-out and access rights

A senior executive must certify each assessment, and the state can demand the full document with 30 days’ notice.

If you have applicants in California, these rules apply to you regardless of where your headquarters sits. And California employment law has a decades-long track record of becoming the national baseline, whether other states formally adopt it or not.

What HR and employers should actually do

Two-thirds of recruiters plan to increase their use of AI for pre-screening in 2026. If your organization is in that group, the window to get ahead of this is closing.

Start with your vendor contracts. Research from Jones Walker found that 88% of AI vendors cap their liability, often to monthly subscription fees. Only 17% warrant regulatory compliance.

When legal exposure shows up, it’s landing on the employer. You need to know exactly what data sources your AI tools are using, whether they’re pulling information from outside candidate applications, and whether they’re generating scores that inform employment decisions.

Ask in writing. Get answers in writing. Vague responses about proprietary methods aren’t reassuring. They’re a red flag.

Audit your automated tools against the meaningful human involvement standard California’s regulations establish. A human decision-maker needs to actually understand the AI’s output, evaluate it alongside other information, and have the authority to override it. If your hiring managers are rubber-stamping AI recommendations without real review, that’s both a legal vulnerability and a fairness problem.

Finally, start building documentation now. Risk assessments, vendor due diligence records, notice procedures, and human review protocols are all things you want assembled before a regulator or plaintiff’s attorney asks for them. The organizations that come through this cleanly won’t be the ones that adopted AI hiring tools fastest. They’ll be the ones that built the governance to support them.

Share.
Leave A Reply

Exit mobile version