Forgot your password?
typodupeerror
Businesses AI The Courts

AI Company Eightfold Sued For Helping Companies Secretly Score Job Seekers (reuters.com) 16

Eightfold AI, a venture capital-backed AI hiring platform used by Microsoft, PayPal and many other Fortune 500 companies, is being sued in California for allegedly compiling reports used to screen job applicants without their knowledge. From a report: The lawsuit, filed on Tuesday accusing Eightfold of violating the Fair Credit Reporting Act shows how consumer advocates are seeking to apply existing law to AI systems capable of drawing inferences about individuals based on vast amounts of data.

Santa Clara, California-based Eightfold provides tools that promise to speed up the hiring process by assessing job applicants and predicting whether they would be a good fit for a job using massive amounts of data from online resumes and job listings. But candidates who apply for jobs at companies that use those tools are not given notice and a chance to dispute errors, job applicants Erin Kistler and Sruti Bhaumik allege in their proposed class action. Because of that, they claim Eightfold violated the FCRA and a California law that gives consumers the right to view and challenge credit reports used in lending and hiring.

This discussion has been archived. No new comments can be posted.

AI Company Eightfold Sued For Helping Companies Secretly Score Job Seekers

Comments Filter:
  • by dmomo ( 256005 ) on Wednesday January 21, 2026 @02:51PM (#65940102)

    I'm not saying that the AI company shouldn't readily seek permission to harvest and use personal data, but it's far from a special case. Take Palantir for instance. Are they not doing the same thing?

    It seems odd that the onus is on the third-party platform to be transparent to the candidates about data collection and AI use, and not the companies using the tool.

    I am all for this lawsuit being successful if it sets a precedent for all AI companies using our data. But if the scope is specifically around the hiring process, I think the individual companies should be held accountable for using the tool without candidates' consent or knowledge.

    • I'm inclined to agree. While setting some jurisprudence around AI company data collection is good- the real problem that I see here, is the employer's use of this tool... they seem like they should be the ones litigated against in this instance.
      After all- if they're not hit for doing this, they'll just find another way to do it if this one goes away.
      • The reason this is tied to only AI with hiring instead of all of AI and our personal data is because the law states for lending or hiring purposes.
        We would need strong personal privacy laws and that is never going to happen in the USA.

    • "It seems odd that the onus is on the third-party platform"

      The key problem seems to be the aggregation from multiple sources, not just processing the data given by one client and providing the results back to that client. Since it is the third party that does the aggregation, that's where the liability lies.

      The client companies only dealt with one source, so all they can do is point to that one source.

  • by CommunityMember ( 6662188 ) on Wednesday January 21, 2026 @03:14PM (#65940134)
    Since these individuals live in California, can they not request Eightfold remove all data about them under the recently enacted DROP law? They might still not be interviewed or hired, of course.
    • Re:Use the DROP act? (Score:4, Interesting)

      by dmomo ( 256005 ) on Wednesday January 21, 2026 @03:36PM (#65940160)

      I would be all for this, but the sad reality is that it equates to playing whack-a-mole. Having to opt out of lists justs guarantees that another list pops up. It's infuriating and exhausting.

      Forcing companies to only use data where the target has opted in would barely help, too, because the opt-in would be buried deep in a TOS, and it would include allowing them to sell it, essentially opting that person in to third parties... and then the whack-a-mole continues.

      Contracts can not be used to break laws. Agreeing to such a contract would be invalid. What we really need are laws that prohibit third party transfer of data all together OR, seeing as how that would be wildly impractical, require that any third party companies reach out to the target of the data and get an additional explicit opt-in. That would turn this miserable industry on its head. Though I don't imagine something like that would ever happen.

  • I have heard so many horror stories of job candidates. Often with no chance of finding out why, but people can lose jobs because:

    1) name is similar to a pedophile from another state.
    2) debt from identity theft - when they did not know it was going on and were NOT told by the employer.
    3) their index finger was too long.
    4) wrong age/race/gender/religion - yes this is illegal in the US but it still happens.

    Companies get inundated with way too many categories which leads them to be arbitrary. They treat job

  • Attention job seekers. How do you know if you are right for the job--unless you know what your options are?
  • Just what in the name of the Great and Mighty Cthulhu is an AI doing with access to FCRA-scoped data in the first place? As incompetent and... well... leaky as GPT and other AI models are; they shouldn't even be *connected* to personal financial data at all, much less the credit bureaus and FICO.

    I mean... "vibe coding" is bad enough to clean up. Is "vibe identity theft" about to become a thing?

    We need the Three Laws, godsdammit!

All science is either physics or stamp collecting. -- Ernest Rutherford

Working...