iTutorGroup's AI screened out older applicants; $365k EEOC settlement

Tombstone icon

On August 9, 2023, the EEOC's first AI-related discrimination lawsuit reached a settlement. iTutorGroup, a company providing English-language tutoring services to students in China via US-based remote tutors, had programmed its applicant screening software to automatically reject female applicants over 55 and male applicants over 60. Over 200 qualified US applicants were rejected because of their age. The company agreed to pay $365,000, adopt a new anti-discrimination policy, provide training to hiring staff, and submit to EEOC compliance monitoring for at least five years. EEOC Chair Charlotte Burrows called AI a "new civil rights frontier."

Incident Details

Severity:Facepalm
Company:iTutorGroup
Perpetrator:Executive
Incident Date:
Blast Radius:Older job applicants screened out; legal settlement and mandated policy changes.
Advertisement

The Business

iTutorGroup provided English-language tutoring services to students in China. The company hired US-based tutors who worked remotely from their homes, connecting with students online. The tutoring positions were straightforward: native English speakers teaching conversational and academic English to Chinese students via video. The company recruited tutors through an online application process.

That application process included automated screening software. Applicants submitted their information - including their date of birth - and the software evaluated them against criteria set by the company. Qualified applicants were passed through to the next stage of hiring. Unqualified applicants were rejected.

The problem was in how "qualified" was defined.

The Cutoffs

The EEOC's investigation found that iTutorGroup had programmed its hiring software to apply age-based cutoffs. Female applicants over the age of 55 were automatically rejected. Male applicants over the age of 60 were automatically rejected. These weren't the byproducts of a neutral hiring criterion that happened to correlate with age - the dates of birth were being directly compared to hard-coded age thresholds.

The software's logic was explicit: if the applicant's calculated age exceeded the cutoff for their gender, the application was rejected. No human reviewed these rejections. No consideration was given to qualifications, experience, or suitability. The algorithm looked at two fields - date of birth and gender - and threw applications away.

More than 200 qualified US applicants were rejected on this basis. These were people who met the job requirements, had relevant skills and experience, and would have been considered for the positions had they been younger. Their applications never made it past the automated gate.

The Lawsuit

The EEOC filed suit against iTutorGroup in the US District Court for the Eastern District of New York in 2022 (Case No. 1:22-CV-2565). The agency alleged that the company's hiring software violated the Age Discrimination in Employment Act of 1967 (ADEA), which prohibits employers from discriminating against applicants or employees who are 40 years of age or older.

The ADEA is not ambiguous about this. It's one of the clearest employment discrimination statutes on the books: you cannot refuse to hire someone because of their age if they're over 40. It doesn't matter whether a human makes the decision or whether software does. The law applies to the outcome, not the mechanism.

iTutorGroup's software had encoded the exact kind of discrimination the ADEA was written to prevent. The only difference from a traditional age discrimination case was that the discrimination was automated - applied systematically to every applicant rather than on a case-by-case basis. This made it more efficient as discrimination, not less discriminatory.

The Settlement

On August 9, 2023, the EEOC and iTutorGroup filed a Joint Settlement Agreement and Consent Decree. The terms included:

  • $365,000 in monetary relief, distributed among the more than 200 affected applicants.
  • A new anti-discrimination policy that iTutorGroup was required to adopt and distribute.
  • Training for all personnel involved in hiring tutors, covering ADEA compliance and non-discrimination obligations.
  • Injunctions against discriminatory hiring based on age or sex, and against requesting applicants' birth dates during the hiring process.
  • EEOC compliance monitoring for at least five years. If iTutorGroup resumed hiring US-based tutors, it was required to notify and interview the applicants who had previously been rejected because of their age.

The settlement was the EEOC's first involving AI-based employment discrimination. The agency made sure everyone knew it.

The EEOC's Signal

EEOC Chair Charlotte Burrows had been signaling for months that AI in hiring was a priority. In an earlier joint statement with officials from the Department of Justice, the Consumer Financial Protection Bureau, and the Federal Trade Commission, the agencies pledged to "vigorously use our collective authorities to protect individuals' rights regardless of whether legal violations occur through traditional means or advanced technologies."

Burrows described AI as a "new civil rights frontier" that threatened "basic values and principles" and carried a risk of discrimination in employment decisions. The iTutorGroup case was the first concrete enforcement action behind those words.

The case was straightforward by discrimination standards - hard-coded age cutoffs in hiring software that rejected applicants without review. It was not a subtle algorithmic bias case where a machine learning system inadvertently learned to correlate protected characteristics with hiring outcomes. This was direct, intentional discrimination implemented in code. The company decided older applicants should not be hired and built software to enforce that decision automatically.

What the Case Meant for AI Hiring Tools

The iTutorGroup settlement was small in dollar terms - $365,000 split among 200+ applicants works out to less than $2,000 per person. But the case established a clear principle: existing employment discrimination law applies to AI-driven hiring decisions. Employers cannot use software as a shield against liability for discriminatory outcomes.

Legal analysis from firms including Sullivan & Cromwell and Greenberg Traurig emphasized the implications. Employers using AI screening tools needed to audit those tools for discriminatory criteria or outcomes, not just at purchase but on an ongoing basis. That a vendor or internal engineering team built the discrimination into the software didn't shift liability away from the employer using it.

The case also highlighted a basic failure of oversight. Someone at iTutorGroup decided that older applicants should be rejected. Someone implemented that decision in code. And then the system ran, rejecting applicants one by one, with no apparent human review of whether this was legal, ethical, or sensible. The automation didn't create the discrimination - it just made it invisible to anyone who wasn't looking at the code.

Reuters described it as the US agency's first bias lawsuit involving AI hiring software. The framing mattered: the EEOC was putting employers on notice that AI tools were not a legal gray area. They were tools, subject to the same laws as every other tool an employer uses to make hiring decisions. If the tool discriminates, the employer is liable.

The $365,000 was a modest settlement. The precedent was not.

Discussion