
California’s Automated Hiring Rules: AI Bias & ADS Compliance
California employers have been urged to carefully review their hiring, recruitment, and employee assessment tools. Effective October 1, 2025, California will implement new regulations governing how employers use automated-decision systems (ADS) in hiring, promotion, and other workplace decisions.
These changes, issued under amendments to the state’s Fair Employment and Housing Act (FEHA), place stricter requirements on employers that rely on artificial intelligence, algorithms, and other digital tools in employment practices. The regulations highlight the state’s growing concern about the potential for algorithmic discrimination and the risks of using technology without appropriate safeguards.
What Are Automated Decision Systems?
The rules define ADS broadly, covering nearly any technology that influences job opportunities or employment actions. These systems range from simple resume-screening tools to advanced AI platforms that evaluate applicant behavior. Examples include:
-
Keyword filters that screen resumes before a human reviews them
-
Online skills or personality tests that predict job performance
-
Algorithms targeting job advertisements to specific demographics
-
Video interview software that measures voice, tone, or expressions
-
Data-driven platforms that analyze employee performance metrics
California’s regulators made it clear that when these systems skew results against protected groups, the employer is still responsible—even if the software is provided by a vendor.
The Risk of AI Bias in Hiring and Employment
Although marketed as objective, AI-driven hiring tools can magnify discrimination. For instance, reaction-time tests may disadvantage people with certain disabilities, while facial recognition systems often perform worse for individuals of color. Similarly, algorithms that direct ads may inadvertently exclude older workers or women from seeing job postings.
Because ADS can unintentionally embed prejudice, the amended FEHA rules hold employers accountable for identifying and preventing these outcomes.
Employer Responsibilities Under the New FEHA Amendments
The 2025 regulations impose several new obligations:
-
Ban on discriminatory use: Employers cannot apply ADS in ways that exclude or disadvantage candidates or employees based on race, gender, disability, age, or other protected categories.
-
Bias safeguards: To reduce liability, companies are encouraged to conduct bias audits, anti-discrimination testing, and to use validated methods aligned with federal EEOC guidelines.
-
Recordkeeping requirements: Employers must now keep ADS-related records—including data, outcomes, and bias audit reports—for at least four years.
-
Third-party liability: Employers remain responsible for unlawful outcomes even when using outside recruiters, software vendors, or testing platforms. Contracts with vendors should include indemnification provisions.
The Hidden Risks of AI Hiring Tools
Artificial intelligence in hiring is often marketed as objective, yet studies and lawsuits reveal the opposite. For instance:
-
Amazon’s recruiting algorithm was abandoned after it penalized resumes containing terms linked to women, such as participation in “women’s” organizations.
-
HireVue, a popular video-interview platform, came under scrutiny for allegedly analyzing candidates’ facial movements and tone, raising concerns about disability and racial bias.
-
In 2023, the EEOC settled a case where an employer’s AI-powered hiring system automatically rejected older applicants, highlighting how ADS can unintentionally embed age discrimination.
These cases show why California regulators are emphasizing safeguards: ADS can reinforce systemic inequalities instead of eliminating them.
Preparing for ADS Compliance
Businesses that use AI or algorithmic tools in recruiting or performance evaluations should begin compliance planning now. Practical steps include:
-
Audit all ADS currently in use to assess risks and identify potentially discriminatory practices.
-
Implement written policies on responsible ADS use and train HR staff to recognize bias risks.
-
Vet technology providers to confirm they conduct bias testing and comply with FEHA standards.
-
Document every step of the hiring and promotion process, including decisions, inputs, and any audits performed.
-
Stay alert to new developments, as California is not alone—other states and jurisdictions are also moving to regulate AI in employment.
Why ADS Employment Law Matters
These changes highlight a growing recognition that workplace technology, while efficient, is not always neutral. When algorithms disadvantage applicants or employees, the harm can be widespread and difficult to detect. California’s amendments give employees stronger protections and force employers to take proactive measures before relying on digital decision-making tools.
Why Hire The Lyon Firm
The Lyon Firm has extensive experience representing individuals harmed by unlawful workplace practices, including those linked to AI discrimination and automated hiring systems. We investigate whether employers or third-party vendors violated employment laws, build cases for accountability, and pursue fair outcomes for workers.
For businesses, we also provide proactive guidance to navigate complex regulations and limit liability. If you are a job applicant excluded by an unfair system or a current employee learning about your rights and California’s evolving laws, our firm delivers informed legal support tailored to your needs.
FAQs on AI Employment Decisions
When do the new California ADS rules begin?
The regulations take effect on October 1, 2025.
What technologies are covered by the regulations?
Any computational process that makes or influences employment decisions, including resume screeners, personality tests, AI-driven assessments, and video interview software.
If a vendor provides the ADS, is the employer still liable?
Yes. Employers remain responsible under FEHA even if a third-party tool produces the discriminatory outcome.
What records must be kept?
Employers must preserve ADS-related data, outcomes, and bias testing records for at least four years.
What steps lower the risk of lawsuits?
Bias audits, validation of hiring tools, clear anti-discrimination policies, and contracts requiring vendor compliance all help reduce exposure.