SD 3007
SD 3007 was introduced on June 26, 2025, and was referred to the committee on Advanced Information Technology, the Internet and Cybersecurity on September 4, 2025. The law would cover “any person, business, organization, or government agency that is subject to the jurisdiction of the Commonwealth of Massachusetts, excluding an individual acting at their own direction and in a non-commercial context.”
Crucially, SD 3007 proposes a ban on algorithmic discrimination, precluding covered entities from using an automated decision system that has the effect of discriminating against a person or class on the basis of a protected characteristic covered by Massachusetts General Law Chapter 151B. These include: race, color, religion, national origin, disability, sex, gender identity, sexual orientation, and other protected categories. In the employment context, the bill would protect individuals’ “fundamental opportunities” related to hiring, pay, independent contracting, worker management, promotion, and termination.
The proposed bill also includes several consequential provisions related to audit and notice requirements. The bill would require that entities using automated decision systems relating to “fundamental opportunities” conduct audits of their system every ninety days. The bill also would require such covered entities to not only provide individuals with notice when such systems are being used in a way relating to their access to “fundamental opportunities,” but to give individuals the right to opt out of the use of the AI system entirely, and to instead opt for a human review process.
SD 3007 proposes a strict liability framework, meaning liability under the Act would incur regardless of whether the violating entity is aware of the discriminatory effect of the AI system, whether federal law is complied with, or whether an external service provider is relied upon for the system. Aggrieved individuals would have a private cause of action under the law and could obtain the greater of $5,000 of liquidated damages or actual damages, in addition to attorneys’ fees and costs.
Relation to “lie detector” litigation
Recently, Massachusetts plaintiffs have been targeting employers who use AI hiring tools under the Massachusetts General Law Chapter 149, Section 19B lie detector law. This statute prohibits employers from administering lie detector tests to employees or job applicants as a condition of employment and requires employers to provide notice to job applicants of the unlawfulness of using lie detector tests in this manner. Thus far, challenges have mostly focused on the notice requirement and whether companies have complied with it.
The lie detector statute broadly defines lie detector tests as “[A]ny test utilizing a polygraph or any other device, mechanism, instrument or written examination, which is operated, or the results of which are used or interpreted by an examiner for the purpose of purporting to assist in or enable the detection of deception, the verification of truthfulness, or the rendering of a diagnostic opinion regarding the honesty of an individual.” Thus, because of the broad scope of the statutory language, plaintiffs may be able to bring cognizable claims under the law if they believe they were subjected to an AI hiring tool that assesses character traits related to honesty. SD 3007, by explicitly targeting the use of AI tools in a wide variety of employment contexts, could provide plaintiffs with yet another tool to check employers who use automated decision-making tools in their hiring processes.
Possible impact
If signed into law, SD 3007 would provide individuals with a clear path to holding employers accountable for the discriminatory impact of their automated decision processes, regardless of whether the employer was aware of the discrimination caused by their AI tool. For employers utilizing AI tools, the law would trigger a slew of internal review and compliance processes, forcing them to continuously review their AI decision making processes for signs of discrimination. Given that state legislatures have been relatively slow to react when compared to the rapid adoption of AI tools by employers, a far-reaching law like SD 3007 is much needed, and would ensure that impacted employees are better equipped to enforce anti-discrimination laws as employers continue to integrate AI decision-making tools into their processes.
If you have been discriminated against at work, please reach out to our employment lawyers.