AI Meeting Tools Are The Latest Target of Illinois BIPA Class Actions
The Illinois Biometric Information Privacy Act (BIPA) has recently turned its attention to AI notetaker applications and other listening tools. A proposed class action filed on December 18 alleges that a popular AI notetaker vendor violated this stringent privacy law by collecting and storing voiceprints during a virtual meeting without proper notice, consent, or a compliant data retention policy.
What Happened: Familiar Workplace Scenario Turns Into Lawsuit
The complaint outlines a scenario that many employers may find unsettling:
- An Illinois resident participated in a routine virtual meeting.
- She had not signed up for any AI meeting assistant.
- She did not agree to any terms of service.
- However, an AI notetaking tool automatically joined the meeting at the request of another participant.
The AI tool recorded the discussion, identified speakers, and generated transcripts attributing statements to individuals. The lawsuit claims that this process required the notetaker to create and store voiceprints, classified as biometric identifiers under BIPA. This law mandates that businesses collecting such data must adhere to specific requirements.
The plaintiff, Katelin Cruz, asserts that the AI vendor failed to inform her in writing about the data collection, retention duration, and did not provide an option for written consent. Furthermore, the vendor allegedly did not publish a compliant biometric data retention and destruction policy.
Why This Matters for Employers
While the lawsuit primarily targets the AI vendor, employers should not dismiss the implications. In Illinois, BIPA is enforced largely through private lawsuits, and employers can be pulled into litigation due to:
- Authorizing or deploying the tool: If an organization licenses or encourages the use of an AI notetaker that captures voiceprints without proper consent, they may be deemed complicit.
- Employee use during work-related meetings: Even if an individual employee activates the AI assistant, employers may be liable if this occurs in business meetings.
- Benefiting from the outputs: If transcripts or insights generated from AI tools are used to derive value, this could support claims of biometric data collection.
- Lack of guardrails: Courts have shown little leniency regarding claims of automatic or incidental biometric collection. Missing policies or training can exacerbate liability.
Even companies based outside Illinois can face BIPA claims if their meetings include participants physically located in Illinois.
6 Steps Employers Should Take Now
Employers do not need to abandon AI notetaking tools, but they must manage their use effectively to mitigate legal risks. Consider the following steps:
- Inventory AI Meeting Tools in Use
Identify all platforms and tools being used across the organization, including those not officially approved. - Understand What the Tool Actually Collects
Assess whether the tool performs functions that involve biometric data, such as voice identification or speaker recognition. - Clarify Who Can Enable Recording or AI Assistants
Restrict the activation of AI notetakers, especially in meetings with external participants. - Update (and Enforce) Meeting and Recording Policies
Ensure privacy policies clearly outline when AI tools may be used and the required notifications and approvals. - Coordinate With Vendors, But Don’t Outsource Compliance
Conduct due diligence with AI vendors to ensure they have BIPA-compliant mechanisms in place and document these discussions. - Train Employees on “Meeting Hygiene”
Provide training to employees to emphasize that enabling an AI assistant is not a neutral act.
This lawsuit signals an important shift in how biometric data relating to AI tools will be scrutinized under BIPA, and proactive measures are essential for compliance.