DOL, CFPB Issue Guidance For Employers On AI

December 18, 2024 10:36 pm
Defense and Compliance Attorneys
Seamless Payment Processing Solutions
Commitment to Client Care


Source: site

Federal Payday Loan Rule Would Devastate Vulnerable Consumers - Competitive Enterprise Institute

Federal agencies are keeping a close eye on artificial intelligence (AI), with both the Department of Labor (DOL) and the Consumer Financial Protection Bureau (CFPB) recently issuing guidance for employers.

In “Artificial Intelligence and Worker Well-Being: Principles and Best Practices for Developers and Employers,” the DOL aimed to value workers as essential resources even in a moment of technological change.

To guide employers, the agency provided a set of eight principles.

The “North Star” of the principles focuses on centering worker empowerment, which the agency said means that workers and their representatives—especially those from underserved communities—should be informed of and have genuine input in the design, development, testing, training, use and oversight of AI systems for use in the workplace.

Other principles include ethically developing AI systems that should be designed, developed and trained in a way that protects workers, the DOL said; establishing AI governance, where employers should have clear governance systems, procedures, human oversight and evaluation processes for AI systems in the workplace; and ensuring transparency in AI use by providing workers advance notice and appropriate disclosure if they intend to use worker-impacting AI, as well as information about what data will be collected and stored and for what purpose that data will be used by AI systems.

The agency also said employers should protect employee rights so that, for example, AI systems could not violate or undermine workers’ right to organize, or jeopardize health and safety rights, wage and hour rights, and anti-discrimination and anti-retaliation protections.

According to the principles, AI systems should also be used to enable workers by assisting, complementing, and improving job quality, while workers impacted by AI should be supported, including being provided with appropriate training opportunities and prioritizing the retraining and reallocating of workers displaced by AI to other jobs within the organization whenever feasible.

Finally, the DOL emphasized the need for responsible use of worker data. The data of workers collected, used or created by AI systems should be limited in scope and location, used only to support legitimate business aims, and protected and handled responsibly, the agency advised.

The CFPB took a different approach in its discussion of AI, cautioning employers about the use of third-party reports with “Background Dossiers and Algorithmic Scores for Hiring, Promotion, and Other Employment Decisions.”

“Can an employer make employment decisions utilizing background dossiers, algorithmic scores, and other third-party consumer reports about workers without adhering to the Fair Credit Reporting Act (FCRA)?” the agency asked, answering in the negative.

Background dossiers that convey scores about workers—used to make hiring, promotion, reassignment or retention decisions—are often governed by the statute, the CFPB explained, as they can qualify as “consumer reports” under the FCRA. Other types of consumer reports may include reports that convey scores assessing a current worker’s risk level or performance, for example.

According to the CFPB’s circular, “Employers that use consumer reports—both initially when hiring workers and for subsequent employment purposes—must comply with FCRA obligations, including the requirement to obtain a worker’s permission to procure a consumer report, the obligation to provide notices before and upon taking adverse actions, and a prohibition on using consumer reports for purposes other than the permissible purposes described in the FCRA.”

The rise of AI has increased the ability and speed that reports can be generated based on employee data, the CFPB noted, triggering potential liability for employers.

Employers today have the ability to purchase reports from third parties that monitor workers’ sales interactions, track workers’ driving habits, measure the time that workers take to complete tasks, record the number of messages workers send and the quantity and duration of meetings they attend, as well as calculate workers’ time spent off-task through documenting their web browsing, taking screenshots of computers and measuring keystroke frequency.

To avoid running afoul of the FCRA, employers should consider whether the use of data qualifies as a use for “employment purposes” under the FCRA (defined as “a report used for the purpose of evaluating a consumer for employment, promotion, reassignment or retention as an employee”), which includes both initial assessments as well as data used for ongoing employment purposes.

Employers also need to consider if the report was obtained from a “consumer reporting agency” that was “assembled or “evaluated” from consumer information, such as an entity that collects consumer data in order to train an algorithm that produces scores or other assessments about workers for employers.

To read the DOL’s guidance, click here

To read the CFPB’s circular, click here

Why it matters: The DOL vowed to “remain vigilant in protecting workers from the potential harms of AI, while at the same time, recognizing that this is a moment of tremendous opportunity,” while the CFPB encouraged employers to review their current practices to ensure compliance with FCRA requirements, particularly in light of the increasing prevalence of AI tools in the workplace.

© Copyright 2024 Credit and Collection News