This article has been saved to your Favorites!

SEC Urged To Investigate OpenAI For Anti-Whistleblower NDAs

By Jessica Corso · 2024-07-15 22:59:06 -0400 ·

The U.S. Securities and Exchange Commission has been contacted by at least one whistleblower urging it to investigate artificial intelligence pioneer OpenAI for allegedly requiring employees to sign agreements discouraging them from reporting potential wrongdoing to federal regulators, according to a letter shared with Law360 on Monday.

The ChatGPT developer allegedly required the signing of nondisclosure agreements that, among other things, mandated prior consent before handing confidential information over to federal authorities and prohibited employees from receiving any sort of whistleblower compensation from the government, according to a letter sent to the commission on July 1 by "one or more" anonymous whistleblowers.

It wasn't immediately clear if the agreements, sometimes referred to in the letter as "prior NDAs," were still in effect.

The letter was shared with Law360 by the office of Republican Sen. Chuck Grassley of Iowa, who noted that it was sent to his office by legally protected whistleblowers.

"OpenAI's policies and practices appear to cast a chilling effect on whistleblowers' right to speak up and receive due compensation for their protected disclosures," Grassley said in an email statement to Law360. "In order for the federal government to stay one step ahead of artificial intelligence, OpenAI's nondisclosure agreements must change."

OpenAI didn't immediately respond to a request for comment Monday evening.

The whistleblower letter calls on the agency to open an investigation into OpenAI, saying the company should be forced to provide the SEC every employment agreement, severance agreement, investor agreement or other agreement that contains an NDA. It should also be fined and be forced to tell any past or current employees that they have the right to confidentially report any violations of law to the SEC, according to the letter.

Doing so would send a message to the entire industry, the letter said.

"Given the risks associated with the advancement of AI, there is an urgent need to ensure that employees working on this technology understand that they can raise complaints or address concerns to federal regulatory or law enforcement authorities," it said.

As an example of those risks, the letter contains a footnote citing a statement signed by policymakers and industry experts like OpenAI CEO Sam Altman last year that warned of the "risk of extinction" posed by AI, and that avoiding that possibility "should be a global priority."

An SEC spokesperson on Monday said that the agency "does not comment on the existence or nonexistence of a possible whistleblower submission."

The agency has not shied away from penalizing companies that require counterparties to sign agreements that it believes could discourage them from calling the whistleblower hotline.

Activision Blizzard Inc., the company behind popular games like Call of Duty and Candy Crush, agreed to pay $35 million in 2023 to settle accusations that it violated the agency's whistleblower protection rules by requiring departing employees sign agreements promising that they would inform the company if a regulator contacted them during an investigation.

And earlier this year a JPMorgan Chase & Co. unit agreed to pay an $18 million fine for allegedly using client confidentiality agreements that the SEC said violated whistleblower protections.

--Editing by Adam LoBelia.

For a reprint of this article, please contact reprints@law360.com.