Some of the world’s best known AI brands have accidentally exposed sensitive credentials online, according to new research.
Cloud security firm Wiz says it found leaked secrets, including API keys, tokens, and access credentials, across hundreds of public GitHub repositories linked to major AI developers.
The exposed data was traced to platforms such as Google APIs, Weights & Biases, Hugging Face, ElevenLabs, LangChain, Infura, and Flickr.
There is the potential for some of this leaked data to expose private models, training data, and organizational structures, putting organizations and users at risk, Wiz said.
65% of companies featured on the Forbes AI 50 list had leaked secrets in their public repositories. Together, those firms are worth more than $400bn.
“Some of these leaks could have exposed organizational structures, training data, or even private models. For teams building the future of AI, speed and security have to move together,” Wiz said.
The Response
Wiz attempted to alert all of the affected companies. Some, including ElevenLabs and LangChain, responded quickly and patched the exposed secrets.
However, Wiz noted that nearly half of the organisations contacted either failed to reply or did not have a disclosure process in place to receive security reports.
“Many companies lacked an official disclosure channel, failed to reply, and/or failed to resolve the issue,” Wiz said.
Analysts say this reflects a broader problem in Silicon Valley’s “move fast” culture, where innovation often takes precedence over security and compliance.
Next Steps
In its report, Wiz urged AI companies to make secret scanning mandatory for public version control systems and to create clear channels for responsible disclosure.
The company also recommended that organisations treat external organization members and contributors as part of their attack surface, since employees’ personal repositories can be targeted to exfiltrate data.
“Treat your employees as part of your company’s attack surface and your VCS org members and contributors as an extension of your SDLC infrastructure,” says Wiz.
“We recommend creating a VCS member policy to apply during the onboarding process (i.e. create a new GitHub user without revealing the name of the employer, use MFA for personal orgs, keep all personal activity in personal accounts etc.).”
“While modern secret scanning has elevated the ‘defense waterline,’ our investigation clearly shows that threats lurk deep below the surface—in deleted forks, gists, and developer repos. For AI innovators, the message is clear: speed cannot compromise security.”