Accidental Data Leak: Microsoft AI Researchers Unintentionally Expose 38TB of Company Data

Accidental Data Leak: Microsoft AI Researchers Unintentionally Expose 38TB of Company Data


A Microsoft‍ AI research team that ‌uploaded training data‍ on GitHub in an⁢ effort to offer other researchers open-source code and AI models for ⁤image recognition inadvertently exposed 38TB of‌ personal data. Wiz, a cybersecurity firm,​ discovered a link included in the files that contained backups of Microsoft employees’ computers. Those backups contained passwords to ‌Microsoft services, secret keys and over 30,000 internal Teams messages from ⁢hundreds of ‍the tech⁤ giant’s employees, Wiz says. Microsoft assures ⁣in its own report of the incident, however, that​ “no customer data was ⁤exposed, and no other internal⁢ services were put⁣ at risk.”

The link was deliberately included with the files ⁢so that interested researchers could download pretrained models — that part⁢ was no accident. Microsoft’s researchers used‌ an Azure feature ‌called “SAS tokens,” which ⁢allows users to ​create⁣ shareable ‍links that give ​other people access to data in their Azure Storage account. ‌Users can choose what…

2023-09-19⁣ 04:46:59
Link from www.engadget.com rnrn

Exit mobile version