Microsoft AI Research Exposes Terabytes of Sensitive Data on GitHub

Microsoft AI Research Exposes Terabytes of Sensitive Data on GitHub

In a turn of events, Microsoft AI researchers unintentionally exposed tens of terabytes of sensitive data while publishing an open-source training data storage bucket on GitHub. This accidental exposure has raised significant concerns regarding data security within one of the world’s tech giants.

 

Discovery by Cloud Security Startup Wiz

 

Cloud security start-up Wiz uncovered this security lapse during their ongoing investigation into cloud-hosted data exposures. According to reports, Wiz stumbled upon a GitHub storage file belonging to Microsoft’s AI research division. What they found was deeply concerning.

The GitHub file was intended to provide open-source code and AI models for image recognition, and users were directed to download these models from an Azure…


Source link

About hosting

Check Also

Danish King didn’t wave Palestinian flag in street protest

Danish King didn’t wave Palestinian flag in street protest

A video being shared widely online falsely claims to show the new King of Denmark …

Leave a Reply

Your email address will not be published. Required fields are marked *