Microsoft AI researchers accidentally exposed terabytes of internal sensitive data

Mian Ashfaq
By -
0

Microsoft AI researchers accidentally exposed terabytes of internal sensitive data

Microsoft AI researchers accidentally made a big mistake. They put lots of private and sensitive data, like passwords and secret keys, on the internet by accident. They did this while trying to share some computer programs they made on a website called GitHub.

information, security, network security, application, cloud, security, data, endpoint,  identity,  access, cyberbullying, cyberstalking, malware, phishing, social engineering, ransomware, backdoors, identity theft, financial fraud,

A company called Wiz found this mistake while looking for data that shouldn't be online. Normally, people can download computer programs from a special website called Azure Storage. But Microsoft's mistake made it possible for anyone to access much more than just the programs. They could see and mess with lots of private stuff.

This included 38 terabytes (which is a lot!) of personal information, even the private backups of two Microsoft employees' personal computers. The data also had passwords for Microsoft services and over 30,000 secret messages from Microsoft workers.

This problem had been going on since 2020, so it was a long time before someone noticed. The website should have only allowed people to read the information, but it accidentally let people do whatever they wanted with it.

Wiz said that the Microsoft AI developers made a mistake with something called a "shared access signature token" (or SAS token). This is a special link that should only give certain permissions to people. But they made it too open, so it allowed too much access.

Wiz told Microsoft about the problem in June, and Microsoft fixed it a couple of days later. They finished checking for any problems it might have caused in August.

Microsoft said that even though this was a big mistake, it didn't affect their customers or other important services. They also said they're going to make sure something like this doesn't happen again in the future. They're going to be more careful about keeping their secrets safe.

Post a Comment

0Comments

Post a Comment (0)