- cross-posted to:
- sysadmin@lemmy.world
- cross-posted to:
- sysadmin@lemmy.world
Microsoft leaks 38TB of private data via unsecured Azure storage::The Microsoft AI research division accidentally leaked dozens of terabytes of sensitive data starting in July 2020 while contributing open-source AI learning models to a public GitHub repository.
This will definitely make customers less trustful of Microsoft when dealing with their privacy-focused AI projects. Here’s to hoping that open-source LLMs become more advanced and optimized.
I am not sure. This was mostly a case of human error in not properly securing urls/storage accounts. The lack of centralised control of SAS tokens that the article highlights was a contributing factor, but not the root cause, which was human error.
If I leave my front door unlocked and someone walks in and robs my house, who is to blame? Me, for not locking the door? Or the house builder, for not providing a sensor so I can remotely check whether the door is locked?
If I leave my front door unlocked and someone walks in and robs my house, who is to blame?
In a private environment, one person’s mistake can happen, period.
A corporate environment absolutely needs robust procedures in place to prevent the company and all their clients from such huge impact of one person’s mistake.
But that’s a looong tradition at M$ - not having it, I mean.
Azure has a huge problem with SAS tokens. The mechanism is so bad, that it invites situations like this.
Root cause is whatever is allowing the human error to happen.
if you live in an apartment and the landlord doesnt replace the front door locks when they break is a better analogy
Because, of course they did.