The Microsoft AI research division accidentally leaked dozens of terabytes of sensitive data starting in July 2020 while contributing open-source AI learning models to a public GitHub repository.
Gotta love these kind of news. There’s always these hypothetical discussions of clouds being insecure and companies generally just ignore that, because clouds are theoretically, sometimes cheaper.
And then every now and then, half the internet leaks out of one of these clouds and everyone’s like, holy crap, and then companies go back to generally just ignoring that, because clouds are theoretically, sometimes cheaper.
Unfortunately nobody in charge has seen consequences for their decision to save a few theoretical nickels, so far. But then again, a lot of software/IT related stuff would look completely different, if anybody did.
Gotta love these kind of news. There’s always these hypothetical discussions of clouds being insecure and companies generally just ignore that, because clouds are theoretically, sometimes cheaper.
And then every now and then, half the internet leaks out of one of these clouds and everyone’s like, holy crap, and then companies go back to generally just ignoring that, because clouds are theoretically, sometimes cheaper.
Unfortunately nobody in charge has seen consequences for their decision to save a few theoretical nickels, so far. But then again, a lot of software/IT related stuff would look completely different, if anybody did.