Microsoft AI researchers inadvertently exposed tens of terabytes of sensitive data, including private keys and passwords, on GitHub. The data, which included personal backups of two Microsoft employees’ computers, was exposed due to a misconfigured URL that granted full control permissions. The issue was discovered by cloud security startup Wiz, and Microsoft has since revoked the overly permissive shared access signature token and expanded GitHub’s secret scanning service.
Read more at TechCrunch…