Microsoft's Security Measures: According to an internal announcement from Microsoft, certain AI tools, including ChatGPT, became temporarily inaccessible due to security and data concerns. The company emphasized that despite its investments in OpenAI and ChatGPT's built-in safeguards, caution is advised when using external services to protect confidentiality and security.
Concerns for Confidentiality and Security: Microsoft highlighted the risks associated with using third-party external services, including ChatGPT. The cautionary message extended to other AI services such as Midjourney and Replika. The company urged its employees to exercise care while using these services.
Microsoft's Explanation: Later, Microsoft clarified that the temporary block on ChatGPT was a mistake resulting from testing systems for large language models. The company unintentionally activated endpoint control systems for all employees, which was promptly rectified upon discovery. Microsoft encouraged both employees and clients to use services like Bing Chat Enterprise and ChatGPT Enterprise, boasting higher levels of confidentiality and security.
Recommendation for Bing Chat: In response to the incident, Microsoft updated its recommendations, urging people to use its own Bing Chat tool based on OpenAI's artificial intelligence model. This year, Microsoft introduced updates to Windows OS and Office applications leveraging OpenAI services, operating on the cloud infrastructure of Microsoft Azure.
Future Plans for AI Integration: Microsoft has ambitious plans, intending to introduce its AI-powered personal assistant, Copilot, for the Windows 10 operating system. This move aligns with the company's continuous efforts to integrate advanced AI capabilities into its products and services.
We may use cookies or any other tracking technologies when you visit our website, including any other media form, mobile website, or mobile application related or connect...
Read more about cookies