
Microsoft has officially barred its employees from using DeepSeek, a Chinese-developed AI language model, citing concerns over data security and the potential for the tool to be used as a vehicle for propaganda. The decision was confirmed by Brad Smith, Microsoft’s vice chairman and president, who highlighted the risks of using AI technologies developed in jurisdictions with differing standards for data privacy and information control.
DeepSeek, developed in China, is among a growing number of advanced artificial intelligence models emerging from nations around the world. However, the use of such tools by employees of major tech companies—especially those based in the United States—has drawn scrutiny due to national security and privacy considerations.
According to Smith, the decision reflects Microsoft’s broader approach to safeguarding sensitive corporate information and ensuring compliance with international privacy standards. “We evaluate all AI tools—not just based on their technical capabilities, but also by considering the frameworks under which they are developed,” Smith explained in a recent statement.
The concerns are not isolated to Microsoft. There is increasing global attention on how AI systems could be influenced by the intent of their developers or by government oversight in countries where information is tightly regulated. Critics point out that AI models can reflect underlying biases in the data they are trained on, and in some cases, be susceptible to manipulation for ideological or political ends.
Smith emphasized that Microsoft continues to support innovation and the responsible integration of AI in its workforce, but said that policies surrounding AI tool usage are constantly being evaluated to account for evolving technological and geopolitical landscapes.
This move underscores the growing intersection between technology, regulation, and international relations, as corporations navigate the challenges of integrating powerful AI tools while maintaining the integrity and security of their operations.
Source: https:// – Courtesy of the original publisher.