Microsoft cuts Israel access over mass Palestinian surveillance technology

Microsoft has officially terminated the Israeli military’s access to critical technology previously used for mass surveillance of Palestinians.

The decision, confirmed by company sources, impacts Unit 8200, Israel’s elite military intelligence unit, which had been collecting millions of phone calls from Gaza and the West Bank daily.

The tech giant informed Israeli officials last week that Unit 8200 had violated Microsoft’s terms of service by storing sensitive surveillance data on the Azure cloud platform.

The action follows an in-depth investigation published by the Guardian in collaboration with local media outlets, revealing the extent of Israel’s use of Microsoft technology in a secret surveillance program.

Unit 8200’s reliance on Azure allowed it to build an expansive system capable of intercepting, storing, and analyzing calls from the Palestinian population. According to internal sources, the unit’s internal mantra, “a million calls an hour,” reflected the massive scale of its operations.

The intercepted communications—estimated at 8,000 terabytes—were initially stored in a Microsoft datacenter in the Netherlands. Following the Guardian’s report, the data was reportedly moved to Amazon Web Services (AWS) to avoid European jurisdiction.

The revelations ignited protests at Microsoft’s U.S. headquarters and European data centers, led by employee advocacy groups such as No Azure for Apartheid.

Activists called for an immediate end to Microsoft’s support for Israeli military surveillance operations, emphasizing ethical responsibility in the use of AI and cloud technologies.

Brad Smith, Microsoft’s Vice-Chair and President, addressed staff in an email confirming the termination.

“We do not provide technology to facilitate mass surveillance of civilians,” Smith wrote, emphasizing that Microsoft has upheld this principle globally for more than two decades. The company’s decision marks a rare instance of a U.S. technology firm severing ties with a foreign military over ethical concerns.

Sources indicate that Unit 8200’s surveillance program was instrumental in both intelligence gathering and military planning. The collected data enabled the unit to conduct AI-driven analysis, potentially identifying targets for airstrikes in Gaza.

This connection has fueled criticism of Microsoft and other technology companies whose services inadvertently supported military actions resulting in civilian casualties.

The Guardian’s reporting and subsequent internal review prompted Microsoft to conduct an urgent external inquiry. The findings revealed that executives, including CEO Satya Nadella, were unaware of the extent to which Azure had been used to store sensitive intelligence data. Microsoft’s review also noted potential gaps in transparency among Israel-based employees regarding the unit’s use of the platform.

The mass surveillance project underscores how major technology companies have become deeply entwined with military operations in conflict zones.

By providing cloud storage and AI analytics, corporations like Microsoft inadvertently contribute to state surveillance programs, raising ethical and legal questions. In this case, the termination of access serves as both a corrective measure and a statement of corporate responsibility.

Unit 8200, often compared to the U.S. National Security Agency, leveraged Microsoft’s technology to intercept cellular communications, store data long-term, and run AI-driven analytics.

The project initially targeted the West Bank but expanded to Gaza during heightened military campaigns. Analysts estimate that more than 65,000 Palestinians have been killed during recent operations, with civilians constituting the majority of casualties. The surveillance system’s role in planning military actions has drawn international condemnation.

The United Nations has classified Israel’s actions in Gaza as potentially constituting genocide, citing systemic targeting of civilian populations. Microsoft’s decision to revoke access to its cloud and AI tools highlights the increasing scrutiny that technology companies face when their platforms are misused for mass surveillance.

Microsoft’s termination of services to Unit 8200 does not affect the broader relationship with the Israeli Defense Forces (IDF), which remains a longstanding client.

However, the move raises questions about the ethics of storing sensitive military data in foreign cloud platforms and the responsibility of tech providers to monitor usage by clients engaged in human rights violations.

Critics argue that the mass surveillance program exemplifies the dangers of AI-enabled intelligence operations. Automated analysis and cloud storage capabilities amplify the potential for intrusive monitoring, making ethical oversight critical.

Microsoft’s actions set a precedent, signaling that technology companies must carefully assess the implications of their services in conflict zones.

Employees and advocacy groups continue to push for more robust safeguards against misuse of cloud and AI technology. No Azure for Apartheid and similar campaigns demand a complete reevaluation of tech-company involvement in military operations, emphasizing transparency, accountability, and respect for human rights.

The mass surveillance controversy also underscores the geopolitical implications of technology in modern warfare. U.S. corporations providing infrastructure to foreign militaries face increasing scrutiny from both governments and civil society.

Microsoft’s termination of services may influence other tech companies to adopt stricter compliance and ethical review processes when working with military clients.

Unit 8200’s reliance on cloud-based AI for mass surveillance demonstrates the scale and sophistication of modern intelligence operations.

Analysts note that similar systems could emerge in other regions, raising concerns about global surveillance norms and the potential for abuse of civilian data. Microsoft’s decision to cut off access represents a rare example of corporate intervention to prevent misuse of its technology.

The Guardian’s investigation, supported by partner outlets +972 Magazine and Local Call, brought unprecedented transparency to this covert surveillance program.

The reporting highlighted the ethical dilemmas technology companies face when their platforms are co-opted for military intelligence purposes, sparking debates over corporate accountability in the digital age.

Microsoft’s decision to block Israel’s military access to its cloud and AI technology reflects a growing recognition of corporate responsibility in mass surveillance and conflict ethics.

By terminating services to Unit 8200, Microsoft underscores the importance of ethical oversight, compliance with terms of service, and protection of civilian privacy. The case serves as a critical reminder of the intersection between technology, geopolitics, and human rights in the 21st century.

Leave a Reply

Your email address will not be published. Required fields are marked *