Elon Musk’s Twitter in Legal Battle Over Terrorist Attack Footage: Christchurch Call Activated

Wellington, New Zealand – After a devastating terrorist attack was live-streamed, then-Prime Minister Jacinda Ardern of New Zealand took a stand against violent and extremist content on the internet. The organization she established, the Christchurch Call, has now engaged with the Australian government regarding a legal dispute with X, formerly known as Twitter.

The Australian eSafety commissioner, Julie Inman Grant, is in conflict with X, owned by Elon Musk, over the sharing of footage of an alleged terrorist incident in Sydney’s West. Following a stabbing attack at Christ The Good Shepherd Church in Western Sydney, where Bishop Mar Mari Emmanuel was injured, the material was classified as abhorrent violent conduct.

In response, social media platforms are mandated by law to remove such content. Despite X geoblocking the material in Australia, authorities argue that the videos must be removed globally as Australians can still access them with a VPN.

The Christchurch Call was initiated over five years ago following the tragic mosque attacks in 2019, where 51 people lost their lives. Ardern, in collaboration with French President Emmanuel Macron, launched the initiative to address the spread of violent and extremist content online. The call has garnered support from over 50 countries and major tech companies.

Musk’s takeover of Twitter in 2022 brought about significant changes, leading to the departure of key staff and altering the platform’s approach to content moderation. The move has raised concerns about X’s commitment to initiatives like the Christchurch Call, as the company distances itself from established social media platforms.

As the legal battle between the eSafety commissioner and X continues, experts see the case as a crucial test for global e-safety legislation. Countries worldwide are grappling with the challenge of regulating online content, particularly in the aftermath of violent incidents like the one in Sydney.

Efforts to remove extremist material from online platforms have led to cooperation between regulatory bodies and tech companies. Meta, for instance, has demonstrated willingness to comply with removal notices, recognizing the importance of maintaining positive relationships with regulators to uphold business operations.

The evolving landscape of online content moderation underscores the ongoing global challenge of combating violent and extremist material. As governments, tech companies, and civil society organizations work together, the need for robust regulations and collaborative efforts to ensure online safety remains paramount.