Australia’s Rushed Social Media Laws: Too Much Ambiguity?

Australia's Rushed Social Media Laws: Too Much Ambiguity?

This article continues the discussion we had in ‘Should Social Media Platforms Moderate Their Content More?‘ by asking if the new social media laws will achieve their purpose.

We’ll assess the new social media laws to determine whether it meets its objectives. Thereafter, we’ll ask how we can regulate social media to force the moderation of harmful content.

Table of Contents

What Do The New Social Media Laws Say?

The Sharing of Abhorrent Violent Material Bill‘s objective is to ensure that internet, content and hosting service providers quickly take action to remove ‘abhorrent violent material’.

What is ‘abhorrent violent material’?

This term encompasses anything audio and/or visual that records or streams certain violent conduct, such as terrorism or murder. Furthermore, the material needs to be:

  • such that a reasonable person would find it offensive; and
  • produced by a person that has any involvement in the conduct.

Offences Under The Law

Generally, social media platforms commit a crime if they:

  • fail to report abhorrent violent material; or
  • fail to stop people from accessing abhorrent violent material.

Platforms can only commit a crime if they realised hosting the content would cause harm but did it anyway.

Defences

The laws have a few exceptions, such as artistic works and news reports that in the ‘public interest’, though the defendant has the burden of proving they meet an exception. A lawyer is strongly advised.

Potential Ambiguities

The social media laws are a step in the right direction. The eSafety Commissioner can write a notice about harmful content to a social media platform and requiring them to take steps to remove the content.

Does The Law Meet Its Purpose?

However, the laws are ambiguous in their requirements, meaning social media platforms like YouTube and Facebook may not have violated them when they recently failed to prevent the spread of harmful content.

You’d need to show that, for example, the platform was aware of the abhorrent violent material and that they were reckless in allowing access to it. But Facebook may have not have been aware of the live-stream until reports twelve minutes after it began.

Moreover, you’d need to show that they didn’t ensure fast removal of the material. The issue here is the definition of material and expediency. For instance, would you need to show the platform was quick to remove the original video, that copies were quickly being removed or that the original video and every copy was quickly removed from the service?

On the other hand, how fast is ‘expeditious removal’? Facebook claimed they removed 1.5 million videos and YouTube widened their algorithm to increase takedown speed.

Jurisdiction

Holding an international company liable is difficult. Platforms, such as 4chan, typically prioritise free speech over preventing harmful content. But policing such platforms based in the US may conflict with the US Constitutions’ First Amendment.

Yet, we clearly need some form of regulation to prevent the spread of harmful content, so how can we achieve this?

How Should We Regulate Social Media Content?

Firstly, we’ve previously discussed the problems in Facebook’s complex reporting system. A simpler and clearer reporting system that lets users know their report is important would help to remove content that isn’t detected by social media algorithms.

In response to criticism, Facebook established an independent body to monitor their content moderation and improving the effectiveness of identifying copies of harmful content.

Mark Zuckerberg also proposed a third-party body ‘to set standards governing distribution of harmful content and to measure companies against those standards’. Additionally, a third-party regulator could also apply the Productivity Commission‘s recommendation for adaptive, risk-based regulation.

Ultimately, this is a global issue. Enforcing standards on global organisations can be difficult without international cooperation. The EU has already taken this step by proposing collaboration between service providers and member states.

Conclusion

Social media regulation is necessary, but the new laws may not be very effective due to their vagueness and lack of enforceability. Regulation needs to be more specific to achieve its purpose.

We have other options available to us, although a global approach would ensure global companies can be held accountable for harmful content on their platforms.

Unsure where to start? Contact a LawPath consultant on 1800 529 728 to learn more about customising legal documents and obtaining a fixed-fee quote from Australia’s largest legal marketplace.

Most Popular Articles
You may also like
Recent Articles

Get the latest news

By clicking on 'Sign up to our newsletter' you are agreeing to the Lawpath Terms & Conditions

Share:

Register for our free live webinar today!

Price of Justice: Paying the Right Price for Legal Expertise

12:00pm AEDT
Tuesday 30th April 2024

By clicking on 'Register for webinar' you are agreeing to the Lawpath Terms & Conditions

You may also like

This article goes into everything you need to know about full-time employment agreements.
Check out this guide on employment verification letters. This article has everything you need to know about employment verification letters.

Thank you!

Your registration is confirmed. Keep an eye on your inbox for an email with details on how to watch the webinar.