Search engines will soon start filtering adult content under new eSafety rules

This is one of several rules outlined in a new online safety code covering internet search engines that comes into force on December 27.

By Lisa M. Given

Search engines in Australia will soon have to blur pornographic and violent images in some cases to limit the chances children accidentally encounter this content.

This is one of several rules outlined in a new online safety code covering internet search engines that comes into force on December 27.

- Advertisement -

Here’s what you need to know.

Why are these rules being introduced?

In 2022, Australia’s online safety body, eSafety, surveyed more than 1,000 Australians aged 16 to 18 years.

The research found that one in three were under age 13 when they were first exposed to pornography. This exposure was “frequent, accidental, unavoidable and unwelcome,” with content described by young people as “disturbing” and “in your face”.

The eSafety Commissioner, Julie Inman Grant, has said “a high proportion” of accidental exposure is through search engines, which are “the primary gateway to harmful content”.

The new code was co-developed by the Digital Industry Group Inc – an industry association representing tech companies including Google, Meta and Microsoft – and the Communications Alliance – the peak body of the Australian telecommunications industry.

The code was announced in July 2025, but has been in development since July 2024.

- Advertisement -

A single breach could result in fines of up to A$49.5 million.

How will account holders’ age be assured?

The code requires providers of internet search engine services, such as Google and Microsoft (which owns Bing), to “implement appropriate age assurance measures for account holders” in Australia by 27 June 2026.

Age checks will identify whether search engine account holders are over or under 18.

Currently, children as young as 13 can create and hold a Google account, the search engine used by more than 90% of Australians.

The industry code of practice outlines several examples of appropriate age assurance strategies companies can choose, such as:

  • photo identification, including digital ID systems
  • facial age estimation
  • credit card checks
  • authentication by a parent of a child account holder’s age
  • use of artificial intelligence (AI) to estimate age from user data

Some of these approaches (such as digital ID and credit card checks) can verify a person is over 18. However, age estimation, whether based on an AI assessment of user behaviour or facial scanning, and parental attestation may be inaccurate.

The implementation of social media restrictions earlier this month has already highlighted the limitations of age assurance technologies.

Children may also use VPNs to get around the restrictions, or they may use adults’ accounts on their devices, if these are not logged out.

How will this change how Australians use search engines?

Under the code, companies need to filter content for all account holders under age 18 to reduce the risk of Australian children accessing or being exposed to online pornography or harmful material in search engine results.

This means a child who uses search terms such as “porn sites” will not have adult material provided to them.

The new code also changes search results retrieved when people are not logged in.

In addition to blurring thumbnail images of pornographic and violent material, providers must “prevent autocomplete predictions that are sexually explicit or violent”.

However, while retrieved images of pornographic and violent content will be blurred for searchers who are not logged in, people of all ages will still be able to click through and access that content.

When people search for information on suicide and self-harm, results that promote this content will need to be downranked in the results. And companies will also need to prominently display crisis-prevention information, such as helplines, in the results for queries about topics such as self-harm, suicide and eating disorders.

The code will apply to results generated by AI. For example, results generated by Google’s Gemini AI service fall under the code, alongside traditional search results.

A significant challenge in implementing these changes will be in defining and identifying what content to filter.

While companies may easily exclude some content (such as well-known pornography websites), content creators can use various strategies to bypass these filters. TikTok creators, for example, have been known to use the term “unalive” to bypass filters that exclude content on “death” or “suicide”.

Similarly, some exclusions can be too broad and mistakenly filter out relevant, helpful content.

For example, if the term “breast” appears as a blocked keyword for potential pornographic content, the system could inadvertently exclude information on breast self-examinations or breast cancer. Robust, additional checks are needed to ensure exclusions and blurring are applied appropriately.

One of several new codes coming in the new year

This is one of nine new age-restricted material codes set to launch in Australia in coming months.

Age checks will also be required for “high-risk” services, such as pornography websites, and for downloading apps that are rated 18+ from app stores, as well as messaging and gaming apps.

AI chatbots or other AI services capable of generating content that is sexually explicit, violent, or relates to self-harm, will also require age checks.

What isn’t known is whether a person’s age will need to be checked multiple times to use the same service.

The eSafety Commissioner explains this will depend on the service and whether it’s classed as “high risk”, noting providers “must balance effectiveness [of age checks] with usability and privacy”. Many age checks may occur without users realising it, based on data companies already hold.

What we do know is the new year will bring new levels of control around how people engage with online services. We will have to wait and see how account holders respond, including whether they gravitate towards logged out searching to avoid age checks, where they can.

Lisa M. Given, Professor of Information Sciences & Director, Social Change Enabling Impact Platform, RMIT University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Support our Journalism

No-nonsense journalism. No paywalls. Whether you’re in Australia, the UK, Canada, the USA, or India, you can support The Australia Today by taking a paid subscription via Patreon or donating via PayPal — and help keep honest, fearless journalism alive.

Add a little bit of body text 8 1 1
,