Learning more about our child safety standards policy

Google Play takes the safety of children on our platform seriously and is committed to working to keep our store free of child sexual abuse and exploitation. We require apps in the social and dating categories to comply with our Child safety standards policy.

Overview

The Child safety standards policy requires apps to:

Please make sure that you read the policy in full, and ensure that you understand and comply. Developers who are not in compliance by the deadline may be subject to enforcement actions.

Timeline information

We anticipate the following timeline for rollout of the Child safety standards policy. Note that this is subject to change; updates will be posted in this article. 

  • April 2024: We announced the new Play child safety standards policy 
  • The declaration form will be available for in-scope apps in Play Console later this year. 
  • 31 January 2025: In-scope apps must comply with the new Play child safety standards policy. Non-compliant apps may face additional enforcement actions in the future, such as the removal of your app from Google Play.

Frequently asked questions

Click on a question below to expand or collapse it.

What category of apps are in scope for this policy?

Apps in the social and dating categories are currently in scope for this policy.

How do I know if my app is considered part of the social or dating category?

A social app is an app that declares itself as a 'social' app in Play Console or lists itself within the 'social' category on Google Play.

A dating app is an app that declares itself as a 'dating' app in Play Console or lists itself within the 'dating' category on Google Play.

What if my app is not for children or does not allow child users? What if my app is just for adults? What if my app is age-gated?

The presence or absence of child users in your app is irrelevant to this policy. If your app meets the criteria above, then it is within the scope of this policy and must comply with its requirements.

How do you define CSAE?

CSAE refers to child sexual abuse and exploitation, including content or behaviour that sexually exploits, abuses or endangers children. This includes, for example, grooming a child for sexual exploitation, sextorting a child, trafficking of a child for sex or otherwise sexually exploiting a child.

How do you define CSAM?

CSAM stands for child sexual abuse material. It is illegal and our Terms of Service prohibit using Google products and services to store or share this content. CSAM consists of any visual depiction, including but not limited to photos, videos and computer-generated imagery, involving the use of a minor engaging in sexually-explicit conduct. For more information, visit the Transparency Report Help Centre.

What are the requirements for the CSAE published standards?

The published standards should be a web resource that is globally accessible for any individual to learn about your policies and standards around CSAE. The web resource must be: 

  • functional (for example, loads without error); 
  • relevant in scope (for example, mention CSAE or child safety); and 
  • reference the app or developer name (that is, as it appears on your store listing on Google Play). 

Certainly, you can offer this in many ways through a Help Centre, policy page, Terms of Service, community guidelines or similar. We recommend using anchor links and clearly laying out these standards. You must provide a link to these published standards in Play Console.

What kinds of in-app mechanisms should my app have? Can users report through an email or form?

By in-app feedback mechanism, we are referring to any mechanism that is available within your app for users to communicate their concerns to you. You may choose your preferred in-app method so long as users can access it without leaving the app. This may include, but is not limited to, a comprehensive in-app user feedback experience, a support email or chat channel for reports. You must certify that you have an in-app mechanism in Play Console.

What does it mean to take 'appropriate action' to address CSAM?

'Taking appropriate action to address CSAM' means acting in accordance with your published standards and relevant laws. For example, removing CSAM when you obtain actual knowledge of it in your app.

Our standards do not mandate a particular methodology, but we do expect developers to act in accordance with their stated policies, procedures and relevant applicable laws.

Do these standards align with global norms on child safety standards?

Yes. Our standards are also inspired by the Tech Coalition child safety standards. Please seek guidance from your legal team(s) or advisor(s) for regulatory compliance on CSAM and child safety matters.

Who should be my designated CSAM point of contact? Does it need to be a specific role?

Please provide a name and contact information for an individual who is ready and able to speak to your organisation’s CSAM prevention practices and compliance with this policy, should our team need to be in touch. Your CSAM point of contact can serve in a variety of positions or teams within your company. You must designate your point of contact in Play Console.

Was this helpful?

How can we improve it?
Search
Clear search
Close search
Main menu
7749507405822085586
true
Search Help Centre
true
true
true
true
true
92637
false
false