Changes are coming to this article
This article will be updated with recently announced changes.
We’re clarifying our User-generated content policy to explain how different UGC experiences may require different in-app moderation efforts. (effective from 31 January 2024)
To preview the updated 'User-generated content' article, visit this page.
User-generated content (UGC) is content that users contribute to an app, and which is visible to or accessible by at least a subset of the app's users.
Apps that contain or feature UGC, including apps which are specialised browsers or clients to direct users to a UGC platform, must implement robust, effective and ongoing UGC moderation that:
- Conducts UGC moderation, as is reasonable and consistent with the type of UGC hosted by the app;
- In the case of augmented reality (AR) apps, UGC moderation (including the in-app reporting system) must account for both objectionable AR UGC (for example, a sexually explicit AR image) and sensitive AR anchoring location (for example, AR content anchored to a restricted area, such as a military base, or a private property where AR anchoring may cause issues for the property owner).
- Provides an in-app system for reporting objectionable UGC and users, and takes action against that UGC and/or user where appropriate;
- Provides an in-app system for blocking UGC and users;
- Provides safeguards to prevent in-app monetisation from encouraging objectionable user behaviour.
Incidental sexual content
Sexual content is considered 'incidental' if it appears in a UGC app that (1) provides access to primarily non-sexual content, and (2) does not actively promote or recommend sexual content. Sexual content defined as illegal by applicable law and child endangerment content is not considered 'incidental' and are not permitted.
UGC apps may contain incidental sexual content if all of the following requirements are met:
- Such content is hidden by default behind filters that require at least two user actions in order to completely disable (for example, behind an obfuscating interstitial or precluded from view by default unless 'safe search' is disabled).
- Children, as defined in the Families policy, are explicitly prohibited from accessing your app using age screening systems such as a neutral age screen or an appropriate system as defined by applicable law.
- Your app provides accurate responses to the content rating questionnaire regarding UGC, as required by the content ratings policy.
Apps whose primary purpose is featuring objectionable UGC will be removed from Google Play. Similarly, apps that end up being used primarily for hosting objectionable UGC, or that develop a reputation among users of being a place where such content thrives, will also be removed from Google Play.
- Promoting sexually explicit user-generated content, including implementing or permitting paid features that principally encourage the sharing of objectionable content.
- Apps with user-generated content (UGC) that lack sufficient safeguards against threats, harassment or bullying, particularly toward minors.
- Posts, comments or photos within an app that are primarily intended to harass or single out another person for abuse, malicious attack or ridicule.
- Apps that continually fail to address user complaints about objectionable content.