Please check out the Community Guidelines, a quick guide to YouTube policies that will help you understand what video content is allowed on our site.
How do we implement YouTube's content policies?
The standard of behavior we expect of YouTube users is set out in our Community Guidelines. If a user posts a video or otherwise behaves on the site contrary to these Guidelines, we will generally remove the offending material, and apply a Community Guidelines warning strike against the user.
Ways to alert us to potentially problematic content include:
- our flagging tool found below the video player,
- our Help & Safety Tool,
- our privacy complaint process
- our legal issues process
When we remove content for violating our policies, the user who posted it receives a strike. The type depends on the reason for the removal: copyright strikes are separate from community guidelines strikes.
In either case, the user is notified via email and via an alert that appears the next time the user logs in to YouTube.
Community Guidelines strikes last for six months from the date they are received. Notice is provided by email and at next log-in; for reference, copies of the notices are also logged within a user's account. Accrual of strikes results in penalties as follows:
- First Strike: The first strike on an account is considered a warning.
- Second Strike: If an account receives two strikes within a six month period, the ability to post new content to YouTube from that account is disabled for two weeks. If there are no further issues, full privileges are restored automatically after the two week period.
- Third Strike: If an account receives a third Community Guidelines strike within six months (before the first strike has expired) the account is terminated.
When a user has posting privileges temporarily disabled on one account, for the duration of the suspension that user is also prohibited from posting material to YouTube using any other account. Attempts to circumvent this rule may result in immediate termination without warning of all accounts. If you feel that your video was removed without just cause, you can appeal the strike on your account. Please click here to learn more.
Sometimes a video is removed for the safety and privacy of the user who posted the video, or due to a first-party privacy complaint, court order or other unintended issue. In these cases the user will not receive a strike and the account will not be penalized.
Copyright strikes are counted separately from Community Guidelines strikes. A copyright strike may expire in 6 months, subject to certain conditions, as long as no additional copyright notifications are received during that time. Notice is provided by email and at next log-in; for reference, copies of the notices are also logged within a user's account. Accrual of three copyright strikes leads to account termination. A copyright strike can only be resolved if (a) the user submits a counter-notification and prevails in that process, or, (b) we receive a message directly from the original claimant retracting the claim. Please note that there may be adverse legal consequences to filing a false counter-notification.
DISCLAIMER: WE ARE NOT YOUR ATTORNEYS, AND THE INFORMATION WE PRESENT HERE IS NOT LEGAL ADVICE. WE PRESENT THESE MATERIALS FOR INFORMATIONAL PURPOSES ONLY.
For more on our copyright policies, including information regarding the counter-notification process, please see here.
Accounts may be terminated due to
- repeated claims of copyright infringement
- a single case of severe abuse (such as predatory behavior or spam)
For more information on account termination, please see here.
When videos violate our Community Guidelines, we remove them. Some videos don't violate our policies, but may not be appropriate for all audiences. We age-restrict these.
When a video is age-restricted, a warning screen displays before the video plays. Only users 18 years of age or older can then proceed to view the material. In order to reduce the chances of users accidentally stumbling across these videos, they are not shown in certain sections of YouTube (e.g. honors pages like 'Most Viewed').
In deciding whether to age restrict content we consider issues such as violence, disturbing imagery, nudity, sexually suggestive content, and portrayal of dangerous or illegal activities.
There are exceptions for some educational, artistic, documentary and scientific content (e.g. health education, documenting human rights issues, etc.), but only if this is the sole purpose of the video and it is not gratuitously graphic. For example, a documentary on breast cancer would be appropriate, but posting clips out of context from a documentary might not be. Videos that qualify as educational, artistic, documentary or scientific that would otherwise have been removed are typically age-restricted instead.
When are videos considered 'sexually suggestive'?
Videos featuring sexually explicit content like real sex acts are not allowed. Other content like nudity and dramatized or implied sexual conduct may be considered sexually suggestive depending on whether or not it is intended or designed to arouse viewers. Nudity includes exposed or partially covered genitalia, buttocks, or breasts, as well as sheer clothing. Videos featuring individuals in minimal or revealing clothing may also be age-restricted if they're intended to elicit a sexual response.
Additional considerations include a combination of:
- Whether breasts, buttocks, or genitals (clothed or unclothed) are the focal point* of the video.
- Whether the video setting is sexually suggestive (e.g. a location generally associated with sexual activity, such as a bed);
- Whether the subject is depicted in a pose that is intended to sexually arouse the viewer;
- Whether the subject's actions in the video suggest a willingness to engage in sexual activity (e.g. kissing, provocative dancing, fondling); and
- If a subject is minimally clothed, whether the clothing would be acceptable in appropriate public contexts (e.g. swimwear vs. underwear).
* Focal point is determined by factors including the length of time an image appears in the video (fleeting vs. prolonged exposure) especially relative to the overall length of the video, the camera angle and focus, the relative clarity of the images in the video, the lighting, and the video thumbnail (content that appears in a thumbnail is also considered to be its focal point).
When are videos considered dangerous to be viewed by minors?
While it might not seem fair to say you can't show something because of what viewers theoretically might do in response, we draw the line at content that's intended to incite violence, encourage dangerous or illegal activities, or show activities that have an inherent risk of serious physical harm or death. Depending on the severity of such content we may remove it from the site or restrict access to viewers 18 and over to help ensure that the content is reaching the right audience. The decision whether to remove or restrict is influenced by the style of the video in question - are the depictions documentary in nature, or just designed to encourage others to imitate them?
Videos involving children (anyone under the age of 18) are particularly sensitive. Videos containing children should never be violent. Children should not be shown participating in dangerous or illegal activities. Videos that contain this content may be removed from the site.
When are videos considered too gory or disturbing?
It's important to make sure that you provide an appropriate amount of information for any content you post that contains violence. Sometimes content is too gory, and these videos will be removed from the site. In other cases, the content just may not be suitable for all audiences browsing YouTube. Similar to movie or television ratings, our age verification helps viewers avoid watching content that they may not feel is acceptable for themselves or for their children.
If the violence being shown in your video has historical, educational or documentary value, please make sure to post as much information as possible in the title and metadata to help the viewer understand what they are seeing. Providing context through additional information can help the viewer understand why they may be seeing the disturbing content.
Some YouTube content partners choose to make their videos available only to certain countries. For instance, they may only have the licensing rights for a particular region.
Sometimes when a video is IP-restricted, the uploader cannot view the video, though users in other regions can view and interact with it as usual. The uploader can still view, moderate, and respond to comments on the video from the all comments page, which is linked and can be accessed from the comments page within your account.