What is CSAM?
What is Google’s approach to combating CSAM?
How does Google identify CSAM on its platform?
We invest heavily in fighting child sexual exploitation online and use technology to deter, detect, and remove CSAM from our platforms. This includes automated detection and human review, in addition to relying on reports submitted by our users and third parties such as NGOs, to detect, remove, and report CSAM on our platforms. We deploy hash matching, including YouTube’s CSAI Match, to detect known CSAM. We also deploy machine learning classifiers to discover never-before-seen CSAM, which is then confirmed by our specialist review teams. Detection of never-before-seen CSAM helps the child safety ecosystem in a number of ways, including identifying child victims in need of safeguarding and contributing to the hashset to grow our abilities to detect known CSAM. Using our classifiers, Google created the Content Safety API, which we provide to others to help them prioritize abuse content for human review.
Both CSAI Match and Content Safety API are available to qualifying entities who wish to fight abuse on their platforms—please see here for more details.
What does Google do when it detects CSAM on its platform?
What is a CyberTipline report and what type of information does it include?
Once Google becomes aware of apparent CSAM, we make a report to NCMEC. These reports are commonly referred to as CyberTipLine reports, or CyberTips. In addition, we attempt to identify cases involving hands on abuse of a minor, production of CSAM, or child trafficking. In those instances, we send a supplemental CyberTip report to NCMEC to help prioritize the matter. A report sent to NCMEC may include information identifying the user, the minor victim, and/or other helpful contextual facts.
Below are some examples of the real world impact of CyberTip reports Google has submitted. They provide a glimpse at the wide range of reports we make, but they are not comprehensive.
- A Google Cybertip reported numerous pieces of CSAM involving elementary school children taken in a classroom setting. Some of the reported CSAM was previously unidentified by Google and appeared to have been produced by the Google Account holder. NCMEC forwarded the report to law enforcement, which led to the identification and safeguarding of two minor children depicted in the reported CSAM imagery.
- A Google Cybertip reported the solicitation and production of CSAM by an account holder, who requested numerous videos to be made that depicted the hands-on-abuse of dozens of minor boys in exchange for money. NCMEC forwarded the report to law enforcement. The account holder was convicted for production of CSAM and several dozens of children were identified and safeguarded from ongoing abuse.
- A Google CyberTip reported a single piece of known CSAM content that led to the apprehension of the account holder, who, according to law enforcement, was found to be in possession of much more CSAM and directly involved in the hands-on-abuse of minors in their care and providing those minors for others to abuse as well. Due to the efforts by Google, NCMEC, and law enforcement, three children were rescued from sexual abuse.
- A Google Cybertip reported CSAM that was produced by the Google account holder and solicited from minors the account holder had online access to. The account holder was later apprehended and determined by law enforcement to be in a position of trust as a medical professional: they used this position to abuse patients in their care and had direct access to minors online from whom they solicited the production of CSAM.
What does Google do to deter users from seeking out CSAM on Search?
How does Google contribute to the child safety ecosystem to combat CSAM?
Google’s child safety team builds technology that accurately detects, reports and removes CSAM to protect our users and prevent children from being harmed on Google products. To ensure the broader ecosystem also has access to this technology and help prevent online proliferation of child sexual abuse material, we developed the Child Safety toolkit to share this powerful technology with the rest of the industry and help prevent online proliferation of child sexual abuse material. Additionally, we provide Google’s Hash Matching API to NCMEC to help them prioritize and review CyberTipline reports more efficiently, allowing them to hone in on those reports involving children who need immediate help.
We are also an active member of several coalitions, such as the Technology Coalition, the ICT Coalition, the WeProtect Global Alliance, and INHOPE and the Fair Play Alliance, that bring companies and NGOs together to develop solutions that disrupt the exchange of CSAM online and prevent the sexual exploitation of children.