If you suspect a child is in immediate danger in any way, contact the police immediately.
Disclaimer: The information on this page is for general informational purposes only and is not legal advice. It may not reflect the most current legal developments and should not be relied upon without seeking legal counsel. IFTAS and its members do not endorse or take responsibility for the content of any third-party links provided. Use of this page does not create a solicitor-client relationship.
User Generated Content #
If you operate an ActivityPub service you are an electronic communications provider and you likely meet the definitions of ESP, ISP, online service provider, and other nomenclature in law that describes electronic communications services. Services that federate with third parties, and/or have their content feeds visible to public users, and/or allow user account creation, are liable for the content they host and display to end users in all jurisdictions.
National and Extranational Law #
Scroll down in the linked page to review the legality of real/realistic; fictional; and possession in most countries.
Legal status of fictional pornography depicting minors #
Regardless of its legality in a given jurisdiction, if your content is available to end users in a jurisdiction where such content is illegal, you are liable for its availability.
Detection #
Services exist to compare stored media with hashes of known material, and ML-assisted perceptual matches. For the most part, these are heavily restricted and will require your ability to sign legal agreements with the service providers.
If you are an independent provider and would like to use IFTAS detection services, please fill out this Needs Assessment: https://cryptpad.fr/form/#/2/form/view/thnEBypiNlR6qklaQNmWAkoxxeEEJdElpzM7h2ZIwXA/
CDN #
- Cloudflare’s CSAM Scanning Tool uses hash-matching technology to detect known child sexual abuse material (CSAM) on websites using its services. When CSAM is detected, Cloudflare notifies the website owner or hosting provider, who is responsible for removing the content and reporting it to the appropriate authorities. This tool previously required NCMEC credentials, meaning only US entities could use it, however this is no longer the case.
- (Firefish CloudFlare configuration: https://socialweb.coop/blog/firefish-cloudflare-quickfix-r2-tutorial/)
Hash and Match APIs #
Generally free of charge, these services allow you to call the API and receive a classification repsonse (e.g. “CSAM, likely CSAM, unknown”). You will likely be required to sign binding agreements.
- (Canada) Project Arachnid Shield: https://projectarachnid.ca/en/#shield
- (USA) Microsoft PhotoDNA: https://www.microsoft.com/en-us/photodna
- (Netherlands) Instand Image Identifier: https://web-iq.com/solutions/instant-image-identifier-to-fight-csam
Standalone Platforms #
- Thorn Safer (paid): https://get.safer.io/csam-detection-tool-for-child-safety
- Meta PDQ (open source): https://github.com/facebook/ThreatExchange/tree/main/pdq
- AI Horde csam_checker (open source): https://github.com/Haidra-Org/horde-safety/blob/main/horde_safety/csam_checker.py
ActivityPub Platform-specific #
- (Lemmy) A script that goes through a lemmy pict-rs object storage and tries to prevent illegal or unethical content: https://github.com/db0/lemmy-safety
- (Firefish) CloudFlare configuration: https://socialweb.coop/blog/firefish-cloudflare-quickfix-r2-tutorial/