Child Safety Policy
Last Updated, March 20, 2025
At Clapper, we are committed to ensuring a safe and secure environment for all users, particularly with respect to the safety and protection of minors. As an 18+ social platform, we strictly prohibit any form of Child Sexual Abuse and Exploitation (CSAE), including any content related to Child Sexual Abuse Material (CSAM), or sexualized depictions of minors.
To maintain a safe and respectful community, the following Child Safety guidelines and measures are enforced:
1. Prohibited Content
Our platform strictly prohibits all forms of content and behavior listed below:
1.1 Inappropriate Interactions with Minors
We prohibit any interaction that could lead to the sexual abuse or exploitation of minors, including but not limited to:
- Grooming: Any behavior aimed at establishing an emotional connection with a minor for the purpose of sexual exploitation. This includes, but is not limited to: attempting to form a trusting relationship through online chats, video calls, text messages, or other private communication, with the intention of manipulating or controlling the minor into engaging in sexual activity.
- Sexualized Messaging: Any form of communication—whether verbal, visual, or through video—that places a minor in a sexualized context. This includes sending sexually suggestive or inappropriate private questions or comments, or referencing sexual behavior in inappropriate contexts.
- Sexual or Erotic Behavior: Any behavior or language, including reference to sexual topics, bodily functions, or intimate relationships, intended to provoke or incite minors to engage in or participate in inappropriate sexual conduct.
1.2 Sexualized Depictions of Minors
We prohibit any form of sexualized depiction of minors, whether real or simulated. This includes:
- Real or Fictional Depictions of Sexualized Minors: Any content featuring minors in a sexualized manner, including real or digitally manipulated images, videos, or animations. This includes any content where minors are presented in a sexualized, provocative manner (e.g., nudity or revealing clothing), whether the content is real or simulated.
- Simulated Sexual Acts: Any content that simulates minors engaging in sexual acts or sexual contact, or portrays inappropriate sexual behavior, even if the content is virtual or digitally altered.
- Sexualized Minor Characters: Any content that involves minors or childlike characters in sexualized scenarios, such as provocative dancing, sexually suggestive behaviors, or other inappropriate themes.
1.3 CSAM Trafficking and Advertising
We absolutely prohibit any behavior related to the trafficking, sharing, or advertisement of Child Sexual Abuse Material (CSAM), including:
- CSAM Trafficking and Distribution: Any attempt to share, distribute, or trade CSAM on the platform, including but not limited to sharing links, private messages, file transfers, or live streaming CSAM content.
- CSAM Advertising or Promotion: Any attempt to advertise, promote, or encourage others to access or purchase CSAM, including sharing links to illegal websites, content, or promoting CSAM through any form of advertisement.
1.4 Combining Child-Friendly Themes with Adult Content
We explicitly prohibit the combination of child-friendly or family-friendly themes with adult content, including:
- Violence and Pornography Combination: Any content that merges child-friendly or family-appropriate themes with excessive violence or pornography. For example, using cartoon characters or children’s images alongside adult violence, abuse, or explicit sexual content.
- Sexualized Jokes or Mockery: Any content that combines child-friendly themes (such as cartoons, animal characters, or children’s shows) with adult sexualized content, crude humor, or inappropriate jokes.
- Sexualizing Child Characters: This includes using child or adolescent characters in sexualized scenes or placing them in adult entertainment contexts, thereby sexualizing them for adult entertainment or inappropriate behavior.
1.5 Content That May Attract Children but Contains Adult Themes
We restrict content that, while potentially appealing to children but includes adult themes or harmful content, such as:
- Excessive Violence and Gore: Content that includes graphic violence, blood, or other disturbing imagery that could be inappropriate for children, especially if the content is presented in a manner that is appealing to a younger audience.
- Depictions or Encouragement of Harmful or Dangerous Activities: Content that encourages or portrays activities that are dangerous, harmful, or illegal, including but not limited to self-harm, substance abuse, or reckless behavior.
- Promoting Negative Body or Self-Image: Content that promotes body shaming, unrealistic beauty standards, or encourages harmful behaviors (e.g., extreme dieting, excessive cosmetic surgery, or harmful body modifications) that can negatively impact children’s mental health or body image.
2. Detection and Moderation
We take a multi-layered approach to detecting and addressing CSAE content:
- AI Technology: Our platform utilizes third-party AI tools such as Hive and AWS to detect and flag any potential minor users on the platform. This system helps ensure that underage users are not participating in inappropriate activities or exposing themselves to harmful content.
- Human Review: All user-generated videos and livestreams on our platform will be manually reviewed by our content review team to identify any inappropriate or illegal content.
- User Reporting System: We provide users with an easy and accessible reporting mechanism through which they can report any concerns about CSAE-related content or underage users. Reports are reviewed by content review team as part of our swift response protocol.
3. Response Protocols
In the event that CSAE-related content or activity is detected or reported, we will take the following actions:
- Immediate Content Removal: Verified CSAM or CSAE-related content will be removed from the platform within 24hours of identification.
- Account Suspension: Users found violating these policies will face account suspension, and repeat offenders may be permanently banned from the platform.
- Notification to Authorities: In compliance with legal requirements, we will immediately escalate CSAE cases to the U.S. National Center for Missing & Exploited Children (NCMEC) and, where applicable, to the relevant regional authorities.
4. Contact Information
For CSAE-related inquiries, you may contact our dedicated compliance team at:
contact@clapperapp.com.