Geml Logo

Child Safety & CSAE Compliance

Contents

  • Overview
  • Prohibition of CSAE Content
  • Reporting Process
  • NCMEC Reporting
  • Prevention Measures
  • Contact Us
Last Updated: January 2, 2026

Child Safety & CSAE Compliance

At Geml, we are committed to providing a safe platform and strictly prohibit any content or behavior that endangers or sexualizes children. This page outlines our policies and procedures for preventing, detecting, and reporting child sexual abuse and exploitation (CSAE) content.

Zero tolerance: We have zero tolerance for child sexual abuse material (CSAM) and child exploitation. Violations result in immediate account termination and reporting to law enforcement.

1. Overview

Geml is designed exclusively for adults aged 18 and over. We implement multiple safeguards to protect children and prevent the use of our platform for any form of child exploitation or abuse.

This policy applies to all users, content, and activities on the Geml platform. It is part of our broader commitment to user safety and complements our Terms & Conditions and Privacy Policy.

2. Prohibition of CSAE Content

Geml strictly prohibits any content that sexualizes or endangers children.

The following activities are absolutely prohibited on our platform and will result in immediate account termination and reporting to appropriate authorities:

2.1 Prohibited Content and Behavior

  • Child Sexual Abuse Material (CSAM): Any visual depiction, description, or content involving the sexual exploitation of minors
  • Child grooming: Any attempt to establish a relationship with a minor for the purpose of sexual abuse or exploitation
  • Solicitation of minors: Any request for sexual content, contact, or meetings with individuals under 18
  • Sexualized content involving minors: Any content that sexualizes, objectifies, or inappropriately portrays children
  • Trading or distribution: Sharing, requesting, or distributing CSAM or links to such content
  • Fantasy content: Written, drawn, or AI-generated content depicting minors in sexual situations
  • Age misrepresentation: Minors creating accounts by falsifying their age

2.2 Age Verification

To prevent minors from accessing our platform, we:

  • Use automated systems to detect potential age misrepresentation
  • Respond immediately to reports of underage users

2.3 Enforcement

When we identify violations of this policy:

  • Accounts are immediately and permanently terminated
  • Content is preserved as evidence for law enforcement
  • Reports are filed with the National Center for Missing & Exploited Children (NCMEC)
  • Law enforcement agencies are notified as required by law
  • Legal cooperation is provided for investigations and prosecutions

3. Reporting Process

We rely on our community to help identify and report content or behavior that violates our child safety policies.

3.1 How to Report CSAE Concerns

If you encounter content or behavior that may involve child exploitation or abuse, report it immediately using one of these methods:

Report Safety Concerns

In-App Reporting: Use the "Report & Safety" feature available in user profiles and chat interfaces

Email: safety@geml.co

Subject Line: Use "URGENT: Child Safety" for immediate priority handling

Include: User ID or profile information, description of the concern, and any relevant screenshots or evidence

3.2 What Happens When You Report

When you submit a report:

  • Immediate review: Our safety team reviews all CSAE reports on the day they are received
  • Content preservation: Reported content is immediately secured for investigation
  • Account action: Violating accounts are suspended pending investigation
  • Reporter protection: Your report is confidential; the reported user will not know who reported them
  • Follow-up: We may contact you for additional information if needed

3.3 External Reporting Resources

You can also report child exploitation directly to these organizations:

  • National Center for Missing & Exploited Children (NCMEC): report.cybertip.org or call 1-800-THE-LOST (1-800-843-5678)
  • FBI Internet Crime Complaint Center: ic3.gov
  • Local law enforcement: Call 911 for immediate threats or emergencies

3.4 False Reports

While we encourage reporting any genuine safety concerns, knowingly filing false reports of child exploitation is a serious violation of our Terms of Service and may be illegal. False reporting:

  • Wastes resources needed for genuine cases
  • May harm innocent users
  • Can result in account termination
  • May be reported to law enforcement if malicious

4. NCMEC Reporting

Geml reports confirmed instances of child sexual abuse material (CSAM) to the National Center for Missing and Exploited Children (NCMEC) in accordance with applicable law.

4.1 Legal Obligations

Under federal law (18 U.S.C. § 2258A), electronic service providers must report known instances of CSAM to NCMEC's CyberTipline. Geml complies fully with these requirements.

4.2 What We Report

When we identify apparent CSAM on our platform, we file a report with NCMEC that includes:

  • Information about the suspected violator (account details, IP address, location data)
  • Description of the suspected violation
  • Preserved content and metadata
  • Dates and times of the activity
  • Any other relevant information for law enforcement investigation

4.3 Content Preservation

When we report content to NCMEC:

  • Content is preserved for 90 days (or longer if requested by law enforcement)
  • The violating account is permanently terminated
  • Content is not accessible to any users during preservation
  • All evidence is secured using forensic best practices

4.4 Cooperation with Law Enforcement

Following NCMEC reports, we:

  • Fully cooperate with law enforcement investigations
  • Respond promptly to legal requests for information
  • Preserve additional data as requested by authorities
  • Provide testimony or documentation when subpoenaed

5. Prevention Measures

Beyond reactive reporting, we implement proactive measures to prevent child exploitation on our platform:

5.1 Technical Safeguards

  • Age verification: Video verification to confirm users are adults
  • Content scanning: Automated detection systems that flag potential CSAM

5.2 Human Review

  • Safety team members review flagged content and reports
  • Regular audits of detection systems and processes
  • Continuous improvement based on emerging threats and trends

5.3 User Education

  • Clear communication of prohibited content in Terms of Service
  • Easy-to-use reporting tools prominently displayed
  • Safety tips and resources provided to all users
  • Transparency about our child safety commitment

6. Contact Us

For questions about this policy or to report safety concerns:

Child Safety Team

Emergency Reports: safety@geml.co

Policy Questions: hello@geml.co

Response Time: Child safety reports are reviewed on the day they are received. Policy questions receive responses within 48 hours.

Note: If you have witnessed or have evidence of child exploitation, we strongly encourage you to also contact law enforcement directly:

  • Call 911 for emergencies
  • Report to NCMEC: report.cybertip.org
  • Contact the FBI: ic3.gov
HomePrivacy PolicyTerms & ConditionsChild SafetyContact

© 2026 Geml, Inc. All rights reserved.