Child Safety Standards

Last updated: 11/1/2025

Overview

Vimli takes seriously the safety of children on our platform and is committed to working to keep our service free of child sexual abuse and exploitation. As a social networking platform, we are required to comply with Google Play's Child Safety Standards policy. We are dedicated to maintaining a safe environment for all users, especially children.

This document outlines our standards against child sexual abuse and exploitation (CSAE), our mechanisms for reporting concerns, and our commitment to addressing child safety issues in accordance with applicable laws.

Our Standards Against Child Sexual Abuse and Exploitation (CSAE)

Vimli has zero tolerance for any form of child sexual abuse and exploitation. CSAE refers to child sexual abuse and exploitation, including content or behavior that sexually exploits, abuses, or endangers children. This includes, but is not limited to:

  • Grooming a child for sexual exploitation
  • Sextorting a child
  • Trafficking of a child for sex
  • Otherwise sexually exploiting a child
  • Creating, distributing, or sharing child sexual abuse material (CSAM)

We prohibit any content, behavior, or activity that violates child safety laws or our terms of service. Any user found to be engaging in such activities will be immediately banned from our platform, and we will report all instances to relevant law enforcement authorities and appropriate reporting organizations.

Child Sexual Abuse Material (CSAM)

CSAM stands for child sexual abuse material. It is illegal and our Terms of Service prohibit using Vimli to store or share this content. CSAM consists of any visual depiction, including but not limited to photos, videos, and computer-generated imagery, involving the use of a minor engaging in sexually explicit conduct.

We actively work to prevent CSAM from appearing on our platform through:

  • Automated detection systems
  • Content moderation procedures
  • User reporting mechanisms
  • Regular platform audits
  • Cooperation with law enforcement and child safety organizations

When we obtain actual knowledge of CSAM on our platform, we take immediate appropriate action, including:

  • Immediate removal of the content
  • Permanent ban of the user account
  • Reporting to the National Center for Missing & Exploited Children (NCMEC) or equivalent authorities in applicable jurisdictions
  • Reporting to relevant law enforcement agencies
  • Preservation of evidence for investigation

In-App Reporting Mechanism

Vimli provides an in-app feedback mechanism that allows users to report concerns related to child safety or any inappropriate content or behavior. Users can access this mechanism directly within the app without leaving the application.

To report a concern:

  • Access the reporting feature through the user profile menu or content options
  • Select the type of concern you are reporting
  • Provide relevant details and context
  • Submit your report for review

All reports are reviewed promptly by our dedicated safety team. Reports related to child safety are prioritized and handled with the utmost urgency. We maintain strict confidentiality and take all reports seriously.

For urgent child safety concerns, please also contact us directly at the email address provided in the Contact section below.

Compliance with Child Safety Laws

Vimli complies with all applicable child safety laws and regulations, including but not limited to:

  • Laws prohibiting the creation, distribution, or possession of CSAM
  • Laws against child exploitation and trafficking
  • Laws requiring reporting of suspected child abuse
  • Age verification and consent laws
  • Data protection laws protecting children's privacy

We regularly review and update our policies and procedures to ensure continued compliance with evolving legal requirements and industry best practices. Our standards are also inspired by the Tech Coalition Child Safety Standards.

Age Restrictions and User Eligibility

Vimli is designed for users aged 13 and older. While our platform has age restrictions in place, we recognize that child safety is paramount regardless of the intended user base. Even though our service targets adult users, we maintain comprehensive child safety measures because:

  • Age verification systems may not be perfect
  • Children may misrepresent their age
  • Adults may attempt to exploit children through our platform
  • Prevention and detection are essential regardless of user demographics

We implement multiple layers of protection, including age-gating features, content moderation, and proactive monitoring to prevent child exploitation.

Training and Awareness

All Vimli team members who handle user reports or content moderation receive specialized training on:

  • Recognizing signs of child exploitation
  • Identifying CSAM and prohibited content
  • Proper reporting procedures
  • Legal requirements and obligations
  • Trauma-informed response protocols

We are committed to continuous improvement and regularly update our training programs to reflect the latest best practices and legal requirements.

Partnerships and Collaboration

Vimli works closely with various organizations and authorities to enhance child safety:

  • Law Enforcement: We cooperate fully with law enforcement investigations
  • NCMEC: We report CSAM to the National Center for Missing & Exploited Children
  • Industry Partners: We participate in industry-wide initiatives to combat online child exploitation
  • Child Safety Organizations: We consult with experts in child protection to improve our policies

Frequently Asked Questions

What is CSAE?

CSAE refers to child sexual abuse and exploitation, including content or behavior that sexually exploits, abuses, or endangers children. This includes grooming, sextortion, trafficking, or any other form of sexual exploitation of minors.

What is CSAM?

CSAM stands for child sexual abuse material. It is illegal material consisting of any visual depiction, including photos, videos, and computer-generated imagery, involving the use of a minor engaging in sexually explicit conduct. We have zero tolerance for CSAM and immediately remove it and report it to authorities.

How do I report a concern?

You can report concerns directly through our in-app reporting mechanism, which is accessible within the app. For urgent child safety matters, you can also contact our Child Safety Point of Contact directly at the email address provided below.

What happens after I report?

All reports are reviewed promptly by our safety team. Reports related to child safety are prioritized. We investigate all reports thoroughly and take appropriate action, which may include content removal, account suspension or termination, and reporting to law enforcement or appropriate authorities.

Will my report remain confidential?

Yes, we maintain strict confidentiality regarding reports. However, we may be required to share information with law enforcement or child protection authorities as required by law or to prevent harm.

Does Vimli allow children to use the platform?

Vimli is designed for users aged 13 and older. However, child safety is a priority regardless of our user base. We maintain comprehensive child safety measures to prevent exploitation and protect any minors who may access the platform.

Updates to Our Standards

We regularly review and update our Child Safety Standards to ensure they remain comprehensive and aligned with best practices, legal requirements, and industry standards. Any significant changes will be reflected on this page with an updated "Last updated" date.

Child Safety Point of Contact

For questions, concerns, or to report child safety issues, please contact our designated Child Safety Point of Contact:

Email: childsafety@vimli.com

Subject Line: Child Safety Concern

Please include as much detail as possible, including any relevant account information, content links, screenshots, or other evidence. Urgent matters will be addressed immediately.

Our Child Safety Point of Contact is ready and able to speak to our organization's CSAM prevention practices and compliance with child safety standards. This contact is available to respond to inquiries from Google Play, law enforcement, and other relevant authorities.

Additional Resources

If you believe a child is in immediate danger, please contact your local law enforcement agency immediately.

Our Commitment

Vimli is committed to creating and maintaining a safe environment for all users. Child safety is not just a compliance requirement—it is a fundamental responsibility. We will continue to invest in technology, processes, and partnerships to prevent child exploitation and ensure that our platform remains a safe space for connecting and building communities.

Thank you for helping us maintain a safe platform. Your vigilance and reporting are essential to protecting children online.