Last updated: 11/1/2025
Vimli takes seriously the safety of children on our platform and is committed to working to keep our service free of child sexual abuse and exploitation. As a social networking platform, we are required to comply with Google Play's Child Safety Standards policy. We are dedicated to maintaining a safe environment for all users, especially children.
This document outlines our standards against child sexual abuse and exploitation (CSAE), our mechanisms for reporting concerns, and our commitment to addressing child safety issues in accordance with applicable laws.
Vimli has zero tolerance for any form of child sexual abuse and exploitation. CSAE refers to child sexual abuse and exploitation, including content or behavior that sexually exploits, abuses, or endangers children. This includes, but is not limited to:
We prohibit any content, behavior, or activity that violates child safety laws or our terms of service. Any user found to be engaging in such activities will be immediately banned from our platform, and we will report all instances to relevant law enforcement authorities and appropriate reporting organizations.
CSAM stands for child sexual abuse material. It is illegal and our Terms of Service prohibit using Vimli to store or share this content. CSAM consists of any visual depiction, including but not limited to photos, videos, and computer-generated imagery, involving the use of a minor engaging in sexually explicit conduct.
We actively work to prevent CSAM from appearing on our platform through:
When we obtain actual knowledge of CSAM on our platform, we take immediate appropriate action, including:
Vimli provides an in-app feedback mechanism that allows users to report concerns related to child safety or any inappropriate content or behavior. Users can access this mechanism directly within the app without leaving the application.
To report a concern:
All reports are reviewed promptly by our dedicated safety team. Reports related to child safety are prioritized and handled with the utmost urgency. We maintain strict confidentiality and take all reports seriously.
For urgent child safety concerns, please also contact us directly at the email address provided in the Contact section below.
Vimli complies with all applicable child safety laws and regulations, including but not limited to:
We regularly review and update our policies and procedures to ensure continued compliance with evolving legal requirements and industry best practices. Our standards are also inspired by the Tech Coalition Child Safety Standards.
Vimli is designed for users aged 13 and older. While our platform has age restrictions in place, we recognize that child safety is paramount regardless of the intended user base. Even though our service targets adult users, we maintain comprehensive child safety measures because:
We implement multiple layers of protection, including age-gating features, content moderation, and proactive monitoring to prevent child exploitation.
All Vimli team members who handle user reports or content moderation receive specialized training on:
We are committed to continuous improvement and regularly update our training programs to reflect the latest best practices and legal requirements.
Vimli works closely with various organizations and authorities to enhance child safety:
CSAE refers to child sexual abuse and exploitation, including content or behavior that sexually exploits, abuses, or endangers children. This includes grooming, sextortion, trafficking, or any other form of sexual exploitation of minors.
CSAM stands for child sexual abuse material. It is illegal material consisting of any visual depiction, including photos, videos, and computer-generated imagery, involving the use of a minor engaging in sexually explicit conduct. We have zero tolerance for CSAM and immediately remove it and report it to authorities.
You can report concerns directly through our in-app reporting mechanism, which is accessible within the app. For urgent child safety matters, you can also contact our Child Safety Point of Contact directly at the email address provided below.
All reports are reviewed promptly by our safety team. Reports related to child safety are prioritized. We investigate all reports thoroughly and take appropriate action, which may include content removal, account suspension or termination, and reporting to law enforcement or appropriate authorities.
Yes, we maintain strict confidentiality regarding reports. However, we may be required to share information with law enforcement or child protection authorities as required by law or to prevent harm.
Vimli is designed for users aged 13 and older. However, child safety is a priority regardless of our user base. We maintain comprehensive child safety measures to prevent exploitation and protect any minors who may access the platform.
We regularly review and update our Child Safety Standards to ensure they remain comprehensive and aligned with best practices, legal requirements, and industry standards. Any significant changes will be reflected on this page with an updated "Last updated" date.
For questions, concerns, or to report child safety issues, please contact our designated Child Safety Point of Contact:
Email: childsafety@vimli.com
Subject Line: Child Safety Concern
Please include as much detail as possible, including any relevant account information, content links, screenshots, or other evidence. Urgent matters will be addressed immediately.
Our Child Safety Point of Contact is ready and able to speak to our organization's CSAM prevention practices and compliance with child safety standards. This contact is available to respond to inquiries from Google Play, law enforcement, and other relevant authorities.
If you believe a child is in immediate danger, please contact your local law enforcement agency immediately.
Vimli is committed to creating and maintaining a safe environment for all users. Child safety is not just a compliance requirement—it is a fundamental responsibility. We will continue to invest in technology, processes, and partnerships to prevent child exploitation and ensure that our platform remains a safe space for connecting and building communities.
Thank you for helping us maintain a safe platform. Your vigilance and reporting are essential to protecting children online.