Child Safety Standards
Last updated: March 31, 2026
Ayoo is committed to creating a safe environment for all users, with particular attention to the protection of minors. This document outlines the specific measures, systems, and policies we have in place to prevent harm to young users on our platform.
1. Age Verification & Eligibility
Ayoo requires all users to be at least 13 years old to create an account. Age verification is enforced at the point of registration through a mandatory birthday gate:
- Mandatory date of birth: Every user must provide their date of birth before accessing any part of the app. This step cannot be skipped or deferred.
- Server-side validation: The date of birth is verified on our servers, not just on the device. Users who do not meet the minimum age requirement of 13 are immediately rejected and cannot proceed.
- Immutable birthdate: Once submitted, a user's date of birth is permanently locked and cannot be changed. This prevents users from altering their age after registration to access content intended for a different age group.
- No access without verification: Users cannot view posts, interact with others, or access any app features until age verification is complete.
2. Age-Segregated Experience
Ayoo separates users into distinct age pools to ensure minors and adults do not interact with each other:
- Minor pool (13–17): Users under 18 are placed in a separate content pool. They can only see posts from and interact with other users in the same age range.
- Adult pool (18+): Users 18 and older are placed in a separate pool with no visibility into the minor pool.
- No cross-pool visibility: Feed queries, discovery, and all social features are filtered by age pool at the database level. There is no mechanism for an adult user to view, react to, or unlock a minor's profile, and vice versa.
- Pool assignment at registration: Each user's age pool is determined at the time of registration based on their verified date of birth and does not change.
3. Content Moderation
Every post and story published on Ayoo passes through automated content moderation before it becomes visible to other users:
- AI-powered text moderation: All text content is analyzed by an AI moderation system that evaluates for hate speech, violence, sexual content, self-harm references, harassment, and threatening language.
- AI-powered image moderation: All images uploaded to stories are automatically analyzed for explicit content, nudity, violence, and other prohibited material. Images that violate our guidelines are blocked before publication.
- Stricter thresholds for minors: The moderation system applies significantly stricter sensitivity thresholds for content posted within the minor pool. Content that might pass moderation in the adult pool may be blocked or flagged for review in the minor pool. This applies to both text and images.
- Automatic blocking: Content (text or images) that exceeds moderation thresholds is automatically blocked from publication. The user is informed that their post or story violates community guidelines.
- Flagged for review: Content that falls in a borderline range is published but flagged internally for human review. If found to be in violation upon review, it is removed and the author is notified.
- Moderation audit log: All moderation decisions are logged with the content, scores, flagged categories, and action taken. This log is inaccessible to users and exists solely for safety review and accountability purposes.
4. Reporting & User Safety Tools
Ayoo provides multiple tools for users to report harmful content and protect themselves:
- In-app reporting: Any user can report a post or story directly from their feed or the story viewer. Reports are categorized by type: spam, impersonation or catfishing, sexual content, harassment, underage user, or other.
- Underage user reporting: A dedicated “Underage User” report category allows any user to flag accounts they believe belong to someone under 13, triggering a priority review.
- Report and block: When reporting, users can choose to simultaneously block the reported user, immediately removing all of that user's content from their feed.
- Anonymous reporting: The identity of the reporter is never disclosed to the reported user.
- Blocking: Users can block any other user at any time. Blocked users cannot see each other's posts or interact in any way. Blocks can be managed and reversed from the Settings screen.
5. Strike System & Enforcement
Ayoo enforces a progressive discipline system to address violations:
- Automatic escalation: When a post receives 5 or more reports from different users, it is automatically hidden from the feed and the author receives a strike.
- Strike 1 — Warning: The user is notified that their post was removed for violating community guidelines. No restrictions are applied.
- Strike 2 — 24-hour posting ban: The user can still browse and interact but cannot publish new posts for 24 hours.
- Strike 3 — 7-day full suspension: The user is locked out of the app entirely for 7 days. All their active posts are hidden during the suspension.
- Strike 4+ — Permanent ban: The account is permanently banned and the device is blacklisted to prevent the user from creating new accounts on the same device.
- Strike decay: Strikes decay by one after 90 days of clean behavior, giving users a path to recovery while maintaining accountability.
6. Predatory Behavior & Grooming Prevention
- Age-pool isolation: The most fundamental protection against predatory behavior is the complete separation of minors and adults. Adults cannot discover, view, or contact minors through the app, and minors cannot discover or contact adults.
- No direct messaging: Ayoo does not include a direct messaging feature. All interactions occur through public posts, reactions, and the unlock/reveal system. This eliminates the possibility of private, unsupervised contact between users.
- Zero tolerance: Any attempt to groom, exploit, or engage in inappropriate contact with minors results in an immediate permanent ban with device blacklisting. We may also report such behavior to relevant law enforcement authorities as required by law.
7. Privacy Protections for Minors
- Limited data collection: We collect only the minimum data necessary to provide the service. For minors, this includes date of birth (for age verification), authentication credentials, and in-app activity data.
- No public profiles for anyone: User profiles are not indexed by search engines and are only visible within the app to users in the same age pool.
- Content expiration: All posts automatically expire and are permanently deleted after 7 days. Stories and their images also automatically expire and are permanently deleted after 7 days. This limits the long-term exposure of any content posted by minors.
- Privacy controls: Users can control whether their online status and age are visible to others through in-app privacy settings.
- Snapchat username protection: A user's Snapchat username is not publicly visible. Another user must spend in-app currency to unlock it, creating a deliberate barrier that prevents mass harvesting of contact information. The unlocked user is notified when someone accesses their Snapchat username.
8. Account & Device Controls
- Device account limits: Each physical device is limited to a maximum number of accounts to prevent abuse, ban evasion, and the creation of duplicate accounts.
- Device blacklisting: When a user is permanently banned, their device is blacklisted. No new accounts can be created from that device, preventing banned users from returning with a fresh account.
- Deleted account tracking: When an account is deleted, the associated device is tracked to prevent abuse of signup bonuses or other new-user incentives.
9. Data Security
- Server-side enforcement: All safety rules (age verification, age-pool separation, moderation thresholds, strike escalation, device limits) are enforced on the server through Cloud Functions and Firestore Security Rules. They cannot be bypassed from the client.
- Protected fields: Sensitive user fields such as strike counts, ban status, gem balances, and moderation metadata are server-only and cannot be modified by users.
- Report confidentiality: Reports and moderation logs are stored in collections that are completely inaccessible to any user, including the reporter and the reported party.
10. Appeals & Support
Users who believe their content was removed or their account was actioned in error can contact us at support@ayooapp.com. All inquiries are reviewed by a human and we respond within 7 business days.
11. Cooperation with Authorities
In cases involving illegal activity, imminent safety threats, or suspected child exploitation, Ayoo will cooperate fully with law enforcement authorities as required by applicable law. We take our legal obligations regarding child safety reporting seriously and will act promptly when such situations arise.
12. Continuous Improvement
We regularly review and update our child safety measures as the platform grows and as new risks emerge. This includes adjusting moderation thresholds, adding new safety features, and incorporating feedback from users and safety experts. This policy will be updated to reflect any changes, and the “Last updated” date at the top of this page will be revised accordingly.
Contact Us
If you have concerns about child safety on Ayoo, or if you believe a minor is at risk, please contact us immediately at support@ayooapp.com. For emergencies, please contact your local law enforcement.