
IRYS APP
CHILD SAFETY POLICY
Last Modification: This Policy was last modified on 24 June 2025.
Purpose
Irys is committed to protecting children from sexual exploitation and abuse. While our platform is intended strictly for users aged 18 and over, we have adopted proactive child safety measures to detect, report, and remove any content or behaviour that violates these principles.
This policy aligns with obligations under UK law, including the Sexual Offences Act 2003, the Online Safety Act 2023, and applicable platform policies (e.g. Google Play Developer Policy).
Scope
This policy applies to:
-
All users and user-generated content (UGC)
-
All features and functionalities of the platform (e.g. profiles, media sharing)
-
All employees, moderators, contractors, and third-party partners
Standards and Prohibited Content
Irys has a zero-tolerance policy for Child Sexual Abuse and Exploitation (CSAE) content or behaviour, including but not limited to:
-
Sexual abuse material involving children (imagery, audio, video, text descriptions)
-
Grooming or attempts to contact or exploit children for sexual purposes
-
Sexualised depictions or simulated abuse involving minors (real or fictional)
-
Content implying or promoting child sexual abuse or exploitation
-
Attempts to solicit, trade, or share CSAE material
-
Any user misrepresenting their age to appear as a minor, or engaging in sexual roleplay involving minors
All such content is strictly prohibited and will be removed and reported to relevant authorities.
Age Restriction and Access Controls
Irys is a platform for users aged 18 and over only. In line with a proportionate approach to age assurance, we rely on user reporting and post-registration moderation to uphold our 18+ policy:
-
User reporting tools that allow suspected underage accounts to be flagged by others
-
Manual review of all reports concerning underage use or potential CSAE behaviour
-
Timely suspension and investigation of any account reported as potentially underage
If a child is identified or reasonably suspected to be using the platform, their account will be deactivated and escalated to our Safety and Compliance team for further action.
Detection and Moderation Measures
We use a mix of tools and human oversight to keep our platform safe from CSAE content:
-
Automated scanning of images using trusted tools
-
Manual checks by trained moderators who quickly escalate any suspected CSAE content
-
Regular updates to our moderation policies and staff training
Reporting and Escalation Procedures
Users and moderators can report content using clearly labelled, accessible reporting tools. Reports are prioritised based on severity, and any CSAE-related reports follow this escalation pathway:
Internal Response:
-
Timely content removal
-
Temporary account suspension
-
Logging of user metadata (e.g. account details, content type)
External Reporting:
-
CSAM (imagery/video/text) is reported to the Internet Watch Foundation (IWF)
Website: https://www.iwf.org.uk/report
-
Grooming or child protection concerns are reported to the NCA-CEOP
Website: https://www.ceop.police.uk/
-
In cases of imminent danger, the matter is referred to police via 999
All incidents are documented for audit and follow-up purposes.
Data Retention and Privacy
CSAE-related data is handled securely and in line with the Data Protection Act 2018 and UK GDPR. Specifically:
-
Content is retained only where necessary for investigation and law enforcement cooperation
-
Access to CSAE content is strictly limited to trained personnel
-
We do not store or duplicate illegal imagery beyond what is legally required
Compliance and Review
This policy is reviewed:
-
Annually, or
-
In response to new legislation, platform policy changes, or emerging threats
Contact and Escalation
All urgent concerns or questions related to this policy, including reports of CSAE content or suspected underage users, should be directed to our Safety & Compliance Team : safety@irysphotos.com. This email is monitored regularly.